value
| - Differential privacy is a strong privacy model based on indistinguishability of statistical outputs of two neighboring datasets, which represent two states that an individual's information is within or without a dataset. However, when the informations of different individuals are dependent, the representation would lose its foundation. In order to remedy the drawback of differential privacy, we revisit the Dalenius' paper [1] that motivates differential privacy, and introduce a new privacy model to control individual's information disclosure. The rationality of the privacy model is based on the information theory, i.e., the mutual information of one individual's information and the statistical outputs is upper bounded by a small value. The new privacy model accurately captures the weakness of differential privacy when dealing with dependent informations. Furthermore, the new privacy model gets on well with differential privacy. When the informations of individuals are independent, we prove that a mechanism that satisfies $/epsilon$-differential privacy would ensure to satisfy the new privacy model. When the informations of individuals are dependent, we prove that the group privacy method to achieve differential privacy in dependent case can be used to achieve the new privacy model. When the dependence extents of the informations of individuals are weak, we find differentially private mechanism which can satisfy the new privacy model with noise magnitude far less than the mechanism based on the group privacy based method. These results imply that the new model inherits (almost) all of the mechanisms of differential privacy.
|