作者: Javier Parra-Arnau , Josep Domingo-Ferrer , Jordi Soria-Comas
DOI: 10.1016/J.INFFUS.2019.06.011
关键词:
摘要: Abstract Differential privacy is one of the most prominent notions in field anonymization. However, its strong guarantees very often come at expense significantly degrading utility protected data. To cope with this, numerous mechanisms have been studied that reduce sensitivity data and hence noise required to satisfy this notion. In paper, we present a generalization classical microaggregation, where aggregated records are replaced by group mean additional statistical measures, purpose evaluating it as reduction mechanism. We propose an anonymization methodology for numerical microdata which target protection set microaggregated generalized way, disclosure risk limitation guaranteed through differential via record-level perturbation. Specifically, describe three algorithms microaggregation can be applied either entire or groups attributes independently. Our theoretical analysis computes sensitivities first two central cross moments; apply fundamental results from matrix perturbation theory derive bounds on eigenvalues eigenvectors covariance coskewness matrices. extensive experimental evaluation shows enhanced medium large sizes groups. For range sizes, find evidence our approach provide not only higher but also than traditional microaggregation.