Please use this identifier to cite or link to this item:
|Title:||Low-rank mechanism : optimizing batch queries under differential privacy||Authors:||Yuan, Ganzhao
|Keywords:||DRNTU::Engineering::Computer science and engineering||Issue Date:||2012||Source:||Yuan, G., Zhang, Z., Winslett, M., Xiao X., Yang, Y., & Hao, F. (2012). Low-rank mechanism : optimizing batch queries under differential privacy. Proceedings of the VLDB Endowment, 5(11), 1352-1363.||Series/Report no.:||Proceedings of the VLDB endowment||Abstract:||Differential privacy is a promising privacy-preserving paradigm for statistical query processing over sensitive data. It works by injecting random noise into each query result, such that it is provably hard for the adversary to infer the presence or absence of any individual record from the published noisy results. The main objective in differentially private query processing is to maximize the accuracy of the query results, while satisfying the privacy guarantees. Previous work, notably the matrix mechanism, has suggested that processing a batch of correlated queries as a whole can potentially achieve considerable accuracy gains, compared to answering them individually. However, as we point out in this paper, the matrix mechanism is mainly of theoretical interest; in particular, several inherent problems in its design limit its accuracy in practice, which almost never exceeds that of naive methods. In fact, we are not aware of any existing solution that can effectively optimize a query batch under differential privacy. Motivated by this, we propose the Low-Rank Mechanism (LRM), the first practical differentially private technique for answering batch queries with high accuracy, based on a low rank approximation of the workload matrix. We prove that the accuracy provided by LRM is close to the theoretical lower bound for any mechanism to answer a batch of queries under differential privacy. Extensive experiments using real data demonstrate that LRM consistently outperforms state-of-the-art query processing solutions under differential privacy, by large margins.||URI:||https://hdl.handle.net/10356/102393
|Rights:||© 2012 VLDB Endowment.||metadata.item.grantfulltext:||none||metadata.item.fulltext:||No Fulltext|
|Appears in Collections:||SCSE Journal Articles|
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.