Deakin University
Browse

File(s) not publicly available

Differentially private data analysis

chapter
posted on 2023-01-30, 01:37 authored by T Zhu, Gang LiGang Li, Wanlei Zhou, P S Yu
The essential task of differentially private data analysis is extending the current non-private algorithms to differentially private algorithms. This extension can be realized by several frameworks, roughly categorized into Laplace/exponential frameworks and private learning frameworks. The Laplace/exponential framework incorporates Laplace or exponential mechanisms into non-private analysis algorithms directly. For example, adding Laplace noise to the count steps in the algorithm or by employing exponential mechanisms when making selections. Private learning frameworks consider data analysis as learning problems in terms of optimization. Learning problems are solved by defining a series of objective functions. Compared with the Laplace/exponential framework, a private learning framework has a clear target, and the results produced by this framework are easier to compare in terms of risk bound or sample complexity. But private learning frameworks can only deal with limited learning algorithms, while nearly all types of analysis algorithms can be implemented in a Laplace/exponential framework.

History

Volume

69

Pagination

49 - 65

ISSN

1568-2633

Publication classification

X Not reportable; BN Other book chapter, or book chapter not attributed to Deakin

Usage metrics

    Research Publications

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC