Selected article for: "art state and prediction performance"

Author: Zheng, Qinqing; Chen, Shuxiao; Long, Qi; Su, Weijie J.
Title: Federated $f$-Differential Privacy
  • Cord-id: 8wd4o9k6
  • Document date: 2021_2_22
  • ID: 8wd4o9k6
    Snippet: Federated learning (FL) is a training paradigm where the clients collaboratively learn models by repeatedly sharing information without compromising much on the privacy of their local sensitive data. In this paper, we introduce federated $f$-differential privacy, a new notion specifically tailored to the federated setting, based on the framework of Gaussian differential privacy. Federated $f$-differential privacy operates on record level: it provides the privacy guarantee on each individual reco
    Document: Federated learning (FL) is a training paradigm where the clients collaboratively learn models by repeatedly sharing information without compromising much on the privacy of their local sensitive data. In this paper, we introduce federated $f$-differential privacy, a new notion specifically tailored to the federated setting, based on the framework of Gaussian differential privacy. Federated $f$-differential privacy operates on record level: it provides the privacy guarantee on each individual record of one client's data against adversaries. We then propose a generic private federated learning framework {PriFedSync} that accommodates a large family of state-of-the-art FL algorithms, which provably achieves federated $f$-differential privacy. Finally, we empirically demonstrate the trade-off between privacy guarantee and prediction performance for models trained by {PriFedSync} in computer vision tasks.

    Search related documents:
    Co phrase search for related documents
    • absence presence and local global: 1, 2, 3, 4
    • absence presence and local training: 1, 2
    • academic research and local global: 1, 2, 3, 4
    • academic research and local training: 1, 2