Papers
arxiv:2212.05796

Generalizing DP-SGD with Shuffling and Batch Clipping

Published on Dec 12, 2022
Authors:
,
,
,

Abstract

A general differentially private framework extends beyond DP-SGD by enabling first-order optimizers with batch clipping and advanced sampling techniques, providing improved privacy guarantees through f-DP analysis.

AI-generated summary

Classical differential private DP-SGD implements individual clipping with random subsampling, which forces a mini-batch SGD approach. We provide a general differential private algorithmic framework that goes beyond DP-SGD and allows any possible first order optimizers (e.g., classical SGD and momentum based SGD approaches) in combination with batch clipping, which clips an aggregate of computed gradients rather than summing clipped gradients (as is done in individual clipping). The framework also admits sampling techniques beyond random subsampling such as shuffling. Our DP analysis follows the f-DP approach and introduces a new proof technique which allows us to derive simple closed form expressions and to also analyse group privacy. In particular, for E epochs work and groups of size g, we show a g E DP dependency for batch clipping with shuffling.

Community

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2212.05796 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2212.05796 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2212.05796 in a Space README.md to link it from this page.

Collections including this paper 1