A Google TechTalk, presented by Jalaj Upadhyay (Rutgers), 2023/05/03 ABSTRACT: Differentially private continual observation has gained renewed interest due to its wide applications, most notably in large-scale private optimization. In this case, a concrete bound on the error is very relevant to reduce the privacy parameter. The standard mechanism for continual counting is the binary mechanism. We present a novel mechanism and show that its mean squared error is both asymptotically optimal and a factor 10 smaller than the error of the binary mechanism. We also show that the constants in our analysis are almost tight by giving non-asymptotic lower and upper bounds that differ only in the constants of lower-order terms. Our algorithm is a matrix mechanism for the counting matrix and takes constant time per release. We also use our explicit factorization of the counting matrix to give an upper bound on the excess risk of the private learning algorithm of Denisov et al. (NeurIPS 2022). Our lower bound for any continual counting mechanism is the first tight lower bound on continual counting under approximate differential privacy. It is achieved using a new lower bound on a factorization norm in terms of the singular values of the matrix. We believe this technique will be useful in proving lower bounds for a larger class of linear queries. To illustrate the power of this technique, we show the first lower bound on the mean squared error for answering parity queries. Time permitting, we will show some new results for factorization norm of a more general class of matrix that allows us to perform counting under decaying sum. Joint work with Monika Henzinger and Sarvagya Upadhyay.
Get notified about new features and conference additions.