Privacy papers in NeurIPS 2019

2 minute read

Published:

I have curated and am beginning to read NeurIPS ‘19 papers related to privacy. The list will be constantly updated with the paper summaries. Stay tuned!

TitleSummary
Private Hypothesis SelectionGiven samples from an unknown probability distribution, select a distribution from some fixed set of candidates which is “close” to the unknown distribution in some appropriate distance measure.
Differentially Private Algorithms for Learning Mixtures of Separated GaussiansLearning the parameters of Gaussian mixture models. sample complexity is small and no a priori bounds on the parameters of the mixture components.
Average-Case Averages: Private Algorithms for Smooth Sensitivity and Mean Estimation 
Generalization in Generative Adversarial Networks: A Novel Perspective from Privacy ProtectionShow that DP implies generalization. but their concrete examples (lipschitz constraint etc.) did not show how DP is achieved (?).
Differentially Private Bayesian Linear Regression 
Minimax Optimal Estimation of Approximate Differential Privacy on Neighboring Databases 
Locally Private Gaussian EstimationEach of n users draws a single i.i.d. sample from an unknown Gaussian distribution, and the goal is to estimate the mean of this Gaussian distribution while satisfying local differential privacy for each user.
Capacity Bounded Differential PrivacyLimit the capability of adversary, i.e., adversary is capable of performing only linear classification.
Practical Differentially Private Top-k Selection with Pay-what-you-get Composition 
Privacy-Preserving Classification of Personal Text Messages with Secure Multi-Party Computation 
Efficiently Estimating Erdos-Renyi Graphs with Node Differential Privacy 
Differentially Private Markov Chain Monte Carlo 
Differentially Private Bagging: Improved utility and cheaper privacy than subsample-and-aggregate 
Oblivious Sampling Algorithms for Private Data Analysis 
Differentially Private Anonymized Histograms 
Facility Location Problem in Differential Privacy Model Revisited 
Private Learning Implies Online Learning: An Efficient Reduction 
Online Learning via the Differential Privacy Lens 
Elliptical Perturbations for Differential Privacy 
Limits of Private Learning with Access to Public Data 
Private Testing of Distributions via Sample Permutations 
Private Stochastic Convex Optimization with Optimal Rates 
Privacy-Preserving Q-Learning with Functional Noise in Continuous Spaces 
Privacy Amplification by Mixing and Diffusion Mechanisms 
On Differentially Private Graph Sparsification and Applications 
An Algorithmic Framework For Differentially Private Data Analysis on Trusted Processors 
User-Specified Local Differential Privacy in Unconstrained Adaptive Online Learning 
Differentially Private Covariance Estimation 
Differentially Private Distributed Data Summarization under Covariate Shift 
Locally Private Learning without Interaction Requires Separation 
Differential Privacy Has Disparate Impact on Model AccuracyIf the original model is unfair, the unfairness becomes worse once DP is applied.