Poster
Optimizing Noise Distributions for Differential Privacy
Atefeh Gilani · Felipe Gomez · Shahab Asoodeh · Flavio Calmon · Oliver Kosut · Lalitha Sankar
East Exhibition Hall A-B #E-1006
Protecting sensitive information is a major concern in the age of big data. Differential Privacy (DP) is a popular method for ensuring privacy by adding random noise to data, making it difficult to identify individuals. However, choosing the right type of noise is critical—too much noise can ruin data accuracy, and too little can fail to protect privacy. In this work, we introduce a new way to find the best noise distribution for a given privacy guarantee. Our method improves the accuracy of results while still meeting strong privacy standards. We show that our optimized noise works better than commonly used noise types, such as Gaussian or Laplace, across different datasets and privacy settings. This approach can help make privacy-preserving machine learning more reliable and effective in real-world applications.