AI

Faster Algorithms for User-Level Private Stochastic Convex Optimization

1 Mins read

We study private stochastic convex optimization (SCO) under user-level differential privacy (DP) constraints. In this setting, there are nn users, each possessing mm data items, and we need to protect the privacy of each user’s entire collection of data items. Existing algorithms for user-level DP SCO are impractical in many large-scale machine learning scenarios because: (i) they make restrictive assumptions on the smoothness parameter of the loss function and require the number of users to grow polynomially with the dimension of the parameter space; or (ii) they are prohibitively slow, requiring at least (mn)3/2(mn)^{3/2}


Source link

Related posts
AI

Chain-of-Associated-Thoughts (CoAT): An AI Framework to Enhance LLM Reasoning

3 Mins read
Large language models (LLMs) have revolutionized artificial intelligence by demonstrating remarkable capabilities in text generation and problem-solving. However, a critical limitation persists…
AI

Validation technique could help scientists make more accurate forecasts | MIT News

4 Mins read
Should you grab your umbrella before you walk out the door? Checking the weather forecast beforehand will only be helpful if that…
AI

Prime Intellect Releases SYNTHETIC-1: An Open-Source Dataset Consisting of 1.4M Curated Tasks Spanning Math, Coding, Software Engineering, STEM, and Synthetic Code Understanding

2 Mins read
In artificial intelligence and machine learning, high-quality datasets play a crucial role in developing accurate and reliable models. However, collecting extensive, verified…

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *