Private Stochastic Optimization with Large Worst-Case Lipschitz Parameter

We study differentially private (DP) stochastic optimization (SO) with loss functions whose worst-case Lipschitz parameter over all data points may be huge or infinite. To date, the most work on DP SO assumes that the loss is uniformly Lipschitz continuous over data (i.e. stochastic gradients are u...

Full description

Saved in:
Bibliographic Details
Main Authors: Andrew Lowy, Meisam Razaviyayn
Format: Article
Language:English
Published: Labor Dynamics Institute 2025-03-01
Series:The Journal of Privacy and Confidentiality
Subjects:
Online Access:https://journalprivacyconfidentiality.org/index.php/jpc/article/view/909
Tags: Add Tag
No Tags, Be the first to tag this record!