Privacy-Preserving SGD on Shuffle Model
In this paper, we consider an exceptional study of differentially private stochastic gradient descent (SGD) algorithms in the stochastic convex optimization (SCO). The majority of the existing literature requires that the losses have additional assumptions, such as the loss functions with Lipschitz,...
Saved in:
| Main Authors: | , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Wiley
2023-01-01
|
| Series: | Journal of Mathematics |
| Online Access: | http://dx.doi.org/10.1155/2023/4055950 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|