Characterizing privacy in quantum machine learning
Abstract Ensuring data privacy in machine learning models is critical, especially in distributed settings where model gradients are shared among multiple parties for collaborative learning. Motivated by the increasing success of recovering input data from the gradients of classical models, this stud...
Saved in:
| Main Authors: | Jamie Heredge, Niraj Kumar, Dylan Herman, Shouvanik Chakrabarti, Romina Yalovetzky, Shree Hari Sureshbabu, Changhao Li, Marco Pistoia |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Nature Portfolio
2025-05-01
|
| Series: | npj Quantum Information |
| Online Access: | https://doi.org/10.1038/s41534-025-01022-z |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Quantum compilation toolkit for Rydberg atom arrays with implications for problem hardness and quantum speedups
by: Martin J. A. Schuetz, et al.
Published: (2025-08-01) -
Decomposition pipeline for large-scale portfolio optimization with applications to near-term quantum computing
by: Atithi Acharya, et al.
Published: (2025-05-01) -
Erratum: Provably Trainable Rotationally Equivariant Quantum Machine Learning [PRX Quantum 5, 030320 (2024)]
by: Maxwell T. West, et al.
Published: (2025-06-01) -
Synergizing quantum techniques with machine learning for advancing drug discovery challenge
by: Zhiding Liang, et al.
Published: (2024-12-01) -
Performance of quantum approximate optimization with quantum error detection
by: Zichang He, et al.
Published: (2025-05-01)