New paper out on quantum differential privacy which you can find here on SciRate and arxiv.
In this paper we propose a novel quantum neighboring relationship, bridging together several ideas from prior works on quantum privacy. Differential privacy is a popular notion of statistical security. If a dataset X contains n users, then a DP algorithm can process X without leaking too much information about any individual user. In fact, DP algorithms map neighboring inputs to close output distributions.
How to define neighboring inputs? The answer is problem dependent. For instance, one could consider x and y neighboring if they're close in Hamming distance, or in some Lp distance. In the quantum setting, the neighboring relationship is often based on the trace distance. However, we note that using the trace distance brings several drawbacks. First, the trace distance doesn't capture the geometry of many quantum encodings used in QML. Moreover, in some settings choosing the trace distance leads to a poor privacy-accuracy tradeoff.
To overcome these limitations, we introduce a neighboring relationship that combines the trace distance with the convertibility by local operations. We demonstrate that this definition captures the underlying structure of many near-term and long-term quantum encodings. We also show how to design private measurements according to this definition, combining the injection of classical and quantum noise into the computation. Compared to prior bounds based on trace distance, our results are exponentially tighter for local Pauli noise. Finally, we outline how quantum machine learning can benefit from our techniques. Prior work showed that DP enhances adversarial robustness and generalization. We performed a simulation to assess the certified adversarial robustness ensured by a private quantum classifier.