Federated Quantum Machine Learning.

Entropy (Basel)

Computational Science Initiative, Brookhaven National Laboratory, Upton, NY 11973, USA.

Published: April 2021

Distributed training across several quantum computers could significantly improve the training time and if we could share the learned model, not the data, it could potentially improve the data privacy as the training would happen where the data is located. One of the potential schemes to achieve this property is the federated learning (FL), which consists of several clients or local nodes learning on their own data and a central node to aggregate the models collected from those local nodes. However, to the best of our knowledge, no work has been done in quantum machine learning (QML) in federation setting yet. In this work, we present the federated training on hybrid quantum-classical machine learning models although our framework could be generalized to pure quantum machine learning model. Specifically, we consider the quantum neural network (QNN) coupled with classical pre-trained convolutional model. Our distributed federated learning scheme demonstrated almost the same level of trained model accuracies and yet significantly faster distributed training. It demonstrates a promising future research direction for scaling and privacy aspects.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8069802PMC
http://dx.doi.org/10.3390/e23040460DOI Listing

Publication Analysis

Top Keywords

machine learning
16
quantum machine
12
distributed training
8
federated learning
8
local nodes
8
learning
7
training
5
federated
4
federated quantum
4
machine
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!