Quantum algorithm for logistic regression. (arXiv:1906.03834v3 [quant-ph] UPDATED)

Logistic regression (LR) is an important machine learning model for
classification, with wide applications in text classification, image analysis
and medicine diagnosis, etc. However, training LR generally entails an
iterative gradient descent method, and is quite time consuming when processing
big data sets. To solve this problem, we present a quantum algorithm for LR to
implement the key task of the gradient descent method, obtaining the classical
gradients in each iteration. It is shown that our algorithm achieves
exponential speedup over its classical counterpart in each iteration when the
dimension of each data point M grows polylogarithmically with the number of
data points N, i.e.,M=O(polylog N). It is worth noting that the optimal model
parameters are finally derived by performing simple calculations on the
obtained gradients. So once the optimal model parameters are determined, one
can use them to classify new data at little cost.

Article web page: