Quantum-classical truncated Newton method for high-dimensional energy landscapes. (arXiv:1710.07063v1 [quant-ph])
We develop a quantum-classical hybrid algorithm for function optimization
that is particularly useful in the training of neural networks since it makes
use of particular aspects of high-dimensional energy landscapes. Due to a
recent formulation of semi-supervised learning as an optimization problem, the
algorithm can further be used to find the optimal model parameters for deep
generative models. In particular, we present a truncated saddle-free Newton's
method based on recent insight from optimization, analysis of deep neural
networks and random matrix theory. By combining these with the specific quantum
subroutines we are able to exhaust quantum computing in order to arrive at a
new quantum-classical hybrid algorithm design. Our algorithm is expected to
perform at least as well as existing classical algorithms while achieving a
polynomial speedup. The speedup is limited by the required classical read-out.
Omitting this requirement can in theory lead to an exponential speedup.