Talking about quantum machine learning algorithms is a tricky subject considering the divergent views experts hold on it. Critics consider machine learning to be predominantly a linear algebra subject with little resonance with quantum computing. Proponents comment that the methods of quantum computing can help train datasets that are too large for classical methods. Seth Lloyd of MIT recently gave a talk citing an example. He suggested that analyzing all the topological features for a dataset with 300 × 300 points will require two to the 300th power processing units, an insolvable computing problem. He argued that a quantum machine learning algorithm could achieve the feat with mere 300 × 300 quantum bits, a scale he considers is attainable within the next few years. What Lloyd implied was that algorithms that take exponential time currently would take only polynomial time using quantum methods. Sounds really promising if it can succeed.
But this isn’t as straightforward as many would like us to believe. For starters, the biggest hurdle to advances in machine learning is limited data. Quantum computing, or any other method, will not solve that. Even advocates agree that quantum machine learning has limited applicability. On the bright side, evidence suggest that many problems in machine learning can be cast as Quadratic Unconstrained Binary Optimization (QUBO), a NP hard problem. For other cases, things are mostly speculative right now.
But even if quantum machine learning succeeds in elementary form, it won’t alter the machine learning landscape majorly. There will be no generic quantum machine learning tool. At best, hard optimization problems could be tackled in polynomial time via tunneling between optima. For the uninformed, quantum computers are believed to have the ability to rise above the local minima in optimization problems using tunneling to attain global minima. Some of the attempts at this form of algorithm include D-Waves quantum annealing, Quantum Bayesian Nets and Quantum Boltzmann Machines.
Also, scalability of qubit entanglement is still just a hypothesis, despite some successes such as Grover’s Algorithm and Shor’s Algorithm. With some success, we might see vendors develop implementation libraries for machine learning developers, reducing the need for the latter for in-depth understanding of quantum computing skills. So it is important to be aware that some of the witnessed journalistic romanticism that links human cognitive powers, machine learning and quantum logic is far-fetched. Also, at small scale, quantum computing hardly has a query complexity (oracle) advantage over classical methods. To an extent, utility of quantum methods will depend a lot on how visible the order of magnitude of query gap count is between classical and quantum methods. However, a recent paper discusses machine learning in quantum computing without needing an oracle.
Another interesting development is the upcoming subfield of Quantum Deep Learning. This field is based on the hypothesis that complex machine learning tasks requires the machine to learn models that contain several layers of abstraction of the raw input data. Visual object recognition from image pixels is one such task. Google too has shown interest in the field with its launch of the Quantum Artificial Intelligence Lab. Specifically, Google’s effort is focused on adiabatic minima searching (D-Wave quantum systems) in improving machine learning algorithms. We’ve also had other research publications from Microsoft Research and Yale University (Quantum Neurons).
Whatever the current trends are, one thing is undeniable. Both machine learning and quantum computing are fields with still scope for lot of evolution, and we are in definitely in for some really interesting times in the future.