Quantum machine learning sounds like it came straight out of a sci-fi novel
But it’s very much real, and very game changing.
Some of the most advanced AI algorithms today use multiple types of hardware to run their own subroutines. These chips range from general and slow at some tasks (like a CPU) to specific and fast (like an application-specific integrated circuit, or ASIC). Quantum devices can be thought of as ASICs, which have a specialized functionality. More advanced quantum devices are more similar to Field Programmable Gate Arrays (FPGAs) which can be programmed to run simple circuits.
Quantum machine learning is the use of quantum algorithms in classical machine learning programs. Quantum computers can be used to speed up particular functions in machine learning that involve the computation of immense amounts of data. They are able to carry out large calculations simultaneously because of their entanglement and superposition properties.
To learn more about quantum computers, check out this article.
There are a ton of areas where quantum computing can accelerate AI. Here’s a summary:
Optimization with Quantum Annealing
Adiabatic quantum computing and quantum annealing can help with optimization problems by finding the lowest energy state of a high dimensional energy landscape. In English:
The picture above is an energy diagram for a single qubit in a superposition of 0 and 1. The 0 and 1 states have the lowest energies. When you measure the qubit, it has an equal chance of collapsing into 0 or 1.
Quantum computers are able to manipulate the probabilities of qubits by adding biases. These are magnetic fields that influence the spin of the qubit. One such bias is a coupler, which can entangle two qubits and ensure that their outcomes are identical or polar opposites.
When you have multiple qubits in a system and add couplers, you end up with an energy landscape that gets pretty complex (2^N energy levels for N qubits). With the proper manipulation, these energy landscapes start to resemble complex optimization problems, like the traveling salesman problem. Because nature tends to exist in the lowest energy configuration, quantum annealers take advantage of this and are able to quickly find the minimal energy state of this landscape.
D-Wave is a leading quantum annealing company. In 2008, they used their annealer to optimize the weights in a binary classification training model. This annealer is also able to solve other quadratic unconstrained binary optimization problems, which are useful in a wide range of areas, from finance to machine learning.
Optimization with Variational Circuits
Variational circuits are used in conjunction with classical computers to perform optimization problems. The quantum circuit calculates a cost function that can’t be computed classically. It then sends these values to a classical computer, which optimizes certain parameters on the quantum circuit based on the calculated cost function. Variational quantum eigensolvers are an application of this process that can be used to find ideal chemicals. The quantum circuit calculates the eigenvalues of a large Hamiltonian matrix (representing the energy levels of the chemical), allowing it to calculate the matrix’s expectation value. It sends this value to the classical computer, which adjusts the parameters of the quantum circuit in a way that will reduce the expectation value. This process is iterated until the expectation value converges with the smallest eigenvalue of the Hamiltonian, helping researchers figure out the ground state energies of certain particles.
This is super useful in drug discovery, and finding catalysts to accelerate carbon capture!
Generally, quantum computers are excellent *chef’s kiss* at linear algebra computations. Quantum gates can multiply exponentially, and even infinitely large matrices, in a single operation. Therefore, quantum gates can act as linear layers in a large neural network.
There is one huge caveat though. Before performing matrix multiplication, the matrix must be encoded onto a quantum computer — this proves to be quite difficult on current devices.
Quantum computers are fundamentally samplers. Thinking back to the coin example earlier, quantum states are probability distributions. When they are measured, you can “sample” the distribution to get a concrete number. Sampling is useful in training machine learning models. In the past, quantum computers have been used to train Boltzmann machines and Markov Logic networks. In these applications, quantum computers are able to provide random numbers to the Boltzmann machine’s weights and functions.
In classification problems, it’s often hard to linearly separate your data points. Simply put, you can’t draw a straight line in the first graph that separates the red and blue points. Kernels to the rescue! Kernel functions map the points on the left to a higher dimensional space (shown on the right), where it’s easier to separate and classify them.
Mathematically, kernels K(x,y) are represented by <f(x), f(y)>. x and y are n dimensional inputs, and f is a function that transforms x and y to a m dimensional space (where m>n). <f(x), f(y)> is the inner product of the two functions, which means each corresponding element in both functions are multiplied and added together to give a scalar. For large values of m, these are difficult to compute classically, so quantum computers can be used to estimate them.
Kernel evaluation is widely used in a classification algorithm called a support vector machine (SVM). In quantum SVMs, a classical data point X is transferred into a quantum data point |Φ(𝑥⃗)⟩ using a circuit V(Φ(𝑥⃗)). Then another quantum circuit W(Θ) processes the data so that it can apply a measurement and return a classical value of -1 or 1 that classifies the data.
Quantum machine learning is indeed quite awesome. However, it’s important to note that there are physical restrictions to some of these applications, because the more qubits you have in a circuit, the harder it is to control their quantum states. Also, the quantum speedup of some of these algorithms (like the QSVM) have not been proven yet. Nonetheless, I’m pretty optimistic about the future of this field as more research is conducted. We might even have quantum processing units to replace GPUs!
Thinking beyond just using quantum computers to speed up existing classical ML algorithms, quantum circuits can train themselves to become better at a certain task by adjusting its parameters. This is called differentiable programming.
I think that’s pretty neat.