Quantum Machine Learning

✨quantum machine learning✨
Here is a chart of the computing devices used in AI, arranged from least to greatest generality. Source

Optimization with Quantum Annealing

Adiabatic quantum computing and quantum annealing can help with optimization problems by finding the lowest energy state of a high dimensional energy landscape. In English:

Source
Source
source

Optimization with Variational Circuits

Variational circuits are used in conjunction with classical computers to perform optimization problems. The quantum circuit calculates a cost function that can’t be computed classically. It then sends these values to a classical computer, which optimizes certain parameters on the quantum circuit based on the calculated cost function. Variational quantum eigensolvers are an application of this process that can be used to find ideal chemicals. The quantum circuit calculates the eigenvalues of a large Hamiltonian matrix (representing the energy levels of the chemical), allowing it to calculate the matrix’s expectation value. It sends this value to the classical computer, which adjusts the parameters of the quantum circuit in a way that will reduce the expectation value. This process is iterated until the expectation value converges with the smallest eigenvalue of the Hamiltonian, helping researchers figure out the ground state energies of certain particles.

Source

Linear Algebra

Generally, quantum computers are excellent *chef’s kiss* at linear algebra computations. Quantum gates can multiply exponentially, and even infinitely large matrices, in a single operation. Therefore, quantum gates can act as linear layers in a large neural network.

Sampling

Quantum computers are fundamentally samplers. Thinking back to the coin example earlier, quantum states are probability distributions. When they are measured, you can “sample” the distribution to get a concrete number. Sampling is useful in training machine learning models. In the past, quantum computers have been used to train Boltzmann machines and Markov Logic networks. In these applications, quantum computers are able to provide random numbers to the Boltzmann machine’s weights and functions.

Kernel evaluations

In classification problems, it’s often hard to linearly separate your data points. Simply put, you can’t draw a straight line in the first graph that separates the red and blue points. Kernels to the rescue! Kernel functions map the points on the left to a higher dimensional space (shown on the right), where it’s easier to separate and classify them.

Final Thoughts

Quantum machine learning is indeed quite awesome. However, it’s important to note that there are physical restrictions to some of these applications, because the more qubits you have in a circuit, the harder it is to control their quantum states. Also, the quantum speedup of some of these algorithms (like the QSVM) have not been proven yet. Nonetheless, I’m pretty optimistic about the future of this field as more research is conducted. We might even have quantum processing units to replace GPUs!

a rare image of a quantum computer training itself

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store