What is a Quantum Computer?

Because driving all our cars, hoarding all our knowledge and building smart robots wasn’t enough, Google has now managed to accurately simulate a hydrogen molecule for the first time in history on their swanky new quantum computer. Sounds impressive, but is Google really one step closer to the ultimate dream machine? And what makes Google’s computer so much quicker than ‘classical’ computers,  which take days and days to calculate even just a fragment of what Google have achieved?  

The information in a classical computing device (like the device you’re reading this on) is stored in a binary format,  consisting of ones and zeros.  Every letter or number you type, as well as every image or video you see, is translated into this binary language using particular encoding standards, such as ASCII or JPEG. The computer uses this information by translating 1s and 0s into ON and OFF states: either current is flowing through that little wire in your hardware, or not.

For example ‘thefullapple’ translates to ‘01110100 01101000 01100101 01100110 01110101 01101100 01101100 01100001 01110000 01110000 01101100 01100101 00100000’ in binary. (Thank god for the alphabet.)  Each piece of information is called a bit, and 8 bits make a byte.

Superposition and Entanglement – the two key buzzwords

In a quantum computer, however, the information is encoded using quantum bits, fondly referred to as ‘qubits’. The basic idea of qubits is that they can simultaneously contain the classical ‘1’ and ‘0’ information in a so-called superposition. In some sense, the qubit can be in state 0 and 1 at the same time.  However, when we perform a measurement on the state, according to the laws of quantum mechanics, it must return either a 1 or a 0 – it is reduced back to the classical information. This sounds crazy, but is common in the quantum physics world. The most striking example of this is the baffling riddle known as Schrödinger’s cat, a thought experiment allowing a cat to be in the states ‘dead’ and ‘alive’ at the same time.

cat.001

The full power of quantum computing is unleashed when we entangle many qubits together. Entanglement is a quantum process where groups of particles interact in such a way that the quantum state of each particle can no longer be described independently. The system must be described as a whole, since the behaviour of the particles is fundamentally linked together. This may all sound like one gigantic mess, and even Einstein believed that entanglement was impossible within the usual quantum mechanics definition of physical reality, calling it “spooky action at a distance“.

The effect of entanglement is that for a pair of entangled particles, actions performed on one particle also have an effect on the other, no matter the distance between them. This allows information to be encoded non-locally across a set of many particles.

How is quantum computing superior?

Because the qubits exist as superpositions, the quantum computer can perform the same sequence of operations on a superposition of many input strings, whereas the classical computer is limited to a single string at a time.

This allows for something known as quantum speed-up to occur. While a classical computer using N bits is stuck with N simultaneous operations, a system of N qubits can perform 2 N simultaneous operations, making a quantum computer exponentially faster. A quantum computer with 1000 qubits could therefore operate on 2^1000 strings simultaneously. This is more than the number of atoms in the universe, implying even if the entire universe acted together as a perfect classical computer, the little collection of 1000 qubits would still beat it computationally.

This makes quantum computers extremely powerful tool for examining complex data sets. Instead of looking at each piece individually, it can look at all of them at once and pick the right one.

How close are we to realising these computers?

To achieve conditions in which qubits can exist, we require very low temperatures of around a hundredth of a degree above absolute zero, colder than outer space. This is because a system of a large number of qubits is highly sensitive to interactions with its surroundings.  If qubits start interacting and entangling with their surroundings, information can quickly become inaccessible, and thus lost forever. To avoid this, the interactions of the system are frozen out by the low temperatures. These conditions are very difficult to achieve, which is why progress so far has been quite slow.

Quantum computers are starting to go commercial at D-Wave, a small Canadian company, which is now heavily supported by Google and NASA. Google also announced that it has hired a team of academics to develop its own quantum computers based on D-Wave’s approach. According to Microsoft’s research lab, we could crack the quantum computing code within the next 10 years.

Algorithms already exist that can make a quantum computer 100 million times faster than a conventional one. This would suddenly allow us to solve problems in a few days, that would take millions of years on a classical computer.

So, Google simulating a Hydrogen atom accurately is a pretty huge breakthrough. By harnessing the incredible power of a quantum computer, they’ve been able to efficiently simulate the energy and behaviour of a single molecule- and it won’t stop there. Next, they hope to scale up and simulate larger molecules in larger chemical systems which could have a huge real world impact.

Quantum computing is almost here, and when it arrives, it promises to revolutionise the way we look at computing entirely.

 

Writing an essay or want to learn more?
Interesting articles on Quantum Computing

A comprehensive book on the subject

  • Quantum Computation and Quantum Information, Nielson M. A., Chuang I. L. Cambridge University Press. 2010.

Websites

  • The Quantised World: A great introduction to quantum theory from the educational section of the Nobel Prize website. Includes links to biographies of the various Nobel Prize winning physicists who contributed to the foundations of quantum theory.

Leave a Reply

Your email address will not be published.