It’s a Small World

The advent of modern computing came startling quickly, but had been in gestation for over a hundred years. In the early 19th century, Charles Babbage […]

Art by Sam Roots.

The advent of modern computing came startling quickly, but had been in gestation for over a hundred years. In the early 19th century, Charles Babbage designed the first ‘programmable’ computers based on thousands of hand-cranked gears, yet by the start of the 20th Century, analogue electronics were achieving increasingly sophisticated calculations.

During the mid to late 20th Century, semiconductor transistors went from laboratory oddities to ubiquitous commodities, revolutionising the modern world in the process. Now they appear in everything from washing machines to university supercomputers.

In 1965, Intel co-founder Gordon E. Moore published a paper modelling the rate of development of computer hardware, which stated that the number of transistors that could be placed on an integrated circuit had doubled approximately every two years. Somewhat surprisingly, this observation continued to hold for the following years, and soon the fulfilment of ‘Moore’s Law’, as it came to be known, was used as a measure for the success of computer manufacture.

To keep the pace set by Moore’s Law, improvements to traditional computers are continually proposed. Optical fibres, in which light is used to transmit data, have increased broadband internet speeds across the country, replacing electrical wires. Graphene, a chicken wire of carbon, one atom thick, has unique material properties that are only just being explored, and could revolutionise microchip design.

However, all of these technologies are based on the classical design of a computer. To increase the power of such electronic devices, the circuits are getting smaller, down to atomic scales. Unfortunately, this is where classical physics begins to break down, and so traditional chip design becomes inadequate. At the atomic level we rely on quantum mechanics instead to describe the behaviour of atoms, explaining the uncertainties and non-ideal performance of devices on the quantum scale. But can we use quantum mechanics to our advantage? Is there a more fundamental improvement we could make?

Computing relies on processing digital units of information known as bits (represented mathematically by 0’s or 1’s). In its simplest form these bits could be represented as a particle being in one position or another, or as a light switch being on or off. Quantum mechanics permits a strange property called superposition which allows the particle to be in both positions at the same time, as long as it’s not observed—as if the switch is both on or off simultaneously. If we now use this quantum particle as our unit of information, it can be both 0 and 1 at the same time and is known as a quantum bit or ‘qubit’.

In computing, 8 regular bits can represent any whole number 0 to 255, whereas 8 qubits in superposition can represent all 256 numbers simultaneously. If we intended to perform a calculation on each number, classically we would need to perform 256 different calculations (one per number). But since the qubits represent all the numbers at the same time we can effectively perform the calculation on every number at once.

Unlike the current computers, which are based on electrons in silicon transistors, no standard method for constructing a quantum computer currently exists. Experiments have been performed demonstrating that these theoretical musings can be physically realised, but the form in which they should be implemented has yet to be decided. And if the quantum computer is to become an everyday reality it has major hurdles to overcome.

So far prototypes can only perform simple calculations, and they are susceptible to decoherence, a problem in which the fragile superposition of the qubits is destroyed due to interactions with the environment (see ‘A State of Collapse on p 20).

Furthermore, a standard microchip stores billions of bits at any one time. In comparison, cutting edge quantum experiments have only managed to control 12-qubits. Clearly there is still a long way to go.

Quantum computing could improve on its classical analogue to advance technology to unimaginable regimes of speed, power and versatility. Moore’s Law may not hold for much longer for semiconductor based computers, but the quantum revolution could supersede it.

The battle of human ingenuity against the physical limitation of computation started slowly and has exponentially increased for over half a century. Can we keep up the pace?

Philip Sibson is a 3rd year undergraduate in Engineering at Exeter College.
Art by Sam Roots.

About Bang!

Oxford's graphically gorgeous science magazine!