# August 21, 2023 - Bits, Bytes and Qubits—Here Comes the Quantum Computer

Although the quantum computing proposal goes back to the 1980s, there has been a recent renaissance of public interest and wonder in that revolutionary new tech.

In 2019, Google claimed that it had achieved quantum supremacy. The tech giant had solved a problem with a quantum computer that would take thousands of years for a conventional computer to complete.

You should take that claim with a grain of salt though because IBM, another tech giant with a lineup of some of the world's most impressive conventional supercomputers, said Google had underestimated the limits of *their* supercomputers.

Still, whichever way you slice it, quantum computers have incredibly impressive potential as does the underlying invention underneath it all—the qubit.

The most basic form of information stored on computers is a bit, whose value is a 0 or a 1 (an ‘on' or an ‘off'). A computer decides if a bit is a 0 or a 1 based on electrical charge, and all its machine logic comes down to various combinations of those 0s and 1s.

A standard collection of bits is called a byte (which stores exactly 8 bits). The reasoning is that bits are so small that you would never need just one bit. A byte is just enough information to store a single ASCII character like ‘a,' which translates to 01100001 in the binary language of 0s and 1s.

And then comes the superposition revolution.

As computer components get even smaller and smaller, we step into a world with fundamentally different laws. At this level, there is a whole new information store called the qubit. The difference between a bit and a qubit is that qubits can be in superpositions. That means that a qubit can be both a 0 and 1 at the same time. An ‘on' or an ‘off' at the same time.

In application, the only time that a qubit will assume a definite value is when it's being read. Superposition means that any given collection of qubits can hold infinite more states when compared to a collection of bits. It took 8 bits to store the ‘a' ASCII character. Storing that same value in qubits would only take 3 qubits.

At that scale though, it's easy to miss exactly how impressive qubits are, especially because with each additional qubit the resulting increase in storage is exponential. So while 3 qubits will store 8 bits of data, 13 qubits will store an incredible 8192 bits or 1 kilobyte.

You can already picture what that means for storage and computation. Is this the next great evolution in the realm of human technology? Or are quantum computers going to be so rare and highly specialised that they only exist in news articles?

Answers to those questions are a mystery that can only be revealed in the future. In the meantime, enjoy this random YouTube comment and stay excited about tech!