My previous article laid the foundation of how data is represented with current technology and that at its most granular level it is represented by a “bit” which is simply an electronic state that is either on or off. Much like a light switch can only have two states, so too can the bit only have two possible values. Last month’s article can be found here: Quantum Computing Primer - Understanding Bits and Bytes and is a basis for this month’s article
A bit is stored and manipulated at the electrical level and is represented by current flowing to that circuit. There is no middle ground, nor is it possible, for the current to be on and off at the same time. A bit can only have one of two states – on or off.
The concept of quantum computing is not new – it has been around since the early 1980’s but it’s only been recently that there have been advances which suggest that quantum computing is more than a theory and is possible in the real world, although for now, it’s on a very small scale. However, where this technology has being considered a new frontier, is that it is moving out of the realm of the theoretical and recent experiments suggest that it is possible.
In 2012, two
scientists were awarded a Nobel Prize for an experiment in which they were able
to demonstrate that they could have the same light particle at two different
spots at the same time.
As the “bit” is the
most granular level with conventional technology, quantum computing introduces
its own version of the bit called a “qubit” (quantum bit).
The following is
admittedly a bit mind-blowing, especially for someone like me who is not a
science major as there’s a revolutionary approach to data storage with the
qubit.
The entire technology
of quantum computing which makes it so intriguing is that the qubit can also be
on or off like the traditional bit, but it can also be on and off at the same
time. What makes the quantum computing
qubit so powerful is not only does it have these three states (on, off, on AND
off), but it can also be any state BETWEEN on and off all at the same
time.
Quantum computing is
not based on electrical signals and current but instead, data is stored and
manipulated at the atomic level.
How exactly does this
benefit technology? Well, recall in the
previous article that stated that each character we type is comprised of 8
bits. The word “Hello” uses 5 bytes and
since there are 8 bits per byte, the most that can be done with 40 bits of
information is to store a five character word.
The generally accepted theory is that a quantum computer which contains
“only” 30 qubits (the equivalent to the word “hello” in
existing bit technology) will be literally millions of times more powerful than
today’s computer thanks to a concept called “Inherent parallelism” which
suggests that the quantum computer can work on millions of operations
simultaneously instead of just one at a time.
Even though this
technology is primarily theoretical, this is laying the groundwork for a
technological revolution that none of us have experienced. In 1812, Charles Babbage conceived of the
first programmable machine, the precursor to the computer. I consider the quantum computer to have the
same impact.
How long it will take
this technology to become mainstream is difficult to say. From the little that I’ve read, it’s probably
going to take at least 5 years to move this technology out of the labs and into
a personal computer prototype and then perhaps another 5 years before the
technology matures to the extent that it could be mass produced. As always, these new technologies are very
expensive as there is so much Research and Development cost that they need to
recoup up front but I’m still excited at the thought of what this will mean to
the computing world.
No comments:
Post a Comment