Roadblocks to the Widespread Adoption of Quantum Computing: Part 1

In previous posts, I have outlined the importance of Quantum computing for you on a basic level. Even so, I’ve recently realized that one particular question may bear a bit more explaining. Why aren’t Quantum computers being mass-produced yet?

First and foremost, it’s important to reiterate that Quantum computers are quite rare due to one key factor. Their production is currently highly siloed. The materials and knowledge needed to create such a machine are really only held by the companies who have the most capital to throw around. In other words, those who can already access the power of Quantum computers are usually only vastly well-known firms like Google. In addition to this, regardless of the existence of this technology, it hasn’t been “industrialized” as is mentioned by some of our sources below. In that sense, even the Googles of the technology industry are finding significant roadblocks in the way of their efforts to bring Quantum computing to the masses.

One that comes to the forefront is the idea that even the famous Moore’s Law has its’ own limitations. In short, the trend of processing power doubling every two years due to shrinking transistors can only go on for so long. This is both because we are running out of space for transistors on computer chips and because their underlying structures only allow for so much processing.  If we really want to want to understand how all of this relates to the rise of Quantum computing, then it may help to go to an expert on the subject. Unfortunately, in most cases, one thing seems to be missing from all of their content.

There doesn’t seem to be a primer that puts the shortcomings of traditional computing and the advantages of Quantum computing into easily understandable terms. Many come close, but most are full of industry jargon.

One way to do so might be to hone in on a single case study like the Blockchain.

In doing so, we could then ask: how do the limitations of traditional computers play out in the Blockchain space?

Of course, there’s the mining process. As anyone who has any experience with Cryptocurrencies has already realized, traditional computers just can’t seem to keep up with the power of Application Specific Integrated Circuits in terms of verifying blocks on a particular network. The easy answer to why this trend continues to dominate Crypto mining is because the required processing power to successfully solve a network’s main hash continuously increasing over time. Because of this, the average computer has quickly become obsolete in this case, while Application Specific Integrated Circuits have flourished, for those who have the capital and the time to create them. Beyond mining, in the Blockchain space, there’s the question of maintaining network security as traditional computing gives rise to Quantum computing. In some circles, it is already being theorized that if current Blockchain networks do not update their security measures in response to this, it will effectively cause their undoing. Unfortunately, the question of whether or not this is true can’t exactly be proven until someone connects one or more Quantum computers to a Blockchain network and tries to hack it. This is not to say that anyone should do this, but rather that it is likely to happen.

Beyond mining and security measures, there’s also the potential of Blockchain networks performing at exponentially higher levels if they are populated with Quantum computers. This, again, is an untested argument, though an easier one to understand with the fundamental features of a Quantum computer in mind. In our next post, I’ll jump into what these are, in addition to how they drive improvements in computing. For now, I’ll leave with one final thought on the subject. Right now, your computer processes data with a horde of transistors.

An easy way of conceptualizing what these are is to think of gates that can open and close to let electrons pass from one end of a circuit to another. If we also take Moore’s Law into account, we can then conclude that as transistors continue to shrink exponentially, they will eventually reach the size of a cluster of atoms. Once this occurs, it is widely theorized that a process called Quantum tunneling will kick in and the electrons will decide to basically jump from one side of each transistor to another, instead of passing through it. Therefore, with this in mind, it becomes relatively easy to see how the architecture of a traditional computer could become obsolete. Some have even said that we are only a few nanometers away from this phenomenon.

In our next post, we’ll bring the rest of Quantum computing’s key driving factors to light, while examining just how close we are to its’ true inception.

Resources:

https://www.wired.com/2017/01/d-wave-turns-open-source-democratize-quantum-computing/

https://www.bloomberg.com/news/videos/2018-09-12/quantum-computing-is-one-step-closer-to-reality-video

https://medium.com/@qilimanjaro/introducing-quantum-computing-523dcf8a4284

https://towardsdatascience.com/the-need-promise-and-reality-of-quantum-computing-4264ce15c6c0

https://www.investopedia.com/terms/m/mooreslaw.asp

https://www.simonsfoundation.org/2018/07/18/ar2017-scott-aaronson/

https://www.youtube.com/watch?v=JhHMJCUmq28

Testing your own Quantum Program: https://medium.com/qiskit/the-atoms-of-computation-ae2b27799eaa

About Ian LeViness 113 Articles
Professional Writer/Teacher, dedicated to making emergent industries acceptable to the general populace