Is Quantum Computing About to Transform the World of Tech As We Know It?

Is Quantum Computing About to Transform the World of Tech As We Know It?

Mar 2, 2021

Landmark technologies have a tendency to take a while before they realise their full potential. Like cloud computing, like IoT, like 5G mobile, we seem to have been hearing just how ‘transformative’ quantum computing will be for quite some time.

The longer it takes for the hype to be realised, the more doubts are raised.

Quantum computing has been a ‘what if’ technology for so long now that many have already written it off as a nice idea that will never be viable.

The first machines built on the principles of quantum computing appeared in the early 2000s. D-Wave claimed the first launch of a product marketed as a quantum computer in 2010, and IBM started running quantum computers on the IBM Cloud way back in 2016.

Throughout this period, we’ve been promised the biggest breakthrough in computer processing since the silicon chip, with a vast acceleration in speed and capacity to match the data-hungry world we now live in. We have waited with baited breath. And waited. And waited…

Yet there are signs that quantum computing could almost be ready to emerge from its chrysalis and spread its wings. Amidst a flurry of quantum-related announcements in the early weeks of 2021, perhaps the most significant so far is the news from Microsoft that it is going live with a quantum service on its Azure platform. IBM might have scored a world-first with its pioneering quantum-driven cloud offer five years ago. But Azure is the world’s second-largest cloud infrastructure platform. If that’s not a sign of the quantum tech going mainstream, nothing is.

So what exactly is quantum computing, and why might it be time to get just a little bit excited?

Beyond binary

Understanding the details of how quantum computing works is best left to quantum physicists. The important point is that quantum computing handles information processing in a very different way to a traditional computer.

The basic unit of conventional digital data is the bit, short for binary unit – binary because it has two possible values, 1 or 0. This is a reflection of the fact that an electronic logic gate can have two physical states – current on or current off, gate open or gate closed. It sounds simple, but that’s the foundation of the astonishing digital revolution that has fundamentally changed the world we live in.

By contrast, a quantum bit, or qubit, doesn’t have such a straightforward either/or value. In fact (and this is where you really need a degree in quantum physics to make full sense of it), a qubit can have a value of 1 and 0 at the same time. This is because the fundamental quantum laws of sub-atomic matter tell us that particles can exist in two apparently opposite states or positions at once, or at least hold the probability of being one or the other simultaneously. This is concept called superposition.

Quantum computers marry the mind-bending yes/no/maybe possible states of sub-atomic particles to ingenious algorithms that are able to work out the probabilities of how qubits interact. The result is information processing systems with a power and capacity way, way beyond anything we’d even be able to conceive of conventional bit-based computers achieving.

And now we are starting this potential have significant real-life impact.

Scaling the concept

One of the big barriers to quantum computing becoming commercially viable to date has been the difficulty of scaling qubits. This is largely because it is so difficult to hold sub-atomic particles in a quantum state for long enough to make them a viable medium for information processing. At present, because of the physical barriers involved, the most powerful quantum processors are still only capable of running between 50 and 100 qubits.

But technological breakthroughs are now happening at pace.

IBM, for example, has announced a new open source software development kit, Qiskit, that is compatible with a hybrid quantum-conventional execution environment. Scheduled for release in 2021, IBM says Qiskit increases execution speeds up to 100-fold. That’s only a fraction of the potential uplift quantum-only systems are expected to achieve. But significantly, IBM expects Qiskit to reduce the timescale before we start seeing commercial developers writing applications designed to take advantage of quantum-assisted processing speeds.

By 2023, IBM expects to have commercially viable systems running 1,000-plus qubits – seen as a critical tipping point after which the value of the benefits gained from quantum systems is likely to be higher than the costs involved. In another significant development, Microsoft and a team at the University of Sydney have announced the development of a new chip dubbed the ‘Gooseberry’ that is capable of running thousands of qubits at temperatures close to absolute zero – necessary for stabilising sub-atomic particles in quantum states for long enough to use.

In the mean time, we are already seeing real-world benefits of quantum processing, albeit at a relatively small scale. Canadian grocery chain Save-On-Foods, for example, has enlisted quantum computing pioneer D-Wave to resolve a logistical problem of gargantuan mathematical complexity. Sure enough, D-Wave’s solution has reduced compute time for some tasks from 25 hours down to seconds.

We already live in a world transformed by algorithms. After the breakthroughs achieved by Big Data and AI, it is largely this promise of taking data processing to a whole new level again that is driving massive investment in the technology. There is still some way to go, but we may be emerging into the home straight.