Quantum 'toggle switch' will eliminate computing errors, researchers claim
NIST says the new approach will cut down on so-called 'noise' that produces inconsistent results in many quantum computers
The National Institute of Standards and Technology (NIST) has published research into a ‘toggle switch’ aimed at making quantum computers more versatile and help cut down on noisy circuits.
Noise has bedevilled designers of quantum computers and is essentially random behaviour that can create errors in qubit calculations.
IBM recently published research revealing how errors can be mitigated to the point where a quantum computer could outperform classical supercomputing methods.
Rather than suppressing errors, IBM’s researchers explored a portfolio of mitigation techniques, including probabilistic error cancellation and Twirled Readout Error eXtinction (TREX).
Harness the power of technology to drive your business forward
Discover integrated as-a-Service models to achieve your DX goals faster
The NIST approach relies on what is referred to as a “toggle switch”, a device that connects the qubits to a circuit called a “readout resonator”. When the switch is off, the elements are isolated from each other, but when on, the qubits can interact and perform calculations. Once complete, the switch can then connect a qubit to the readout resonator for result retrieval.
The toggle switch and readout circuit must be made of superconducting components and the switch itself is made from a superconducting quantum interference device. Interactions between the qubits and the readout resonator are induced by driving a microwave current through a nearby antenna loop.
While currently at the research stage, the approach also has the benefit of introducing programmability to otherwise static architecture.
Get the ITPro. daily newsletter
Receive our latest news, industry updates, featured resources and more. Sign up today to receive our FREE report on AI cyber crime & security - newly updated for 2024.
Does this mean that quantum computers are just around the corner?
Possibly. While the research from NIST only deals with two qubits and a single resonator and must be operated at very low temperatures, plans are afoot to add more qubits and resonators.
However, actually constructing a quantum computer using such techniques with enough qubits to solve problems that are currently insurmountable remains in the future.
That said, quantum computing does seem to be nearing utility. Following its error mitigation research, IBM expects to complete its Quantum systems over the next year. The machines will be powered by a minimum of 127 qubits and, according to IBM, will provide access to computational power “large enough to surpass classical methods for certain applications”.
IBM has coined the term ‘utility-scale’ - the point at which quantum computers could serve as scientific tools to explore a new scale of problems that classical systems may never be able to solve. With the company’s IBM Quantum systems due for completion by the end of 2024, that point is now in sight.
Other companies, such as Intel and Microsoft have similarly made strides toward usefulness.
What could this technology be used for?
Quantum computing has historically been long on promise and short on result. However, the IBM team has been able to generate large, entangled states that simulate the dynamics of spins in a model of material and accurately predict properties such as its magnetization.
Achieving transformative business results with machine learning
Why hundreds of thousands of organisations use AWS ML
Other uses include healthcare and life sciences, where researchers are looking into using quantum chemistry and quantum machine learning to accelerate molecular discovery. There are potential applications in high energy physics and optimization as well as further work in materials simulation.
The words “quantum machine learning” might make some observers concerned. Dell CTO John Roese recently argued that the technology would prove far more disruptive than the current upheaval caused by generative technologies.
There is also concern that quantum computing, once useful, could be used to decrypt data secured with, for example, an RSA algorithm.
Quantum computing has long been a technology that has proven elusive. Recent developments have shown that a useful computer, capable of outpacing classical approaches, is now years rather than decades away.
Richard Speed is an expert in databases, DevOps and IT regulations and governance. He was previously a Staff Writer for ITPro, CloudPro and ChannelPro, before going freelance. He first joined Future in 2023 having worked as a reporter for The Register. He has also attended numerous domestic and international events, including Microsoft's Build and Ignite conferences and both US and EU KubeCons.
Prior to joining The Register, he spent a number of years working in IT in the pharmaceutical and financial sectors.