“Hyperchaos:” Qubit Behaviour That Allows The Simulation of Complex Quantum Systems Without Extensive Computational Power

Image by Pete Linforth from Pixabay

Complex Quantum Systems

For some years now, developments of quantum computers have been restricted by the processing speed of classical CPUs. However, scientists from Loughborough University, University of Nottingham and Innopolis University in Russia have discovered what could turn out to be a transformational characteristic of quantum bit behaviour, moving the technology forward. If successful, this could allow researchers to mimic complex quantum systems without a high level of computational power required as with the processing power available from the world’s fastest supercomputers, which could have positive ramifications in the evolution of newer, better and more powerful quantum cryptography tools, for example.


As full-scale quantum computing on a true quantum computer is not available yet, the bottleneck is that only small-scale quantum computers, up to dozens of qubits, can be simulated using classical supercomputers

— Dr. Alexandre Zagoskin

The paper, Emergence and control of complex behaviors in driven systems of interacting qubits with dissipation, was published in the NPJ Quantum Information. In it, the researchers present their findings on how to avoid the need for large amounts of power by utilizing the chaotic behaviour of qubits, demonstrating the phenomenon known as hyperchaos.


Dr. Alexandre Zagoskin, one of the authors of the paper from Loughborough’s School of Science and coincidentally a cofounder of D-Wave Systems (the world’s oldest quantum computing company, founded in 1999), said: “A good analogy is aircraft design. In order to design an aircraft, it is necessary to solve certain equations of hydro(aero)dynamics, which are very hard to solve and only became possible way after WWII, when powerful computers appeared. Nevertheless, people had been designing and flying aircraft long before that. It was because the behaviour of the airflow could be characterized by a limited number of parameters, such as the Reynolds number and the Mach number, which could be determined from small scale model experiments.”

The analogy over, he then talked about the real-world implications to the problem:

“Without this, direct simulation of a quantum system in all detail, using a classical computer, becomes impossible once it contains more than a few thousand qubits. Essentially, there is not enough matter in the Universe to build a classical computer capable of dealing with the problem. If we can characterize different regimes of a 10,000-qubit quantum computer by just 10,000 such parameters instead of 2^(10000) — which is approximately 2 times a 1 with three thousand zeros — that would be a real breakthrough.”

From all of this, the researchers will be able to control the critical values of these parameters by constructing and testing scale models, measuring the system to find out if the parameters of the quantum processor permit it to work properly or not.


“The results in this work are insightful for understanding complex quantum dynamics. Future quantum computers consist of thousands of quantum bits (qubits), which will be orders of magnitude more powerful than the fastest classical computer on the market,” said Dr. Weibin Li, from the School of Physics and Astronomy, the University of Nottingham. “Here, full control and characterization of quantum computers is the key to performing correct and massive computing. In the quantum realm, the number of degrees of freedom of a system grows exponentially with its size. As full-scale quantum computing on a true quantum computer is not available yet, the bottleneck is that only small-scale quantum computers, up to dozens of qubits, can be simulated using classical supercomputers.”

James Dargan
James Dargan
James Dargan is a contributor at The Quantum Daily. His focus is on the QC startup ecosystem and he writes articles on the space that have a tone accessible to the average reader

Related articles