Quantum computers walk a fine line between order and disorder – only then can the quantum physical processes that these systems calculate take place. But as an analysis using three different methods has now shown, some of the current quantum computers from IBM, Google and Co are dangerously close to the threshold of chaotic collapse. That could mean that not all of these systems based on superconducting transmon quantum bits are readily scalable, the research team explains. In order to prevent these systems from falling into chaos, this must be taken into account when designing qubit processors on the superconducting platform.
Quantum computers are considered to be the computers of the future. Because thanks to quantum physical processes such as superposition and entanglement, they can test many possible solutions at the same time and are therefore faster than conventional computers. Companies like IBM, Google and others have already developed the first commercially usable quantum computers, even if they are still too small for most applications and contain too few quantum bits. Most of these systems are based on so-called transmon qubits. These are virtual particles in the form of charge islands in special superconducting metal coils. In order for these qubits to be able to perform arithmetic operations, they must remain in the superimposition state as undisturbed as possible during the computing time. This requires the best possible shielding against external interference. On the other hand, however, they must be coupled together to form circuits.
Between coupling and disorder
The current quantum computers thus work in a fragile balance between order in the form of the coupling between the qubits and a disorder that offers the individual qubits enough freedom for independent fluctuations. “The Transmon chip not only tolerates but actually requires random qubit-to-qubit imperfections,” explains first author Christoph Berke from the University of Cologne. Because the coupled quantum bits resemble a system of coupled pendulums whose fluctuations can easily build up to uncontrollably large oscillations with catastrophic consequences. The principle behind it is similar to the effect of the resonance of bridges: when large groups cross them, they must avoid marching in step, otherwise oscillating resonance vibrations arise.
In quantum computers, too, an intentionally introduced disorder should avoid the emergence of such chaotic resonance fluctuations. Deliberately introduced local “tuning” prevents the qubits from becoming too strongly coupled and maintains the delicate balance between order and disorder in multi-qubit processors. The current quantum computer systems use different methods to maintain this balance. IBM uses a structure in which qubits with the same frequency are placed alternately in the lattice to prevent unwanted coupling from neighboring qubits. However, there can still be couplings to the next but one qubit, as Berke and his team explain. The quantum computers at TU Delft and Google, on the other hand, use active point interference to block unwanted resonances.
IBM system potentially more prone to chaos
How well these different methods work has now been investigated by Berke and his colleagues using three different techniques. “In our study, we are investigating the question of how reliable the principle of ‘stability by chance’ is in practice,” says Berke. On the one hand, it was shown that even before the collapse into chaos, there is a surprisingly extensive gray area in which undesired resonances are already affecting the quantum states, but the threshold to “hard quantum chaos” has not yet been crossed. On the other hand, the tests showed that at least some of the system architectures pursued in industry are dangerously close to being unstable. “When we compared the Google chips with the IBM chips, we found that in the latter case the qubit states could be coupled to such an extent that controlled computing operations could be impaired,” reports Berke’s colleague Simon Trebst. The system of this computer architecture is more susceptible to chaotic fluctuations.
This could have consequences for the planned expansion of quantum computers to include more quantum bits: “We would go so far as to speculate that this system cannot support the generalization to larger and two-dimensionally connected array geometries, which are necessary for more complex applications”. , the researchers write. Senior author David DiVincenzo from RWTH Aachen University adds: “Our study shows how important it is for hardware designers to combine device modeling with state-of-the-art quantum random methodologies and to make ‘chaos diagnostics’ a routine part of the design of qubit processors on the to integrate into a superconducting platform.”
Source: Christoph Berke (University of Cologne) et al., Nature Communications, doi: 10.1038/s41467-022-29940-y