| Issue |
Europhysics News
Volume 57, Number 1, 2026
Quantum physics
|
|
|---|---|---|
| Page(s) | 12 - 15 | |
| Section | Features | |
| DOI | https://doi.org/10.1051/epn/2026107 | |
| Published online | 16 March 2026 | |
From macroscopic quantum circuits to scalable quantum systems
James Watt School of Engineering, University of Glasgow
Abstract
Superconducting quantum circuits have reached impressive performance, with individual qubits and operations working extremely well. This maturity was recognized by the 2025 Nobel Prize in Physics, awarded for demonstrating that genuinely quantum phenomena – such as energy quantization and tunnelling – can occur in macroscopic electrical circuits. Yet building large, reliable quantum processors remains challenging. The limitation is no longer a single flaw, but the combined effect of many small imperfections spread across the hardware. Subtle losses in materials, tiny variations in components, unwanted heat, and constraints imposed by wiring, packaging, and control electronics all accumulate as systems grow in size. This article explains how such device-level effects ultimately limit the performance of entire quantum processors, and why understanding this link is essential for turning today’s high-quality quantum devices into scalable and reliable quantum technologies.
© European Physical Society, EDP Sciences, 2026
Between the everyday world of wires and switches and the abstract realm of wavefunctions, quantum mechanics found a new voice. The 2025 Nobel Prize in Physics celebrates the moment when quantum behaviour ceased to be confined to atoms and particles, and instead revealed itself in something far more tangible: electrical circuits made from superconductors.
![]() |
FIG 1 Left: dilution refrigerator system used by Devoret, Martinis, and Clarke in their 1985 experiments on macroscopic quantum tunnelling and energy quantisation. Right: dilution refrigerator system hosting Google Quantum AI’s Sycamore processor used in the 2019 quantum supremacy experiment. |
The prize was awarded jointly to John Clarke, Michel H. Devoret, and John M. Martinis for demonstrating macroscopic quantum tunnelling [1] and energy quantisation [2] in electrical circuits. Their experiments showed, with remarkable clarity, that quantum mechanics does not end at the microscopic scale — provided we build systems carefully, cool them deeply, and isolate them from the noisy classical world.
At the heart of this story lies superconductivity, a phenomenon discovered more than a century ago but still central to modern quantum physics. When certain materials such as aluminium and niobium are cooled below a critical temperature, their electrical resistance vanishes entirely. Currents can flow without dissipation, circulating indefinitely without energy loss. Superconductivity is a collective quantum state: electrons bind into Cooper pairs, and the entire system is described by a single macroscopic wave-function with a well-defined quantum phase. This collective coherence, extending over micrometres or even millimetres, allows superconductors to support quantum behaviour on scales vastly larger than atoms.
To observe delicate quantum effects in such systems, thermal motion must be suppressed almost completely. The experiments recognised by the Nobel Committee were performed at temperatures around 10 millikelvin, more than 100 times colder than outer space. At these temperatures, achieved using dilution refrigerators, thermal energy becomes negligible compared to quantum energy scales. Random excitations freeze out, and quantum mechanics is allowed to speak clearly.
Cooling alone, however, is not enough. At millikelvin temperatures, even a faint electromagnetic disturbance or a barely perceptible vibration can overwhelm the signal. These experiments demanded an extraordinary level of experimental discipline: careful filtering of every electrical line, shielding against stray radiation, meticulous grounding, and mechanical isolation. The success of the work lay not in forcing a signal out of nature, but in removing everything that obscured it.
The crucial element enabling these discoveries is the Josephson junction [3], itself associated with an earlier Nobel Prize. A Josephson junction consists of two superconductors separated by an ultra-thin insulating layer of only a few nanometres. Classically, such a barrier would block current completely, but quantum mechanics allows Cooper pairs to tunnel coherently through the insulator while preserving their phase across the junction.
Quantum tunnelling is familiar at the microscopic level. Electrons tunnel through barriers in semiconductors; atoms tunnel in chemical reactions. What Clarke, Devoret, and Martinis demonstrated was that an entire electrical circuit can tunnel. In their experiments, the phase difference across a Josephson junction behaves like a particle trapped in a potential well defined by the circuit design. Classically, escaping such a well requires sufficient thermal energy to climb over the barrier. As the temperature was lowered, a striking transition occurs: the escape rate became temperature independent, reflecting the collective tunnelling of many Cooper pairs as a single quantum object. This was macroscopic quantum tunnelling, observed directly and unambiguously.
![]() |
FIG 2 Timeline of major IBM Quantum processor generations over the past decade, showing the progression from early few-qubit demonstrators to utility-scale systems with over a thousand qubits. |
The same systems demonstrated another defining feature of quantum mechanics: energy quantisation. Rather than occupying a continuous range of energies, the circuits could exist only in discrete energy levels. Using microwave spectroscopy, transitions between these levels were observed as sharp resonances. The circuit behaved like an artificial atom, with properties determined not by chemistry, but by lithography and design.
![]() |
FIG 3 Schematic of a modern 3D-integrated qubit control and readout scheme proposed by MIT Lincoln Laboratory [7], in which qubit, interposer, and readout/ interconnect chips are vertically stacked and connected using indium bump bonds and through-silicon vias (TSVs). |
This marked a turning point in experimental physics. Quantum systems were no longer merely discovered in nature — they were engineered. Energy levels could be shaped, coupling could be tuned, and quantum states could be prepared, measured, and controlled within an electrical circuit.
The path from these foundational experiments to today’s quantum computers is unusually direct. In the 1980s and 1990s, macroscopic quantum tunnelling and quantisation established that superconducting circuits obey quantum mechanics. In the early 2000s, these insights were used to design superconducting qubits, where two quantised energy levels encode quantum information [4]. Over the following decades, control techniques improved, coherence times lengthened, and multi-qubit systems became possible. Today, superconducting quantum processors containing tens to hundreds of qubits operate routinely in laboratories around the world.
The move from a handful of qubits to processors with tens or hundreds of elements marks a transition into the noisy intermediate-scale quantum (NISQ) regime. At and beyond this scale, the central question is no longer whether individual qubits can behave quantum mechanically, but how well many such elements can work together as a single quantum system.
In today’s quantum processors, performance is shaped by the collective influence of many device-level effects distributed across the hardware. Materials that are nearly lossless still host minute dissipation at surfaces and interfaces. Josephson junctions can be fabricated with extraordinary precision, yet each junction still exhibits sub-nanometre-scale variations. Even at millikelvin temperatures in a well-shielded cryostat, residual disturbances such as stray electromagnetic radiation, energetic particle excitation, or even cosmic ray can occasionally disrupt delicate quantum states. These effects are subtle and well understood individually, but their accumulation across many qubits can significantly complicate and limit overall processor performance.
As systems grow, performance also becomes increasingly shaped by the interplay between qubits and the supporting infrastructure required to operate them as a coherent whole. Wiring, packaging, filtering, and control electronics must connect hundreds of quantum elements to the outside world while preserving coherence and stability. These classical components are not mere auxiliaries: they shape how qubits are addressed, how densely they can be arranged, and how reliably operations can be repeated over time. Understanding this dynamic interplay, from materials, devices, and components to full systems, is now a key step toward scalable quantum computing.
State-of-the-art superconducting quantum processors developed by groups such as IBM and Google already comprise hundreds to over a thousand qubits, embedded within large dilution refrigerators and connected by thousands of control and readout lines. IBM’s Condor processor, for example, integrates 1,121 qubits and requires over a mile of high-density cryogenic wiring within a single cryostat. At this scale, scalability becomes a tangible constraint: each additional qubit brings added wiring, heat load, calibration complexity, and opportunities for cross-talk and interference. The effort to scaling up quantum computing is as much about managing cryogenics, interconnects, and system integration as it is about improving individual qubit performance.
This perspective is reflected in the remarkable breadth of the current research landscape [5]. Efforts spanning materials optimisation, cleaner interfaces, and improved fabrication processes aim to reduce noise, extend quantum coherence, and improve fabrication scalability. In parallel, new qubit designs explore increased protection against environmental disturbances, while error correction and error mitigation have now been demonstrated on working quantum processors. Scaling strategies extend beyond qubit count alone, encompassing higher operating temperatures and frequencies, modular architectures, and the integration of cryogenic control electronics [6].
The present generation of quantum processors has enabled landmark results, including the demonstration of quantum advantage [8] and error-corrected operations [9]. At the same time, a broader ecosystem is taking shape, spanning algorithms, software stacks, cloud platforms, and an expanding user community [10]. Looking forward, roadmaps from major academic and industrial efforts outline a staged progression toward scalable, fault-tolerant quantum computing, where reliable quantum processors will be used to address real-world problems such as quantum chemistry and materials simulation, drug discovery, large-scale combinatorial optimisation, cryptography, security, and machine learning. Accordingly, scalable, fault-tolerant quantum computing is not a single breakthrough, but the sustained alignment of device physics, system engineering, and application-driven development across the entire quantum community.
Designated as the International Year of Quantum Science and Technology, 2025 highlighted how quantum physics is reshaping technology beyond the laboratory. The challenge in the decade ahead is to translate this momentum in research and development into reliable, scalable technological capabilities.
References
- M. Devoret, J. M. Martinis and J. Clarke, Physical Review Letters, Oct. 1985, doi: https://doi.org/10.1103/physrevlett.55.1908 [Google Scholar]
- J. M. Martinis, M. H. Devoret and J. Clarke, Physical review letters, Oct. 1985, doi: https://doi.org/10.1103/physrevlett.55.1543 [Google Scholar]
- B. D. Josephson, Physics Letters, Jul. 1962, doi: https://doi.org/10.1016/0031-9163(62)91369-0 [Google Scholar]
- J. Clarke and F. K. Wilhelm, Nature, Jun. 2008, doi: https://doi.org/10.1038/nature07128 [Google Scholar]
- P. Krantz, M. Kjaergaard, F. Yan, T. P. Orlando, S. Gustavsson, and W. D. Oliver, Applied Physics Reviews, Jun. 2019, doi: https://doi.org/10.1063/1.5089550 [Google Scholar]
- J. Brennan et al., APL Quantum, Dec. 2025, doi: https://doi.org/10.1063/5.0273490 [Google Scholar]
- D. Rosenberg et al., npj Quantum Information, Oct. 2017, doi: https://doi.org/10.1038/s41534-017-0044-0 [Google Scholar]
- F. Arute et al., Nature, Oct. 2019, doi: https://doi.org/10.1038/s41586-019-1666-5 [Google Scholar]
- R. Acharya et al., Nature, Dec. 2024, doi: https://doi.org/10.1038/s41586-024-08449-y [Google Scholar]
- J. Preskill, Quantum, Aug. 2018, doi: https://doi.org/10.22331/q-2018-08-06-79 [Google Scholar]
All Figures
![]() |
FIG 1 Left: dilution refrigerator system used by Devoret, Martinis, and Clarke in their 1985 experiments on macroscopic quantum tunnelling and energy quantisation. Right: dilution refrigerator system hosting Google Quantum AI’s Sycamore processor used in the 2019 quantum supremacy experiment. |
| In the text | |
![]() |
FIG 2 Timeline of major IBM Quantum processor generations over the past decade, showing the progression from early few-qubit demonstrators to utility-scale systems with over a thousand qubits. |
| In the text | |
![]() |
FIG 3 Schematic of a modern 3D-integrated qubit control and readout scheme proposed by MIT Lincoln Laboratory [7], in which qubit, interposer, and readout/ interconnect chips are vertically stacked and connected using indium bump bonds and through-silicon vias (TSVs). |
| In the text | |
Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.



