In a groundbreaking advance that pushes the boundaries of quantum computing stability and precision, researchers at the Niels Bohr Institute have developed a rapid detection system capable of tracking fluctuations in the delicate quantum states of superconducting qubits in real time. This pioneering work addresses a longstanding limitation in quantum technology—the inability to swiftly characterize and adapt to the fleeting changes in qubit energy-loss rates, a phenomenon that has hampered progress toward reliable quantum processors.
The quantum bit, or qubit, stands as the fundamental building block of quantum computers, analogous to classical bits but vastly more fragile and complex. Superconducting qubits, favored for their scalability and integration potential, suffer from energy dissipation caused by environmental interactions. Critically, the rate at which this energy is lost does not remain constant; it fluctuates unpredictably due to microscopic defects and noise present within the materials composing the processor. Until now, typical measurement protocols, which rely on extended averaging over long durations, have failed to capture these rapid, sometimes hundreds of times per second, fluctuations.
Traditional characterization routines for qubit relaxation times involve prolonged measurement sequences lasting minutes, effectively smoothing over the transient dynamics inherent to the qubit environment. This averaging obscures the true temporal behavior of the qubit’s performance and limits the ability of quantum engineers to implement effective error mitigation or real-time correction techniques. The inability to “catch” these fast fluctuations means that quantum processors have been calibrated on incomplete data, potentially jeopardizing accuracy and stability during computations.
By leveraging a real-time adaptive measurement approach, the research team, led by Dr. Fabrizio Berritta, has overturned this limitation. Utilizing a custom-built classical controller powered by a field-programmable gate array (FPGA), the system continuously monitors and updates its estimate of the qubit’s relaxation rate with unprecedented speed—on the order of milliseconds. This performance nearly matches the qubit’s intrinsic fluctuation timescale, enabling precise following of the fast-varying environment that impacts quantum coherence and fidelity.
A key innovation lies in running the inference algorithm directly on the FPGA hardware. Unlike conventional setups that shuttle data back and forth between qubits and a general-purpose computer, this integration drastically reduces latency, allowing Bayesian models of the qubit relaxation rate to update with each individual measurement. This closed-loop, adaptive protocol guides subsequent measurement timing dynamically, maximizing informational gain and enabling the detection of relaxation rate changes approximately one hundred times faster than previous methods.
Programming FPGAs for such sophisticated adaptive control is a nontrivial task given their complexity and low-level hardware description requirements. Yet, the team succeeded in crafting a highly efficient coding framework reminiscent of Python, a language widely accessible to physicists, through the Quantum Machines OPX1000 platform. This accessibility Democratizes advanced quantum measurement techniques, opening doors for other research groups worldwide to emulate or extend these methods.
The FPGA’s rapid processing and flexible architecture allow it to perform simultaneous qubit state readout, Bayesian updating, and control pulse timing adjustments with minimal overhead. When combined with state-of-the-art superconducting quantum hardware from Chalmers University, the setup achieves a remarkable synergy: the qubit and its classical controller co-evolve temporally, enabling near real-time calibration and feedback during quantum operations. This approach mitigates the impact of rapidly fluctuating defect states invisible to slower characterization methods.
Crucially, the newfound ability to detect such fast fluctuations has unveiled surprising details about the qubit environment. Fabrication defects and microscopic fluctuators, once thought to vary only over long timescales, can alter qubit relaxation properties within fractions of a second. This revelation challenges existing assumptions and calls for renewed efforts to understand and engineer materials and architectures at microscopic levels to stabilize quantum processors.
The impact of this research extends beyond fundamental physics. Quantum computing promises revolutionary advances in cryptography, materials science, and artificial intelligence. Yet the road to practical quantum advantage hinges on qubits that maintain coherence reliably and consistently. Real-time adaptive tracking and control, as demonstrated by this work, represent a critical step in moving from static, slow calibrations toward dynamic, error-resilient quantum systems.
Moreover, this study highlights the powerful role of interdisciplinarity—melding classical high-speed electronics, sophisticated statistical inference, and cutting-edge quantum devices. It underscores how commercially available classical processors can be repurposed and optimized for complex quantum control tasks, bridging the gap between theoretical quantum science and practical engineering implementations.
As Dr. Berritta notes, current quantum processors’ performance is often bottlenecked by poorly behaving qubits overshadowing the “good” ones. This adaptive monitoring technique rapidly identifies such problematic qubits and tracks them in real time, enabling focused remediation and potentially informing hardware redesign choices. In time, this could lead toward feedback-driven quantum processors whose overall system quality dynamically adapts to moment-to-moment environmental changes.
This research is the fruit of an international collaboration spanning the Niels Bohr Institute, Norwegian University of Science and Technology, Leiden University, and Chalmers University of Technology. The collective expertise has yielded a quantum measurement platform that redefines the temporal resolution of qubit characterization and opens new avenues for quantum processor calibration and error correction frameworks.
In the broader context of quantum technology development, this progress exemplifies how nuanced understanding of physical qubit environments paves the way for robust, scalable quantum architectures. The dynamic interplay between quantum states and classical control electronics is becoming more central to the roadmap of scalable, reliable quantum hardware.
In conclusion, the real-time adaptive tracking of superconducting qubit relaxation rates ushers in a new era of quantum hardware diagnostics and control. By harnessing the power of FPGA-based adaptive measurement, researchers have lifted a veil on previously undetectable fluctuations that challenge qubit performance. This advancement not only deepens scientific understanding of quantum decoherence but also accelerates practical progress toward fault-tolerant quantum computing, pushing the boundaries of what quantum machines can achieve.
Subject of Research: Real-time adaptive measurement and control of relaxation rate fluctuations in superconducting qubits.
Article Title: Real-Time Adaptive Tracking of Fluctuating Relaxation Rates in Superconducting Qubits
News Publication Date: 13-Feb-2026
Web References:
DOI: 10.1103/gk1b-stl3
Image Credits: Fabrizio Berritta
Keywords: quantum computing, superconducting qubits, FPGA, real-time measurement, relaxation rate fluctuations, Bayesian inference, adaptive control, quantum hardware, decoherence, quantum processor calibration, quantum error correction

