Crucial leap in error mitigation for quantum computers

Phys.org  December 9, 2021
Coherent errors severely limit the performance of quantum algorithms in an unpredictable manner, and mitigating their impact is necessary for realizing reliable quantum computations. The average error rates measured by randomized benchmarking and related protocols are not sensitive to the full impact of coherent errors and therefore do not reliably predict the global performance of quantum algorithms. Randomized compiling is designed to overcome these performance limitations by converting coherent errors into stochastic noise, dramatically reducing unpredictable errors in quantum algorithms, and enabling accurate predictions of algorithmic performance from error rates measured via cycle benchmarking. An international team of researchers (USA – UC Berkely, Lawrence Berkeley National Laboratory, Canada) has demonstrated significant performance gains under randomized compiling for the four-qubit quantum Fourier transform algorithm and for random circuits of variable depth on a superconducting quantum processor. They accurately predicted algorithm performance using experimentally measured error rates. The results demonstrate that randomized compiling can be utilized to leverage and predict the capabilities of modern-day noisy quantum processors, paving the way forward for scalable quantum computing…read more. Open Access TECHNICAL ARTICLE 

Experimental realization of noise tailoring via randomized compiling on a superconducting quantum processor…. Credit: Phys. Rev. X 11, 041039, 24 November 2021

Posted in Quantum computing and tagged , , , , .

Leave a Reply