Quantum Computing Asked by Abstract Acumen on March 19, 2021
I recently listened to a presentation for Introductory Quantum Computing. It was very general and meant to give the listeners an idea about the potential for Quantum Computing.
The speaker ended the talk with a kind of “On the Other Hand” talking point by mentioning that some people are concerned that the noise in Quantum Computers will grow at such a rate that limits their overall potential.
I was wondering if anyone on this channel has any experience with this issue or knowledge about it? Is this a theoretical limit of physics of an engineering problem? What is the current state of thinking about this issue and do recent breakthroughs in Quantum Error Correction do anything to address it?
According to so-called threshold theorem, it is possible to get rid of errors in quantum computation with arbitrary precision. However, there is an assumption that you have enough qubits.
To ilustrate the idea, you can encode one qubit $|qrangle=alpha|0rangle+beta|1rangle$ with more qubits, for example $|qrangle=alpha|0000rangle+beta|1111rangle$ and after calculation, based on majority rule, to decide about result.
As you can see, you need high number of qubits to found and correct errors which is a main obstacle in current quantum computers having only few qubits. E.g. IBM Tokyo proccesor has 20 qubits and a processor with 53 qubits is planned. Publicly available processors have even less qubits, e.g. 5 qubits - IBM Q or less - 2 qubits on 2-spin Quantum Inspire processor.
To conclude, it is possible to reduce noise but we need more qubits. Increasing number of qubits would theoretically leads to quantum processor with no (or at least low level) noise.
Correct answer by Martin Vesely on March 19, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP