Quantum Computing Asked by Konrad on August 20, 2021
I’ve recently tried to build a Random generator using 5 hadamard gates (shown as U2 below) measured to 5 classical bits in parallel as shown in the circuit image.
I’ve executed this circuit for 8192 shots (and repeated this many times) hoping to get somehow flat histogram of every of 32 possible states. Yet, instead i’ve found that probability decreases in almost linear fassion from |00000> -> |11111> which is bizare. I’m very new to quantum computing – could someone explain me why there is visible such strong linear dependence?
Or maybe this is expected, but why?
What I tried up til now:
Can anybody help me to understand this behaviour?
Regards
Konrad
The issue is that you are using noisy hardware with imperfect operations and measurements. In particular, the most likely problem here is that after you prepare a qubit it immediately begins decaying towards the ground state $|0rangle$ via interactions with the environment. Each qubit will be slightly more likely to be measured as 0 instead of 1 than you'd expect from a noiseless machine. Try grouping by the number of 1s in the result and the effect will stand out even more.
It's difficult to say for sure exactly what is going wrong. For example, given only this circuit you can't really tell if the errors are occurring during state preparation, measurement, or intermediate operations. Also, my understanding is that IBM may be modifying your circuits before executing them (e.g. inserting random bitflips to depolarize errors, or purposefully adding a bit of noise to see which results go up and then extrapolating backwards) which makes it particularly hard to assign blame to any one part of the circuit.
Correct answer by Craig Gidney on August 20, 2021
Get help from others!
Recent Questions
Recent Answers
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP