Physics Asked on November 29, 2020
I used a $rm NaI$ scintillation detector to linked to a scalar unit that counts the number of gamma rays detected from a $^{22}$Na source. I repeated the experiment for equal time intervals of 40 seconds at a variety of different distances from the source ranging from 0.02 to 0.5m.
I subtracted an average background count for the time interval and converted the counts into counts per second and then used scipy curve fit to fit a function of the form:
count rate = $Ar^{b}$
where $r$ is the distance and $A$ and $b$ are constants.
I expected that due to the inverse square law, I would find that $b = -2$, however I found that instead b was around -1.45.
Can anyone explain why the dependence on distance of the detector count rate is not described by the inverse square law?
Try fitting to
raw count rate = $Ar^b + C$.
I know you think you have taken care of the background, but what if your background isn't exactly right?
Even then, you might not get the expected result because of the nature of statistical counting. The uncertainty of what the proper count rate is at a certain distance is the square root of the your counts. The value you get with any one measurement will be different next time! (Do the experiment. Make the measurement 10 times at the same distance.)
Another explanation is the finite size and resulting differing solid angle of the detector at different distances. For your range of distances, that could be very significant.
Finally, try fitting to
raw count rate = $Ar^2 + C$ to see how the expected fit handles the background value. The scientist's work is never done with a single analysis attempt or analysis model.
Correct answer by Bill N on November 29, 2020
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP