Physics Asked by Somniare on December 12, 2020
I have been taught that you can find out the size of a nucleus of an atom by firing electrons at high velocities at the atom. This causing scattering due the positive charge of the nucleus and diffraction due to the small de Broglie wavelength.
Part of this relies on the first minimum of the diffraction pattern occurring at $sin theta = frac{1.22 lambda}{d}$, where $d$ is the diameter of the nucleus. This is a variant on the same equation that I came across in light diffraction where $d$ is the width of the slight, or spacing between elements of a grating. Why should $d$ here now become the diameter of a nucleus rather than the distance between nuclei?
Thank you for all your help
If we evaluate the DeBroglie wavelength of electrons (e.g. with this online calculator), then we find that an electron energy of 1 eV leads us to $1.2times10^{-9},mathrm m$ or about 1 nm (that's the size of small organic molecules with 2-3 benzene rings), 15 eV for the size of a hydrogen atom. If we use 1 GeV electron energy we finally get down to $1.2times10^{-15},mathrm m$, which is roughly nuclear size. So if we want to make atomic measurements, a little tabletop experiment will do, to measure the size of nuclei with scattering directly takes a medium-sized accelerator.
Answered by CuriousOne on December 12, 2020
Get help from others!
Recent Questions
Recent Answers
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP