Physics Asked on February 25, 2021
We did a classical Hall Effect experiment where we measured Hall Coef. given by $R_H=frac{E_H}{Jcdot B}$. The setup was a rectangular Germanium semiconductor placed perpendicular to a magnetic field generated by a large coil.
We did several measurements: First, we applied a current of 2mA and measured the Hall voltage ($E_H cdot width$) as a function of the magnetic field strength, from which we derived $R_H$. Then we did the same thing, only setting the field constant and changing the current.
We expected to get roughly the same values for $R_H$, but we got totally wild results. We can’t figure out or even suggest an hypothesis to explain the data. The figures are attached below, $R_H$ is given in arbitrary units.
Three main features trouble me: a) the big difference in Hall coef. that was observed in the different experiments; b) the fact that it seems to diverge at small magnitudes of the magnetic field; and c) it changes signs passing from negative to positive values of the magnetic field! This as I understand really should not happen unless the conditions are quite exotic.
Any help would be appreciated! Thanks
I just got the same results for an experiment I did :). I believe that what is going on is that for low magnetic fields the Lorentz force is not strong enough to produce any Hall voltage. This is reflected in an apparent increase in the Hall resistance. This reasoning seems to suggest that as the magnetic field grows the Hall resistance should decrease. However, due to the thermal interactions involved, after the concentration gradient due to the Lorentz force is countered by the concentration gradient due to the collisions between the carriers one reaches a hold. This explains the constant behavior of the Hall coefficient at big magnetic fields.
Answered by Iván Mauricio Burbano on February 25, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP