Physics Asked on October 15, 2020
Let’s suppose we have a point charge with no dimensions (zero volume and surface area) absorbing photons from Sun. Does it change the amount of photons and energy absorbed when we move the point charge close to, or far from the Sun? Since photons do not decrease in energy with distance and come one after another in a line, it really doesn’t matter how far we are from the Sun. How does classical electromagnetism explain this in terms of electromagnetic waves?
Quantum mechanics: although the energy of the individual photons does not change with distance, the density of photons (the number of photons per volume) does. This is because every spherical shell around the sun must have the same number of photons, but ones with bigger radius have a greater surface, so the photons are more spread apart.
Classical EM: the frequency of the EM waves is the same everywhere, but their amplitude decreases with distance. Again, this happens because the total energy in spherical shells must be the same, but this time it is calculated from the square of the fields, that is, of their amplitude.
Answered by Lucas Baldo on October 15, 2020
First, a point charge cannot absorb a photon. This could not simultaneously conserve energy and momentum. So let's assume you are talking about scattering.
The scattering cross-section of the particle is a fixed quantity and has a finite value even for point-like particles such as electrons. But the number of photons per unit area, per unit time will decrease as the inverse distance squared as you move the particle away from the Sun (as demanded by conservation of number of photons). Thus the scattering rate also decreases in the same way.
In classical electromagnetism, the Poynting vector (power per unit area) of the radiation decreases as the inverse distance squared (as demanded by conservation of energy). The amount of scattered power is equal to the Poynting flux multiplied by the scattering cross-section, so the scattered power also goes as the inverse distance squared. Since classical scattering does not change the frequency of light (e.g. Thomson scattering) then we infer that the rate of photons scattered goes as the inverse distance squared - the same result.
Answered by Rob Jeffries on October 15, 2020
Photons do not come "in a line" and are not absorbed by a point particle. Photons are described by a wave solution if Maxwell's equations that gives the probability of photon absorption. To a good approximation, for electric dipole transitions the probability is proportional to $E^2$ by Fermi's golden rule. Sunlight can be viewed as spherical waves and for these $E^2 propto r^{-2}$ and the probability of absorbing a photon follows this.
Answered by my2cts on October 15, 2020
I'm afraid this question does not make any sense. There is no point charge with zero volume. So if you do in fact have one, then pigs can fly. The reason for making (idealizing) assumptions such as point charges is that they work in some contexts. We need to know when they work and when they don't. What does it mean by "work"? If the idealization is a limit on some continuous spectrum, then the conclusions from that might translate to approximately correct conclusions in the real world. For instance, when dealing with electrostatic force between charged objects, we often make the idealizing assumption that they are point charges when they are sufficiently far apart. It gets worse and worse as you bring the objects closer, and breaks down badly if the objects are touching or interlocked or one is enclosing the other.
Similarly, we often assume that particles are point-like only when we are concerned with distant interaction. Assuming you are talking about a particle absorbing a photon (because point charges don't), this is not a valid idealizing assumption because the photon has to get sufficiently close to the particle, and photon absorption is inherently a quantum-mechanical phenomenon, not some classical interaction.
Secondly, you said "Since photons do not decrease in energy with distance and come one after another in a line, it really doesn't matter how far we are from the Sun.". This is wrong. Classically, light intensity reduces with increasing distance from the source. Quantum-mechanically, each photon can be understood as an indivisible packet of energy that has a bulk velocity $c$ in some direction but spreads out as it propagates. One difference is that the photon cannot be partially absorbed; either it is completely absorbed or it is not. See this video for an animation illustrating a traveling wave packet. The likelihood that a photon from the sun whose direction is uniformly random is absorbed by that particle will in fact decrease as distance increases, which is in line with the classical observations. It is meaningless to ask for the situation where the photon's direction is exactly towards the particle, since this is simply impossible, in the same way that there is no such thing as a truly point particle.
(Disclaimer: I included the video solely for the animation. I did not check the contents, and it has conceptual errors such as concerning wavefunction collapse; it cannot be just "zeroed out" in the part that is inconsistent with the measurement, but rather the wavefunction changes to a new one.)
Answered by user21820 on October 15, 2020
Get help from others!
Recent Questions
Recent Answers
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP