Physics Asked by Matthew jk on March 13, 2021
If both bulbs are 50 watts shouldn’t they emit the same frequency light since they both convert 50 J of electrical energy into light energy per second?
But this is the example I came across. I don’t understand.
The infrared (IR) bulb will emit more photons per second than the ultraviolet (UV) bulb because each IR photon has less energy than a UV photon.
After some follow-up questions, I've realized that there is an ambibuity in the phrase "50-watt bulb." Is this a bulb that emits 50-watts of light, or a bulb that consumes 50 watts of electrical power? If the former, then my first paragraph is still true. However, most light bulbs are labeled based on the latter. So, in the case of a 50-watt IR bulb and a 50-watt UV bulb, it matters how the light is generated.
For example, if both bulbs are incandescents (the electrical power heats up a filament to thousands of degrees to emit light) with filters in front of them that only let through a desired spectrum, then the IR light power emitted will be much greater than the UV power emitted. This is because the filament is a black body radiator that emits much more power in the lower energy (IR) part of the spectrum. A UV blacklight works by taking a light bulb and putting a filter in front of it that absorbs visible and lower energy light, leaving UV to pass through. Even if the blacklight consumes 50 watts of power, the light emitted will have much less than 50 watts of power because most of the radiated power will be blocked by the filter.
Different methods of light generation produce different amounts of light given an amount of input electrical power.
Correct answer by Mark H on March 13, 2021
It should be clarified that they do not emit photons of the same frequency, since you specified that one was UV and the other IR, which are quite distinct frequency bands.
otherwise, as the other poster points out, if both bulbs have the same efficiency, which is unlikely using the usual cheap designs, they will emit the same amount of energy. But each photon will not have the same energy: Planck's law says the energy of an IR photon is lower than the energy of each UV photon. So for the energy emitted to be the same, there must be many more IR photons, again, assuming the same efficiency. Lasers could easily be designed
to have the same efficiency.
One could also "design" the incandescent bulbs to have the same inefficiency by crippling the more efficient one with a potentiometer or dimmer so it emits the same amount of energy as the worse bulb.
In summary, there is no principled connection between the power and the frequency emitted. But Planck's law does give an exact relation between each individual photon's frequency and its individual energy. So 5o watts in a beam of IR light has more photons per second than a 50 watt beam of UV light.
That said, the crucial point is are we talking about a beam of light or just one big photon? To say "watts" means energy per second, so clearly we are talking about a continuous process. So a more energetic bulb does not have to produce the higher energy by producing higher frequency photons, it could produce more of the same frequency photons. In fact, the frequency of an emitted photon depends on the exact relation between some energy levels in an atom somewhere that emit the photon. So it could be said that if we are talking on the micro-level, and just one event, one photon being emitted, an atom sparking between two widely spaced energy levels will produce more energy because it emits a photon with higher frequency. And conversely, if you wanted to design a one-shot one event emitter, that was only going to emit one photon and then stop, and if you wanted more energy, you would have to go to higher frequencies. But that's not a bulb, and not really watts anymore.
Answered by joseph f. johnson on March 13, 2021
If the frequencies of both the rays would have been the same in any scenario then I don't think we would have classified them as two different em waves.
The rating $50 W$ doesn't mean that they will emit photons of the same frequency. It only means that the energy input in those bulbs is the same.
If both bulbs were identical (but one gives IR and the other UV rays) then the energy output will also be the same.
Total energy given out is given by, $$E = nhf$$ where $n$ is the number of photons, $h$ is the Planck's constant and $f$ is the frequency of the light.
Since, $$E_{IR}=E_{UV}$$ this doesn't mean that the frequency of both photon types will be the same. This means that the number of photons will be different in both the types of bulbs as well.
Here is an analogy :
Suppose you go to a stationary shop and buy pens of two types:
Suppose you buy one from Type A and $10$ from Type B.
Now Both the time you gave a total amount of $50$ units but does that mean that both the types are the same?
No, both the pens are of different types and different in numbers but their overall values are still the same.
This is exactly what is happening in your case.
Note: You may use units as per your choice.
Hope it helps ?.
Answered by A Student 4ever on March 13, 2021
It depends what you mean by UV vs IR light. In the case of a bulb in an ideal vacuum (so we can ignore the effects of conduction), both a 50W UV and IR light source will emit 50W of electromagnetic radiation. However, what is important is what the frequency distribution of that radiation.
For the ideal case of a 100% efficient monochromatic source, they would both produce 50Ws of radiation at the ideal frequency, but with the ultraviolet source producing less photons of higher energy.
However, pretty much all real sources in some way behave like a blackbody radiator, with LEDs for the relatively small amount of ineffient conversion via heat. Incandescent bulbs are pretty much purely blackbody radiators. A blackbody radiator will still output the energy as radiation, but spread over a much wider spectrum resulting in less being in the part of the spectrum the bulb is designed to output in. This is what is counted as the inefficiency as only a small proportion will be converted into radiation in the useful band that the UV bulb is being used for.
Additionally, if you do factor in conduction, then in the case of incandescent bulbs, the IR bulb will emit slightly less radiation for the same electricity consumed. This is because the UV bulb will have a hotter filament, and thus more heat will be conducted away rather than radiated. This does not apply to other forms of light generation that are not based directly off blackbody radiation.
Answered by user1937198 on March 13, 2021
UV photons have more energy than IR photons. One way to look at it is that less number of UV photons will be released per second than IR photons (Or intensity of UV light will be less than IR light)
Answered by TKA on March 13, 2021
The energy input to a light source is, in general, unrelated to the amount of light emitted
You can't generalise from the amount of energy going into a light source and the amount or type of radiation emitted by that source.
The most important reason is that there are a large number of fundamentally different ways to create light. Incandescent bulbs do so by black body thermal emission. Discharge lamps do so by exciting gas molecules via an electrical discharge so their electrons get excited to higher energy levels and then emit light when falling back to lower energy levels. LEDs emit light when excited electrons in a semiconductor fall to lower energy levels in the semiconductor.
These mechanisms are so different that comparing across classes of emitter is meaningless. And different types of light source have very different profiles of emitted light. Black-body emitters emit a broad range of wavelengths depending on how hot they are. Discharge lamps and LEDs tend to emit narrow bands of specific wavelengths (though these can be considerably broadened in high-pressure discharge lamps). The type of light (or more specifically the wavelength or colour) is often independent of the power input. The mix of wavelengths varies a little for a given incandescent light depending on power but it can't vary much because there are few materials that can survive over a large enough temperature range to make much difference to the colour.
What we measure as the power of any light source is the energy going in. In many lamps, most of this is lost and not emitted as visible light. In incandescent bulbs only about 5% of the energy appears as visible light (arguably the rest is emitted in the infra-red). Other sources are far more efficient breaking any simple relationship between the power going in and the light coming out (if comparing across types of emitter).
Even if we consider incandescent bulbs alone as a class, the simple relationship between input power and output is non-trivial. Incandescent UV bulbs are available but they don't work by making the heated filament hotter so the peak wavelength of the black-body emission is moved into the UV (or not by much) since temperatures hot enough to do that would exceed the melting point even of tungsten. They usually work by filtering out visible from the bulb's emission. IR bulbs might use lower temperature filaments to minimise the visible emission, though. What this means is that, if you compare the output of a 50 W UV bulb to an IR bulb, they will have very large differences in the total energy of the specific light emitted. Only if you counted all the energy lost as heat as IR emission for both bulbs would you get the same total energy output.
But, while incandescent bulbs are pretty good at emitting IR, there are far better ways to create UV. Mercury discharge lamps, for example, have a very strong narrow emission line in the near-UV (as well as many weaker lines some of which are visible). Far more of the input energy will go into that UV emission. While those lamps will also see significant conversion losses, a great deal more UV will be emitted. But in discharge lamps the frequency of the light is nearly independent of the power input. A 1W discharge lamp will emit less in total but will still emit the same proportion of its output as UV (very unlike an incandescent bulb). So, if you had a typical IR light with 50W power, it would make no sense to compare it to a UV source using a totally different mechanism to create light even with similar power input.
In summary, the only type of light where the wavelength varies much by power input is the incandescent bulb. But they lose most power to heat and are very limited by the materials used in their filaments. Other types of emitter produce light of specific frequencies that don't depend on the power input.
Answered by matt_black on March 13, 2021
Question : If two vehicles consume 500ml of fuel, will they both travel the same distance at the same speed?
Answer : No, because they are two different systems which convert fuel into motion.
In the same way, IR and UV bulbs are basically two different "engines" which convert electricity to two different types of EM energy.
I know other people explained this much more thoroughly and completely... and I like those answers. I just thought this analogy might help some people understand.
Answered by Dumb Rock on March 13, 2021
One photon of 300nm (soft UV) light carries 0.7 attojoules of energy. One photon of 1000nm (near IR) light caries 0.2 attojoules of energy. Therefore a 100%-efficient 50W UV lamp outputs 71 quintillion photons each second, obeying conservation of energy, while an equally-efficient IR lamp outputs 250 quintillion photons per second. Both of those numbers are kind of... a lot, but the difference between them accounts exactly for the difference in energy per photon due to the different wavelengths.
Answered by hobbs on March 13, 2021
Suppose there is a 50 watt infrared bulb and a 50 watt UV bulb. Do they emit light of the same energy?
If both bulbs are 50 watts shouldn't they emit the same frequency light since they both convert 50 J of electrical energy into light energy per second?
But this is the example I came across. I don't understand.
If they are rated for 50 watts it means that they both absorb that amount of energy as a load on a system. But, UV and IR, amplitude/frequency wise, will emit that amount in different time periods. It takes more time to emit 50 watts in the IR region than it takes for the UV because they have different wavelengths and UV is orders of magnitude more powerful than IR!!!
Answered by popopo champ on March 13, 2021
Get help from others!
Recent Questions
Recent Answers
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP