Electrical Engineering Asked on December 4, 2021
I’m hoping to resolve an argument here. Someone else said that lightswitch dimmers don’t actually cut back on power used, since they’re just resistors and just divert the power to additional heat. I agree that dimmers are just resistors, but from my thinking, V=IR and P=IV gives us P=V^2/R. Voltage is constant in this system, so as resistance increases between the dimmer and the lightbulb in series, power will go down. Who is correct?
I was wondering what the actual power dissipated in the series resistor compares with the bulb power over a wide dimming range and particularly how much effect there is from the variable filament resistance due to temperature. I tested a 120 volt, 60 watt bulb and also calculated the power assuming no change in filament resistance. The results are shown below.
Wba and Wra are the actual bulb and resistor power plots from the test results. Wbc and Wrc are calculated assuming resistance remains at the hot value during dimming. Wttl is the total bulb plus resistor power.
The bulb was labeled 60 watts, but it only took about 55 watts at about 120 volts. Room temperature was about 72F.
Answered by Charles Cowie on December 4, 2021
Even if dimmers were just resistors (they aren't, as explained in the comments), it should be obvious that as the resistance goes up, the current goes down and the lamp gets dimmer. By definition, since power = voltage × current and voltage is being held constant, then the power being consumed scales directly with the current.
Some of the power gets dissipated in the resistor, some of it gets dissipated in the lamp, but the total still adds up to less than the lamp uses at full brightness.
Answered by Dave Tweed on December 4, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP