Physics Asked by code noob on January 7, 2021
In an exam, I am given a situation where a student investigates how the current varies with potential difference for two bulbs of the same type.
For the same voltage, the current of bulb A is twice the current of bulb B.
The conclusion made in the mark scheme is that since the power dissipated by bulb A is twice the power dissipated by bulb B, the brightness of bulb A is twice the brightness of bulb B.
My question is: is power directly proportional to brightness? The question definitely seems to assume so, but where does this relationship break down and when is it a fair assumption?
This is GCSE level physics (so physics in the UK for 16 year olds).
I'm going to make some assumptions: 1. We are talking about bulbs with tungsten filaments. 2. They are designed to operate at the same temperature (somewhat below the melting point of tungsten). 3. The filaments are the same length. 4. The brightness is proportional to the area of the filament. Then the current is: V/R = Vπ$r^2$/(ρL) = [Vπ/(ρL)]$[A/(2πL)]^2$ where A = 2πrL (the surface area of the filament). It would appear that the current (and power) is proportional to the square of the surface area (and brightness).
Answered by R.W. Bird on January 7, 2021
"Brightness" isn't very well defined, but it's commonly used as if proportional to the power received by the eye (or other detector). I assume the question writers didn't want to make a lengthy and confusing digression into the details of this. More rigorously, the luminous intensity is a measure of power, and what we perceive as brightness is related (though not really linearly) to this.
Answered by Raghu Parthasarathy on January 7, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP