Physics Asked by Xplane on April 21, 2021
Why is the speed of light defined as $299792458$ $m/s$? Why did they choose that number and no other number?
Or phrased differently:
Why is a metre $1/299792458$ of the distance light travels in a sec in a vacuum?
The speed of light is 299 792 458 m/s because people used to define one meter as 1/40,000,000 of the Earth's meridian - so that the circumference of the Earth was 40,000 kilometers.
Also, they used to define one second as 1/86,400 of a solar day so that the day may be divided to 24 hours each containing 60 minutes per 60 seconds.
In our Universe, it happens to be the case that light is moving by speed so that in 1 second defined above, it moves approximately by 299,792,458 meters defined above. In other words, during one solar day, light changes its position by $$ delta x = 86,400 times 299,792,458 / 40,000,000,,{rm circumferences,,of,,the,,Earth}.$$ The number above is approximately 647,552. Try it. Instruct light to orbit along the surface of the Earth and you will find out that in between two noons, it will complete 647,552 orbits. Why it is exactly this number? Well, it was because how the Earth was created and evolved.
If it didn't hit a big rock called Megapluto about 4,701,234,567.31415926 years ago, it would have been a few percent larger and it would be rotating with a frequency smaller by 1.734546346 percent, so 647,552 would be replaced by 648,243.25246 - but because we hit Megapluto, the ratio eventually became what I said.
(There were about a million of similarly important big events that I skip, too.)
The Earth's size and speed of spinning were OK for a while but they're not really regular or accurate so people ultimately switched to wavelengths and durations of some electromagnetic waves emitted by atoms. Spectroscopy remains the most accurate way in which we can measure time and distances. They chose the new meter and the new second as a multiple of the wavelength or periodicity of the photons emitted by various atoms - so that the new meter and the new second agreed with the old ones - those defined from the circumference of the Earth and from the solar day - within the available accuracy.
For some time, people would use two independent electromagnetic waves to define 1 meter and 1 second. In those units, they could have measured the speed of light and find out that it was 299,792,458 plus minus 1.2 meters per second or so. (The accuracy was not that great for years, but the error of 1.2 meters was the final accuracy achieved in the early 1980s.)
Because the speed of light is so fundamental - adult physicists use units in which $c=1$, anyway - physicists decided in the 1980s to redefine the units so that both 1 meter and 1 second use the same type of electromagnetic wave to be defined. 1 meter was defined as 1/299,792,458 of light seconds which, once again, agreed with the previous definition based on two different electromagnetic waves within the accuracy.
The advantage is that the speed of light is known to be accurately, by definition, today. Up to the inconvenient numerical factor of 299,792,458 - which is otherwise convenient to communicate with ordinary and not so ordinary people, especially those who have been trained to use meters and seconds based on the solar day and meridian - it is as robust as the $c=1$ units.
Correct answer by Luboš Motl on April 21, 2021
The number is arbitrary. We could have chosen it to be 1 m/s, but that would have been a little awkward, as we'd be going around telling people to go left at the next corner .00000084 meters down the road.
It used to be that the meter was defined in terms of two scratches on a stick. Therefore, we could measure the speed of light experimentally. Before redefining things they measured as accurately as they could and then used the result as the new official standard.
Defining the speed of light means that the meter is no longer defined in terms of scratches on a stick. If it were, we'd have the second, the meter, and the speed of light all defined independently, while really they are dependent. That means that future, more accurate measurements could show that our system was inconsistent. As a result, the meter is now simply the distance that light moves in 1/299792458 seconds.
I think the main motivation behind this is that speed of light is always the same everywhere (for a local measurement) and is fairly easy to measure. Before the redefinition, the accuracy in the measurement of the speed of light was getting to be so good that it was limited by the accuracy with which we could measure a meter.
Many precise distance measurements actually are made using light rather than meter sticks. See, for example, LIDAR, in which light is bounced off a source, and you see how long it takes to come back. Precise distance measurements (or measurements of change in a distance) can also be made with light using interferometry. See LIGO for example.
Answered by Mark Eichenlaub on April 21, 2021
It's a rather nice historical full circle.
The meter was originally defined as 1/10,000,000 the distance from the equator to the north pole - through Paris - as Luboš explained above.
This was dreamed up by some guys in the French revolution with the idea that units of physics should depend only on the universe (or at least Paris - which is pretty much the same thing) and not on historic details like the length of a king's arm or foot.
This also meant that you could independently arrive at your own impartial measurement of the unit - there was no master metre in a vault somewhere. Thus making all men, and their measurements, equal.
Like many of the French revolution's ideas this didn't quite work out exactly as planned - the survey was off - and soon the demands of industry meant that the only sufficiently accurate standard was a master metre in a vault somewhere.
By redefining the metre in terms of the speed of light - which is universal (even outside Paris) we are going back to that original noble principle.
Extra Note: You don't need to go to the north pole. It's easy to measure the latitude (angle north/south of the equator) of a place with just a telescope, you can then measure the distance on the ground between two places north/south of each other and get a definition of the metre. This lead to the discovery that the Earth isn't a sphere, so it doesn't actually work, but did lead to a whole new science of measuring the Earth.
Answered by Martin Beckett on April 21, 2021
Because of historic reasons. Before the current definition a meter was defined as the length of some piece of material, the prototype meter bar. Then the speed of light was measured based on that length, which turned out to be 299792458 lengths of the bar per second. This was then used to redefine the meter to the distance light in vacuum travels in 1/299792458 seconds, so that the old and new definition match as closely as possible.
Answered by noah on April 21, 2021
Clearly calculations would be easier if we had $cstackrel{def}{=}3times10^8{rm m,s^{-1}}$ and other physical constants would be convenient values: the freespace electric constant would be $frac{1}{36,pi}times 10^{-9}{rm F,m^{-1}}$ and the freespace wave impedance would be $120pi,Omega$.
However, a primary goal in revision of metrology standards is to cause the smallest disruption possible to former practice when any changes are made. Revised systems of measurement have to be as backwardly compatible as possible. There was already a definition of the meter in place, and that ultimately came from the physical meter bar, as described in noah's answer. A change in the meter to shift of $299792458{rm m,s^{-1}}$ to $300, 000, 000{rm m,s^{-1}}$ would be a change of about 700 parts per million ($7times 10^{-4}$), and that would cause major disruption to current practices in science, industry and just about every human endeavor.
Answered by Selene Routley on April 21, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP