Physics Asked by surd100 on January 5, 2021
Some authorities have stated publicly and without explanation that if the theories of Special and General Relativity were not taken into account in the design of the GPS (by building the satellite clocks to run 38us/day slower than GPS time before launch aka ‘the factory offset), the position indicated by an earthbound GPS user device would drift by about 11km/day. I’ve considered this for various GPS models but can only predict much smaller effects. That multiplying the 38us/day uncorrected difference from GPS time by the speed of light yields 11.6km/day, does not for me seem to relate to GPS receiver function. I’d be very glad for any pointers.
See also Why does GPS depend on relativity?
The satellites' clocks are corrected for GR and SR but this is (mostly) irrelevant for how the GPS system works. Your receiver is comparing the time difference between the time sent by a number of different satellites.
If this time is in 'earth' seconds or 1 part in 10^10 speeded up 'space seconds' is to first order irrelevant - so long as all the satellites experience the same effect. So the choice is to broadcast at 10.23 MHz and let the signal be a slightly different frequency when it reaches the ground, or adjust the frequency to 10.22999999543 MHz onboard so it's 10.23MHz on the ground.
I think this is where the urban legend of the 'USAF didn't believe in relativity and weren't going to correct the clocks' comes from.
Of course although your position relative to the satellites is unaffected by the time dilation - the satellites' own knowledge of time and so its position in its orbit would accumulate an error. To allow you to find your absolute position the satellite also broadcasts its own orbit data and the time, allowing your receiver to calculate the satellites position in space.
The satellites are in orbit at about 20,000km altitude, 26500km from the centre of the Earth so have an orbit of 165,000km which they cover every 12hours. An error of 38.6 μs/day in a path of 333,000km/day still gives a position error (of the satellite) of only a fraction of a meter - although this accumulates with time.
This could be corrected by giving the satellites an adjusted figure for their orbital speed or by updating their empheris as they pass over the ground station.
Answered by Martin Beckett on January 5, 2021
If you look at the wikipedia page about GPS and relativistic corrections, they make it clear that this 10km/day drift applies to the 'pseudoranges' - the initial distance calculated between the receiver and each satellite. This error would cancel out in solving the triangulation problem to obtain the receiver position, since it is a an equal error in all the satellite clocks.
Here is my guess as to why they chose to correct this effect: the individual clocks all have some drift as well, and are periodically synced to a master timebase on earth. If they were allowed to drift so drastically from the master, it would be necessary to adjust every clock simultaneously, or navigation would be completely out of whack. Somewhat simpler to just adjust individual units as their drift becomes noticeable.
Answered by user2963 on January 5, 2021
Found the answer after drawing a blank with several experts. Two US professors of high GPS pedigree, independently explained that the '10km/day' claim presupposes that between 1 and 3 of the satellites used for a 4 satellite fix do not incorporate the 38us/day clock rate ('factory') offset. They also remarked that the GPS is often used as a time source where observed time shifts are clearly important.
I and others have been vexed by several scientific authorities publicly repeating the 10km/day position error claim without any mention of that presupposition. The question is resolved but the presupposition seems strange because relativity shifts all the observed satellite clock rates approximately equally. That presupposition seems only to allow GPS position finding to be shown to be about as susceptible to transmitter clock differences as radio-location systems such as Loran, where relativity is not a consideration.
Sincere thanks to those who replied to my question.
Answered by surd100 on January 5, 2021
I looked at the 10km/day-if-38us/day-uncorrected claim several years ago and found it was based on a model of one or all-but-one of the GPS satellites used to fix the observer's position having an uncorrected clock. This model bears no relation to any sensible GPS system. Many lecturers appear not to realise this so 10km/day became received wisdom. Not offsetting satellite clocks by 38us/day would yield only mm location errors but would use up the current +/-1ms time correction range of the current correction system described below.
I also learnt that all satellites also transmit data on their own clock error, derived from earthbound time standards and transmitted up to each satellite from ground stations, allowing the observer to determine precise time.
Hope this fills in the picture on timing a bit. I would have loved to know more but could not find anything on how those satellite time error signals are derived.
Answered by surd100 on January 5, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP