Signal Processing Asked on January 4, 2022
My receiver contains: AGC,timing recovery,carrier recovey blocks.
I am using gardner timing recovery with loop filter and farrow parabolic filter.
The input to the farrow filter is the fractional delay. I am assuming the fractional delay should be constant after preamble time may be only small variations. The fractional delay does not settle in the preamble time.
Between two data burst only pure carrier is sent so there any no transition for
timing recovery to work. So immediately after the data burst there is lot of variations in the frac. delay. which causes timing errors. any suggestions???
If your data isn't white or for certain TED there is a lot of jitter.
If you're using a pre-amble, why not use a least-squares or grid search for timing detection and then heavily damp your NDA detector? Or don't even bother and just lock it down until the next burst.
Answered by FourierFlux on January 4, 2022
Get help from others!
Recent Questions
Recent Answers
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP