Signal Processing Asked by AtoM_84 on October 24, 2021
I am defining the required antialiasing (analog) filter for a chirp generated by a DDS. The chirp is produced in a FPGA (NCO generation) and forwarded to the DAC for the samples generation (with interpolation from the DAC). Because of a dynamic range lower than 16 bits and the natural behavior of the DAC to produce high frequency nonlinearities, I need to include and antialiasing filter which I try to define.
I guess I need to fulfill these requirements:
As I am not very used to filters I read articles on the web and could determine those few rule of thumbs to design the best suited filter for my application:
For the considered case, with a stopband at 350MHz and a gain ~1 at 50MHz it seems possible to achieve a filtering, especially with Bessel filters (order 3 or 4 are OK).But if I would like to decrease my cost, hence the sampling frequency of my DAC, it turns quickly to be impossible because my stop band and my cut off frequency are getting closer and only a Butterworth filter could do the job? but because it is a group delayed filter it is not suitable for chirp generation am I right?
I tried to simulate this with ADIsimDDS (interesting online software) for instance AD9859 (400MSPS 10 bit) and my case study looks good but i cant see any of the group delay impact with respect to the different proposed external analog filters.
Thanks for your answers and advices.
I recommend a digital FIR filter to both compensate for the DAC Sinc droop and added group delay distortion combined with a simpler analog filter focused on rejecting the higher images out of the DAC with the minimum number of components (this would eliminate using a Bessel filter, while a Butterworth or elliptic filter would be viable candidates). The analog filter cost and complexity can be minimized by utilizing digital pre-distortion to equalize the analog filters group delay variation with a reasonable tolerance.
For further details on the design details for the anti-alias filter, please see this post: Where should I set my anti-aliasing filter corner frequency for this signal?
This post may also be of interest as an implementation approach where I detail all the code for an optimized chirp for a flat FFT response which would also have minimum aliasing effects under the same sampling conditions. Instead of a time domain filter, the response is scaled with a Tukey window which provides a constant envelope chirp over most of its duration, while scaling the very start and end of the chip minimize aliasing effects. This could be done with a weighting factor of the amplitude of the chirp at the output of the DDS. Please see the very bottom of this post for more details showing a flat FFT response over most of the chirp duration: How can I plot the frequency response on a bode diagram with Fast Fourier Transform?
Answered by Dan Boschen on October 24, 2021
Get help from others!
Recent Questions
Recent Answers
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP