Physics Asked on April 27, 2021
I have just begun setting up an interferometer to take some measurements. I am trying to reproduce one paper I found in literature for 3 DoF homodyne interferometry. The detector is merly a CMOS sensor. Here is a picture of the paper I am trying to reproduce: from the spatial interference pattern, these guys managed to extract information about relative tilts and displacements between the mirrors applying FFT algorithms to each pixel line.
Applying FFT to their picture, I got something similar like that (for only one [pixel line), and it seems to agree fairly well apart from the scale.
However, when I try to do the same to a fringe pattern I got from my setup I obtain this:
The intensity is not sinusoidal adn there are plateaus where the maximum intensity is detected: is there a way to get more spacing, or smoother fringes?
Your problem is in part due to the small dynamic range, the ratio between the maximum and minimum measurable light intensities, of your detector.
This is a problem often encountered whenever one tries to "photograph" fringe patterns and you can usually tell if an image is real or simulated by looking to see if there are overexposed fringes or not.
Reducing the intensity of the fringes which are being analysed by your detector sufficiently should enable you to analyse the "expected" fringe intensity pattern.
Answered by Farcher on April 27, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP