Photography Asked by legokangpalla on July 14, 2021
Background:
This question is largely inspired by one of the answers in:
What limits the size of digital imaging sensors?
So the answer mentions uber-large image sensors that are made by “quilting” or “tiling” smaller CMOS sensors in a single silicon. Now, I’ve seen quite a few astronomy images that uses this kind of humongous sensor.
One of the example image is this:
Question:
How can you eliminate the horizontal and vertical strips of missing photo? Is this kind of thing even possible or the astrophotography photos are heavily photo shopped?
How to do it will depend on the width of the lines. Narrow ones, for "pretty picture" purposes can be healed by using basic nearest neighbour calculations. (Obviously this can't be used for scientific analysis of the raw data!)
Phase One, in some of their earlier sensors, had "bad pixel columns", which appear as a thin black line in the resultant images if not corrected. They "heal" these in software using weighted nearest neighbour calculations.
Answered by RobbieAB on July 14, 2021
Get help from others!
Recent Questions
Recent Answers
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP