Photography Asked by user34145 on February 17, 2021
For analog photography it’s pretty easy, the film is exposed to light during the exposure time, but what about digital sensors ?
As far as I know a sensor can’t “be exposed” for a long period of time, it just captures an image at an exact moment, so what I think is happening is that the “computer” in the camera polls the sensor frequently and then combines all the images together to form a single, long-exposed photo.
Am I correct ?
No, digital exposure is the same, the shutter is opened for a length of time and the sensor records whatever light strikes its surface over that time, just like film.
There is some technical information here: What is the structure of a photosite?
One difference between digital and film is that digital doesn't suffer from reciprocity failure.
Correct answer by MikeW on February 17, 2021
As far as I know a sensor can't "be exposed" for a long period of time
That is wrong, you can't capture an image at an "exact monent". An image whether film or digital is formed by photons hitting a sensor over an exposure period. In the case of a film camera the photons cause chemical changes. In the case of a digital camera they create electron-hole pairs which produce electric charges on a semiconductor junction.
The exposure period on a film camera is defined by a mechanical shutter. On a digital camera it may be defined either by a mechanical shutter or electronically by the timings used to reset and readout the sensor.
The challenge with long exposures on digital sensors is that digital sensors are relatively noisy. As well as photoelectrically generated electron/hole pairs there are also thermally generated electron/hole pairs. Many cameras will perform dark frame subtraction which helps with this but doesn't entirely solve the problem. In the serious astronomy world Cryogenic CCDs are sometimes used to reduce noise but they are impractical for regular portable cameras.
Sometimes stacked exposures are also used, the main advantage of stacked exposures over a single long exposure is that you can throw out bad frames before merging. This can be very useful for removing the affects of atmospheric turbulence from images.
Answered by Peter Green on February 17, 2021
A CCD sensor will have capacitor in each pixels and then it acts like a analog unit which is comparable to a silver halide crystal in the film plane.Longer the light come in,greater voltage resulted in the sensor until the processor decides to read the matrix.But a CMOS sensor doesn't have any capacitor,it has a mosfet system instead to amplify the sensor individually.So the solution is to buffer the data into another memory and then you'll have a mapped image in the memory,the processor will read the very dim signal and convert it to digital code and then use an algorithm to encode a final image.That is quite an complex and unnatural process compare to analog photography.
Answered by Lan... on February 17, 2021
Get help from others!
Recent Questions
Recent Answers
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP