TransWikia.com

Why are camera sensors green?

Photography Asked on December 31, 2020

When I look at a CMOS sensor it’s green.

But CCD sensor. photos in the internet show pink sensors.

So what exactly defines the color of an camera sensor? Especially what defines colors of the sensors of an gorgeous 3CCD camcorder?

I looked to the CMOS sensor in sunlight. Would there be an difference If I looked at it in a dark room with an perfectly white flashlight in my hand?

4 Answers

The color you see when you look at a "sensor" is usually determined by the combined colors of the colored filter arrays that are placed directly in front of the actual silicone chip as well as the combination of other filters (Low-pass, IR, UV) placed in the "stack" in front of the sensor.

Although we call them "red", "green", and "blue", the colors of most Bayer masks are:

  • 50% "green" pixels that are centered on around 530-540 nanometers and significantly sensitive to light ranging from about 460nm to past 800nm and the edge of the infrared range. The "color" of 540nm light is perceived by most humans as a slightly bluish green color.
  • 25% "blue" pixels that are centered on around 460nm and significantly sensitive to light ranging from the non-visible ultraviolet range to about 560 nm. The "color" of 460nm light is perceived by most humans as a bluish-violet color.
  • 25% "red" pixels that are centered on around 590-600nm and significantly sensitive to light ranging from about 560nm to well into the infrared range. The "color" of 600nm light is perceived by most humans as a yellowish-orange color. (What we call "red" is on the other side of orange at about 640nm).

The "color" components of the Bayer mask can be seen by looking at spectral response curves for various sensors:

enter image description here

enter image description here

The "colors" each type of cone in the human retina are most sensitive to are similar:

enter image description here

Here is a representation for the "colors" humans perceive for various wavelengths of light:

enter image description here

Please compare the peaks of the sensitivities above with the "colors" of those wavelengths along the visible spectrum.

There are no coatings on most tri-color imaging sensors that is centered on what we call "red", all of the drawings on the internet of CMOS sensors with Bayer filter arrays depicted notwithstanding.

enter image description here

Most CMOS sensors placed in cameras used for taking the types of images we consider "photography" here have a "stack" of filters that include both infrared (IR) and ultraviolet (UV) cut filters in front of the Bayer color filter array. Most also include a low pass "anti-aliasing" filter. Even sensor designs that are said to have "no low pass filter" tend to have either a cover glass with the same refractive index or the two components of a low pass filter oriented to each other so that the second one cancels the first one.

enter image description here

enter image description here

What one sees when one looks into the front of a camera and sees an exposed CMOS sensor is the combined effect of light reflecting off all of theses filters, and is dominated by the slightly bluish-green tint of the "green" filtered portions of the Bayer mask combined with half as many blue-violet and orange-yellow filtered portions that we call "blue" and "red". When viewed sitting inside an actual camera, most of the light striking the sensor and the stack in front of it will be from a fairly narrow range of angles and usually be fairly uniform in color. (The purple tint on the edge of the Sony sensor is probably due to reflections of light at just the right angles off the UV and/or IR cut filters.)

enter image description hereenter image description here

When there is light from a wide range of angles falling on such a sensor without the filter "stack" in front of it, there will also be a prismatic effect evident that will show a fuller range of colors, due to the shapes of the surface of the microlenses on top and the colors of the Bayer mask sandwiched in between the microlenses and the sensor.

enter image description here

Correct answer by Michael C on December 31, 2020

An unfiltered CCD or CMOS sensor looks very similar to any other silicon integrated circuit that has a very regular/repeating structure of similar structure size - semi-metallic gray (from silicon, quartz and aluminum) with some iridescence probably resulting from diffraction grating effects in the fine, repeating structures. Compare a bare DRAM or flash memory chip.

A filtered sensor typical for a color video or still camera will appear greenish because colour filter matrices that are heavily green-biased (2 green pixels for every red and blue pixel) are very commonly used, since such a perception bias is also well known to exist in the human eye (even in non-green-eyed individuals :) )

Answered by rackandboneman on December 31, 2020

I've personally seen sensors of various colors in different cameras; green, pink, blue, etc. Without the specific dimensions and details on construction, it is tough to say, but I'd imagine that the color on most sensors is given by the thickness of the coatings on top of the sensor. Different thicknesses will produce different colors due to thin-film interference. Depending on the thickness of the coatings, different wavelengths of light (i.e. colors) destructively interfere with themselves in the coating, and whatever wavelengths do not reflect back to give you the color you see.

Answered by Shamtam on December 31, 2020

Photographic film is naturally sensitive only to violet and blue light frequencies. Hermann Vogel, Professor Berlin Technical, attempting so solve the problem due to “halation”,. He had some emulsions dyed yellow to arrest blue light exposing from the reflections from the emulsion-base interface. The It worked but to his amazement, the film gained sensitivity to green light (orthochromatic). His graduate students discovered other dyes sensitize emulsions to red light. This was an important step, emulsions sensitive to red, green and blue, yielded correct monochromatic rending. These tweaked emulsions made future color films possible.

As the CCD and CMOS senor evolved, it was also necessary to tweak them as to RGB sensitivity. Bryce Bayer of Eastman Kodak developed a subpixel matrix scheme coating the various photosites with strong additive color filters. The scheme is approximately 50% green, 25% blue and 25% red filters. This scheme tweaks the overall sensitivity so that a more faithful image results.

Because the image sensor is highly sensitive to infrared radiation, the entire imaging surface is filtered and this flat cover-glass does duel duty and protects the fragile surface from abrasion. A cover-glass is highly polished so it, like polished lenses induces a light loss due to surface reflection.

Robert Taylor, London optician, discovered that aged lenses acquired a natural coat of grime from air pollution. These “bloomed” lens reflected away only 2% whereas a new lens reflected away 8%. Artificial blooming (coating) took hold in the 1930’s.

The coated lens or cover-glass appears dichroic. It looks one color by transmission and the opposite color by reflection. Say the coat is to control red and blue reflections, the lens appear green by reflected light and magenta by transmitted light. Because most such glass is multi-coated, a casual observation gives little clue as to what color is being mitigated.

Answered by Alan Marcus on December 31, 2020

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP