TransWikia.com

How does shooting on dedicated monochrome digital cameras compare to shooting in monochrome mode on full-colour digital cameras?

Photography Asked on July 9, 2021

For an example of a dedicated monochrome camera, see Leica’s M10 Monochrom.

To the best of my understanding, the major optical difference here is that the use of a monochrome sensor means you don’t need a Bayer (or similar) filter since your sensor can only pick up B&W signal anyway

What repercussions does this have in terms of the shooting experience? Does the removal of the Bayer filter mean you get measurably more light to the sensor, allowing for better low-light shooting?

I’m mentioning the Leica here specifically since it’s got a full-colour-shooting cousin, the normal M10, which can probably provide a good baseline in case anyone here’s shot on both, but a more general/theoretical answer is also welcome.

3 Answers

The biggest advantage to shooting with an unmasked monochrome sensor is that there need be no demosaicing, along with the resulting loss of resolution. Demosaicing is needed due the color filter array. This loss when using color sensors is not as bad as one might first think, though. Since a Bayer mask has four-cell groups with R-G-G-B color filters, it would seem at first glance that a Bayer masked sensor would have 1/4 the resolution of a non-masked sensor with the same number and same size photosites. But Bayer masked sensors don't combine four masked sensels to get the values for one RGB pixel in the resulting image. They use demosaicing algorithms to interpolate all three values¹ for an RGB pixel in an image with the same number of pixels as the sensor has photosites (a/k/a sensels or pixel wells). It turns out that, assuming the lens can outresolve the sensor - which is nowhere near as often the case now with many high resolution cameras in common usage for creative photography today than it was a decade or more ago, the absolute resolution capability of a Bayer masked sensor is about equal to 1/√2 compared to the absolute resolution capability of an unmasked monochrome sensor of the same size and number of sensels.

On the other hand, there is one obvious advantage of using Bayer masked digital cameras instead of B&W sensors or film: color information is preserved.

With color sensors we can apply different white balance and colored filters to the same shot after the fact. This flexibility after the fact can be a huge advantage.

It's not obvious at first glance, but even monochrome sensors are more responsive to some wavelengths of light than others. In the early days of chemical photography, this was even more pronounced. The earliest B&W emulsions spread upon glass plate negatives were only sensitive to blue light. We called this an orthochromatic response. Gradually other chemicals were used that extended the response to include some green and yellow light. Eventually panchromatic film emulsions that were sensitive to the entire visible spectrum were created. But even panchromatic film does not have a perfectly flat response to all wavelengths of light. Neither does an unmasked monochrome sensor.

But beyond the theoretical aim of a monochromatic sensor that is equally sensitive to all wavelengths of light, many times we want to make our monochrome photo give objects of certain colors more or less brightness than other objects of other colors that are equally bright in the scene we are photographing. The use of color filters to alter the tonal values of various objects of different colors when using B&W film has been around almost as long as photography has. In a sense, B&W film photographers had to do their "past processing" with regard to tonal values based on color before they exposed their film. I call it "past processing" because once the film is exposed, the opportunity to do it is now in the past.

Even if the final output is intended to be monochrome, changing color temperature and white balance after the fact allows us to alter the tones (brightness levels along the grey scale between full black and full white) of differently colored objects after the fact. The adjustment of the ratio of tonal values of objects of different colors can be even more pronounced when using digitally applied color filters.

Having latent color information in a raw digital file allows us to apply various color filters after the fact, just like we can alter the color temperature and white balance after the fact. Digitally applied color filters can be customized and optimized individually for a specific photo in much the same way that we can use an almost infinite variety of color temperature and white balance correction combinations. Color temperature and white balance modifications pull all of the colors in an image in one direction or another. Color filters, whether digital or physical, influence different colors by more varied amounts than global CT/WB adjustments do. Bayer masked digital sensors allow us to create the equivalent of as many customised film emulsions as we have the time to create, either color or B&W, for every shot we save in raw format. They also allow us to create as many customized color filters with a specific color and specific density as we have time to create. And we can do all of this after the shot has been captured!

When shooting with an unmasked monochrome sensor, the tonal relationships between objects of different colors are locked in as soon as the sensor is exposed. Any filtering to make similarly bright objects of one color a brighter shade of grey than equally bright objects of another color must be done using a physical filter at the time of exposure. The monochrome sensor only records a shade of grey. The raw files can no longer differentiate between objects that were different colors.

If we use a red filter when we shoot in monochrome without a Bayer mask, we can't go back based only on the information contained in the raw data and make it look like we used a green filter, or even an orange filter after the fact. With a digital raw file from a Bayer masked sensor, the possibilities of adjusting relative tonal values based on the colors of objects in the scene are near endless!

With a Bayer masked color sensor, there is an advantage in certain situations to using a physical color filter in front of the lens at the time an image is captured when we intend to make the final output a monochrome image. If the light is heavily biased towards the color of one of the filters on the Bayer mask, we can use an opposing colored filter to reduce that color's influence. This allows us to raise exposure so that the other two channels are brighter and have better signal-to-noise ratio (a/k/a S:N, S/N ratio, or SNR) without totally saturating the stronger color channel. This allows us to preserve detail in the stronger color channel that would be lost if we allowed it to blow out. This relatively late answer to this question covers this concept in greater detail. Using a physical color filter at the time of capture with an unmasked monochrome sensor also has this same advantage if the light illuminating a scene, or certain very brights objects within the scene, are dominated by a particular color.

In the end, it depends upon what tradeoffs are more important to you. How these considerations are weighted will vary based on what one is shooting and the pace at which it may be shot. A fast-paced environment with rapidly changing lighting conditions would weight things one way. A more methodical shooting situation in which a plethora of physical color filters are available and can be swapped out without losing the shot because the scene has changed in the interim would tend to be weighted the other way. My work lives more in the former situation, but yours might be in the latter.

If your primary outlet for publishing your images is at typical web-based sizes (and, even worse, using many compression schemes web hosting sites use), the importance of the difference in resolution and low light performance is practically nil. If you intend to print very large, the differences in resolution and low light performance will be noticeable.

For better performance in terms of resolution and low light performance, choose the monochrome sensor.

The relative differences in terms of resolution and low light performance between monochrome sensors and Bayer masked sensors aren't huge, but they are there.

For the ultimate in post-processing capability, choose the Bayer masked sensor.

The difference in post-processing flexibility is the difference between black and white. Bayer masked sensors have this capability. Monochrome sensors do not.


A few more technical explanations of some of the above information.

Does the removal of the Bayer filter mean you get measurably more light to the sensor, allowing for better low-light shooting?

It does, but not as much as one might think. This is because Bayer masked sensor have a green filter over half of its photosites (a/k/a sensels or pixel wells), as well as a blue and red¹ filter, respectively, over only one-quarter of the photosites.

If one is shooting in daylight at the Earth's surface, or under artificial light sources that simulate the spectral distribution of daylight at the Earth's surface, the distribution of different wavelengths in that light also contains more energy in the wavelengths near green than it does in red (longer) or blue (shorter) wavelengths.

enter image description here

The cones in our human retinas are also most responsive to light in the middle of the visible spectrum. So the light that is most plentiful in most natural light sources is also the light we perceive as brightest and it is also the light that our cameras with Bayer masked sensors is most efficient at detecting.

In low light situations it can highly depend on the exact nature of the light source. Incandescent lights emit the full visible spectrum, but they are weighted towards the longer wavelengths that we perceive as red, orange, and yellow. Other light sources may peak at higher color temperatures. Many of the artificial light sources we now use that are optimized for energy efficiency have strong tints along the Green ←→ Magenta axis that is more or less orthogonal to the Amber ←→ Blue color temperature axis.

enter image description here

For instance, in addition to having a color temperature of about 3700 K, traditional fluorescent bulbs also emit a green tint along the Green ←→ Magenta axis and need correction in the magenta direction. On the other hand, many of the popular LED stage lights found in small clubs are also at about 3700 K but have a decidedly magenta tint that requires compensation in the green direction along the Green ←→ Magenta axis. Both types of light are the same basic color temperature but look very different without compensation on the Green ←→ Magenta axis that is approximately orthogonal to the Blue ←→ Amber color temperature axis.

Color Filters

Let's make a B&W image of two vases. One is blue, one is orange. They are both about the same brightness under daylight.

  • If we take a photo of them with an unmasked monochrome sensor (or panchromatic B&W film) they will both turn out about the same shade of grey.
  • If we take a photo of them with a monochrome sensor, use a blue filter, and adjust exposure slightly brighter to account for the filter, the blue vase will turn out about the same shade of grey as in the first image, but the orange vase will be darker. But it probably won't be totally black, either. Nor will all of the yellow, red, green, magenta, etc. objects in the scene be totally black because we placed a blue filter in front of the lens. Colors will be reduced in brightness with the amount of reduction based on their 'distance' around the color wheel from blue, but you'll still see those objects in the picture. They'll just be darker compared to the blue vase than they actually are.
  • If we take a photo of them with a monochrome sensor, use an orange filter, and adjust exposure slightly to account for the filter, the orange vase will turn out about the same shade of grey as it did in the first photo but the blue vase will now be darker. But it probably won't be totally black, either. Ditto for objects that are yellow, red, green, magenta, purple, etc.

Look at the sensitivity curves for RGGB sensors. The color filters they use are not that much different than color filters used in B&W photography. If anything, they are weaker and reduce light less than those typically used in B&W film photography. A good bit of red and some blue light makes it through the green filter. A good bit of green light and even a little blue make it through the red filter. Even a tiny bit of red as well as a bit more green make it through the blue filter. For a very detailed discussion of how all of this works please see: RAW files store 3 colors per pixel, or only one?

For how using Bayer masked sensors mimic the way our eyes (which, by the way, also use only a portion of the total light that falls on each "R", "G", or "B" cone in our retinas) and brains work to create color, please see: Filter for RGB separation and its effect on the image

So how does a monochrome sensor stack up against Bayer-masked images that have a black-and-white effect applied to them?

In the end you can't really say that either a monochrome sensor or a Bayer-masked sensor can do everything the other can. They're two different ways of recording light. There will always be differences in the way each works that will affect how well each is suited for a particular task.

Monochromatic film had its heyday for about a century, during which time it advanced tremendously. Color film, which is basically three monochromatic layers tuned to be most sensitive to three different sections of the visible spectrum, had its heyday for about half a century. With a Bayer sensor, we can have both. With a monochrome sensor we give up the flexibility of color information, which comes in very handy even for images whose final form is in monochrome, in exchange for higher resolution and slightly better low light performance with the same number of photosites.


A practical example.

Here's an image I shot a couple of years ago after dark at an outdoor music festival with a color digital camera. In raw conversion I applied a green filter as well as used the color temperature, white balance correction, contrast, shadow, and highlight sliders to set differences in brightness based on color and reduce the influence of the "fog" on stage.

enter image description here
EOS 7D Mark II + EF 70-200mm f/2.8 L IS II, ISO 3200, f/2.8, 1/500 second.

Here's the same image with a red filter applied. All other processing settings are exactly the same.

enter image description here

From the significant increase in brightness, we can see that the light in the scene at the time was mostly red. Using a green filter under mostly red lighting has the effect of local contrast enhancement as well as decreasing brightness.

Yet even adjusting for brightness by reducing exposure 1.67 stops for the red-filtered version, we still see differences in contrast between some objects and others compared to the green-filtered version.

enter image description here

The bright parts are brighter, the dark parts are darker. Compare the detail in the wrinkles on the t-shirt (which was actually black). Look particularly at how the the multicolored guitar strap is rendered in each example. Also, notice the tuner assembly on the out of focus guitar tuning head low in the foreground. Look at the differences in the beard and the hair on the arm. The effect of the fog reflecting the red light is still there. Try getting the first image above with an unmasked sensor and a green filter in front of the lens. You might could get close. Or even worse, try getting the first result above that used the green filter with an unmasked sensor if a red filter was in front of the lens when it was shot. You can't.

Just for demonstrative purposes, here's the image in color with the CT/WB/HSL settings used to create the above monochrome images and exposure set (+0.17 stops) halfway between the green filtered (+1 stop) and red filtered (-.67 stop) brightness adjustment settings used:

enter image description here

Here is what the stage lighting looked like about forty-five seconds earlier:

enter image description here

A few minutes later, after I'd moved to the other side of the stage, the stage lighting was predominately blue but the red cans were also dimly on and the guitarist was being illuminated by a white spot from the tower out in the field.

enter image description here

So just how many color filters would I have needed and how much time would I have spent swapping them out every time the lights changed? How many images would I have even been able to take before the light changed again while I was still swapping filters? Out of about 87 images I edited and published from the set, I used monochrome on 13 of them. Of those 13, I used the following five filter choices followed by the number of images I used that choice:

  • None - 1
  • Yellow - 2
  • Orange - 1
  • Red - 4
  • Green - 5

Red and green are pretty much polar opposites. What a red filter darkens the most, a green filter barely affects. What a green filter darkens the most, a red filter barely affects.

¹ The "red" filters in our Bayer masks aren't really red. They're a yellow-orange color that is closer to the yellow-green color to which the 'L' (long wavelength) cones our human retinas are most sensitive. The "blue" and "green" filters aren't exactly the same colors as those used in RGB color reproduction systems, either.

Correct answer by Michael C on July 9, 2021

Doesn't affect light but does affect sharpness. The following quote from this article explains why:

“The Bayer filter itself does NOT make the image less sharp. The Bayer filter is simply a way to designate a color filtration for each individual pixel. If the image were rendered without the interpolation of this color information, it would be just as sharp as a camera/sensor without this filter. However in that case you would not have accurate colors. So it is technically the interpolation of what the Bayer filter captures, creating the final colors, which makes the image appear “less sharp”.

And don't forget this camera uses a 40MP sensor. Note that many other high pixel camera's also have alternative filter setups because of sharpness and moire issues. Think of the Nikon D800 without AA filter.

Plus other solutions for color sensors exist that try to work around that sharpness loss, think of FujiFilms X-Trans sensor config. Also a good read on this topic.

So far for the filters that is. Because there are other benefits to a monochrome only solution. It's like having a camera where you can drop all the features, menu settings, processing power, filters (see above) etc that are only needed for color processing. And then use these for improving the B&W image from a 40MP raw image capture.

I don't have details on that but it makes sense that if you only need to make B&W images work within a similar product and budget (a way higher sales margin but less sales) that you can improve upon that and excel at it.

Answered by hcpl on July 9, 2021

About a full stop of light is lost and the resolution does go down significantly. The loss of detail is hard to quantify though because advanced interpolation algorithms are used to reduce it. These algorithms are very effective, but do sometimes produce 'artifacts'.

This answer provides a bit more details: https://photo.stackexchange.com/a/117510/40887

Answered by Orbit on July 9, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP