Photography Asked by Hamish Downer on August 12, 2020
Digital photos can have colour filters applied after the fact by software, so are there any good reasons to use colour filters with a digital camera? As I understand it, the main reason for them originally was for effects with black and white film, but now even black and white is a post-processing effect.
I know that UV filters are good for lens protection, and that ND filters allow you to use longer exposures, but what use are colour filters?
There's a difference between color and color correction filters although they both are colored.
Color correction filters are useful in digital photography to get more even exposure in all channels under some special types of lightning.
For example you'd probably get more exposure and thus less noise in blue channel if you used blue color correction filter (82A/B/C) under tungsten lightning. It should be noted that these filters have filter factor, meaning one stop gain in noise could mean lost stop in terms of exposure time.
Underwater photography is another domain where light is tricky and physical filters are suggested, mostly warming, but fluorescent-correction filters may also apply.
In this example two pictures were made in the same conditions under tungsten lightning (street light in winter), the first one shows blue channel from picture without any filtration and the second one blue channel from picture with fairly weak 80D filter. Note the differences in noise. It's important to mention that the white balance reference for both shots was taken from gray card and the blue channel shows more noise in unfiltered case because the blue channel got more amplified in that case.
The usual color filters for BW film are not very useful in digital world as these can easily result overexposure in one channel and leave the other channels underexposed and noisy. Putting a strong color filter in front of your lens means that you are using your digital camera inefficiently, as for example in case of red/blue filter, you're using just 25% of your available pixels and 50% in case of green.
The list of filters with their Wratten number and description can be found from Wikipedia article.
Correct answer by Karel on August 12, 2020
With the flexibility offered by digital image programs such as Photoshop, Lightroom, Aperture, and such, there is no reason to use a colored filter on a digital camera. ND filters and polarizers can obtain effects that aren't possible purely through software, but for adding a color cast to an image there's no reason to purchase or carry around a physical filter.
Answered by ahockley on August 12, 2020
Yes, if you want to spend less time behind your computer, attach a color filter on your lens.
Answered by Marc on August 12, 2020
If you shoot RAW, then there isn't much reason to use color filters anymore.
If you shoot jpeg, then you're better off getting it right the first time rather than doing processing afterwards, so color filters are quite useful.
Answered by chills42 on August 12, 2020
Depends why you're using a coloured filter in the first place -- if it's to enhance contrast with a B&W shot, then you may be better with doing this at the digital processing stage (unless you've set the camera to shoot in B&W as JPEG)
If you're wanting to correct or enhance different lighting, you can just fiddle your white balance settings on the camera.
Some artistic effects can be achieved through using coloured filters, but software like Photoshop can emulate all of the settings -- it's just a case of when you want to spend the time -- shooting, or processing
Answered by Rowland Shaw on August 12, 2020
What you have to take into account are two main variables:
Taking that into account what you want to do is maximize the output of your shot and minimize post-processing. That is the rationale behind using filters in digital photography: to maximize quality right out of the shot.
In practice, if you use RAW you will be able to do color filtering with minimal impact.
Answered by Rezlaj on August 12, 2020
I would say that the biggest advantage of using filters on the camera rather than post-processing in the computer is that you can see the result on the site, making any necessary adjustments. The same goes for in-camera double exposure instead of stacking frames in the computer; you can get immediate feedback on the result.
Answered by Fredrik Mörk on August 12, 2020
Lets go to the extreme case so that we can think about what the filter does.
Lets take an arbitrary image and then try to reconstruct what the image would have been if there was an R72 filter on the camera.
These are IR longpass filters.
You really can't take what the sensor recorded and backwards from there to try to reconstruct the actual wavelengths (or polarization) of the light that went through the lens.
If you could, everyone would be doing IR photography and UV photography without filters at all. The thing is, once you've hit the sensor you've lost some of the information about the light.
Light itself isn't RGB
its an entire range of various wavelengths for which the summation of that is something that our eye perceives as color. With a colored filter, you are able to reduce the significance of certain parts of that spectra to either balance out the light (as in the case of correcting for UV light), or remove specific parts of it to achieve a specific purpose.
That remove specific parts is one that you can often see. My favorite is the didymium filter (aka Red Enhancer) which has a transmission spectra that looks like:
That drop at 580nm is around the sodium line (think those yellow street lights) and used for safety glasses for a glass blower so that they can remove the the sodium yellow color in the flame and see the stuff they are working with more clearly.
In photography, the brown fall color leaves aren't brown, they're red, and orange, and yellow and a bunch of other wavelengths. By removing some of the wavelengths near red, the red color comes through more clearly.
image from http://photoframd.com/2010/10/15/enhance-fall-colors-with-an-intensifier-filter/
You can find similar filters in astrophotography. A 'skyglow' filter to help reduce specific forms of light pollution in the night sky (some use the didymium filter because it cuts out some of the light pollution from a sodium vapor lamp (note that a mercury vapor lamp is much harder to deal with). (see Types of Lamp for more info on this). Alternatively, you might want to just photograph the hydrogen alpha line that only lets in 7nm of bandpass around 656.3nm. Again, these are things that cannot be reconstructed after the fact once the image has been captured.
Gels, and color correcting filters are ones that only allow the light that you want to photograph through to your sensor. Once the entire spectrum of light is collapsed to an RGB value, you cannot pull it apart again to remove specific parts.
Answered by user13451 on August 12, 2020
Color filters are also used to get a more accurate representation of the colors in the scene. Here one takes multiple pictures using a large number of different color filters of the scene which can then be combined to yield a more detailed color picture.
Your camera sensor only uses 3 filters, each pixel detects a gray value of light filtered by one of the 3 filters. Using interpolation, the 2 missing gray values are obtained at each pixel. Even if we ignore the inevitable artifacts in this step, we have to consider that it's theoretically impossible to reconstruct how we would perceive the colors in the scene given the gray values obtained by 3 filters that filter the light spectrum differently than the cone cells in our eyes.
The transform from the detected gray values to the picture on your computer screen involves making assumptions that in some cases can be very inaccurate. The displayed colors will then be visibly different from reality. Now, it's impossible to correctly display the colors we can see using only a combination of 3 colors, so a conventional monitor will always fall short. However, even if the scene contained only colors that are within the range of what your monitor could display, these colors will now still not be displayed correctly.
The only way to get a better representation of the colors, is to make more independent measurements of gray values using different filters. A simply way would be to take pictures with different cameras that have different color filter arrays in their sensors. E.g. an additional low quality picture taken with your smartphone can be used to improve the colors in a high quality picture taken with a DSLR camera. But you can also take many pictures using different filters and then use these pictures to more accurately estimate the correct representation of the colors.
Answered by Count Iblis on August 12, 2020
It depends. Especially when producing images that are viewed in monochrome/B&W.
If digital sensors had unlimited dynamic range it wouldn't matter so much, but we all know that they are limited by their noise floor.
By using the color filter at the time you shoot, you can reduce a particular color channel that might otherwise be blown out while still preserving the brightness of the other two color channels. For instance, if the scene has a lot more brightness in the red channel than I want in the final image I can use a green filter to reduce the amount of red without reducing the green (and to a lesser extent the blue) as well. The green filter might also allow me to expose so that the greens and blues are even brighter while still keeping the reds below full saturation.
But today, with digital cameras, would I just instead shoot in color, apply the yellow filter in post (or whatever other color filters I want), and then convert the image to black and white?
Not exactly. Digital filters don't always work the same way that actual physical filters do, and so they don't always give the same results. You may be able to get very close, but there's still no substitute for using actual filters if you're planning on presenting the image with a particular balance between certain colors and the gray tones they produce in monochrome.
With most general raw converters that have a dedicated "Monochrome" tab, the number and color of filters that can be applied are usually fairly limited. The available choices usually might be something like Red→Orange→Yellow→None→Green. But you often can't alter the density/strength of a specific filter color. If you want a specific color between these choices, or say you want a blue filter, you're often out of luck.
Dedicated B&W/Monochrome editing applications or plugins like Nik's Silver Efex Pro or Topaz B&W Effects often add many more choices including specific filters in varying strengths. They may even be labeled by the names of their analog counterparts e.g. Lee #8 Yellow or B&W Light Red 090. But they still act upon light after it has been recorded by your sensor, rather than before. So the limitations of a camera's dynamic range will limit, to one degree or another, how close to using an actual filter you can get by doing it in post processing.
What you set for color temperature and fine tuning along the Blue←→Yellow and Magenta←→Green axes will have an effect, but it won't always be the same as using a color filter would. When you adjust the color temperature pretty much all of the colors are shifted in one direction or another. Color filters are much more selective about which colors are affected. You could use the Hue Saturation Luminance (HSL) tool in many post processing applications to fine tune a little more, but you've still unnecessarily limited your camera's dynamic range more than you would by applying the filter to the light before exposure so that you can utilize more of your camera's dynamic range only on the light that you want to capture.
You can reduce contrast in post, for example, to mimic the effect of a blue filter but it may not give you the exact same effect. Again, you are also sacrificing dynamic range by applying the filter to the digital information after it was recorded rather than to the light before it was recorded.
Answered by Michael C on August 12, 2020
From personal experience it can look better.
If we are talking the strong colour filters used for black and white? I just find it more pleasing to the eye, and is likely a different 'equation' going on compared to whatever 'digitial red' adjustment (and my camera has a BW+red built in)
I find combinations with processing that affects dynamic range may produce artifacts using the digital filter. Of course, other options like curves or hue/saturation could be used...but if i KNOW i want red, then i use red: often you look outside and see what type of sky it is just like Ansel Adams did.
Anyway, otherwise the main filters they say are necessary to be physical are polariser and maybe those 'natural night' ones (because they are quite a specific 'sodium' frequency band rejection i dunno if that's easy to dial in)
But strong colour filters, i just think somehow this happening prior to the exposure being read is a good idea, seeming to look better. Maybe this is because colour balance adjustments can clip, or is just a different 'algorhythm' happening and does it differently. A filter does things with quanta of photons, but changing 255 values for each colour is a mathematical function, so maybe its better for it to change in real world then be represented in the 255 values- maybe things like bright lights will look less strange and it offers more 'latitude' since saturation sliders can also send bright areas into clipping-then you reduce luminosity to compensate and looks unnatural perhaps. It does hold the obvious disadvantage that filters can increase ghosting, and also the exposure compensation needed however. You may have flares/ghosts around those bright areas instead, like taking direct into sun..
As others mention I have this 'belief' that digital post process= always degrades and is a trade-off. Analogue process =somehow pure from the start. Its not necessarily correct and its certainly not correct all the time. Filters and grain can screw things up for sure, but might look better than slamming red saturation slider to 100%.
Answered by jorgepeterbarton on August 12, 2020
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP