Photography Asked on January 30, 2021
I have been taking a photo class for the past month or so and the professor told us to shoot in raw for a few assignments. When I view the pictures shot in RAW (on a canon rebel t7) on my computer, the picture will look great for a few seconds, then it changes to a grainy undersaturated mess. It makes me question what the image actually looks like. Any idea why this happens and/or how to fix it.
Quite likely your application is displaying the built-in JPEG thumbnail from the RAW, while it tries to decode the RAW. Then when it is done it replaces the thumbnail by its own interpretation of the full RAW.
So it would all depend on your application's RAW decoding defaults and support for your particular camera.
Given the size of SD cards these days, you can always shoot in RAW+JPEG, and use the RAW when necessary.
Answered by xenoid on January 30, 2021
Anytime you open a raw file and look at it on your screen, you are not viewing "THE raw file." ¹ You are viewing one among a near-countless number of possible interpretations of the data in the raw file. The raw data itself contains a single (monochrome) brightness value measure by each pixel well. With Bayer masked camera sensors (the vast majority of color digital cameras use Bayer filters) each pixel well has a color filter in front of it that is either 'red', 'green', or 'blue' (the actual 'colors' of the filters in most Bayer Masks are anywhere from a slightly yellowish-green to an orange-yellow for 'red", a slightly bluish-green for 'green' and a slightly bluish-violet for 'blue' - these colors more or less correspond to the center of sensitivity for the three types of cones in our retinas). For a more complete discussion of how we get color information out of the single brightness values measured at each pixel well, please see RAW files store 3 colors per pixel, or only one?
¹ Please see: Why are my RAW images already in colour if debayering is not done yet?
Anytime you view a "raw" image on any device with any particular viewing/editing application, one of two things is happening:
You are seeing a preview jpeg generated by the camera that took the shot. This preview image is appended to the file containing the raw image data, along with the metadata generated by the camera. Many devices will use this preview image when you open a "raw" photo.
The raw data in the file is being processed and interpreted by the application you are using to view the image. That application may be a simple photo viewer built into the device's firmware, or it may be a sophisticated photo editor such as Lightroom or Photoshop. There is no single "correct" interpretation of the data in a raw image file. Each application can interpret the raw data in the file differently. There is no "one" way to render the linear 12-14 bit monochromatic luminance values contained in a raw file in color on a 8-bit three color device. The raw data must be processed to be viewed.
Some applications display the preview image until they can render an image created by interpreting the raw data itself. It sounds like this it what is happening in your case. Many applications have user selectable options that allow the user to select what is displayed when a raw image file is opened: the jpeg preview or one of many possible interpretations of the raw data using an automated routine or one of many selectable default processing profiles.
Related questions:
RAW files store 3 colors per pixel, or only one?
Why is there a loss of quality from camera to computer screen
Why do RAW images look worse than JPEGs in editing programs?
While shooting in RAW, do you have to post-process it to make the picture look good?
Why do my photos look different in Photoshop/Lightroom vs Canon EOS utility/in camera?
Why do my RAW pictures look fine in Lightroom preview but become faded when exported?
Are paler raw images normal for a newer sensor with higher dynamic range?
It makes me question what the image actually looks like.
It is vital to understand that the two different versions of an image you see on your screen when you load a raw file are both equally "accurate representations" of the raw file. Both of the images are different interpretations of exactly the same raw data. Neither is more original than the others. Neither is more "correct" than the others in terms of being a valid representation of the data contained in the raw file. They are both perfectly legitimate ways of using the data in the raw file to produce an 8-bit image.
If you have not altered your viewing application's default settings, most raw editors will open a raw file with no noise reduction, sharpening, increased contrast or color saturation, etc. applied to the information in the file. They leave the decisions about how much of each the editor of the image (you) wants to use. In contrast (pun intended), most camera's JPEG engines that generate the preview image attached to a raw file will apply a good bit of all of the above using preset amounts or even using automated routines that analyze the information in the file and make "educated guesses" about things such as color temperature and white balance, contrast, noise reduction, etc.
If you could only get one possible image from a raw file, it would defeat the entire purpose of being able to save raw files. There is only one correct way to display a properly written JPEG. JPEG images are intended to be the 'final' form for distribution and sharing. This normally includes some "punching up" of color, contrast, and sharpening by in-camera processing routines as well as by many raw processing applications' default settings.
Raw files, on the other hand, are a starting point. There are almost countless possible interpretations that can validly be derived from a single raw file. None of those interpretations is more legitimately "THE raw image" than any of the others.
For more please see:
Why does my Lightroom/Photoshop preview change after loading?
Are paler raw images normal for a newer sensor with higher dynamic range?
How to know correct exposure for RAW shooting when camera show JPEG Histogram
Why don't cameras show an "accurate" histogram?
How to make camera LCD show true RAW data in JPG preview and histogram?
Why can software correct white balance more accurately for RAW files than it can with JPEGs?
Answered by Michael C on January 30, 2021
Get help from others!
Recent Questions
Recent Answers
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP