Photography Asked on April 29, 2021
I have the Xrite i1Profiler and there are several things I need to understand.
1.) Calibration seems to mean running the software and creating a profile. The profile seems to be located on the computer. Which raises a slew of questions. First of all, it means that you’re not calibrating the screen – you’re calibrating the computer output. Second, it therefore means that the program would have to be run separately for every computer (output device). Third, if there’s no output device that can run a computer program (a DVD player, for instance), then calibration isn’t practically possible. but that can’t be right, can it?
2.) …and fourth, it means that you’re limiting the signal, and therefore not using the hardware potential to its maximum. THIS video https://www.youtube.com/watch?v=h_TT9O2I1b4 has an Xrite expert talking about what sounds like just that – that the calibration can end up in the hardware calibration of the screen. What I want is that: to calibrate the monitor/projector/TV, not to adjust (limit) the output to that screen. I don’t see how this process could be doing that, but I’d like that to be the case.
SO THE ULTIMATE QUESTIONS ARE:
A. if I have to use my Xrite i1Profiler to calibrate a screen for use with a game player or DVD-player that can’t run software, how do I do that?
B. If I have to use my Xrite i1Profiler to calibrate a wall-sized plasma TV (for example) for use without a player at all, how do I do that?
I’d really love to have a thorough grasp of the principles and possibilities (minus impossibilities) involved. My thanks in advance!
You are asking very valid questions, and the very fact of it means that you do understand the essence of the problem. You only need to know the technology to deal with it. There are good answers already, all of them basically correct, but I'll add my own take in hope to resolve the confusion.
Michael mentioned that there are two things typically involved in the colour management workflow: calibration and profiling. Calibration physically adjusts the display to a known standard. Profiling describes what we've got as the result, for the benefit of the software that can utilise it.
Why would we need profiling if we can do calibration? Well, because
Let me use this single-dimensional analogy. We produce rulers rather than displays. There are several common standards: 30 cm (ISO1), 35 cm (ISO2), etc. We buy one that is advertised as "ISO1-capable". How do we know it actually is? We measure it with a trusted device, say X-Rite TapeMeasure. It tells us: the device is actually 31.2 cm (and, if we want to be advanced with this analogy, it can detect that the divisions on its first half are slightly denser than on the other half: it is not even linear). What we've done is already some profiling. We could now record all these discrepancies, create a correction table/formula, and call it a profile.
But who will apply this correction? Right, the computer. By definition, it can "manipulate the output", so it can compensate for imperfections of the output device. Doing this correction is, essentially, calibration. But like it's better to "fix" the ruler instead of doing mental corrections every time, it's usually better to "fix" the display instead of correcting the computer's output.
Typically (but not always for cheap monitors), display offers higher quality adjustments.
Adjusted (calibrated) display will be correct for all inputs, even for devices that can't apply correction themselves.
But only as long as we calibrated to a common standard. The device simply needs to be told to output in that standard (say, sRGB), which for many devices is implicit.1 This was less true for analog video cables, where calibration would compensate for individual imperfections of the cables and on-board DAC.
In our ruler analogy, this is equivalent to "massaging" the ruler to be linear and cutting it off to the standard 30 cm. Then all the consumers need to know is that the device is ISO1-compliant, or at least that it is metric - which is often an implicit assumption, but still something to be aware of.
On a computer, different software may compete for adjustments. A game or a screen saver may ruin the correction table - worst of all, unbeknown to the user. This necessitates presence of a resident program (usually from the supplier of the calibration device) that monitors and reinstates the calibration adjustments. Not to mention, it has to be done every computer restart.
Having established that, we can consider three scenarios.
Professional/advanced monitors have full built-in calibration capabilities. They have high-qiuality tables (LUT) and can be calibrated to any reasonable target. Sometimes they even have built-in colorimeters; sometimes colorimeters can be connected directly to them; sometimes it all goes via computer, but only for convenience and flexibility of control.
In order to see correct colours from a DVD player on such a monitor, you calibrate it for Rec.601 or .709 as appropriate (or just sRGB - they are very similar) (typically using the display's proprietary software), store the settings and that's it.2
Normal consumer displays (including most TVs) have, at best, separate colour adjustments (R-G-B). The calibration process will ask you to adjust these to obtain neutral greys with respect to your chosen target (see in Michael's answer). This brings you some way, and that's the best you can do for your DVD-player. But you don't know for certain how red is your red and how correct is your yellow. This is similar to "linearising" our ruler some way, so that the middle (15 cm) is in the middle, but the resultant "total length" and the remaining non-linearities will remain known only to the computer.
2a. Many modern displays (including TVs) have standard presets (typically sRGB, even though they may have a different name). You can check how correct they are by profiling the display in this mode with your colorimeter. Good software will show you the actual gamut and let you compare it with the theoretical sRGB. As a minimum, you can check how close the measured colour temperature is to 6500K. If the result is reasonable, using such preset is probably the best thing for home devices if you want "correct" colours.
Laptop displays often don't have any controls (except for backlight brightness). Inevitably, they have to be fully calibrated by the video card.
Now, why would you want to sacrifice the display capabilities when calibrating it? If you need, say, to reduce the blue gain in order to compensate excessive bluishness of the backlight, surely you are sacrificing the "bluest blue" you can produce. Indeed. But do you want correct colours or fancy colours? It does surprise some that after calibration the image may look "duller". But it is consistent with the original intention. If this is not important for you, you don't need to calibrate.
There are practical consequences here. Sometimes displays have wider gamut than the desired standard, even after calibration for greys. (Like our ruler was a bit longer than 30 cm). What to do then? Wouldn't it be good to use this extra capability? Yes! But...
1 It's a separate big topic what is the assumed/default standard (profile) for different devices.
2 Of course, you'll need to refer to the manual for exact instructions. There could be several pitfalls, like calibration being separate for different inputs etc.
Correct answer by Zeus on April 29, 2021
You're conflating calibration with profiling. They're two different things.
Let's do a basic definition of those two terms:
Almost all displays can be calibrated to one degree or another, but not all displays can be profiled.
Calibration is what happens when we manually adjust a control and then measure the result in a way that the measurement does not change the device's output. The measurement only tells us the result of our manual adjustment. We then manually change the adjustment to try and get closer to the target output and measure again. Rinse and repeat until we get as close as we can.
Profiling is when we use a measuring device to create and load a set of instructions that does change the device's output.
Ideally, a screen should be calibrated (and/or profiled if the screen has that capability) using it's own adjustment parameters before it is profiled to work with a specific device that has profiling capability.
Only after using the display's own adjustment controls to get it as close as possible to the targeted output do you run the profiling section of the software that adjusts the output of the device providing the signal (usually a computer's GPU). In most cases, the generated profile applied to the computer's output signal should add very little correction to the signal sent to the calibrated screen if you calibrated the screen properly using its internal controls (which, of course, also assumes that the display device has adjustment controls available to calibrate it well).
SO THE ULTIMATE QUESTIONS ARE: A. if I have to use my Xrite i1Profiler to calibrate a screen for use with a game player or DVD-player that can't run software, how do I do that? B. If I have to use my Xrite i1Profiler to calibrate a wall-sized plasma TV (for example) for use without a player at all, how do I do that?
A and B are basically the same question other than the output from your gaming console, DVD player¹, or whatever other device(s) is/are feeding/streaming programming to your TV screen (which could be an internal tuner module, but probably isn't for most folks) may use different standards/color spaces. This is where it gets a bit messy. You might want to use the same screen to play games, watch Blu-ray movies, use an Amazon Fire or Roku, etc. to stream programming via the internet, or watch programming coming from your cable TV box! They might not all have the same color space capability. You might have to choose one device for the screen to be optimized to pair with and let the results from lower fidelity (or less frequently used) input devices fall wherever it lands.
How do I do that?
It all depends upon the capabilities of the screen.
If your screen has no capability to load LUTs (basically, profiles), then you're stuck getting the screen as close as you can using the screen's internal RGB, brightness, and contrast adjustments while it is hooked up to your computer. While doing this, you want to:
If your screen has independent profiling capability, then you use your computer and software to load the correction profile to the screen, rather than to your computer's GPU. Different screens may allow this in so many different ways that you'll have to dig into the screen's User Manual or Technical Manual to see how to do it for a specific model.
If your screen has the ability to load multiple LUTs/profiles for different input devices connected to their own input port (HDMI1, HDMI2, DP3, etc.), then you can rinse and repeat for each use case scenario. Do a profile for watching movies from your streaming device or Blu-ray player in a dim room with warm lighting. Do another profile for playing games in brighter, cooler light during the day when there's lots of sunlight illuminating the room, etc.
¹ Please - FOR THE LOVE OF ALL THAT IS GOOD - upgrade to Blu-Ray if you're using a plasma TV or other really nice and/or large screen!
I've still got an older X-Rite colorimeter and the ancient software that came with it. The first step is to use the monitor's own contrast, RGB, and brightness adjustments until the colorimeter measures the output as close as possible to the desired output.
Here's the closest I can get RGB and brightness to target using the monitor's own controls:
This monitor is a fairly cheap one, and it's also fairly new, which means it's still very bright. Monitors dim as they age, so it's nice to get one that has plenty of headroom when it's new. I had a Dell monitor that I used for 6-8 years that started out having to be set to about 50% brightness and I replaced it when it couldn't reach the target brightness set all the way to 100% brightness. With the current monitor, I had to do a LOT of correction using this monitor's internal controls.
Brightness and contrast were set to:
Brightness: 14/100
Contrast: 66/100
The RGB controls were set to:
Red: 36/100 Green: 25/100 Blue: 25/100
For whatever reason, moving the red control above 36 does not increase the red response of the colorimeter, even though the screen is noticeably pink-tinted with red set to high values, so the other two colors had to be set to match red at 36/100. (Reducing the red control to values less than 36 did reduce the amount of red response by the colorimeter, so I'm guessing 36-37 is all it takes to fully saturate the colorimeter's red channel.)
After running the automatic profiling routine, here's the screen showing me the curves applied via the monitor profile:
Notice that there is very little correction being done via the profile! The red, green, and blue "curves" are almost a perfectly straight slope with the three colors almost on top of one another. If I had set the internal RGB controls to the same values, say R:36, G:36, B:36, then the profile would show deep curves reducing Green and Blue and a mild curve increasing Red.
Here's the "test pattern" on the final page of the calibration software. The first is with the profile applied. The second is with a generic sRGB profile applied.
There's almost no visually distinguishable difference! So if you hook up your monitor or TV to your computer and calibrate it well using it's own controls, then you can use it attached to things such as DVD/Blu-Ray players or gaming consoles and it will still be so close that you can't tell the difference. Changes in ambient lighting conditions will have much more influence on your perception of the color output than the minor inaccuracies that not having a proper color profile will introduce.
[OK, so I later realized that the visual differences seen looking at the monitor with the profile applied and not applied would not be reflected in a screen shot, which is created from the signal before any profile is applied to the signal sent to the monitor. But I promise, I didn't notice any difference in the actual monitor output when viewing it with and without the custom profile. The hardware calibration got it that close. When I tried to do the same thing with the monitor basically set to 6500K and then profiled to 5500K I realized my error. They looked markedly different on the monitor, but the screenshots were still identical! LOL]
Just for fun I went back and changed the monitor's hardware RGB settings to R:36, G:36, B:36.
Here's what the colorimeter measured.
It looks much worse than it is, It doesn't take much adjustment to push one color all the way to the end of the scale. Notice the colorimeter measured the output as 6400K, which is probably the monitor's native white point (with a goal of 6500K).
Here's the correction provided by the color profile generated by that hardware setting. As expected, Green and Blue are reduced compared to Red. Notice also that blue was measured slightly stronger than green before the profile was generated, so to compensate the blue line runs slightly below the green line in the profile generated.
Answered by Michael C on April 29, 2021
It is possible to calibrate both but calibrating the display is much better. Unfortunately, not all displays can be calibrated and so many people resort to calibrating the graphics card. The problem is that this can introduce artifacts causing banding or reduced color-reproduction.
The difference is crucial if you want to see smooth color gradations or even grey scale. Most graphics card have 8-bit per-channel output which gives 256 levels for each primary color (10-bit output has been available a long time but 10-bit displays are only recently becoming common). So when you calibrate the graphics card, the output is adjusted but precision can easily be lost. Imagine, for example, that your screen is too red and that the calibration determines that red must be multiplied by 0.8, then the graphics card could only send values from 0 to 204 (256 * 0.8), so gradations are lost.
When you calibrate the monitor, the mapping between the output and the display is performed on 8-bit output at a much higher precision. High-end displays (even relatively affordable ones) can feature 14-bit 3D LUT exactly to perform this mapping. This is far more precise and allows to render 8-bit or 10-bit color at its full precision.
Keep in mind that when you calibrate a display, it is calibrated to a certain target because the translation from the output to the display is specific. So if you want to calibrate a screen for use with a certain device that does not support loading a profile, you must calibrate it to show colors it expects to display. The entertainment this is often NTSC or sRGB color.
Summarizing to answer your questions:
Answered by Itai on April 29, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP