TransWikia.com

Color Calibrating - the display? or the output? Key Basic Issues - help!

Photography Asked on April 29, 2021

I have the Xrite i1Profiler and there are several things I need to understand.

1.) Calibration seems to mean running the software and creating a profile. The profile seems to be located on the computer. Which raises a slew of questions. First of all, it means that you’re not calibrating the screen – you’re calibrating the computer output. Second, it therefore means that the program would have to be run separately for every computer (output device). Third, if there’s no output device that can run a computer program (a DVD player, for instance), then calibration isn’t practically possible. but that can’t be right, can it?

2.) …and fourth, it means that you’re limiting the signal, and therefore not using the hardware potential to its maximum. THIS video https://www.youtube.com/watch?v=h_TT9O2I1b4 has an Xrite expert talking about what sounds like just that – that the calibration can end up in the hardware calibration of the screen. What I want is that: to calibrate the monitor/projector/TV, not to adjust (limit) the output to that screen. I don’t see how this process could be doing that, but I’d like that to be the case.

SO THE ULTIMATE QUESTIONS ARE:
A. if I have to use my Xrite i1Profiler to calibrate a screen for use with a game player or DVD-player that can’t run software, how do I do that?
B. If I have to use my Xrite i1Profiler to calibrate a wall-sized plasma TV (for example) for use without a player at all, how do I do that?

I’d really love to have a thorough grasp of the principles and possibilities (minus impossibilities) involved. My thanks in advance!

3 Answers

You are asking very valid questions, and the very fact of it means that you do understand the essence of the problem. You only need to know the technology to deal with it. There are good answers already, all of them basically correct, but I'll add my own take in hope to resolve the confusion.

Michael mentioned that there are two things typically involved in the colour management workflow: calibration and profiling. Calibration physically adjusts the display to a known standard. Profiling describes what we've got as the result, for the benefit of the software that can utilise it.

Why would we need profiling if we can do calibration? Well, because

  • Calibration is never perfect.
  • As you say, we may want to use full capabilities of the display. (At times we don't: I'll come to that later). In this case we want to do "minimal" calibration.
  • Even if we calibrated perfectly to a standard, the resulting trivial 1:1 profile at least certifies this result, and we (the colour-managed software) at least know what we are doing.

Let me use this single-dimensional analogy. We produce rulers rather than displays. There are several common standards: 30 cm (ISO1), 35 cm (ISO2), etc. We buy one that is advertised as "ISO1-capable". How do we know it actually is? We measure it with a trusted device, say X-Rite TapeMeasure. It tells us: the device is actually 31.2 cm (and, if we want to be advanced with this analogy, it can detect that the divisions on its first half are slightly denser than on the other half: it is not even linear). What we've done is already some profiling. We could now record all these discrepancies, create a correction table/formula, and call it a profile.

But who will apply this correction? Right, the computer. By definition, it can "manipulate the output", so it can compensate for imperfections of the output device. Doing this correction is, essentially, calibration. But like it's better to "fix" the ruler instead of doing mental corrections every time, it's usually better to "fix" the display instead of correcting the computer's output.

  • Typically (but not always for cheap monitors), display offers higher quality adjustments.

  • Adjusted (calibrated) display will be correct for all inputs, even for devices that can't apply correction themselves.

    • But only as long as we calibrated to a common standard. The device simply needs to be told to output in that standard (say, sRGB), which for many devices is implicit.1 This was less true for analog video cables, where calibration would compensate for individual imperfections of the cables and on-board DAC.

      In our ruler analogy, this is equivalent to "massaging" the ruler to be linear and cutting it off to the standard 30 cm. Then all the consumers need to know is that the device is ISO1-compliant, or at least that it is metric - which is often an implicit assumption, but still something to be aware of.

  • On a computer, different software may compete for adjustments. A game or a screen saver may ruin the correction table - worst of all, unbeknown to the user. This necessitates presence of a resident program (usually from the supplier of the calibration device) that monitors and reinstates the calibration adjustments. Not to mention, it has to be done every computer restart.

Having established that, we can consider three scenarios.

  1. Professional/advanced monitors have full built-in calibration capabilities. They have high-qiuality tables (LUT) and can be calibrated to any reasonable target. Sometimes they even have built-in colorimeters; sometimes colorimeters can be connected directly to them; sometimes it all goes via computer, but only for convenience and flexibility of control.

    In order to see correct colours from a DVD player on such a monitor, you calibrate it for Rec.601 or .709 as appropriate (or just sRGB - they are very similar) (typically using the display's proprietary software), store the settings and that's it.2

  2. Normal consumer displays (including most TVs) have, at best, separate colour adjustments (R-G-B). The calibration process will ask you to adjust these to obtain neutral greys with respect to your chosen target (see in Michael's answer). This brings you some way, and that's the best you can do for your DVD-player. But you don't know for certain how red is your red and how correct is your yellow. This is similar to "linearising" our ruler some way, so that the middle (15 cm) is in the middle, but the resultant "total length" and the remaining non-linearities will remain known only to the computer.

    2a. Many modern displays (including TVs) have standard presets (typically sRGB, even though they may have a different name). You can check how correct they are by profiling the display in this mode with your colorimeter. Good software will show you the actual gamut and let you compare it with the theoretical sRGB. As a minimum, you can check how close the measured colour temperature is to 6500K. If the result is reasonable, using such preset is probably the best thing for home devices if you want "correct" colours.

  3. Laptop displays often don't have any controls (except for backlight brightness). Inevitably, they have to be fully calibrated by the video card.

Now, why would you want to sacrifice the display capabilities when calibrating it? If you need, say, to reduce the blue gain in order to compensate excessive bluishness of the backlight, surely you are sacrificing the "bluest blue" you can produce. Indeed. But do you want correct colours or fancy colours? It does surprise some that after calibration the image may look "duller". But it is consistent with the original intention. If this is not important for you, you don't need to calibrate.

There are practical consequences here. Sometimes displays have wider gamut than the desired standard, even after calibration for greys. (Like our ruler was a bit longer than 30 cm). What to do then? Wouldn't it be good to use this extra capability? Yes! But...

  • The display is now non-standard, at least not the assumed standard. If you naively output image to it 1:1, it will appear oversaturated (and possibly tinted in some colour regions). (The "max" signal will produce 31.2 cm instead of the assumed 30 on our ruler). Some action needs to be taken.
  • A computer can take action if it knows the difference. This measured difference is recorded in the profile. There are three possibilities here:
    • The computer needs to know the "intention", i.e. the colour space of the image. If it decides that it is "narrow" (ISO1 in our allegory), it will compress the output and you'll get desaturated (but correct) result.
    • If the image is "wide" (wider in gamut than your display), the computer will try its best to "massage" it into your available range. You will benefit from this extra bit you have (provided that the image actually has such saturated colours).
    • If, for some reason, the display profile is ignored (i.e. colour management is off), you will get distorted colours. And this is very common. The moral is: you have to be constantly aware of the fact that you are using a non-standard device and need "smarter" software to deal with that.
  • A device that can't adjust the output (can't colour manage), like a DVD-player, will inevitably display distorted colours. For such cases, you are better off clipping your display to the standard.

1 It's a separate big topic what is the assumed/default standard (profile) for different devices.

2 Of course, you'll need to refer to the manual for exact instructions. There could be several pitfalls, like calibration being separate for different inputs etc.

Correct answer by Zeus on April 29, 2021

You're conflating calibration with profiling. They're two different things.

Let's do a basic definition of those two terms:

Almost all displays can be calibrated to one degree or another, but not all displays can be profiled.

  • Adjusting a simple brightness/black level control is a kind of calibration.
  • Adjusting a simple contrast/white point control is a kind of calibration.
  • Adjusting simple RGB controls are the next higher level of calibration
  • Adjusting different sets of RGB controls for dark, mid, and bright grayscale levels are the next higher level of calibration
  • ... and so on as more refined adjustments are offered by a device's user adjustable controls.

Calibration is what happens when we manually adjust a control and then measure the result in a way that the measurement does not change the device's output. The measurement only tells us the result of our manual adjustment. We then manually change the adjustment to try and get closer to the target output and measure again. Rinse and repeat until we get as close as we can.

Profiling is when we use a measuring device to create and load a set of instructions that does change the device's output.

Ideally, a screen should be calibrated (and/or profiled if the screen has that capability) using it's own adjustment parameters before it is profiled to work with a specific device that has profiling capability.

Only after using the display's own adjustment controls to get it as close as possible to the targeted output do you run the profiling section of the software that adjusts the output of the device providing the signal (usually a computer's GPU). In most cases, the generated profile applied to the computer's output signal should add very little correction to the signal sent to the calibrated screen if you calibrated the screen properly using its internal controls (which, of course, also assumes that the display device has adjustment controls available to calibrate it well).


SO THE ULTIMATE QUESTIONS ARE: A. if I have to use my Xrite i1Profiler to calibrate a screen for use with a game player or DVD-player that can't run software, how do I do that? B. If I have to use my Xrite i1Profiler to calibrate a wall-sized plasma TV (for example) for use without a player at all, how do I do that?

A and B are basically the same question other than the output from your gaming console, DVD player¹, or whatever other device(s) is/are feeding/streaming programming to your TV screen (which could be an internal tuner module, but probably isn't for most folks) may use different standards/color spaces. This is where it gets a bit messy. You might want to use the same screen to play games, watch Blu-ray movies, use an Amazon Fire or Roku, etc. to stream programming via the internet, or watch programming coming from your cable TV box! They might not all have the same color space capability. You might have to choose one device for the screen to be optimized to pair with and let the results from lower fidelity (or less frequently used) input devices fall wherever it lands.

How do I do that?

It all depends upon the capabilities of the screen.

If your screen has no capability to load LUTs (basically, profiles), then you're stuck getting the screen as close as you can using the screen's internal RGB, brightness, and contrast adjustments while it is hooked up to your computer. While doing this, you want to:

  • Set the target color space to either the screen's capability or that of the primary device that you plan to use with the screen, whichever is more limited.
  • Set the color temperature target (5000K, 6500K, D55, D65, etc.) to whatever the prevailing ambient light will be where and when you plan to use the screen the most. A lot of folks will say to set it at D65 regardless of ambient light. That works well in dark movies theaters or dark home theaters, but doesn't always works as well in places that are fairly bright but the color, tint, and spectral distribution of the light aren't close to direct, mid-day sunlight.
  • Set the brightness target (120 cd/m², 160 cd/m², etc.) to whatever is appropriate for the anticipated brightness of the ambient light in which you plan to use the screen the most.

If your screen has independent profiling capability, then you use your computer and software to load the correction profile to the screen, rather than to your computer's GPU. Different screens may allow this in so many different ways that you'll have to dig into the screen's User Manual or Technical Manual to see how to do it for a specific model.

If your screen has the ability to load multiple LUTs/profiles for different input devices connected to their own input port (HDMI1, HDMI2, DP3, etc.), then you can rinse and repeat for each use case scenario. Do a profile for watching movies from your streaming device or Blu-ray player in a dim room with warm lighting. Do another profile for playing games in brighter, cooler light during the day when there's lots of sunlight illuminating the room, etc.

¹ Please - FOR THE LOVE OF ALL THAT IS GOOD - upgrade to Blu-Ray if you're using a plasma TV or other really nice and/or large screen!


I've still got an older X-Rite colorimeter and the ancient software that came with it. The first step is to use the monitor's own contrast, RGB, and brightness adjustments until the colorimeter measures the output as close as possible to the desired output.

Here's the closest I can get RGB and brightness to target using the monitor's own controls:

enter image description here

enter image description here

This monitor is a fairly cheap one, and it's also fairly new, which means it's still very bright. Monitors dim as they age, so it's nice to get one that has plenty of headroom when it's new. I had a Dell monitor that I used for 6-8 years that started out having to be set to about 50% brightness and I replaced it when it couldn't reach the target brightness set all the way to 100% brightness. With the current monitor, I had to do a LOT of correction using this monitor's internal controls.

Brightness and contrast were set to:

Brightness: 14/100
Contrast: 66/100

The RGB controls were set to:

Red: 36/100 Green: 25/100 Blue: 25/100

For whatever reason, moving the red control above 36 does not increase the red response of the colorimeter, even though the screen is noticeably pink-tinted with red set to high values, so the other two colors had to be set to match red at 36/100. (Reducing the red control to values less than 36 did reduce the amount of red response by the colorimeter, so I'm guessing 36-37 is all it takes to fully saturate the colorimeter's red channel.)

After running the automatic profiling routine, here's the screen showing me the curves applied via the monitor profile:

enter image description here

Notice that there is very little correction being done via the profile! The red, green, and blue "curves" are almost a perfectly straight slope with the three colors almost on top of one another. If I had set the internal RGB controls to the same values, say R:36, G:36, B:36, then the profile would show deep curves reducing Green and Blue and a mild curve increasing Red.

Here's the "test pattern" on the final page of the calibration software. The first is with the profile applied. The second is with a generic sRGB profile applied.

enter image description here

enter image description here

There's almost no visually distinguishable difference! So if you hook up your monitor or TV to your computer and calibrate it well using it's own controls, then you can use it attached to things such as DVD/Blu-Ray players or gaming consoles and it will still be so close that you can't tell the difference. Changes in ambient lighting conditions will have much more influence on your perception of the color output than the minor inaccuracies that not having a proper color profile will introduce.

[OK, so I later realized that the visual differences seen looking at the monitor with the profile applied and not applied would not be reflected in a screen shot, which is created from the signal before any profile is applied to the signal sent to the monitor. But I promise, I didn't notice any difference in the actual monitor output when viewing it with and without the custom profile. The hardware calibration got it that close. When I tried to do the same thing with the monitor basically set to 6500K and then profiled to 5500K I realized my error. They looked markedly different on the monitor, but the screenshots were still identical! LOL]


Just for fun I went back and changed the monitor's hardware RGB settings to R:36, G:36, B:36.

Here's what the colorimeter measured.

enter image description here

It looks much worse than it is, It doesn't take much adjustment to push one color all the way to the end of the scale. Notice the colorimeter measured the output as 6400K, which is probably the monitor's native white point (with a goal of 6500K).

Here's the correction provided by the color profile generated by that hardware setting. As expected, Green and Blue are reduced compared to Red. Notice also that blue was measured slightly stronger than green before the profile was generated, so to compensate the blue line runs slightly below the green line in the profile generated.

enter image description here

Answered by Michael C on April 29, 2021

It is possible to calibrate both but calibrating the display is much better. Unfortunately, not all displays can be calibrated and so many people resort to calibrating the graphics card. The problem is that this can introduce artifacts causing banding or reduced color-reproduction.

The difference is crucial if you want to see smooth color gradations or even grey scale. Most graphics card have 8-bit per-channel output which gives 256 levels for each primary color (10-bit output has been available a long time but 10-bit displays are only recently becoming common). So when you calibrate the graphics card, the output is adjusted but precision can easily be lost. Imagine, for example, that your screen is too red and that the calibration determines that red must be multiplied by 0.8, then the graphics card could only send values from 0 to 204 (256 * 0.8), so gradations are lost.

When you calibrate the monitor, the mapping between the output and the display is performed on 8-bit output at a much higher precision. High-end displays (even relatively affordable ones) can feature 14-bit 3D LUT exactly to perform this mapping. This is far more precise and allows to render 8-bit or 10-bit color at its full precision.

Keep in mind that when you calibrate a display, it is calibrated to a certain target because the translation from the output to the display is specific. So if you want to calibrate a screen for use with a certain device that does not support loading a profile, you must calibrate it to show colors it expects to display. The entertainment this is often NTSC or sRGB color.

Summarizing to answer your questions:

  1. Yes you can calibrate the output but you can also calibrate the display, when supported by the hardware.
  2. Yes, generally have to run the software on each device to load the intended calibration.
  3. Yes it is possible to calibrate a screen for a device that is not calibration aware but you must calibrate the display to the color-space output by that device.
  4. Yes, if you calibrate the output, there can be artifacts.
  5. Depends on the screen. If the screen has a builtin calibration LUT and software with the calibrator is able to set it, you can do that. Some screens even support multiple LUT and you can switch between them from the display controls.

Answered by Itai on April 29, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP