TransWikia.com

How to process DSLR RAW files for display on modern HDR-capable TVs & computer monitors? (actual 10bit/ch or 12bit/ch dynamic range)

Photography Asked by RMEnger on December 10, 2020

10bit/ch display panels are becoming common. The latest generation UHD TVs can display 10bit/ch, some consumer TVs have as high as 1000nit output. Very recently HDR computer monitors have (finally) come on the market. And both LG and Sony now make smart phones that claim to have HDR-capable display screens.

How should we process our still photo RAW files to display (with maximum dynamic range fidelity) on the HDR capable displays? (Actual 10bit/ch, or greater DR)

What software should we be using to process the RAW files?

What output format should we be using?

One would think that JPEG2000 is a logical choice, given it is used in Digital Cinema to support 12bit/ch. But there is little support for it in the still-camera community. (E.g. do any still cameras generate JPEG2000 internally?)

The disparity is likely to get worse, as we’re probably going to see home theater projectors with 12bit/ch and high-NIT capability in the future. We already have them in commercial theaters. The Dolby Vision encoding standard for home video supports 12 bit/ch. (maybe HDR10+ does too) Consumer use of 12bit seems only a matter of time.

How do we leverage the fancy sensors in our DSLR cameras (that often cost more than our expensive HDR-capable UHD TV set) to actually display 10bit/ch wide dynamic range still photos to those TVs, computer monitors and smart phones? What is the RAW file processing software chain? What is the output file standard we should use?

2 Answers

Your first step is to find out what formats the TV can decode. JPEG2000 is one candidate TIFF is another, but the TIFF spec is a can of worms allowing creation of new tags with alternative decoding mechanisms. There are many TIFF decoders that don't understand all the variants.

Second test: Can you actually see 10-12 bits of dynamic range. Paper doesn't even support 8 bits -- the contrast range is about 100:1 for a well made print, less than that for halftoned images.

The advantage of deeper bit ranges is the ability to remap to a lower contrast range without artifacts.

Due to the way the human eye reacts to light, the only way I suspect that you can see 10 bit dynamic range is if:

  • Lighting external to the screen is minimal (trip over the dog level)
  • The image itself has very little zone IX and X regions, and not much VII. Too much white, and your eye constricts, which means you can't see squat in the shadows.

Zone system: https://en.wikipedia.org/wiki/Zone_System

Answered by Sherwood Botsford on December 10, 2020

Canon's HDR PQ HEIF might be one of the answers, shoot 10bit PQ in-camera then view on HDR TV. See this article.

Answered by Steven Wang on December 10, 2020

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP