Jump to content
jonpais

Say Your Goodbyes to SDR!

Recommended Posts

I uploaded my very first HDR video to YouTube around a week or so ago and... since then, I've got no desire to ever look back. Every filmmaker is looking for the camera with the highest dynamic range, good low light performance, and nice highlight rolloff... and as a consequence, we keep upgrading our cameras every 6 months to 2 years... a costly enterprise!... however - if your camera shoots RAW or LOG, you are already prepared to edit and deliver in HDR, which has a greater color space than SDR as well as greater dynamic range - and not just a paltry one or two stop improvement. Many balk at investing in HDR because of the supposed expense - but for less than the price of a Lumix GH5s, you can purchase a Ninja Flame or Inferno, a BMD UltraStudio Mini Monitor and the very best 55" OLED television and begin editing high dynamic range video right away. And if, like me, you primarily distribute your videos on YouTube, you'll be happy to learn that the quality is greatly improved as well - fewer compression artifacts and less macroblocking, less noise in the shadows and much sharper looking images. Why sharper? Because of the insanely higher local contrast of OLED displays, images that looked soft and smudgy in SDR are suddenly crisp again. Possibly for the first time in your life, you'll see inky blacks and brilliant highlights. And what about the workflow? Little different from any other really. You simply need to select wide dynamic range for your library and project in Final Cut Pro and away you go. If you shoot with the GH5, HLG is a bit more simple than V-Log, if only because you won't need any LUTs when recording, viewing your footage in the timeline or for delivery. The GH5 already has built-in LUTs for viewing the image while recording and you can use the Atom HDR feature on the Ninja for grading (you can also use Atom HDR when recording). If you prefer working with V-Log, however, there are LUTs available for all of the above. I prefer HLG because in my opinion, there's less work in post, shadows are less noisy and it has a more pleasing highlight rolloff, but you are free to choose whichever you think is best. If you have clients that require rec.709, conversion LUTs are also available. 

Share this post


Link to post
Share on other sites
EOSHD Pro Color for Sony cameras EOSHD Pro LOG for Sony CamerasEOSHD C-LOG and Film Profiles for All Canon DSLRs

Agreed 100%!

Now we just need the displays to catch up.

I have an LG OLED and it's wonderful, and there are a few HDR projectors (mostly 1080p) but there's a real lack of HDR computer displays and laptops out there - and to be honest I think I know why. The brightness would cause eyestrain at such close viewing distances.

Share this post


Link to post
Share on other sites

@AaronChicago For now, I’m planning to use the Ninja Inferno as a grading monitor. Not ideal, I know, but we’re not living in an ideal world. And many of us are still editing SDR on less than ideal monitors as it is. 😬 There is always the Dell UP2718Q, which is certified HDR Premium and has gotten excellent reviews, but as far as I know, it’s not compatible with Apple, it’s pricey ($1,659.00 USD), and I believe that even if it were, you’d be looking at $3,000 for a premium i/o device. I just picked up my iMac from the service center and bought a Thunderbolt 2 cable and USB-C to Thunderbolt 2 adapter to hook up the UltraStudio Mini Monitor. I won’t even tell you how much those cables set me back! I’ll be installing the firmware for the device this evening and pray my computer doesn’t crash or anything. Will keep everyone updated.

Share this post


Link to post
Share on other sites

A key issue in this is what the uploaded HDR video looks like on YouTube when viewers do not have HDR capability. 

The good news is that YouTube converts the HDR video, if it has the correct metadata, to an SDR version and shows the SDR version if it detects no ability to display the HDR video correctly. That is, if the video is HLG and the display cannot work with HLG videos then it plays the SDR version.

And, in my experience, the SDR version of HLG videos converted by YouTube look good.

The downside is confusion on the part of viewers whether they are seeing HDR or SDR. The uploader usually labels the video as HDR, so many viewers might think they are viewing in HDR when they are not. And this could be true even if they have a real HDR-capable display but, say, one that does not do HLG.

All in all, if uploading to YouTube is what you do I am tending to go with the idea to always create and upload HDR videos.

 

Share this post


Link to post
Share on other sites
42 minutes ago, markr041 said:

The good news is that YouTube converts the HDR video, if it has the correct metadata, to an SDR version and shows the SDR version if it detects no ability to display the HDR video correctly. That is, if the video is HLG and the display cannot work with HLG videos then it plays the SDR version.

And, in my experience, the SDR version of HLG videos converted by YouTube look good.

I haven't tried it, but it seems you can also provide hints for the SDR downconversion with your own 3D luts. Check the "More control over SDR conversion" section here: https://support.google.com/youtube/answer/7126552?hl=en

Share this post


Link to post
Share on other sites

It's still not clear to me what's required for viewing HDR.

The Ninja Inferno can do it but the screens on those aren't true 10bit as they are only 8+2 FRC (frame rate control iirc). 

Now I currently have a (3 year old) LG 27" screen that is also 8+2 FRC but I don't believe I can watch HDR content on it, or if I can I've been unable to find out how. I guess something extra is needed for compatibility with HLG / HDR 10 / Dolby Vision?

I've got a RX480 graphics card which can output 10bit afaik, but how all these things play together for actually being able to view HDR (or not) is a bit of a confusing mess in my opinion.

Share this post


Link to post
Share on other sites

@Grumble The Ninja Inferno is just a poor man's way to get started grading HDR. The monitor is admittedly too small and is not true HDR. 

Your three year-old television is not HDR. It cannot be made compatible for HDR. Neither is your computer HDR.

To learn more about HDR, read this article.

Share this post


Link to post
Share on other sites

The easiest way to see HDR content is with a "Mobile" HDR-compatible phone, like the Samsung Galaxy S8 or S8+ or Note 8 (HDR10 and I think HLG too).

The YouTube app recognizes the capability on these phones and switches the video to HDR mode (it is labeled as such) when played full screen if the video, again, has the HDR metadata. The brightness of the screen increases substantially when play starts, and you can see the difference from the same video that you can play in a small box, which will be SDR. Sure it is not "true" HDR like on a big screen TV - but it is actually certified as true mobile HDR. And it comes close to the big screen HDR experience and is really impressive. The notion that HDR is worth it is demonstrated just by this.

If you want to show your friends your HDR video in HDR, just whip out the relevant phone and play it using the YouTube app.

This is the Samsung PR about the phone: "The Galaxy S8 is the first ever Mobile HDR Premium certified smartphone, so you can watch shows and films the way they were meant to be seen. And the Quad HD+ Super AMOLED display brings films and shows, games and images to life in vivid detail."

*I think  some Sony phones are HDR capable as are LG phones. I just have the Galaxy S8, so I can just attest on that phone this really works. It is so far the only phone (and S8+) certified as Mobile HDR Premium.

So what are the requirements for Mobile HDR Premium on smartphones? 10bit, DR of 0.0005-540 nits,  90% of P3 color gamut.

And do not scoff at the nits - if you hold the phone up close as one normally does for viewing, that level of nits is almost blindingly bright.

The link to the UHD Alliance press release on Mobile HDR Premium: 

https://alliance.experienceuhd.com/news/uhd-alliance-defines-premium-viewing-experience-battery-operated-devices

Share this post


Link to post
Share on other sites
1 hour ago, DaveAltizer said:

Do the new MacBook Pros with the Cine 3 gamut display not do HDR? Theoretically, the MacBook Pro is fully equipped to handle the richer colors of HDR content. 

To my eye, the contrast ratio and brightness on my late 2016 aren't even close to HDR-ready, but if Apple labels the display as HDR-certified, it's HDR certified. But it's deceptive if they claim it is imo.

Unlike every previous gamma ever, HDR values are absolute, not relative. It's not just contrast ratio (which is also very poor on the MBP relative to HDR displays) but absolute brightness that gets you an HDR certification. (You need a wide gamut display too, of course, and maybe it does fulfill that requirement.) When an HDR display can't reach a certain level, then it's tone mapped, but all brightness levels below that point are set in absolutes. So you need a REALLY bright display. HDR starts being meaningful/noticeable around 1000 nits brightness, which I believe is the lowest level where a device can be certified, and starts to look really good around 4,000 nits. But 10,000 nits is where you get "full spec HDR." 

I would be shocked if the MacBook Pro is much brighter than 400 nits at the absolute brightest and the contrast ratio is very poor anyway compared with an OLED. Apple might say differently, but I don't trust them on this stuff. Vimeo and Netflix have "HDR" content for the iPhone X. While the iPhone X's display is dramatically better than the MacBook Pro's, it still reaches 600 nits at best. I think the Samsung Galaxy 8 can hit 1000 nits. That might be the only phone that can show off true HDR (granted, only at full brightness, and only at the lowest possible spec, the equivalent of 480p being labelled as HDTV). The Pixel 2 is worse than either at 500 nits.

If Apple claims the MacBook Pro is HDR-compatible, they should sued for false advertising. It's not even close to being close. If 10,000 nits is 4k, it's barely 240p.

Share this post


Link to post
Share on other sites

HDR is still too much of a unicorn. On my PS4 HDR content looks great but there isn’t much of a difference because of the contrast ratio of my display which defeats the purpose of having the HDR feature to begin with. Lets not even talk about windows interpretation of HDR which is either buggy or inferior because it looks severely washed out. When I was at the Panasonic B&H workshop the other day you can see people struggling to see the difference between the sdr and hdr display right next to each other. Maybe if they were set really bright we can see the difference but showing night scenes on two displays around 90-120 cd/m was a hard sell. I think the only people that will benefit from viewing HDR content is OLED monitor owners and they are so expensive with such a small market share I rather shoot for SDR 2.2 gamma rec709 spec where I know the majority of displays are calibrated for. Give it a couple of years and my opinion will change

Share this post


Link to post
Share on other sites
51 minutes ago, HockeyFan12 said:

To my eye, the contrast ratio and brightness on my late 2016 aren't even close to HDR-ready, but if Apple labels the display as HDR-certified, it's HDR certified. But it's deceptive if they claim it is imo.

Unlike every previous gamma ever, HDR values are absolute, not relative. It's not just contrast ratio (which is also very poor on the MBP relative to HDR displays) but absolute brightness that gets you an HDR certification. (You need a wide gamut display too, of course, and maybe it does fulfill that requirement.) When an HDR display can't reach a certain level, then it's tone mapped, but all brightness levels below that point are set in absolutes. So you need a REALLY bright display. HDR starts being meaningful/noticeable around 1000 nits brightness, which I believe is the lowest level where a device can be certified, and starts to look really good around 4,000 nits. But 10,000 nits is where you get "full spec HDR." 

I would be shocked if the MacBook Pro is much brighter than 400 nits at the absolute brightest and the contrast ratio is very poor anyway compared with an OLED. Apple might say differently, but I don't trust them on this stuff. Vimeo and Netflix have "HDR" content for the iPhone X. While the iPhone X's display is dramatically better than the MacBook Pro's, it still reaches 600 nits at best. I think the Samsung Galaxy 8 can hit 1000 nits. That might be the only phone that can show off true HDR (granted, only at full brightness, and only at the lowest possible spec, the equivalent of 480p being labelled as HDTV). The Pixel 2 is worse than either at 500 nits.

If Apple claims the MacBook Pro is HDR-compatible, they should sued for false advertising. It's not even close to being close. If 10,000 nits is 4k, it's barely 240p.

HDR vs Resolution vs Brightness.

I believe that they are different and separate from each other. Does it affect each other?

Almost like asking “Does the world continue to move when I have my eyes closed?” Or “When a tree falls down in a forest does it make a sound if no one is there to hear it?”

 

Share this post


Link to post
Share on other sites
2 minutes ago, mkabi said:

HDR vs Resolution vs Brightness.

I believe that they are different and separate from each other. Does it affect each other?

Almost like asking “Does the world continue to move when I have my eyes closed?” Or “When a tree falls down in a forest does it make a sound if no one is there to hear it?”

 

They all contribute to contrast and perceived resolution. I'm being metaphorical, though.

To my eyes, the difference between 4k and 240p is about as dramatic as the difference between a 10,000 nit display and my late 2016 rMBP.

Share this post


Link to post
Share on other sites
59 minutes ago, andrgl said:

Or you can wait for the real world to catch-up with the standards, saving yourself a shitload of cash. 4K hasn't even overtaken 1080 yet. And now HDR has made the non 10-bit 4K panels obsolete?

Early investing in hardware only benefits manufactures.

Exactly. There isn’t even a display calibration standard for HDR. The industry created the tech and rolled the shit out totally prematurely to sell units. I think everyone knows we are hitting the ceiling in terms of tech and display quality so manufacturers are dropping as much buzzwords and gimmicks as possible to sell products. It’s not even overkill its just useless at the moment. When OLED tech gets cheap enough to be the baseline standard then ok HDR is a must but until then its just something to fantasize about kinda like 8k 

sidenote : the 8bit 4:2:0 “noisy” Panasonic GH4 is still being used in hollywood productions and cut seamlessly with Varicam footage. Friendly reminder to the forum

Share this post


Link to post
Share on other sites
1 hour ago, webrunner5 said:

Well I have a Samsung Note 8 and I can say HDR is some impressive stuff even on my Phone. So if I had the money I would go for it. It certainly is the future once you have seen it.

This video here is pretty unbelievable in HDR.

 

Yes, seeing HDR on an HDR-capable phone really shows off what HDR does. Who would have thought phones would beat TV's for usability - YouTube just puts the phone right into HDR mode when you push play (and full screen) in the app. Bam! it is really difficult then to go back to dynamics-compressed, dull SDR video!

 

Share this post


Link to post
Share on other sites

@HockeyFan12 @DaveAltizer The MacBook Pros aren’t HDR. hehe 

Words like HDR Ready, HDR Compatible or anything with HDR in the name does not mean anything in and of itself. It is certainly disappointing when even manufacturers like LG, who make among the best HDR displays today, consider it necessary to deceive buyers with garbage like HDR Pro or HDR Effect on their entry level televisions which are not HDR at all. In the very loosest sense of the term, HDR can mean as little as that a piece of equipment will process the signal! 😳

As far as brightness is concerned, the requirements differ depending  on whether  the display is OLED or LCD. Brightness is only one factor as well - LCD displays must use full array backlight dimming in order to increase local contrast. MacBook Pros with full array backlight dimming would be chunky as heck - and full array backlight is extremely expensive. Witness Dell’s UP2718Q. Which is why I believe that only when laptops with OLED screens appear will we have some semblance of HDR in notebooks.

The UHD Alliance published standards for television manufacturers, but you don’t necessarily want to apply the same criteria to desktop monitors - for one thing, as Andrew pointed out, sitting a couple feet away from a 10,000 nit display would probably fry your eyeballs. 🤩 VESA also recently came up with badly needed HDR standards for desktop monitors, but they are intentionally vague and all but useless in my opinion. 

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...