Jump to content

HDR on Youtube - next big thing? Requirements?


Axel
 Share

Consider HDR already?   

57 members have voted

  1. 1. Consider HDR already?

    • Not interested at all.
      7
    • Don't need it now, will evaluate it when it's everywhere.
      27
    • I wasn't aware of the latest developments, but I'm looking into it now.
      16
    • I am already updating my workflow and hardware, HDR is the next big thing.
      7


Recommended Posts

It's a taste thing, right, trading color saturation for greater dynamic range.  We certainly wouldn't want HDR if it did that because people who favor saturation over DR would then be left with inferior images.  We need both.  When I say "saturation" (and maybe someone can give me a better term) I mean the amount of color information we need to discern all colors within the display gamut.  Banding is the clearest example of what I mean.  As I mentioned elsewhere, if you display, say 20 colors (saturation) of yellow on an 8-bit, 6DR gamut display, you will see banding, because your eye can tell the difference.  Here are some examples I created.  

The first is all 255 shades of green an 8bit image, which should render "bandless" on a 6DR screen

8bit.thumb.jpg.df66b2e1d85efdad182fbef29c708848.jpg

I can already see some banding, which tells me that the website might re-compresses images at a lower bit-depth.

Here's a version where 18% of the colors are removed, let's call it 7-bit

8bit_minus18pct.jpg.401aee884e9e93daccceca614f9a1a56.jpg

And now for 32% removed, call it 6-bit

8bit_minus32pct.jpg.645d47fc9189854ee7154614d9493fef.jpg

The less colors (saturation information) there is, the more our eye/brains detect a difference in the scene.  HOWEVER, what the above examples show is that we don't really need even 8bits to get good images out of our current display gamuts.  Most people probably wouldn't notice the difference if we were standardized on 6bit video. But that's a whole other story ;)  

How does this relate to HDR?  The more you shrink the gamut (more contrast-y) the less difference you see between the colors, right?  In a very high contrast scene, a sky will just appear solid blue of one color.  It's as we increase the gamut that we can see the gradations of blue.   That is, there must always be enough bit-depth to fill the maximum gamut.  

For HDR to work for me, and you it sounds like (I believe we have the same tastes), it needs the bit-depth to keep up with the expansion in gamut.  So doing some quick stupid math (someone can fix I hope), let's say that for every stop of DR we need 42 shades of any given color (255/6 DR).  That's what we have in 8bit currently, I believe.  Therefore, every extra stop of DR will require 297 (255+42) shades in each color channel, or 297*297*297 = 26,198,073.  

In 10bits, we can represent 1,024 shades, so roughly, 10-bit should give us another 24 stops of DR; that is, with 10bit, we should be able to show "bandless" color on a screen with 14 (even 20+) stops of DR.

What I think it comes down to is better video is not a matter of improved bit-depth (10bit), or CODECs, etc., it's a matter of display technology.  I suspect that when one sees good HDR it's not the video tech that's giving a better image, it's just the display's ability to show deeper blacks, or more subtle DR.  That's why I believe someone's comment about the GH5 being plenty good enough to make HDR makes sense (though I'd extend it to most cameras).  

Anyway, I hope this articulates what I mean about color saturation.  The other thing I must point out, that though I've argued that 10bit is suitable for HDR theoretically, I still believe one needs RAW source material to get a good image in non-studio environments.

And finally, to answer the OP.  I don't believe you need any special equipment for future HDR content.  You, don't even need a full 8bits to render watchable video today.  My guess is that any 8bit video graded to an HDR gamut will look just fine to 95% of the public.  They may be able to notice the improvement in DR even though they're losing color information because again, in video, we seldom look at gradient skies.  For my tastes, however, I will probably complain  because LOG will still look like crap to me, even in HDR, in many situations ;)  10bit?  Well, we'll just have to see!

 

Link to comment
Share on other sites

EOSHD Pro Color 5 for Sony cameras EOSHD Z LOG for Nikon CamerasEOSHD C-LOG and Film Profiles for All Canon DSLRs

According to Nick Driftwood, you should always be using 10 bit when shooting HLG, (not that you've got a choice!)

10-bit is where you should ALWAYS be when using V-Log L and HLG because you're making use of the bigger headroom - luminance detail and chrominance depth data levels - that 10-bit offers you.

Source

Link to comment
Share on other sites

Sorry to interrupt, but my colleague with a Samsung HDR TV, which has a YouTube app that actually displays HDR videos reports that my HDR video looks fine, and is clearly HDR - it looks luminously realistic, not washed out (and not oversaturated). He also had to also manually set his TV to HDR mode, otherwise the picture looked washed out. Another colleague, expert in TVs, reports that the latest YouTube app for LG TVs does not display HDR. And yes, my video was shot in 10bit and rendered in 10bit, from Slog Sgamut.

Link to comment
Share on other sites

lol, dies guy xD

"So it's a stupid idea to shoot HLG, please ignore those people who say that you need to shoot HLG for more high dynamic range, they obviously have no idea what they're talking about. [..] Ignore it completely."

Uhhhh... bro! Like they'll just add a feature for shits 'n giggles. Found his earlier stuff analysis of stuff interesting, but to dismiss and discourage HLG is bit messed up.

Link to comment
Share on other sites

3 hours ago, jonpais said:

I don’t know which YT app you are referring too, @markr041, but my LG C7 (2017) has the YT app, and it flags HDR content with the appropriate type (Dolby Vision, etc); all the videos on the HDR Channel play correctly for me as well... 

I am trying to be helpful. Your picture conclusively showed you were not watching the video in HDR - it was what an HDR video looks like when not watched in HDR. I see it in HDR, my colleague does on his Samsung TV - and can reproduce what you see when HDR viewing is manually turned off on his TV. This is exactly his report:

"At first the HDR video was very disappointing. The SDR looked better. Then I went looking in the Samsung TV menu. There is an ON/OFF for HDR. It was off. I turned it on and the HDR video was much better."

The full information about the YouTube app on LG TVs is: the latest YouTube app in the operating system of all LG TVs does not appropriately play HDR10 videos, but does ok with Dolby Vision and HLG. My video metadata indicates the video if HDR10. Maybe that's the problem, maybe not. But there is a problem on your TV viewing my video, and not on my player or on a Samsung HDR TV.  I am grateful you took the care to play the video and supply the pictures of what you were seeing. It is very helpful.

I am not blaming anyone, just trying to figure out what is going on. I am also concerned that you or someone else would conclude that the video is washed out with no blacks, when it is not. You are not an expert on TVs or HDR and neither am I, but there is something wrong and it is not the video I am reasonably confident (I am fine if proved wrong, since the video can be easily changed). 

This thread is about YouTube HDR; this experience shows that there are still problems with the YouTube HDR ecosystem. 

Link to comment
Share on other sites

10 hours ago, markr041 said:

I am not blaming anyone, just trying to figure out what is going on. I am also concerned that you or someone else would conclude that the video is washed out with no blacks, when it is not. You are not an expert on TVs or HDR and neither am I, but there is something wrong and it is not the video I am reasonably confident (I am fine if proved wrong, since the video can be easily changed). 

I wasn't aware there was any doubt, in anyone's mind, that HDR technology is buggy, in the real world ;)  I also can't believe anyone on this forum would doubt that you shot what you said you shot.  They're just reporting what they see. I mean, even in this late, mature state of PHOTOGRAPHY one can print/view someone else's JPG that will look all F'd up!  Even using color calibration systems I have found them to create unsuspecting problems in software that doesn't recognize them.

So don't get down, Mark!  I'm looking forward to looking at your stuff when I have HDR equipment.

Link to comment
Share on other sites

@markr041

I personally thank you for pioneering. I can't monitor HDR right now, but it's good to know about the intricacies beforehand. You took the effort and tested it, that's awesome!

@maxotics

Very interesting read. I absorp every word and read your posts several times in order not to miss something.

@jonpais

You always find the right words and explain very well. What I like most about your postings are your excellent videos that show your good taste and commitment to beauty.

I am glad I started this thread. Invaluable information, great forum. Thumbs up for all.

Link to comment
Share on other sites

2 hours ago, maxotics said:

I wasn't aware there was any doubt, in anyone's mind, that HDR technology is buggy, in the real world ;)  I also can't believe anyone on this forum would doubt that you shot what you said you shot.  They're just reporting what they see. I mean, even in this late, mature state of PHOTOGRAPHY one can print/view someone else's JPG that will look all F'd up!  Even using color calibration systems I have found them to create unsuspecting problems in software that doesn't recognize them.

So don't get down, Mark!  I'm looking forward to looking at your stuff when I have HDR equipment.

Precisely. i was just reporting what I saw. I've got no doubt Mark shot HDR. @markr041 Sorry if that wasn't clear.

Link to comment
Share on other sites

On 11/5/2017 at 7:32 AM, Cinegain said:

lol, dies guy xD

"So it's a stupid idea to shoot HLG, please ignore those people who say that you need to shoot HLG for more high dynamic range, they obviously have no idea what they're talking about. [..] Ignore it completely."

Uhhhh... bro! Like they'll just add a feature for shits 'n giggles. Found his earlier stuff analysis of stuff interesting, but to dismiss and discourage HLG is bit messed up.

This just concerns the rec. 709 version using the Leeming LUT, but here’s what Paul’s got to say about HLG vs V-Log and dynamic range:

I've found that using HLG in a Rec709 timeline/colour space is giving fantastic DR (same as V-LogL) plus more tonal precision thanks to using 5-95% of IRE as opposed to V-LogL's 16-79% IRE. It's my main goto profile now other than when shooting high speed 8 bit, where I switch to Cine-D.

and further on, he writes:

In all my testing for HLG vs V-LogL, I've found they have the same dynamic range. Both see equally into the shadows.

source

Incidentally, not only does wolfcrow seem dead set against HLG, unless you’re shooting for the BBC or NHK or own a Panasonic television set, but he’s also pretty militantly against HDR altogether until displays are capable of 10,000 nits. He calls HDR ‘old wine in a new bottle’.

Link to comment
Share on other sites

Think about it - if you even had a television that output 10,000 nits (I think that's what Dolby's shooting for, if I remember correctly) - the amount of light reflected off the walls of your average room would most likely just about negate any advantage. I think nits are a linear measurement, and my current set emits some 690 of the buggers - if it gave off 10,000 nits, that'd be around 15 times the present amount of light - maybe it would fry my eyeballs!

Now that Atomos is becoming available in Vietnam at last (yay!), I might invest in the Ninja Inferno and give HLG rec. 2020 a shot - even though I'm not working for the BBC, I don't have a Panasonic TV, I can't tell 4K from 1080p and I'm not even sure I'd recognize HDR unless it was shown alongside SDR... The Inferno should arrive on these shores just about the same time that Apple releases Final Cut Pro version 10.4. Anyhow, even for those who, correctly or not, consider HDR just a marketing gimmick, my advice at this point in the game would still be to invest in the very best television set you can afford, because even SDR content looks a gazillion times more awesome on one of the newer flat screens than on any small desktop or laptop.

Edit: I guess I'll throw in one more word, and that is that, regardless of whether you think focusing with the GH5 or other mirrorless is good enough just using the built in LCD screen, I'd strongly encourage picking up a 5" monitor for precise focusing. Especially when shooting flat profiles like V-Log, HLG or even Cinelike D, with sharpening and saturation dialed down, I often can't see focus peaking. With an external monitor, I can distinguish whether the camera's focused on the subject's glasses, their hair, their cheeks or their eyes, something that was next to impossible to do with the LCD.

Link to comment
Share on other sites

3 minutes ago, jonpais said:

I think nits are a linear measurement, and my current set emits some 690 of the buggers - if it gave off 10,000 nits, that'd be around 15 times the present amount of light - maybe it would fry my eyeballs!

From Dolby test laboratory: https://blog.dolby.com/2013/12/tv-bright-enough/

"...At Dolby, we wanted to find out what the right amount of light was for a display like a television. So we built a super-expensive, super-powerful, liquid-cooled TV that could display incredibly bright images. We brought people in to see our super TV and asked them how bright they liked it.Here’s what we found: 90 percent of the viewers in our study preferred a TV that went as bright as 20,000 nits. (A nit is a measure of brightness. For reference, a 100-watt incandescent lightbulb puts out about 18,000 nits.)..."

 

On an average sunny day, the illumination of ambient daylight is approximately 30,000 nits.  http://www.generaldigital.com/brightness-enhancements-for-displays

Link to comment
Share on other sites

19 hours ago, Kisaha said:

https://www.atomos.com/sumo19

that seems like an excellent choice for a lot of different people and uses. If anyone has ever shop, or even work, with a broadcast monitor, then will find its value extremely low for what it offers.

From the field, to the studio and beyond!

People in the professional color grading world think the screens of the Atomos stuff are rubbish for actual grading and it's just a marketing gimmick with the HDR. 
e.g. http://liftgammagain.com/forum/index.php?threads/atomos-sumo-19-hdr-recorder-monitor.8947/
I was quite interested in the monitor-only Sumo but the specs are really not that interesting. It's not really 10bit but 8+2 FRC, and it only does Rec709 according to the the specs sheet. It's an IPS panel with up to 1200 nit so I guess you can expect the blacks to be grays when it shows the max brightness in the frame (couldn't quickly find any data about contrast).
It even says "Brightness 1200nit (+/- 10% @ center)" so I wouldn't expect great uniformity.

So I'd say it's still better to get a used FSI or get a LG OLED.

 

 

Unrelated: my only HDR capable device is a Samsung Galaxy S8+ and I hate how colorful the demo videos are I watched so far. Didn't help my taste that they were in 1080p60.

Link to comment
Share on other sites

Since this thread has already wandered, Jon, do you have access to Netflix or Amazon?  If so I think it will be quite easy for you spot the difference with 4khdr material.  The Ridiculous Six is shot on film and supposedly mastered in 4k Dolby vision.  Beautiful natural looking imagery.   Another thing to keep in mind, is that your Lg is upscaling hd automatically, as well as applying various detail, sharpening, contrast enhancement, etc....So unless you have turned all of that off, you won't really have a good idea of what footage looks like.  Also, another thing to think about is a bmd mini monitor, decklink, etc... combined with DR14 so you can send hdr metadata over HDMI (Not to mention a video signal unadulterated by your mac/pc/gpu.) 

Link to comment
Share on other sites

@sam I’ve got Netflix, I’ll have to check out that show. But don’t underestimate my inability to distinguish HDR from SDR. ? I can’t say I’ve ‘calibrated’ my set exactly, but I do turn off all the enhancements and stuff. Your advice about the decklink, etc., are probably the most sensible, just not sure how deep I want to dive in to all this just yet. 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

  • EOSHD Pro Color 5 for All Sony cameras
    EOSHD C-LOG and Film Profiles for All Canon DSLRs
    EOSHD Dynamic Range Enhancer for H.264/H.265
×
×
  • Create New...