Jump to content
Axel

HDR on Youtube - next big thing? Requirements?

Consider HDR already?   

57 members have voted

  1. 1. Consider HDR already?

    • Not interested at all.
      7
    • Don't need it now, will evaluate it when it's everywhere.
      27
    • I wasn't aware of the latest developments, but I'm looking into it now.
      16
    • I am already updating my workflow and hardware, HDR is the next big thing.
      7


Recommended Posts

EOSHD Pro Color for Sony cameras EOSHD Pro LOG for Sony CamerasEOSHD C-LOG and Film Profiles for All Canon DSLRs
On 11/20/2016 at 3:06 AM, Policar said:

If that's what Dolby is saying, it's ridiculous. Step outside during a sunny day and you're exposing yourself to brightness far in excess of 10,000 nits all around you. Orders of magnitude brighter. Having seen 10,000 and 4000 side-by-side I can confirm that neither is fatiguing, even in indoor light, though you might choose to turn brightness down at night as you would with any display. And simply because a display can get that bright doesn't mean it will. It's like saying 8k is too sharp. It might be unnecessarily sharp, but you can always show a less sharp image. Only very small parts of the frame ever get that bright.

Dolby, who doesn't have a horse in this race as Philips does (hardware manufacturers want lower levels to be perceived as preferable so they can sell displays before technology advances, whereas Dolby just wants to have the best standard that will eventually get adopted) has found that 10,000 nits is what 90% of consumers consider "adequate." Based on my experiences with HDR, I wouldn't consider anything below 4000 nits even "true" HDR (though it might still be quite impressive) and 10,000 nits for me at least is the baseline of what's ideal, similar to how I think 2k or 1080p is an ideal "good enough" resolution. Anything beyond that is great but diminishing returns and only really appeals to that last 10%. Of course, 2000 nits will still look much better than 100, which is the standard now. It's considerably more than halfway to 10000 in log base 2.

As regards future proofing, I don't think any camera I have will come close to matching HDR spec so I don't worry about it. The C300 Mk II with the new SLOG 2 upgrade is the cheapest thing on the market that is HDR-ready, and it was actually designed as such. 

Well, I dont watch movie "outside during a sunny day". 10000 is too much in a relatively dark room. Its like 10W LED torch flashing unexpectedly right in front of your face, while your pupil is still wide open. Besides, its not commercially feasible yet, and probably won't be in next ten years. 

You have three wrong assumptions there, first you think HDR is only defined by a group of cine-geeks in Los Angles, and they're the only ones who determine what is HDR and what is not, based on their wishful standard. BUT, they just set their own limits, their own road map. they don't write bible. Second, you think 15 stop is always necessary. No it's not. in almost 95% of situations, the whole scene is fully covered by only 9 stop. We don't always point our lens towards the sun. Third, you believe C300mark2 is a 15+ camera! Sorry, its not true. Lecture us as much as you want about its color science, but DR? No Thanks. The king of low noise floor in DXO list is Nikon D810, and its just 14.8ev at miraculous ISO64, but "useful" DR is only 13.5. because they measure down to 1:1 SNR, which means the floor where for every piece of data, there is equivalent piece of noise, which means garbage quality. in big-pixel sensor, it tends to be even worse. I think RED and Canon use some temporal noise reduction tricks to clean that, but as any NR practice, sacrifices resolution and color accuracy. 

Share this post


Link to post
Share on other sites

First of all, thank you guys for providing some background, controversial as it may be. 

For my part, I am just curious if the so-called cinematic look is the ultimate answer to all our questions. Cause if it were, we practically reached the summit of simulating film stock, had the holy grail of manipulating colors to a degree never deemed possible a decade ago and with *just* HD had enough resolution to depict anything we could dream of. Even analog film projection (I know what I'm talking about here, I had been both an analog and digital projectionist for many years) effectively didn't have more resolution and dynamic range than we can produce today.

With the advance of technology these standards are about to change. The UHD specifications schedule more DR, more color differentiation, higher resolution and higher framerates. Particularly the latter is something cinéastes swear they will never get used to, and I am one of them. While I still think that all these quality improvements do not expand the vocabulary of film, I can now imagine that contemporary SDR films may look faded a few years from now. 

Share this post


Link to post
Share on other sites

So... I'm in The Netherlands visiting my parents... they've got a new Samsung UHD HDR tv. The screen was absolutely awful to look at. Factory settings of course. Full brightness. Full contrast. Samsung colors... way too vivid, Chernobyl Barbie skintones, things blown out and crushed down... So, I went into the settings, changed it from Dynamic or whatevs to HDR, just curious what it would do, went to expert settings, changed things around so that it would actually look good, natural and not as eye straining as it had been before. My dad: 'ah, ok, sure, I guess... whatever'. Mom comes in after work 'what the hell happened to the tv?', I'm expecting her to be struck by the newfound beauty and quality of the screen... 'it's like I'm looking through a frosted milkbottle, it's horrible'. Wow... they're completely ruined by watching crap (bright, vivid, blown, crushed) all the time (same people that don't dim their phones at night). Not only don't they know any better, they do not even want any better. I'm baffled and ashamed to be quite honest. :confounded:

Share this post


Link to post
Share on other sites
4 hours ago, Cinegain said:

 Wow... they're completely ruined by watching crap (bright, vivid, blown, crushed) all the time (same people that don't dim their phones at night). Not only don't they know any better, they do not even want any better. I'm baffled and ashamed to be quite honest. :confounded:

I think most people on this site would have similar experiences with their familie's viewing preferances. I downloaded an episode of Boardwalk Empire from You Tube and it ended up being 480p. So I ran it from my laptop into the plasma TV. I asked my wife what she thought of the quality and she said it was great, just as good as the dvd's we'd watched the other episodes on. I've got to admit that I was a bit surprised just how good 480p can look. 

 

Share this post


Link to post
Share on other sites
5 hours ago, Cinegain said:

So... I'm in The Netherlands visiting my parents... they've got a new Samsung UHD HDR tv. The screen was absolutely awful to look at. Factory settings of course. Full brightness. Full contrast. Samsung colors... way too vivid, Chernobyl Barbie skintones, things blown out and crushed down... So, I went into the settings, changed it from Dynamic or whatevs to HDR, just curious what it would do, went to expert settings, changed things around so that it would actually look good, natural and not as eye straining as it had been before. My dad: 'ah, ok, sure, I guess... whatever'. Mom comes in after work 'what the hell happened to the tv?', I'm expecting her to be struck by the newfound beauty and quality of the screen... 'it's like I'm looking through a frosted milkbottle, it's horrible'. Wow... they're completely ruined by watching crap (bright, vivid, blown, crushed) all the time (same people that don't dim their phones at night). Not only don't they know any better, they do not even want any better. I'm baffled and ashamed to be quite honest. :confounded:

I was really proud of my family for appreciating when I fixed the motion settings on their new tv. They're no experts, but they knew that was a bunch of bullshit <3 

Share this post


Link to post
Share on other sites

HDR TV has automatically panel light and contrast at 100% and very vivid colors ON when watching HDR material. Watching normal SDR with those settings shows too bright and colorful image.

I watch my GH4 videos with panel light and contrast 100% and vivid gamut ON in my Pana 4k TV. When shooted and adjusted properly before TV the video image is stunning.

Share this post


Link to post
Share on other sites

Hi,

I'm saving up for a G85, and with that I'm looking for a 4k screen.

The screen is going to be used as a monitor for my pc. I like to have 40" +.

My priorities are: 1) viewing angle 2) colour accuracy 3) NON flickering when set to low brightness 4) latency 5) HDR ready

My budget hovers about € 700.

Any recommendations?

- Thanks!

Share this post


Link to post
Share on other sites
On 12/3/2016 at 9:55 PM, Stanley said:

I think most people on this site would have similar experiences with their familie's viewing preferances. I downloaded an episode of Boardwalk Empire from You Tube and it ended up being 480p. So I ran it from my laptop into the plasma TV. I asked my wife what she thought of the quality and she said it was great, just as good as the dvd's we'd watched the other episodes on. I've got to admit that I was a bit surprised just how good 480p can look. 

 

Your wife is right.  DVDs are 480i so 480p should look "just as good".  Why is that surprising?

On 12/3/2016 at 11:14 PM, Liam said:

I was really proud of my family for appreciating when I fixed the motion settings on their new tv. They're no experts, but they knew that was a bunch of bullshit <3 

I don't typically "fix" people's TVs but the few I have fixed have appreciated the difference.   I've fixed TVs and computer monitors.  At least with the monitors I can use a calibration tool.  It's amazing how out of whack some setups are.  I honestly recommended someone buy a new monitor.  Then I "fixed" it one day and sheeplishly said, never mind.  It literally looked that bad before calibration.

Share this post


Link to post
Share on other sites
44 minutes ago, Damphousse said:

Your wife is right.  DVDs are 480i so 480p should look "just as good".  Why is that surprising?

 

Presumption, and technical ignorance on my behalf. And my wife is always right!!

Share this post


Link to post
Share on other sites
1 hour ago, Stanley said:

Presumption, and technical ignorance on my behalf. And my wife is always right!!

Lol.  Well you did a nice experiment and got a good result.  I actually had to read your post a couple of times before I caught the resolution thing.

With the cameras most of us use the resolution on the box is a lie.  Only cameras that produce their output from a down sampled readout give you true resolution.  I'm sure most people would think 720p output from a Samsung NX1 had plenty of resolution.  If you have good lighting, good sound and good story no one is going to complain if the video isn't true 4k.  Nice colors and dynamic range are a bigger selling point for me.  I like 4k for down sampling and avoiding artifacts.  At this point I don't need 4k output for my own personal use.

Share this post


Link to post
Share on other sites
On 13/11/2016 at 7:32 PM, tugela said:

Unlikely. 4K enabled sets are the bulk of models on sale today. Anyone buying a mid to high end set is going to have a 4K screen. In a few years those will be the only screens you can buy other than bargain basement models. HDR will NOT be a mature feature before 4K is.

Anyone who is buying a new TV and buys a 1080p screen is being very shortsighted.

I concur, except on the HDR part... : )

http://www.trustedreviews.com/opinions/hdr-tv-high-dynamic-television-explained

https://www.avforums.com/article/what-is-hdr.11039

:-)

On 13/12/2016 at 10:21 PM, Cas1 said:

Hi,

I'm saving up for a G85, and with that I'm looking for a 4k screen.

The screen is going to be used as a monitor for my pc. I like to have 40" +.

My priorities are: 1) viewing angle 2) colour accuracy 3) NON flickering when set to low brightness 4) latency 5) HDR ready

My budget hovers about € 700.

Any recommendations?

- Thanks!

No idea on all from yours. I find the Chinese offer pretty appealing anyways:

https://www.avforums.com/review/hisense-H65m5500-M5500-uhd-4k-hdr-tv-review.12991

Share this post


Link to post
Share on other sites

Ok, at the moment I've arrived at the Sony KD-49XD8099 It looks like it ticks all the boxes, accept it's a bit over budget ~900 €. As a bonus a few more inches, and H265 decoding!

Apparently the 43" model has a VA panel; even though on Sony's website both panels are listed to have "Triluminos™ Display". http://www.displayspecifications.com/en/model-display/a20c636 shows what panel each display has. The Samsung KS8000 has a VA panel = instant disqualification.

I'll look into the hisense. http://www.displayspecifications.com/en/model/6f565f4

Sorry, non bueno, because it's a VA panel.

 

Share this post


Link to post
Share on other sites

Well, good luck on finding a good IPS match with similar features... ; )

Apples to Oranges: 49" is not a 65" display; it's a really pity indeed :-D

Share this post


Link to post
Share on other sites

A year passed, and I came by a small town's consumer electronics' shop, stuffed with usually overpriced TV sets. The three biggest screens in the window front have the HDR-label, the prices reduced from 3000€ to 1800€.

HLG is another aspect not widely discussed a year ago. It can be/should be 10-bit, but obviously not necessarily (see Sony A7riii announcement). Allegedly, broadcasters would use this as HDR standard because of manageable bandwidths.

HDR10 (meaning 10-bit, a billion colors) is limited to 1000 nits. Netflix adopted this as well as iTunes store.

DolbyVision demands 12-bit (68 billions of colors), it had experimental support from many companies. Supports playback on screens with 600 - 10000 nits, but right now is unlikely to win the race, except for cinema distribution.

Youtube currently supports PQ (=HDR10) and HLG in rec_2020 (P3, a slightly wider color space than rec_709, currently standard for DCPs, is not supported). Getting this right seems to be quite a hassle.

Apple announced HDR for their upcoming FCP 10.4 update. Since FCP already supported 2020, this probably means they will offer standardized sharing options. And although only a minor percentage of video enthusiasts/prosumers use FCP, this could be another nail to the SDR coffin.

Today, I would add another possible answer to the poll: 

⎕ I'm already worried ...

Share this post


Link to post
Share on other sites
24 minutes ago, jonpais said:

I think the learning curve is the biggest hurdle - getting an Inferno and HDR TV to use as monitors is the easy part!

I'd postpone any investment in the editing/monitoring hardware until it gets a) cheaper and b) good enough. Even the 27" Dell HDR (~$2000) covers 2020 only by ~70% and is deemed "HDR ready" by most of the reviewers. Imagine you plan for an ambitious short film to be shot next spring, do you need to buy an Ursa mini (I wouldn't : too big and cumbersome). Or at least a GH5? I am really worried.

How BAD will rec_709 videos look in comparison? 

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...