Jump to content

HDR on Youtube - next big thing? Requirements?


Axel
 Share

Consider HDR already?   

57 members have voted

  1. 1. Consider HDR already?

    • Not interested at all.
      7
    • Don't need it now, will evaluate it when it's everywhere.
      27
    • I wasn't aware of the latest developments, but I'm looking into it now.
      16
    • I am already updating my workflow and hardware, HDR is the next big thing.
      7


Recommended Posts

59 minutes ago, kidzrevil said:

Well I can say this much : I am not switching over to HDR anytime soon. Its too new of a display technology and there are as many arguments for it as well as there are against it.

Lets not forget each displays manufacturer adds their own mojo to their display units to stand out from other companies color science thus creating Yet another variable in how your HLG content is displayed. Even though some of these TV’s claim HDR and wide gamut color most barely hit 60% of DCI-P3 color space. These are one of the many reasons why I am personally sticking to SDR until the tech matures. HDR is reminding me of the 3D tv hype and thats my personal take. 

Shooting with an HDR camera is one thing...how it will be displayed is another. “Shoot for the lowest common denominator” is my personal mantra and it has not failed me yet. The lowest common denominator is still 1080 rec709 8bit @ 2.2 gamma. I know if my images look good within that srgb color space  and exposed so the important details falls within the rec709 6-7 stop dynamic range its going to look good everywhere period. If your display set has upscaling (which most do for full hd content natively) it’s going to fit in with the rest of the content in your home theater i.e. bluray movies ,tablet and mobile device.

So several year ago at NAB Dolby had this booth where they showed off HDR. This was pretty new at the time. They had 2 monitors. One showing a 4K image. The other showing an HD image with HDR. It was kind of a blind test. Most people would choose the HD with HDR over the 4K image as the better, higher res image. Theres probably 1 zillion variables that could influence perception beyond the few they highlighted, but the take away is that HDR was a more noticeable image improvement than 4K, but only required a 25% increased data rate compared to 4K. 

However tech companies and TV manus still went with 4K. Why? I think because resolution is an easier sell to a consumer than dr. It's easier to understand. With that said, consumer res will top off at 4K for the long long long foreseeable future. Broadcast and streaming are still not really ready for even 4K. Even though 4K TV's are inexpensive, I still know only 1 or 2 people with a 4K TV. But I do think 4K TV's will start building steam. And I think HDR will also become a thing if manus can make the image improvement noticeable. Current TV's just look horrible. The bar is low, so should be achievable. 

TV manus need a next big thing. 8K is just too much data and too stupid, at this point. 3D never looked great and was more a novelty. I still can't understand why anyone would want to watch a 3D movie.

4K HDR is still a bit off, but will be the thing. 

Link to comment
Share on other sites

EOSHD Pro Color 5 for Sony cameras EOSHD Z LOG for Nikon CamerasEOSHD C-LOG and Film Profiles for All Canon DSLRs

@Jonesy Jones yeah im not opposed to the format at all I just think its too young to worry about it at the current moment. I do debate ditching all my 8 bit bodies and settling down with a GH5 to meet the upcoming standard but I am currently focusing on mastering the current standard. Im waiting for when HDR eventually becomes more widespread and saturates the market

Link to comment
Share on other sites

Right now there is absolutely no need for HDR. I am cooling down my self, concentrate on my 8 bit stuff and workflow, and wait for the next big wave to happen. We got at least the whole 2018 to think things through. It is not advisable to be at the tip of the spear (or the edge of the razor, the peak of the iceberg etc); except you have plenty of money to waste, or you are at least 80% positive of your "investment". In anyway, the HDR dust hasn't settled yet, let them (or the market) decide on most standards and specs, and we go from there. But I do agree that HDR is a big deal, anyone can see that is much better than HD, they just can't truly see it yet!

Link to comment
Share on other sites

35 minutes ago, Kisaha said:

Right now there is absolutely no need for HDR. I am cooling down my self, concentrate on my 8 bit stuff and workflow, and wait for the next big wave to happen. We got at least the whole 2018 to think things through. It is not advisable to be at the tip of the spear (or the edge of the razor, the peak of the iceberg etc); except you have plenty of money to waste, or you are at least 80% positive of your "investment". In anyway, the HDR dust hasn't settled yet, let them (or the market) decide on most standards and specs, and we go from there. But I do agree that HDR is a big deal, anyone can see that is much better than HD, they just can't truly see it yet!

Well said. 

I'll follow this wise advice then, thanks for joining the discussion.

Link to comment
Share on other sites

The Top Ten OLED HDR Reference Monitors for Under $800

Just kidding! Before I delve further into the current monitor situation for HDR, let me first rattle off some of the specs of the 27" Dell UP2718Q, talk a bit about why it's so freakin' awesome, then why you might not want to pull the trigger just yet, and lastly suggest some options that won't require you to sell your kidney to pay the bills.

Dell proudly boasts that the UP2718Q is the first certified Ultra HD Premium monitor. What exactly does that mean? It means that it meets criteria regarding resolution, screen brightness, bit depth, color gamut and so on, all of which were developed with television sets in mind. some of which may not be necessary or even desirable for a reference monitor that you'll be seated three feet away from in a smoke-filled dimly lit room decorated with heavy metal posters.

Now for the specs: the UP2718Q's got an impressive 3840X2160 screen, a ridiculous 384-zone full array local dimming backlight (making the monitor a little chunky compared to its svelte competitors, and which probably accounts for a fair share of the hefty price tag), 1000 nit brightness, hardware calibration, 10-bit color, numerous connections, including Display Port 1.4, which allows for HDR support for high-end (read: expensive) graphics cards, an IPS panel and pretty nifty color accuracy for sRGB, Adobe and rec.709 (all 100%), 97.7% for DCI-P3 and 76.9% rec. 2020. Before you go ballistic, no monitors can currently display 100% of rec.2020, and in the neighborhood of  77% is supposedly an altogether respectable result. The monitor also happens to be rather well-built. While the unit itself is excellent, it's got downsides that've got nothing at all to do with Dell (afaik): (1) it's not compatible with Apple; (2) some behaviors with Windows will be a little aggravating (if memory serves me correctly, having to do with how it handles non-HDR content and screen brightness); (3) PC graphics cards will be terribly expensive; (4) support for HDR movies, such as streaming services from Netflix, Amazon Prime and YouTube are not available for PC and; (5) HDR PC games are in short supply. Oh, and did I mention that the monitor retails for $1,500?

To be continued (bear with me!)

Link to comment
Share on other sites

The Top Ten OLED HDR Reference Monitors for Under $800 (continued)

I'm writing this after having sat through hours and hours of podcasts and webinars, some of which lasted as long as an hour and-a-half, which went on and on about the benefits of HDR, introduced dozens of new terms and acronyms I'd never heard of before, and didn't get to the 'what about the monitor?' part until the last ten minutes, when you'd learn that the industry standard, the one used by practically all the studios, is the Sony BVMX300, which costs $45,000. What the heck!?

At this point, you might be wondering, why the push for HDR? And the short answer would be money - after all, these manufacturers have to sell their television sets! Seriously though, cinema cameras and even consumer cameras have long been able to shoot high dynamic range images (10-16 bit, 10+ stops of DR), but it's only recently that display technology has followed suit. And now, reference monitors are playing catch up with premium TV sets. It couldn't have been the other way around, because the displays used in monitors all roll off the same assembly lines as those used for consumer televisions, mobile phones, tablets and watches. 

To make a long story short, I listened to Alister Chapman giving a webinar - and this is someone who's got a fair amount of experience both behind the camera and as a colorist - and he claims to have graded a project or two with the Ninja Flame, a 7" external recorder/monitor that runs $800, and when he took the finished work to Sony's Pinewood Studios and looked at it on the Sony X300, he says the grade was nearly spot-on. Not perfect, mind you, but very good. So while it's obviously not an ideal situation (I personally dislike editing on anything smaller than a 27" monitor), anybody who's able to get hold of an Atomos or other HDR external monitor should be able to begin editing HDR right away.

While he hadn't done so himself as of the time of the webinar, Chapman cautioned against using televisions as reference monitors because they each have different curves (not sure if I remember correctly!) and auto-enhancement features to make their picture stand out; and while I've heard several colorists talk about picking up an OLED TV, I've yet to read about anyone's experiences grading on one - if I'm not mistaken, I think they're mostly used to demo work to clients...

So there you have it: at least one monitor (the Dell) that I'd consider buying myself if I had a powerful PC; a number of budget options (an Atomos or Small HD monitor); or wait it out till next year or beyond, when several other affordable monitors hit retailers' shelves. I've already put a deposit down on a Ninja Inferno, which is due to arrive in three weeks.

@Vesku Okay, fire away! :) 

Link to comment
Share on other sites

Monitoring is one nightmare, recording the other. Van Hurkman does say that HDR was no excuse for buying a new camera, as long as it could record LOG, but this is probably true if you have an Alexa, an Epic or at least an Ursa sitting on your shelf.

It seems as if stretching 8-bit LOG (at least that without HLG-mode) to 2020 values will result in banding and posterisation (by which I mean colors in the upper midtones are noticeably thin).

Pocket ProRes (which is 10-bit 422 with up to 13 stops DR) looked better than RAW (too many artifacts). As of yesterday, there is another option of 4k 10-bit with good DR available: the Micro Studio 4k got a firmware update and now allegedly holds 13 stops as well. My buddy has three of those for live events, and he has the Shogun. Have to give it a try. 

The upcoming Sony A7Riii features HLG in 8-bit. How well this is going to work remains to be seen. As a matter of fact, those Sony hybrids have very good DR, the current models as well. Wouldn't it be technically possible to add an HLG profile via FW update? 

22 minutes ago, jonpais said:

But please promise not tell any of the snobs at liftgammagain! 

;-)

Link to comment
Share on other sites

Every single person I know of without exception has recommended 12-bit, or if absolutely necessary, 10-bit. If you've got the GH5, you can begin shooting for HDR today. I've never shot the Ursa Mini, but if I'm not mistaken, Daniel Peters has shown that the dynamic range of the GH5 compares favorably to the BM camera.

Edit: fwiw, Matthew Scott, a longtime advocate of RAW, and a cameraman and colorist whom I greatly respect, recently sold his BMMCC (I know, not same!) and bought the GH5, saying the image quality is better.

Link to comment
Share on other sites

21 hours ago, Jonesy Jones said:

3D never looked great and was more a novelty. I still can't understand why anyone would want to watch a 3D movie.

I agree with your post, but thought I would add my opinion on the 3D failure>

180 degree stereoscopical 3D with VR head-set looks good (if done correctly) and adds immersion. Cinema/TV style 3D glasses makes the image look bad (to me at least) and 3d doesn't really add any immersion when it's confined to a rectangular box. Shooting a whole movie in 180 degree stereoscopical 3D would require a completely different take on filmmaking, but in my opinion that's when it will add value. (360 degree video is a dead end when it comes to films with a narrative.)

Link to comment
Share on other sites

23 hours ago, kidzrevil said:

@Jonesy Jones yeah im not opposed to the format at all I just think its too young to worry about it at the current moment. I do debate ditching all my 8 bit bodies and settling down with a GH5 to meet the upcoming standard but I am currently focusing on mastering the current standard. Im waiting for when HDR eventually becomes more widespread and saturates the market

The work you've posted here is awesome Kidz. You continue to work to get the most out of your gear, and this is exactly the right approach. I have already learned a lot from you, and may even reach out one day with questions. Keep up the great work.

Link to comment
Share on other sites

4 hours ago, jonpais said:

Chapman cautioned against using televisions as reference monitors because they each have different curves (not sure if I remember correctly!) and auto-enhancement features to make their picture stand out; and while I've heard several colorists talk about picking up an OLED TV, I've yet to read about anyone's experiences grading on one - if I'm not mistaken, I think they're mostly used to demo work to clients..

A good television shows reference quality colors. Enhancements can be turned off.

A VA-panel TV is not good for close monitoring because viewing angle makes edges to fade. An IPS panel monitor/TV can not show HDR like contrast (gray black). OLED should be the best choice.

Link to comment
Share on other sites

4 hours ago, Axel said:

Pocket ProRes (which is 10-bit 422 with up to 13 stops DR) looked better than RAW (too many artifacts). As of yesterday, there is another option of 4k 10-bit with good DR available: the Micro Studio 4k got a firmware update and now allegedly holds 13 stops as well. My buddy has three of those for live events, and he has the Shogun. Have to give it a try. 

 

Hmm, where are you guys situated in Germany. I might just have to witness some of that Micro Studio glory:) Was filming with a FS700 and a Flame once and even that was a tiny fun package for 4K.

When I watched Blade Runner 2049 in the cinema it looked like a perfectly fine projection but no different than a projection of a fine Bluray (REC709) regarding my observation and experience of dynamic range. Just a fine and awesome SDR projection. Not sure if DCP has a place for HDR yet. I´m no technical expert.

Link to comment
Share on other sites

@Vesku No disagreement there - OLED’s got incredible black levels, it’s insane. Weak blacks just destroy the viewing experience. I once bought something like two different pressings of the 2009 masterpiece City of Life and Death (from different countries) in search of a decent copy, and finally ended up getting the Blu ray because it was the only one that resembled what I experienced at the movie theater - rich deep blacks. Unfortunately (or fortunately, depending on your wallet!), HDR OLED reference monitors for consumers don’t yet exist. The LG OLED TVs are reference quality, I just don’t think I could really edit on a 55” screen! Also, OLED’s supposedly got burn-in issues, I wonder how it would hold up after hours of displaying the borders of Final Cut as I edit?  Not sure why Alister Chapman was worried about auto enhancement, he’s owned several OLEDs himself and must be aware you can turn those off. 

Link to comment
Share on other sites

@PannySVHS I don’t want to start cluttering up this thread with YT videos, but just to give you a rough idea of how much better HDR is, particularly with films that have man-made colors (ie. not from nature - such as neon colors, fluorescent greens, automobile finishes and so on) and especially science fiction and action films with explosions (though it looks equally stunning for dramas to me), have a look at this demonstration beginning at 16”30’ and ending at 18”30’. The bolt of lightning in the rec 709 space looks drab, the HDR version really flashes! It’s those values above white (the highlights) that really pop. 

HDR is also capable of great subtlety. 

Link to comment
Share on other sites

3 hours ago, jonpais said:

@Vesku Also, OLED’s supposedly got burn-in issues, I wonder how it would hold up after hours of displaying the borders of Final Cut as I edit?

I've had the Thinkpad X1 Yoga with an OLED screen for a bit over a year without any burn in issues. The odd thing is that I don't know of any other laptops with an OLED screen, so I wonder why.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

  • EOSHD Pro Color 5 for All Sony cameras
    EOSHD C-LOG and Film Profiles for All Canon DSLRs
    EOSHD Dynamic Range Enhancer for H.264/H.265
×
×
  • Create New...