Jump to content

HDR on Youtube - next big thing? Requirements?


Axel
 Share

Consider HDR already?   

57 members have voted

  1. 1. Consider HDR already?

    • Not interested at all.
      7
    • Don't need it now, will evaluate it when it's everywhere.
      27
    • I wasn't aware of the latest developments, but I'm looking into it now.
      16
    • I am already updating my workflow and hardware, HDR is the next big thing.
      7


Recommended Posts

43 minutes ago, kidzrevil said:

 I mean im still taking my 0-255 and compressing it down to 16-235 because anything outside of that range clips on most devices so I think you can understand why im not so eager to entirely abandon my current SDR workflow. 7 stops of DR is plenty if you know what you are doing. 

One day, I'll be looking back nostalgically at delivering SDR 4K to YouTube, when all you had to do was hit 'upload', and your content could be effortlessly watched in as many as seven different resolutions. =) 

Link to comment
Share on other sites

EOSHD Pro Color 5 for Sony cameras EOSHD Z LOG for Nikon CamerasEOSHD C-LOG and Film Profiles for All Canon DSLRs

SDR vs HDR. Is it necessary to always adjust SDR videos for 6-7 stops dynamic range? If our camera can save 10-12 stops why we must grade SDR video to clip highlights and crush shadows. Not many are using so bad displays. Most TVs and displays can show quite bright whites and good contrast with 8 bit SDR material. In most cases clipping looks worse than keeping than highlights.

I have shooted many years using  "HDR"-like profiles and viewing. I shoot with a flat profile (GH3/4/5 Natural contrast -5 + idynamic low). When I watch my videos I set my TV brightness and contrast both to 100 and colors to TVs native largest gamut. I use darker gamma in TV/computer player to compensate flat profile. The result is HDR-like video experience with 8 bit SDR material. My videos look little flat in low contrast weak screens but just OK in good screens.

We watch very high quality photos with good dynamic range and colors in our 8bit screens. Why 8bit videos must look crushed and clipped in the same screens? Why there must be holes in the head and plain white clouds? Why national and commercial TV programs has so low dynamic range? Are the professional cameras bad or is there some rec709 rules to adjust images to look crap?

I think it is possible to have HDR video with 8bit. 10bit is of course better but 8bit is not always so bad. Modern TVs has sophisticated filters to show 8bit video in 10bit panel so that we cant see banding like in "dumb" computer screen.

Link to comment
Share on other sites

@Vesku Are you asking why we can't give more exposure in rec 709 without clipping the highlights? When you've reached 100 nits on your display, that's as bright as the whites (or highlights) are ever going to get. With HDR, fire, the sun, a bright sky, and specular highlights can be reproduced by as many as 1000 nits on some of today's displays. When you say most of today's TVs and monitors can show good highlights and shadows in rec 709, are you saying they're capable of much more than five or six stops of dynamic range? Because this contradicts everything professionals in the industry are saying. rec 709 was only designed to accommodate around five stops of dynamic range. Look it up. Not sure what you mean when you ask why 8-bit videos must look crushed and clipped on 8-bit screens - as long as they're properly exposed and graded, they should look fine. I'm also having trouble understanding what you're getting at when you ask why commercial television has such low dynamic range. It sounds like you're contradicting yourself here - on the one hand, you're telling us that when you shoot rec 709, you get images that resemble HDR, but when you look at professionally shot video on TV, it looks terrible? Lastly, I fail to understand why you keep insisting on 8-bit for HDR when every single industry professional is recommending 10-bit. It's all the more baffling, as you've got a camera that can shoot 10-bit internally. I think you're missing the point about 10-bit as well. In the first place, it never has been the most desirable bit rate for grading (most colorists say they prefer working with higher bit depth footage), and according to everything I've read and heard and watched, it can be even more problematic when grading for HDR. I think it's got something to do with banding or something.

Since you already own a computer and I'm assuming an HDR TV, I would think you'd be able to instantly see the difference between SDR and HDR. It's like night and day. Only today, I compared the experience of watching some videos on both my LG C7 and on my 5K iMac, and guess which one had deeper blacks and brighter highlights? I also watch my own YT videos from time to time on TV and they look incredible, but they're still in the rec 709 color space: no amount of playing with the highlights and shadows is ever going to make them magically look like HDR footage. I should add that it is absolutely possible to grade your rec. 709 footage for HDR delivery, and it can indeed look very good. But not as good as HDR.

Below is a screen shot from the intensely beautiful The Lost Castle by Christian Maté Grab. Viewed on an SDR monitor, it looks fine, but if it were graded in HDR, the bright sky would glow and the image would have greater overall depth. It would benefit from more subtle color gradation (particularly in the sky - 10-bit rec 2020 can reproduce bright intense colors that rec 709 cannot), as well as an impression of greater sharpness as a result of the higher local contrast of HDR. Fortunately, it was shot with James Miller's LOG profile (created in partnership with Atomos) and can always be graded again in HDR should the filmmaker be so inclined.

Screen Shot 2017-11-19 at 7.40.16 PM.png

Link to comment
Share on other sites

@jonpais Good that we are trying to understand each other and this HDR phenomenon. I dont have HDR TV or monitor. I have 65 inch Panasonic AX800 4k TV which has 98% of DCI-P3 color space and about 500nits brightness. It is not HDR ready so it has no HDR profile or YT HDR feature. It can play 10bit HEVC files from SD card but it cant understand HDR mode. My computer monitor is vivid and bright 27" 2.5k IPS so it has gray blacks but otherwise very good image.

I know what is the difference between true HDR and SDR. I have seen many demos in stores. I am still wondering why many SDR videos today must have 30 years old look with very limited dynamic range even when our displays are so much better today. I think that not many are calibrated their bright TVs to standard 100nits. It is too dim in normal room lightning. Of course industry has standards and they cant grade videos wildly for best TVs but still many videos has unnecessary too contrasty and clipped look. Our eyes/vision is very flexible. In my mind it would be better to keep some highlights in sunsets or bright objects. Our brain can understand the scene without blinding "true" HDR brightness. 

HDR is coming and I am going to use it when I get an HDR TV. Meanwhile I am getting very good results with my 8bit SDR system using GH5 12 stops 8bit dynamic range, 0-255 video range and my TVs bright and vivid colors. 4k 60P also helps to make my videos more life-like. 4k 60P 10bit HDR would be nice but my camera is not capable for it (internally).

Sharing these "extreme" videos is difficult. It is fun for our own pleasure but for big audience it is still just coming. There is even no HDR for movie theaters. No video projector can show HDR brightness or HDR colors. Big screen is dim and faint. What is the point of DCI-P3 or rec2020 if the max brightness in movie theater is 50-80nits.

Link to comment
Share on other sites

Broadcasters and post production houses have been shackled by standards that are nearly thirty years old. Better displays have been the driving force behind the new standards. Even if your Panasonic can display more brightness, richer color and a higher dynamic range than the run-of-the mill set, Technicolor couldn't arbitrarily start grading and distributing content solely for premium televisions in the current SDR space.

The idea has occurred to me though, that since colorists grade shows in dimly lit rooms under D65 bulbs with neutral walls, and content is meant to be viewed that way, that TV sets should use their light sensors to automatically switch off when operated under unfavorable conditions. =) 

I think they're looking at doubling the brightness in movie theaters.

Link to comment
Share on other sites

For me 4k 60P vs 30P is bigger improvement than HDR vs SDR because I have already so good, nyanced, bright and vivid colors with my 8bit system than it is hard to think it being even better. Of course true HDR has better image but smooth, sharp and well defined motion is more important to me. I wonder when industry is going to HFR HDR? Who will be the one shooting a movie with 4k 60P HDR?

Link to comment
Share on other sites

3 hours ago, jonpais said:

There are some stunning looking 4K 60p HDR videos on The HDR Channel on YouTube, but they look pretty horrible on an SDR display. 

I just watched some 4k 60P 10bit HDR demos with my non-HDR TV via SD card and they looked stunning. Even when my TV is not HDR ready it shows so large color gamut and brightness that HDR videos looked almost as they should. I used max brightness and contrast with darkest gamma possible so the image was quite OK just little too bright. 10bit is very good, no banding at all with max settings. I think my TV shows HDR demos better than medium class LCD "HDR ready" Tvs because my old top model has larger color gamut and better contrast.

 I wonder when photography is going to join 10bit standard. HDR photos would look nice. It is funny that we must render a 10bit video of photos to watch HDR photos.

Link to comment
Share on other sites

@Vesku I think you may have misunderstood how SDR displays work as well as the camera itself. What you are seeing on you 1080p capable television is a rec709 2.2 gamma image. Rec709 will not exceed 6-7 stops thats why HDR was made. What your camera is doing since it is fully capable of going pass 7 stops of DR is compressing the stops of light above the 7th stop (roughly at 90 ire) and you can start to see the image losing texture. HDR maintains the texture in those upper ranges. When I mentioned shooting for that limited DR of rec709 its simple. You just make the decision of where you want to maintain texture : in the highlights (clipped shadows but highlight texture), the midtones (balanced image) or in the shadows (blown out highlight and sometimes midtone areas). That’s it. All HDR is doing is giving you a greater margin of error in comparison to SDR...it doesn’t matter the DR of your camera rec709 cant display as much as rec2020 and hdr

Link to comment
Share on other sites

17 minutes ago, kidzrevil said:

@Vesku I think you may have misunderstood how SDR displays work...

Lets forget SRD, HDR, rec709 and rec202 for a while. I shoot with my GH5 a 8bit file. I record about 12 stop dynamic range. I watch it and I can see that 12 stops dynamic range in my TV. There is texture in highlights and shadows. Image looks stunning and vivid in highs, mids and shadows. How is that possible?

Link to comment
Share on other sites

With what picture profile because they all employ serious highlight compression. Cinelike D for sure doesn’t have anything close to 12 stops of DR. VLOG is at 12 and that has to be converted from LOG to linear to watch it. And if its 8 bit the file is definitely being saved in a rec709 format so you definitely cant choose not to talk about rec709 when your DR is being limited to a set value because of it @Vesku

Link to comment
Share on other sites

2 minutes ago, kidzrevil said:

With what picture profile because they all employ serious highlight compression. Cinelike D for sure doesn’t have anything close to 12 stops of DR. VLOG is at 12 and that has to be converted from LOG to linear to watch it. And if its 8 bit the file is definitely being saved in a rec709 format so you definitely cant choose not to talk about rec709 when your DR is being limited to a set value because of it @Vesku

GH5 RAW photos has about 13 stops. Video Natural, contrast -5 with idynamic standard has about stop less than RAW, at least 11 stops. As I said I am not talking about rec709 now. GH5 can record much more. When we grade for rec709 rules we must clip and crush the good GH5 file, but must we do it always?

Link to comment
Share on other sites

32 minutes ago, Vesku said:

 I shoot with my GH5 a 8bit file. I record about 12 stop dynamic range. I watch it and I can see that 12 stops dynamic range in my TV. There is texture in highlights and shadows. Image looks stunning and vivid in highs, mids and shadows. How is that possible?

??????

@Vesku I thought @kidzrevil‘s explanation was very clear and concise. Instead of cluttering up this thread with how you can see 12 stops of dynamic range on your SDR television set, please start a tutorial in a new thread. Thanks!

Link to comment
Share on other sites

42 minutes ago, jonpais said:

??????

@Vesku I thought @kidzrevil‘s explanation was very clear and concise. Instead of cluttering up this thread with how you can see 12 stops of dynamic range on your SDR television set, please start a tutorial in a new thread. Thanks!

I think this is relevant to HDR topic. Sorry if not. Here are examples what I see in my TV. I shooted a RAW photo and a video (Natural, contrast -5, idynamic) in my hiking trip in Norway at the same place. As you can see from frames the dynamic range is quite close. I see my videos and photos in my TV just as my examples are. Looks almost the same. Maybe not 12 stops but more than six.

 

JPG from RAW

GH5-RAW.jpg

 

JPG from video

GH5-video.jpg

Link to comment
Share on other sites

@Vesku trust me man as beautiful as that image is that is not a 12 stop image. A true HDR image on an HDR display looks far different than this. The sensor may be 12+ but the tone curve used to store and display the image in 8 bit srgb isn’t. It will always compress the DR to fit the rec709 spec. Using different picture profiles merely redistribute the DR i.e. raise shadows, lower highlights, compress midtones etc. to display what the sensor sees in accordance to rec709 or rec2020 specifications.

HDR10 & Dolby Vision allows the user to see the 13 stops of lets say a GH5 with details in what was once the textureless highlight range of these tone curve without the use of highlight compression.

im still learning the intricacies of how HDR works myself but I currently understand and work within the constraints of SDR material.

Link to comment
Share on other sites

It’s interesting, to me at least, how HDR seems to hold little or no appeal for EOSHD members, while over at other sites, there is keen anticipation. Not a single doubting Thomas saying ‘my clients don’t ask for it’, or ‘I’ll wait till it’s in every home’. I would think that, given the fact that, at the very minimum, you could at last finally see the full twelve stops of dynamic range your camera is capable of, or display your work on a client monitor in your home or office, would be of interest to more than one or two filmmakers. Especially as most already own a camera that can shoot log, and purchasing an external monitor and last year’s top-rated OLED would run little more than the last camera body you purchased. For example, a Ninja Flame runs around $800, and last year’s top- rated 55” OLED is going for something like $1,500 - just a few hundred dollars more than the GH5 + V-Log. None of this has anything whatsoever to do with whether you shoot 1080p, whether your clients can afford it, or that your local broadcasters only just implemented HD. 

Link to comment
Share on other sites

3 hours ago, jonpais said:

It’s interesting, to me at least, how HDR seems to hold little or no appeal for EOSHD members, while over at other sites, there is keen anticipation.

Doesn't mean lack of appeal or interest, or at least I don't think it is. I discussed the matter with my buddy, who does a lot of corporate video stuff. I showed him the Alistair Chapman clip, and he said, well, it seems I'll have to invest some 20.000 bucks. Why, I said, you have the FS7, that's 10-bit Slog3 (probably getting the FS7 Mark ii's rec_2020 profiles soon via firmware update), you want to sell the Shogun anyway to get the Inferno, what's the problem? He said, if you are working professionally, you can't have your client stare on a field monitor. And don't be so naive to think it stops with a monitor! What else? I asked. 

But I knew the answer already: you can't know in advance. Things you never thought of in the beginning are suddenly missing. It's like a law. 

For myself, occasional wedding videographer but actually a hobbyist? Could very well mean I had to sell the whole A6500 ecosystem I barely used in earnest and buy he GH5 instead with all necessary accessoires. Inferno? Wouldn't carry that around, no way! Borrowed the Shogun once and found it too bulky. Little to gain from ProRes over the GH5's internal codecs also. And then of course a TV set. A few thousands!

Apart from these worries, I think I'm a born HDR guy. The above Ursa shot as well as Neumanns GH5 sunset I graded "with HDR in mind" - paradoxically by accentuating the shadows and carelessly letting the upper highlights clip. At first look, this seems to contradict the common understanding of what HDR means. But *just* looking for detail in every part of the image, preserving highlights at any costs for rec_709 asf. doesn't result in a convincing image, imo. I am playing a lot with Blender, and I like the CGIs the most where light is the star, not millions of dispensible fabric texture details that often distract more than add to the overall impression. 

I'm setting my hopes in FCP 10.4. Not sure if the (few) "HDR tools" will enable me to preview some sort of basic HDR with my 500 nits display, it's probably wishful thinking. As I have learned by listening to FCP experts, the new CC tool integration will be a dream. A special inspector tab for color where they all live, with a set of very comprehensive tools that allegedly surpass the 3d party plugins available so far (minus, alas, a tracker) ...

 

Link to comment
Share on other sites

Some scenes are less than 12 stops. Shooting those in HDR would be a waste the same way shooting LOG in low contrast scenarios are. I’ve watched content on my HDR display and some of it looks worse than the SDR content. I have no clue why because my playstation games look way better in HDR than SDR so its a very hit or miss technology right now. There aren’t even reference patterns to calibrate a tv for HDR right now and HDR10 looks different than Dolby Vision HDR. Some displays are even trying to use dynamic contrast styled trickery to convert sdr content to hdr. This is why im sticking to my decision that its too early for HDR, you don’t know what peoples display is doing to your content. SDR is more predictable

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

  • EOSHD Pro Color 5 for All Sony cameras
    EOSHD C-LOG and Film Profiles for All Canon DSLRs
    EOSHD Dynamic Range Enhancer for H.264/H.265
×
×
  • Create New...