Jump to content

Say Your Goodbyes to SDR!


jonpais
 Share

Recommended Posts

3 minutes ago, jonpais said:

The UHD Alliance published standards for television manufacturers, but you don’t necessarily want to apply the same criteria to desktop monitors - for one thing, as Andrew pointed out, sitting a couple feet away from a 10,000 nit display would probably fry your eyeballs. ? VESA also recently came up with badly needed HDR standards for desktop monitors, but they are intentionally vague and all but useless in my opinion. 

The UHD Alliance publishes different standards for cell phone, tablets, and notebook computers.

Link to comment
Share on other sites

EOSHD Pro Color 5 for Sony cameras EOSHD Z LOG for Nikon CamerasEOSHD C-LOG and Film Profiles for All Canon DSLRs

Update on my Ultrastudio Mini Monitor adventure. The device wasn’t recognized on installation last night because High Sierra introduced new security, so it will be necessary to go into security preferences to allow the driver. Will keep everyone updated. 

Link to comment
Share on other sites

Here are the steps you need to take to work with BMD's UltraStudio Mini Monitor on a Mac. I've got to take my Mac back into the shop again, they screwed up, so I can't say how or if the Ninja is working yet! (when they put my Mac back together, the Thunderbolt ports weren't aligned properly or something, so I can't plug anything in them) :) 

 

Link to comment
Share on other sites

There's a fair bit to shooting, grading, delivering, and displaying HDR properly. Art Adams' write up is the most comprehensive I've seen, and a goldmine for people looking to dig into the topic further. 

https://www.provideocoalition.com/a-guide-to-shooting-hdr-day-1-what-is-hdr/amp/

Sadly, most viewers simply don't have the display or the desire for proper HDR content, and the majority of "HDR TVs" don't have nearly the display contrast of an OLED or premium LCD. It also doesn't help that there's no universal standard for HDR encoding: HDR10, HDR 10+, Dolby Vision, HLG...it's early, messy days yet, and I doubt we'll see widespread adoption for a few years yet.

Link to comment
Share on other sites

As it's been said, 99.99% of people don't have hdr display (let alone 4k...). 

In my country the delivery format for broadcast TV series is rec709 1080p. Just saying. 

Is all of this really worth the hassle? At least for now? All this good money and effort being spent could be used for something more useful, like investing in a short film, a model, whatever more important than being a lonely pioneer... 

Link to comment
Share on other sites

You guys have obviously not looked at the links I’ve provided... First off, HDR is not just ‘anything above seven stops of dynamic range’. It is also about more than just dynamic range. Concerning the different formats, there are only two you need to concern yourself with: HDR10 and HLG. If you own a recent HDR television, it will automatically recognize whether the material is HDR10, Dolby Vision or HLG. Whether 99% of the universe is ignorant about HDR is irrelevant: we have the information at our fingertips. As far as cost goes, if you’re shooting with a camera with LOG or RAW, you’re already capturing HDR: investing in a recorder (which you might have already), a device like the Mini Monitor and a TV will cost less than a new GH5s. Regardless of what your average Joe says, any sentient being can distinguish between SDR and HDR. And yes, just as when shooting for SDR delivery, you’ll want to take care that there isn’t a bald sky occupying 90% of the frame (unless that’s your artistic intent), or a bright lamp shining directly behind your talent’s ears. My very first HDR YouTube video was never even intended to be delivered in HDR; I uploaded it without any grading whatsoever; the color is off, for sure, but  in every other respect, it kills the SDR version. 

Link to comment
Share on other sites

10 hours ago, kidzrevil said:

Give it a couple of years and my opinion will change

Give it a couple of years, yeah. That's reasonable. I won't try to shoot a wedding in HDR now or in summer. I do, however, have some more ambitious shorts in planning. The visual quality of which concerns me now

10 hours ago, mkabi said:

HDR vs Resolution vs Brightness.

I believe that they are different and separate from each other. Does it affect each other?

In short: I think so.

My iMac display is *not* HDR, but it has ~500 nits. LCD, means blacks are grey. I have it backlit with a 6500°K LED bar for perceived contrast. Can't stand to watch Netflix on my ~4 year old Samsung TV - anymore. In comparison, all images look muddy and faded. Once it is replaced by an HDR-TV, it's clear to me that I wouldn't want to invest any effort into producing rec_709 images any further. 

There are two sources that discuss HDR vs Resolution vs Brightness in detail: 

1. the Yedlin "Resolution Myths Debunked" video. The bottom line of which is, with true 1080p, we have passed a threshold. We won't be able to see individual pixels at reasonable viewing distances. What is more, since all images are scaled - always! - an upscaling on a device with bigger resolution will improve the perceived resolution dramatically. HD on a UHD display looks better than UHD on an HD display. Fact. 

Resolution is only good if you can't see (or rather feel) it's limits. So resolution must be "invisible". 

Resolution is often confused with perceived sharpness. Beyond the said threshold, contrast adds more sharpness, brilliance, clarity than more pixels. 

2. The lectures on rec_2020, which include 4k (UHDTV1), 8k (UHDTV2), HFR, WideColorGamut and HDR. This is complicated matter, but all engineers agree that an extended dynamic range contributes most of all factors to perceived image quality. 

As a side-note, regarding resolution: it's an indisputable *fact* that 4k @ standard frame rates is only HD for moving images. 4k demands bigger pictures (interchangeable with shorter viewing distances, as in retina display), and the motion blur then diminishes the spatial resolution. 50/60p for 4k, 120p for 8k. Like it or not. You can't be a pixel peeper and resolution fundamentalist and at the same time insist on cinematic 24p.

8 hours ago, andrgl said:

Early investing in hardware only benefits manufactures.

We have to define the word benefit here. At the present point, it may not be reasonable or economically advisable to buy the hardware. If these were generally accepted arguments, EOSHD would probably die.

Link to comment
Share on other sites

11 hours ago, jonpais said:

You guys have obviously not looked at the links I’ve provided... First off, HDR is not just ‘anything above seven stops of dynamic range’. It is also about more than just dynamic range. Concerning the different formats, there are only two you need to concern yourself with: HDR10 and HLG. If you own a recent HDR television, it will automatically recognize whether the material is HDR10, Dolby Vision or HLG. Whether 99% of the universe is ignorant about HDR is irrelevant: we have the information at our fingertips. As far as cost goes, if you’re shooting with a camera with LOG or RAW, you’re already capturing HDR: investing in a recorder (which you might have already), a device like the Mini Monitor and a TV will cost less than a new GH5s. Regardless of what your average Joe says, any sentient being can distinguish between SDR and HDR. And yes, just as when shooting for SDR delivery, you’ll want to take care that there isn’t a bald sky occupying 90% of the frame (unless that’s your artistic intent), or a bright lamp shining directly behind your talent’s ears. My very first HDR YouTube video was never even intended to be delivered in HDR; I uploaded it without any grading whatsoever; the color is off, for sure, but  in every other respect, it kills the SDR version. 

I own an HDR set that supports both. I frequently attend B&H workshops and to the average eye they wont notice the difference with current displays that aren’t OLED especially if the DP & the colorist know good lighting ratios to bring contrast to an image. HDR does look very good at the professional display and high end camera level but on a consumer level its way too early. Most displays barely can show P3 color gamut colors at the moment even the ones that show HDR. You can confirm this with a spyder elite or equivalent on your monitor. I’ll bet we wont see good HDR shot content from the indie market until a couple of years down the line. Early adopters will have fun with the tech but like anything the consumer market has to catch up. Aren’t we still waiting for 4K to saturate the market ? I remember there being a gold rush for that too during the GH4 era.

Link to comment
Share on other sites

4 hours ago, kidzrevil said:

I own an HDR set that supports both. I frequently attend B&H workshops and to the average eye they wont notice the difference with current displays that aren’t OLED especially if the DP & the colorist know good lighting ratios to bring contrast to an image. HDR does look very good at the professional display and high end camera level but on a consumer level its way too early. Most displays barely can show P3 color gamut colors at the moment even the ones that show HDR. You can confirm this with a spyder elite or equivalent on your monitor. I’ll bet we wont see good HDR shot content from the indie market until a couple of years down the line. Early adopters will have fun with the tech but like anything the consumer market has to catch up. Aren’t we still waiting for 4K to saturate the market ? I remember there being a gold rush for that too during the GH4 era.

I don't get this - I can see am immense difference between HDR and SDR on my Samsung Galaxy phone for almost every HDR video on YouTube. It is not subtle. They key difference is the brightness/dynamic range, not as much the color gamut. I can show (have shown) the HDR videos to anyone with my phone and they all are amazed by the difference (I show them the SDR version too). The Samsung phone does at least 90% of the P3 color gamut btw. I think some people here are in denial.

I also don't get the concept of "good HDR shot content" - all content that purports to depict reality rather than fantasy (documentaries, travel, etc.) benefits from being able to display improved colors and greater dynamic range, especially if the scene has spectacular (not just specular) highlights. If you shoot log gammas and HLG properly, you can get good HDR. HLG is the easiest since you shoot in REC2020 color and thus need no translation from some other color gamut for HDR.

Link to comment
Share on other sites

I don't know what the answer is though. I have to admit the things I have seen on my Note 8 are pretty breath taking. But, but in reality it is over the top from hell. I am not too sure I want to be just bombarded with colors, detail, DR out the ass! I think it may get old and and get old damn fast.

And hell I am old, with one good eye, that I am sure is a bit gimpy from age and to me, it is well too crazy good. What the heck is it to someone that is 16 years old! I am sure it is not a bad thing overall but will it be a relaxing, soothing, really enjoyable experience or you see it and your blood pressure is up 30 points, you need to pee, and maybe need to rest for 10 minutes!

I just remember back in the day of Technicolor compared to a great B&W movie, both were great but I still remember the B&W as being more rewarding, a lasting memory, an almost spiritualistic experience than the in my Face Techno blast from hell.

I am sure the Technicolor stuff was more more fast paced stuff maybe compared to B&W Drama, Scary movie stuff, hell I don't know. But I think I can see me tiring of HDR in a pretty short time. I doubt it will be some everlasting memory of something spiritual. Maybe we have come Too Far?

It is sort of like IMAX for TV. Talk about over the top!! Clown, Cartoon colors with music set on 11, 120' wide screen, Christ gives me the F ing he be gee bees to even think about it.

Link to comment
Share on other sites

1 hour ago, cantsin said:

It shouldn't be over the top - digital cinema/DCP has always been HDR. HDR just means that the gap in color reproduction quality between cinema and consumer video is closing.

Well you are telling that to a person that in reality thinks even 4k is Over the Top! Only 4k I really like is out of a 1DC. And that is because Canon's are sort of soft and have sort of muted colors. It works in 4k, not so hot in 1080p. Now the down sampled C100, 300's OOC look good in 1080p.

Link to comment
Share on other sites

Revised cost alert. ? Any pro videographer shooting V-Log or HLG with the GH5 should probably own the Inferno anyhow, since it unlocks 4K 60p 10 bit. So if like Mark and Ron, you already own an HDR smartphone, the investment is only $145.00 for a converter. hehe

And in no way, shape or form am I denigrating the work of Art Adams and the dozens of others who generously give of their time to share shooting, monitoring, grading and delivery recommendations - it is invaluable information - but it’s often so technical that it is likely to scare off some who get dizzy just reading all the terminology. Many shooting today, if they’d read so many technical considerations about SDR - about color gamuts, bit depth, learning to read scopes, how dynamic range and color depth are compressed, chroma subsampling, gamma, log curves, debayering, color volumes - would never even pick up a camera. Remember, when 4K arrived not so long ago, there were just as many, if not more, dire precautions against such things as shooting close-ups of talent, concerns about the additional diligence required of makeup artists and costume designers, and outright dismissal because of the absence of 4K projectors in theaters and 4K televisions in the home. Is it just a coincidence that the ones here most vehemently opposed to HDR are the very ones who insist on 1080p and diffusion filters? ? Do yourselves a favor and watch an episode of Chef’s Table on Netflix in Dolby Vision.

Rather than reading pages and pages about taking extra care with specular highlights and large bright areas when on set, why not just go out and shoot some tests yourself, and you’ll quickly discover some of these  precepts on your own. Then, there’s a lot of hand-wringing about how the video will be viewed in the home or at the theater - but this is something that has plagued filmmakers and colorists from day one! I shoot 4K, and I usually make my titles small because I don’t like chunky titles and I expect my work to be viewed on a 27” or larger display - so for sure anyone watching my stuff on a 5” ipod is going to be squinting. ?

The same  applies to highlight roll off when grading HDR10. Should you deliver for 1,000, 5,000 or 10,000 nit displays? Realistically, most HDR sets today fall anywhere from around 500-1,000 nits, so the answer should be pretty obvious, at least for those of us who aren’t shooting for theatrical release. Fifty years down the road, when every suburban housewife has a 10k nit display on the refrigerator door, you can go back and deliver the project in 10,000k nits. Some may not be aware of this, but studios already make several trim passes - deliverables for the myriad distribution options - which is how I’m able to enjoy a Dolby Vision program on a 600 nit display. For professonals, SMPTE has come up with the Interoperable Master Format (IMF), whose purpose is to make versioning simpler by wrapping all the versions in one container. Should YouTubers and wedding photographers be concerned with all this? Probably not. 

Link to comment
Share on other sites

7 hours ago, webrunner5 said:

It is sort of like IMAX for TV. Talk about over the top!! Clown, Cartoon colors with music set on 11, 120' wide screen, Christ gives me the F ing he be gee bees to even think about it.

Demo-reels, "test shots" featuring once new technology is always cringeworthy in retrospect. On your comparison of B&W with Technicolor: in one of his docs on cinema history Martin Scorsese shows that color was used creatively early on. They learned to hold back very quickly.

7 hours ago, cantsin said:

It shouldn't be over the top - digital cinema/DCP has always been HDR. HDR just means that the gap in color reproduction quality between cinema and consumer video is closing.

2D projection had to be 50 nits peak (15 foot lambert). With brighter projection, you'd lose contrast again. The blacks never had been very convincing in cinema either. It's true that particularly analog film (but digital cinema packages also) can hold more stops of light. More than could be shown. 

This whole HDR affair is about new display technology more than about camera technology. Cameras that can record 10-15 stops in 10-bit or higher are with us a few years now. Only that until recently most of it was "lost in translation" for distribution.

The downside will probably be that HDR will be less 'forgiving'. Many affordable cameras were just a tad better than what the 8-bit rec_709 Youtube clip they were bought for demanded.

5 hours ago, jonpais said:

Remember, when 4K arrived not so long ago, there were just as many, if not more, dire precautions against such things as shooting close-ups of talent, concerns about the additional diligence required of makeup artists and costume designers, and outright dismissal because of the absence of 4K projectors in theaters and 4K televisions in the home. Is it just a coincidence that the ones here most vehemently opposed to HDR are the very ones who insist on 1080p and diffusion filters? ? Do yourselves a favor and watch an episode of Chef’s Table on Netflix in Dolby Vision.

I was among the 4k skeptics. Resolution is not about image quality. The trek moved in the wrong direction. If 4k was sharper than HD, it was because it hadn't been true HD before. And for the sake of more pixels everybody was happy to allow heavier compression. Although storage costs have become so low (I have to think about John Olivers How Is This Still A Thing?) and in spite of the warnings that compression artifacts degrade perceived image quality the most (see Yedlin again). 

But I wasn't "opposed" to 4k. To those who feared problems with make up and the like, I said, why would it make a difference? Would I light and frame differently? No. Why? I had been shooting DV. Did I avoid long shots with lot of background detail? No. Why?

The same with HFR. We've been discussing this ad nauseam. I had reservations. But I could name the reason. The comparative lack of motion blur takes away momentum. If you know about that, you can shoot accordingly. As I see it, you can still shoot 24p in UHD. The resolution goes down as the camera moves? So be it. If you are fixated on resolution, you will eventually stop motion altogether. No more fights, car chases. Pristine calendar stills of the graveyard, soon with a 1000 nits sun playing behind the headstones ...

Link to comment
Share on other sites

26 minutes ago, jonpais said:

@Axel My mistake! It was when 1080p arrived that there were already concerns about seeing lines, wrinkles and blemishes on talents’ faces. 

I was skeptic about HD as well. Saw it first in autumn 2004 on a trade show with the then-new Sony FX-1. Worries were unsubstantiated since the images were seen on SD TVs then. I wasn't impressed.

The first time I saw UHD was on a trade show again. Some JVC camcorder, stitching four HD videos together. Horrible colors, terrible edge-sharpening. The best part was where they showed a fish tank that was supposed to look real. The audience was impressed. I said, no, the fish look dead. I have a more convincing screensaver ... 

Groundhog day. 

Link to comment
Share on other sites

9 hours ago, markr041 said:

I don't get this - I can see am immense difference between HDR and SDR on my Samsung Galaxy phone for almost every HDR video on YouTube. It is not subtle. They key difference is the brightness/dynamic range, not as much the color gamut. I can show (have shown) the HDR videos to anyone with my phone and they all are amazed by the difference (I show them the SDR version too). The Samsung phone does at least 90% of the P3 color gamut btw. I think some people here are in denial.

I also don't get the concept of "good HDR shot content" - all content that purports to depict reality rather than fantasy (documentaries, travel, etc.) benefits from being able to display improved colors and greater dynamic range, especially if the scene has spectacular (not just specular) highlights. If you shoot log gammas and HLG properly, you can get good HDR. HLG is the easiest since you shoot in REC2020 color and thus need no translation from some other color gamut for HDR.

Isn’t the Samsung an OLED screen ? I definitely said it before that it was a screen brightness thing that makes HDR pop. An HDR an SDR display calibrated to the same brightness looks roughly the same. Its the HIGH contrast ratio from OLED screens like your Samsung that gives it that pop and the color gamut is in fact an integral part of HDR so that matters as well. Rec2020 requirements is a 12 bit color depth (and the screen displays up to P3 colors which are no where close to rec2020 spec).There isn’t a consumer camera shooting at 12 bit color depth so your not even getting the full benefits of shooting rec2020 for hdr its just a nice feature to say you have to match your footage to higher quality cameras when shooting as a B cam. Most of your colors in the 8bit HLG will be out of range of the color gamut kinda like shooting sony’s SGAMUT in 8bit. You are viewing changes of luminance as “good hdr” when that is only a small piece of what is supposed to make HDR the next big thing.

SDR is not replacing HDR any time soon. Unless your audience cares for side by side a to b comparisons when in a theater setting or on youtube which I can assure you no consumer does. As long as the content is engaging then who cares. No one is skipping out on the latest film because its not shot in SDR over HDR. Im not against HDR as a format but this talk of it replacing SDR right now is gimmicky at best

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

  • EOSHD Pro Color 5 for All Sony cameras
    EOSHD C-LOG and Film Profiles for All Canon DSLRs
    EOSHD Dynamic Range Enhancer for H.264/H.265
×
×
  • Create New...