Jump to content

How Important is 10-Bit Really?


Mark Romero 2
 Share

Recommended Posts

At what point does one really NEED 10-Bit 4K?

How serious does your grading have to be for 8-bit to not work well?

Is Sony's 8-bit slog 4:2:0 really that prone to banding?

(I read through the recent thread about "converting" 8-bit to 10-bt using noise removal software, and thought it has some potential for Sony cameras, although not technically converting to 10-bit.)

Thanks in advance.

Link to comment
Share on other sites

EOSHD Pro Color 5 for Sony cameras EOSHD Z LOG for Nikon CamerasEOSHD C-LOG and Film Profiles for All Canon DSLRs

Well I guess if you are Really good at exposure, proper lighting, not shooting into the sun, etc, 8 bit works. It has somehow worked for many years,

But in a professional venue they have every base covered. Better cameras, great cameramen, great audio, great lighting, great graders, great editors, great actors, on and on.

So it is hard for the average person to compete with the Pros, but now I think the gap has been closed a Lot. But you are not going to do Green Screen, VFX effects in 8 bit very well.

But I think we beat ourselves up too much anymore on output quality. I am not saying to not try as hard as possible to achieve it, but at the cost of always being in debt, worrying about it all the time Bah I say.

Look back how bad 8mm cameras were, super8 even. Nobody said damn that sucked. They were happy to see what had been shot, not complaining about the quality. And I would imagine most of the people that watch videos have no clue what 90% of the problems we piss and moan about even exist. I would say the most important thing that drives me crazy is poor Audio, not poor video, in a short or a long. And other than one person on here, we rarely ever even talk about that subject.

Link to comment
Share on other sites

Dave Dugdale proved me with this video that 10 bit is a fairy dust if you are a run& gun shooter, wedding cinematographer or you are in love with architecture cinematography. 

I have never faced banding issues with 8 bit codec, and if you are shooting with correct WB you don't have to twist your footage till it breaks. And if it breaks it's probably because of the 100Mbps bitrate and not the 8 bit...

I think for most cases 10 bit gives you just a little more advantage over 8 bit if you nail your exposure and WB.

And i have worked with BMPCC in ProRes/RAW and with 5D RAW before.

It seems like the next big think is shooting in HLG with 8bit which works perfectly fine.

Link to comment
Share on other sites

1. Bit rate

2. Color sampling

x. Bit depth

 

100 mbps can be short where 400mbps or 200mbps for a non-intra codec AFAIK like that one used on X-H1 can overcome further issues.

 

Here's some informative link:

https://fstoppers.com/education/can-you-see-difference-between-10-bit-and-8-bit-images-and-video-footage-166977

 

And this one endlessly posted over here before:

vlcsnap_2015_04_18_01h38m52s199.thumb.png.41f71e4ed1000283d1a0babf9b422d97.png

sample extracted from (minute 28:30):

https://vimeo.com/114978513

 

Link to comment
Share on other sites

There's no simple answer here. You really need to do your own tests. Imo, the difference between 10 bit and 8 bit is on its own nearly invisible in every circumstance, even with graded log footage. But it's only part of the story.

In theory, if your footage isn't heavily compressed and the camera's sensor is sufficiently noisy (noise is dithering), even if the footage is shot log, increasing the bit depth will only improve tonality and reduce the appearance of noise. It won't have an effect on whether there's banding or not. And on a standard 8 bit display, this difference will be basically invisible. (HDR is another story, you need more tonality for the wider gamut and flatter recording gamma.)

But add in in-camera noise reduction and macro blocking (and Premiere having a broken codec for years when interpreting dSLR footage, if I remember correctly) and you get the potential for tons of banding, some of which 10 bit color will reduce (generally in the sky and into more bands rather than a smooth gradient, but more is better than fewer), some of which 10 bit color won't obviate at all (macro blocking). The FS5 and F5 have some banding issues in SLOG2 in the highlights even in 10 bit in certain codecs, in my experience. Certain cameras shooting in 8 bit, however (in the right codec), won't exhibit any banding or even suffer from significant tonality problems. (The C300 will rarely exhibit banding with an external ProRes recorder, except perhaps at high ISO with noise reduction on. But the built in codec can exhibit banding for sure. That sensor is very noisy. Fwiw, the noisiest sensor I've see might be the Alexa, or maybe it just has less in-camera NR. Can't get banding from that camera no matter what. I found a lot of banding with Canon Technicolor on dSLRs, though. None ever with any raw camera except my Sigma point and shoot.)

The idea of "upscaling" bit depth is specious. I think that thread might have been posted as a troll or a joke. The examples posted in the thread basically just show noise reduction reducing macro blocking (not very well, I might add), which is sort of besides the point, but yes, you can use noise reduction to reduce macro blocking, which I guess is the look people have blamed on codec bit depth, but it really has to do with codec bitrate.

Imo, 10 bit vs 8 bit is hugely overrated and essentially a moot point. But it's one of many many factors that are responsible for thin footage from certain cameras. Rent and run your own tests. And yes, the A7 line in SLOG 2 can exhibit serious banding. You can clean it up in post, though (with other tools than noise reduction, generally), or shoot to mitigate it. But it is absolutely a problem with those cameras, and whether it's a problem for you or you're fine dealing with it in post when it appears, is up to you. 

So it's two separate questions. Is the difference between 10 bit and 8 bit big? Generally, no. Is banding a problem on certain 8 bit cameras? It sure is.

Rent, test, double check, then buy. But, imo, bit depth is far less important than bitrate when it comes to avoiding banding in the image.

Link to comment
Share on other sites

We just had an interview graded with the director of a big franchise release that was shot on an A7RII in SLog. Producer says he was surprised because this shooter was supposed to bring an FS7. The interviewee was wearing a black shirt that was reflecting some purple/pink gelled light (the movie has a vibrant color palette) but the shadows were a disaster because of this. Noise, no tonality, just crunchy garbage. There was also a skin tone issue with a lack of tonality between yellows and pinks. Colorist did what he could to clean it up. Keep an eye open for an HBO First Look debuting in a week to see what i mean.

Link to comment
Share on other sites

7 minutes ago, Emanuel said:

1. Bit rate

2. Color sampling

x. Bit depth

 

100 mbps can be short where 400mbps or 200mbps for a non-intra codec AFAIK like that one used on X-H1 can overcome further issues.

 

Here's some informative link:

https://fstoppers.com/education/can-you-see-difference-between-10-bit-and-8-bit-images-and-video-footage-166977

 

And this one endlessly posted over here before:

vlcsnap_2015_04_18_01h38m52s199.thumb.png.41f71e4ed1000283d1a0babf9b422d97.png

sample extracted from (minute 28:30):

https://vimeo.com/114978513

 

I'm gonna check this link out! Never seen this, but that's exactly what I'd expect the difference to be. Subtle difference when increases bit depth, huge difference when increasing bit rate.

Link to comment
Share on other sites

19 minutes ago, HockeyFan12 said:

I'm gonna check this link out! Never seen this.

Well, it is always my usual post on topic in these last two years and a half at least... We need to actually demystify that thing : ) If any 1 million -- in this case, 16 million colors is enough, why to spend 4x the same amount strictly necessary to handle the same as due?

Link to comment
Share on other sites

1 hour ago, Emanuel said:

Well, it is always my usual post on topic in these last two years and a half at least... We need to actually demystify that thing : ) If any 1 million -- in this case, 16 million colors is enough, why to spend 4x the same amount strictly necessary to handle the same as due?

Different people see the world differently. The clip you posted shows 8 bit and 10 bit gradients looking nearly identical to me, but for others perhaps the slight difference is very pronounced, based on how important 8 bit vs 10 bit is to some.

For high end HDR (15+ stops) the color gamut and dynamic range are MUCH wider and there I truly do believe 10 bit color makes a significant difference with tonality.

But for me it doesn't matter at all with the cameras I use. And I think everyone has to try out this out for themselves!

Link to comment
Share on other sites

25 minutes ago, HockeyFan12 said:

Different people see the world differently. The clip you posted shows 8 bit and 10 bit gradients looking nearly identical to me, but for others perhaps the slight difference is very pronounced, based on how important 8 bit vs 10 bit is to some.

For high end HDR (15+ stops) the color gamut and dynamic range are MUCH wider and there I truly do believe 10 bit color makes a significant difference with tonality.

But for me it doesn't matter at all with the cameras I use. And I think everyone has to try out this out for themselves!

Agreed. Different needs apply. In any case, a matter of bit rate more than color sampling as 2nd line of factor to weigh heavily than the usual bit depth yadda yadda yadda... my fair guess.

Practice variably redefines theory and people tend to neglect it :-)

Link to comment
Share on other sites

1 hour ago, Emanuel said:

And this one endlessly posted over here before:

vlcsnap_2015_04_18_01h38m52s199.thumb.png.41f71e4ed1000283d1a0babf9b422d97.png

 

If we go by this picture.... then some parts of House of Cards (first season, on Netflix) was probably filmed and/or delivered 8bit 4:2:0. From what I know, I saw banding in the low light (lamp) scenes, even in the first 2 episodes and those 2 episodes were directed by David Fincher (who is a RED guy) and by the looks of it - it may have been shot on a RED cam.

http://www.red.com/shot-on-red/television  - go into search and start typing "house of cards"

Link to comment
Share on other sites

29 minutes ago, mkabi said:

If we go by this picture.... then some parts of House of Cards (first season, on Netflix) was probably filmed and/or delivered 8bit 4:2:0. From what I know, I saw banding in the low light (lamp) scenes, even in the first 2 episodes and those 2 episodes were directed by David Fincher (who is a RED guy) and by the looks of it - it may have been shot on a RED cam.

http://www.red.com/shot-on-red/television 

The streaming can be 8-bit 4:2:0 because people have no need of more. Take a look here:

https://www.rtings.com/tv/learn/chroma-subsampling

Delivered or even produced for Netflix, no way, no less than 10-bit 4.2:2

But the issues you mention are coming from low bit rate and compression quite likely.

Link to comment
Share on other sites

8 minutes ago, mkabi said:

If we go by this picture.... then some parts of House of Cards (first season, on Netflix) was probably filmed and/or delivered 8bit 4:2:0. From what I know, I saw banding in the low light (lamp) scenes, even in the first 2 episodes and those 2 episodes were directed by David Fincher (who is a RED guy) and by the looks of it - it may have been shot on a RED cam.

http://www.red.com/shot-on-red/television  - go into search and start typing "house of cards"

Those are two very different things.  Other than over the air broadcast television which the government has mandated to be of a certain quality everything else is subject to heavy compression.  Maybe at some point in the future they will come up with better compression algorithms but until then just about anything you see on TV other than over the air broadcast is going to have an unacceptable amount of compression artifacts... and broadcast ain't perfect either.  It's just most of the time it is the best of the worst.

It is what it is, but there is simply no way you can work backwards from a lossy compressed final output and start commenting on the rest of the pipeline all the way back to the camera and the DP's brain.

I see banding all the time on Netflix.  That is one of the reasons I find 4k and HDR so baffling.  Can we get at least broadcast quality 1080i first?!

Link to comment
Share on other sites

41 minutes ago, mkabi said:

If we go by this picture.... then some parts of House of Cards (first season, on Netflix) was probably filmed and/or delivered 8bit 4:2:0. From what I know, I saw banding in the low light (lamp) scenes, even in the first 2 episodes and those 2 episodes were directed by David Fincher (who is a RED guy) and by the looks of it - it may have been shot on a RED cam.

http://www.red.com/shot-on-red/television  - go into search and start typing "house of cards"

Did you work on the online or are you referring to the show as viewable on Netflix?

Of course the Netflix version has tons of banding. Fincher aims for a very clean image and then it’s massively compressed for Netflix. And generally web content is not compressed with the same care that's taken with Blu Rays, that are processed in high end proprietary Linux machines using special software that can be used to mitigate banding. (Which, yes, is an issue with Blu Rays. I'm not saying bit depth is totally irrelevant, particularly in shadows and highlights.)

If you were working on the online, however, that's a very interesting point. Denoising can help with compression efficiency, but a cleaner image at 8 bit from a high quality codec can exhibit more banding than a noisier image at the same bitrate of the same codec, and by far. I believe the Prometheus Blu Ray, for instance, had added grain.

House of Cards was shot in the HDR (dual exposures, then merged) on Reds, I believe. Not sure it was all HDR. So it was something like 28 bit raw in effect.

Link to comment
Share on other sites

Practically, I don't need it.  IF a shot has a gradient that creates banding I can fudge over it with some post voodoo.  Or, more likely, just live with it.

*gasp!*  

Okay, I'll pardon you to retreat to the chaise and allow recovery of your delicate constitution...

Hey, I'm a low-fi filmmaker, got better stuff to worry about.

Now, if I was in a more up-market situation, it matters.  But I ain't, so it's doesn't.  And, on a side note, the IQ tech will be so democratized in 5 years it REALLY won't matter.

----------

Funny story:  running a filmfest and one filmmaker had the most engaging narrative feature length film about love/family/redemption.  Beautiful characters, etc, etc.  Looked like garbage though.  Shot on a Sony HDR-SR1.  Bad lighting, exposure, color grading, composition, cinematography, highlight roll off was disgusting!  ... Still somehow an engaging film!  Writing and acting were so good. 

However, the director came up to me an hour before screening, distraught that he didn't give us the ProRes4:4:4 file (we were going to screen the .mp4)  "Oh, please fix this problem!  Please!"

Dude, what problem, I thought. Your film looks like shit and we still like it.  4:4:4 ain't an issue.

Anyway, screened the ProRes4:4:4 and he felt better.  People is crazy, y'all.

On a side note, I'm watching "The Wire" for the first time.  Looks like crap too.  (yes, even shot on film) Sets are so poorly lit or overlit -- and there's nothing really interesting going on with the cinematography.  Oh well.  Still good stuff to watch.

Link to comment
Share on other sites

14 minutes ago, fuzzynormal said:

People is crazy, y'all.

Indeed!

What season / episode have you made it up to in The Wire??? Far and away one of the best television series ever made.

I remember in the book about the series, the dp was asked about how he made the dreary settings in the wire so authentic, and he replied, "I'm an expert in ugly."

Link to comment
Share on other sites

18 minutes ago, Mark Romero 2 said:

Indeed!

What season / episode have you made it up to in The Wire??? Far and away one of the best television series ever made.

I remember in the book about the series, the dp was asked about how he made the dreary settings in the wire so authentic, and he replied, "I'm an expert in ugly."

Season One.  When McNulty, Bubbles, and Det. Kima go over their work and Bubbles has a little soliloquy --looked like a high school stage play he was so overlit!

Link to comment
Share on other sites

5 hours ago, HockeyFan12 said:

There's no simple answer here...

...Imo, 10 bit vs 8 bit is hugely overrated and essentially a moot point. But it's one of many many factors that are responsible for thin footage from certain cameras. Rent and run your own tests. And yes, the A7 line in SLOG 2 can exhibit serious banding. You can clean it up in post, though (with other tools than noise reduction, generally), or shoot to mitigate it. But it is absolutely a problem with those cameras, and whether it's a problem for you or you're fine dealing with it in post when it appears, is up to you.

Thanks for sharing your insights and for the advice.

A few follow up questions if I may;

Could you better define "thin footage" for me? I hear lots of people says this (in particular about 8-bit footage), but I never seem to get a clear idea of what specifically they mean. Does the footage lack saturation? Vibrancy? Accutance? DR? A combination of missing a little bit of a LOT of things? (meaning, a combination of lacking a bit of contrast, a bit of vibrancy, a bit of saturation, a bit of DR?)

Also, regarding ways to clean up banding in post. Besides the aforementioned noise reduction in post, what other ways are there to deal with it?

And finally, you mentioned "shoot to mitigate it." Any tips on how to shoot to mitigate it?

Thanks in advance.

2 hours ago, fuzzynormal said:

Practically, I don't need it.  IF a shot has a gradient that creates banding I can fudge over it with some post voodoo.  Or, more likely, just live with it.

Can you sahre any of your secret sauce for taking care of it in post??? Thanks in advance.

5 hours ago, Zak Forsman said:

We just had an interview graded with the director of a big franchise release that was shot on an A7RII in SLog. Producer says he was surprised because this shooter was supposed to bring an FS7. The interviewee was wearing a black shirt that was reflecting some purple/pink gelled light (the movie has a vibrant color palette) but the shadows were a disaster because of this. Noise, no tonality, just crunchy garbage. There was also a skin tone issue with a lack of tonality between yellows and pinks. Colorist did what he could to clean it up. Keep an eye open for an HBO First Look debuting in a week to see what i mean.

Thanks for the tip.

1 hour ago, jagnje said:

10 bit is just as unheard off on big produstions as 8 bit is, by no means is regarded as a pro bit depth.

Don`t expect miracles with it, in most cases it is prety much the same. Most diference from what I notice is 422 vs 420. 

Thanks for the insights.

I am surprised about the difference between 4:2:2 and 4:2:0

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

  • EOSHD Pro Color 5 for All Sony cameras
    EOSHD C-LOG and Film Profiles for All Canon DSLRs
    EOSHD Dynamic Range Enhancer for H.264/H.265
×
×
  • Create New...