Jump to content

4K 8bit to 2K 10bit - Let's get to the bottom of this please!


Guest Ebrahim Saadawi
 Share

Recommended Posts

Guest Ebrahim Saadawi

While I appreciate (and learned a lot) for all the mathematical debates and talk, I want to get to the bottom of this using real examples. For once and for all. 

-10 bit advantage over 8bit is reduced banding and more color gradients, right? So here you go: 

This is a 4K 8 bit file, showing chromabanding (upper left area) - exported to TIFF for no loss in image quality. Download it, and downscale it yourself, and tell us what you find

https://drive.google.com/file/d/0Bz2AvpYrksB1OVdZRjNmNXNJMXM/edit?usp=sharing


(For me, It didn't reduce banding. I downscaled with Mpeg Streamclip to Avid DNxHD 444 10bit, and no reduction in banding. But I'll assume I am doing something horribly wrong I'll leave it to you to downscale!)

___________________________

If there's anything wrong with my frame, please take a picture with any camera you have in JPEG showing banding. It's more than 4K and also 8 bit, then downscale it to 1080p and tell us the results.


This is such and easy debate to be solved, I have no idea why there's still all these wars going on the forums. 

Let's get this over for once and for all!

Link to comment
Share on other sites

EOSHD Pro Color 5 for Sony cameras EOSHD Z LOG for Nikon CamerasEOSHD C-LOG and Film Profiles for All Canon DSLRs
  • Administrators

I believe what you're seeing in that frame is not the result of 8bit colour but actually heavy compression.

 

Bland areas like skies are more heavily compressed than areas with fine detail.

 

Also remember that colour remains 8bit... it is only the depth of the luma channel and colour sampling which benefit from the downscaling from 4K to 1080p.

Link to comment
Share on other sites

How did you get that source file? Something looking that horrible won't turn into something usable of course. It doesn't look like a proper 8 bit file, 8 bit isn't that bad. Looks heavily processed, either in post or with a combination of extreme settings in camera.

Link to comment
Share on other sites

  • Administrators

It looks like mega extreme push or pull, no wonder it is shot to pieces.

 

Try instead shooting a detailed surface which is unevenly lit, say with a light source at one edge of it, shoot it in 4K then don't touch the image, just convert it to 1080p 10bit ProRes. Only after that, see how it grades compared to the 8bit 1080p files direct from the GH4.

Link to comment
Share on other sites

And here's an example from me

 

 

We're cross-posting here, this was on another thread this morning...

 

First off, 100Mbps at 25p. Then the same shot a few seconds later, no changes other than to switch to 4k. Then import to FCPX, optimizing footage to ProRes. Drop both shots onto a 1080p timeline. Add a Hue/Saturation filter, crank up the Saturation to 2.0. Export as 1080p.

 

Obviously nobody would actually add that much saturation in real life, but it exaggerates the colours in a grey Newcastle sky.

 

I think the difference really stands out.

 

Settings: Cinelike V tweaked to Mr AR’s recommendations, cloudy WB, shutter speed 1/50, iso 200, f3.5. Not sure what the colour balance difference is due to - I changed nothing on the camera between shots other than changing between HD and 4k.

Link to comment
Share on other sites

Guest Ebrahim Saadawi

gh4-4k-gradation.jpg

 

Here's an example from me...

 

Top shot is 100% crop of the 4K image. You can see some banding. Bottom shot is 100% crop of the 4K down sampled to 1080p. Much smoother looking and crisper.


Thank you Andrew. This is the first proper test I've seen regarding the 4K 8 bit to 2K 10 bit! It does look like the chroma gradient is improved but not sure (detail and resolution surely are) . Hope some others post their examples too. 


Regarding my frame: It's shot on the Gh4 in 4K, and it's heavily pulled in post to make the banding much more apparent of course. But the banding was "there" just wasn't that apparent. I thought it would be easier to judge banding reduction when they're that obvious* If that's wrong, please post yours. Again it can easily be done with any Jpeg still from any camera showing banding 

Link to comment
Share on other sites

So you will notice that nobody wanted to use your image in their tests Ebrahim.. but what you posted is on the right track as to how to test this theory (although that level of banding is not necessary), you need to try to fix something with clear 8bit 4:2:0 artifacts and then look at it up close when grading at 2k. Andrew, I could not really see clear artifacts in your source 4k image, so to me, it just looks like a little anti-aliasing and a zoom out of 50% is what you are thinking is a bit dept conversion improvement.

 

Keep in mind that nobody at Cineform or even the person that wrote the conversion app (that is just adding 4 pixels) has made claims that the technique will give you the same latitude in post of footage recorded at 10bit depth natively, they have only confirmed the math, not what you can do with the results.

 

This animation I did in Nuke mathematically/visually should suffice, but if you want to use the conversion app, it will have to be from someone on a mac. The second half of the video is 4k reformatted to 2k in 32bit float space. All the 8bit gradients look the same, but the gradients that started in 10bit or higher still grade vastly smoother. In other words, the 4k 8bit 4:2:0 or 4:2:2 gradients did not turn into 10bit gradients when reformatted to 2k and prevent banding when graded.

 

Link to comment
Share on other sites

Guest Ebrahim Saadawi

 

 

This animation I did in Nuke mathematically/visually should suffice, but if you want to use the conversion app, it will have to be from someone on a mac. The second half of the video is 4k reformatted to 2k in 32bit float space. All the 8bit gradients look the same, but the gradients that started in 10bit or higher still look vastly smoother. In other words, the 4k 8bit 4:2:0 or 4:2:2 gradients did not turn into 10bit gradients when reformatted to 2k.

 

Thank you. Very helpful.

So you don't believe the theory... while Andrew does.

I also don't see banding in the Andrew's photo above. So what do you think Andrew? His tests above seem to prove 4K 8 bit has the same amount of banding when downscaled to 2K. 

The only way this will be settled is with a clear 4K 8bit banded image. 
 

"you need to try to fix something with clear 8bit 4:2:0 artifacts/banding." So how can I create an image with proper chroma banding that would be worthy of testing on, Sunyata? 
 

Link to comment
Share on other sites

No problem.. just want to help with explaining these issues of colorimetry and color correction, it's already confusing enough w/o adding to it.

 

One thing about my test that might not be clear.. I'm not just converting already pushed footage, I'm converting a clean gradient first, and then pushing it.. in both the 4k and 2k test. The gradient source has only been converted to nBit depth and chroma, but no heavy grade until after it was reformatted to 2k.

 

Here are some 4k 8bit 4:2:0 tiff's that have not been "pushed" yet, if you want to test on them. The key is to make the gradient very dark, that will show the limited range of 8bit, and then I did a 4:2:0 after that using ffmpeg's "-pix_fmt 420p" option and saved as high quality tiff, to prevent more compression form entering into the test.

 

I'm also not using studio swing range, so this is a little bit better than rec709.

 

http://www.collectfolder.com/420test.zip

 

Also, as a side note.. RGB 8bit uncompressed grades really well. It dithers nicely and doesn't really show heavy banding, I was surprised. I think the real challenge is with the chroma subsampling. 

Link to comment
Share on other sites

  • Administrators

I have a suggestion...

 

Some of you will have access to 10bit (or better) already. Blackmagic Pocket Cinema Camera, 5D Mark III raw, external HDMI on the GH4 itself.

 

It is this you need to compare the 8bit 4K to when doing tests for banding.

 

The banding test on its own is a bit pointless.

Link to comment
Share on other sites

  • Administrators

I also don't see banding in the Andrew's photo above. So what do you think Andrew? His tests above seem to prove 4K 8 bit has the same amount of banding when downscaled to 2K.

 

I already told you several times what I think...

 

Compression is a factor, as is exposure.

 

I also said countless times before... 10bit luma is is not the same as having 10bit colour.

 

I also told you that David Newman of CineForm / GoPro backs up the 8bit 4K -> 10bit 1080p argument as does Thomas Worth of Rarevision. If you can't listen to the experts, I as a non-colour science engineer will have no chance in putting you straight!!

Link to comment
Share on other sites

  • Administrators

8bit has easily enough to give you a sky with more than 4 bands gradient. It has enough for 256 shades in that sky.

 

The banding in Ebrahim's sample was caused by compression in-camera and then compression of luma in post, with a high contrast. I can break 10bit footage in the same way if I ramp contrast all the way up. Just tried it.

 

I just did a quick test, nothing special... But clearly the gradient is handled just fine by 8bit HDMI and 8bit 4K. The 8bit 1080p has banding caused by compression. The compression removes the dithering between shades.

 

Peep carefully.... :)

 

gh4-hdmi-banding-comparison.jpg

 

In conclusion...You only REALLY need 10bit if you want to grade heavily. You shouldn't normally notice the difference between 10bit and 8bit footage. Bodes well for the A7S doesn't it?

Link to comment
Share on other sites

I also told you that David Newman of CineForm backs up the 8bit 4K -> 10bit 1080p argument as does Thomas Worth. Please listen to the experts! I'm not a colour science engineer!

 

Andrew, everyone agrees that adding 4 8-bit numbers gives a 10-bit number, including Newman and Worth. However this does not mean scaling down a 4K image to 2K gives useful 10-bit luma, especially for one of the main reasons to shoot in 10 or more bits: reduced banding and improved color tonality.

 

Just about everyone here (including me) is using 8-bit displays! How can we possibly see the results of greater than 8-bits on 8-bit displays?

 

Here's an image created in Photoshop CC to illustrate sky banding using a simple diagonal blue gradient (PNG- no JPG DCT compression):

16BitScreenShot8Bit.png

 

A 16-bit per color channel image producing banding? How is this possible? This image was captured via a screenshot- so it's 8-bit per color channel and no chance for Photoshop to HDR compress the image to 8-bit: it's quantized. If we add .25% Uniform Monochromatic noise and convert to 8-bit:

 

16Bitp25pUMnoise.png

 

What is banding, really, and why is it so noticeable when it happens? It's due to human the visual system for edge detection, AKA Mach banding: http://en.wikipedia.org/wiki/Mach_bands . If the image has even a very small amount of noise, it can stop the "Mach band" trigger, and the gradients look smooth.

 

What if we add .25% noise to the 8-bit image?

8BitP25noise.png

 

How about 4x the noise or 2%?

8Bit2pNoise.png

 

It's clear that adding noise (dithering) before quantizing to 8-bit is required to eliminate banding. For signal processing in general, it's common practice to perform a low-pass filter as well as dither before sampling down to a lower frequency signal (or image resolution/bit-depth). The 4K GH4 signal has a fine noise grain: thus when sampling from 4K to 2K, the effect is a low-pass filter + noise-dithering which produces an improved result vs. in-camera 1080p (which isn't performing as high of quality resampling as is possible in post). The Sony A7S, however, appears to have a very high-quality resampler in hardware, and thus looks great at 1080p 8-bit. Noise and compression behavior is the likely reason for reducing macroblocking/banding artifacts in this example: http://www.shutterangle.com/2014/shooting-4k-video-for-2k-delivery-bitdepth-advantage/ . The reason 4K to 2K scaling could possibly reduce banding is due to noise / effective dithering and macroblock scale/effects, not virtual 10-bit luma: quantized 10-bit luma will still (Mach) band, as did the 16-bit example above without dithering.

 

If shooting material that may have issues with banding, try a higher ISO- it may provide effective noise dither, reducing banding once quantized to 8-bit.

Link to comment
Share on other sites

  • Administrators

Well that is exactly what I am saying about the noise and compression.

 

Compression hurts the noise-dither, making banding more apparent.

 

And buy the way, I shrunk your banding image...

 

16bit.jpg

 

Guess what, the bands are thinner on the shrunk version... Whudda thought!?

 

Downsampling 4K to 2K clearly has makes gradation finer...

 

And if everyone agrees on the 4 8bit numbers = 1 10bit number, then surely when we take dithering, finer gradients and grading a ProRes file into consideration, you are going to see less banding aren't you?

 

For those with 10bit LCDs can you download this 10bit TIFF from a 16Bit Photoshop gradient and tell us if you see the banding. I do on my iMac display. http://www.eoshd.com/uploads/16bit.tif

Link to comment
Share on other sites

JCS--

 

Noise is a tried and true trick to prevent banding with CG and I've used it many times when delivering dark luminous glows that were created in post, also adding a matching film stock based on whatever plate I'm trying to match, but it can only do so much. This is even considering rendering 10bit log cineon from a float comp, and delivery to film prints or dcp.. adding noise on top of log can be insurance against banding. 

 

The 8bit monitor issue is an old one too, if you're working on an 8bit monitor it is common to sample the image to make sure any artifacts you are seeing are not in your monitor. You have to float over the image with some sampler and make sure the numbers are changing.

 

In fact, if I had to get a new monitor, it would be HD 10bit over 4k 8bit for all these reasons. I have monitors I can test on but I'm working on an 8bit IPS that I'm used to.

 

But you can tell the difference between 8bit from your monitor and an 8bit file that has 4:2:0 compression on it. The chromatic subsampling is very blocky and distinct. 

Link to comment
Share on other sites

  • Administrators

Thanks for the post by the way JCS. I don't disagree with a lot of what you're bringing up.

 

So monochromatic noise in Photoshop... Can we do the exact same thing in Premiere to the GH4's 4K output before downsampling, for smoother banding?

Link to comment
Share on other sites

Guest Ebrahim Saadawi

Such an informative discussion. Thank you all for contributing. So conclusion is converting 4K 4:2:0 8bit files to 1080p gives us 4:4:4 10bit luma and 8bit chroma files. However, the chroma banding effects are reduced because of the fine noise dithering and low-pass filter effect. 

The adding noise trick to eliminate banding is awesome. Heard of it before but never thought it would work this well! This is something I am going to be using a lot on all my footage. Thanks!

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

  • EOSHD Pro Color 5 for All Sony cameras
    EOSHD C-LOG and Film Profiles for All Canon DSLRs
    EOSHD Dynamic Range Enhancer for H.264/H.265
×
×
  • Create New...