Jump to content

10 Bit vs 8 Bit - Sony FS5 vs Sony a7Sii a7Rii - Is 10 Bit Always Better?


Django
 Share

Recommended Posts

EOSHD Pro Color 5 for Sony cameras EOSHD Z LOG for Nikon CamerasEOSHD C-LOG and Film Profiles for All Canon DSLRs

Dave's a nice guy and yes, he qualifies that he's only talking about noise, but that doesn't change the fact it's a nonsense test. The FS5 vs A7SII side by side under the bridge was the funniest. He was saying the A7 looked better, but his face in the A7 shot looked like a purple version of the Elephant Man. 

Also he completely failed to acknowledge that, for a professional camera like the FS5, noisy shadows (again vs the A7SII) that hold detail are far more desirable than clean shadows turned to mush by a weak codec and/or noise reduction. Noise can be cleaned up in post while retaining detail. If the NR is in-camera it's a lost cause.

It's a bit like saying "This is a test to see who is the fastest runner, but I'm only going to base it on how fast the athlete's arms move"

Link to comment
Share on other sites

I am surprised he posted the video. He sort of got "raked over the coals" from the "heavy-hitters" over at Lift Gamma Gain when he asked them critique it. See: http://liftgammagain.com/forum/index.php?threads/10-bit-vs-8-bit-%E2%80%93-sony-fs5-vs-sony-a7sii-%E2%80%93-is-10-bit-always-better.6657/

Dave even admitted, " ... From the comments here it sounds like I screwed up this video."

As Marc Wileage (a noted colorist) said: "One thing I would argue is that it's hard to demonstrate 8-bit vs. 10-bit online, for the simple reason that 99.9% of all online video and computer displays are 8-bit. But... I think Dave's intentions were good. I'm not of the opinion that if you put 8 gallons of wine in a 10-gallon barrel it'll taste any better. I think once it's stepped on, it's stepped on. However, I think 8-bit video can look acceptable under certain conditions, especially when you're not really having to stretch it out, bend it, or key it."

Link to comment
Share on other sites

Dave's video is fine. 4K 8-bit 422 becomes pseudo 10-bit Luma 8-bit Chroma 444 1080p. 'pseudo' as the 4 8-bit sample's variation summed to 10-bit is helped by noise/dither. Everything seen everywhere by consumers except in the theater is 8-bit 420. Only filmmakers/hobbyists/specialists will ever see > 8-bit (first need a 10+ bit display, graphics card, and OS/app support). Dithering/noise helps reduce banding, and downsampling 4K to 1080p helps reduce noise. Regardless of the math/tech, the A7x II hang well against the FS5 in the real world, and that's what really matters. The FS5 is a dedicated video camera, so it has other advantages beyond image quality.

Link to comment
Share on other sites

3 hours ago, jcs said:

Dave's video is fine. 4K 8-bit 422 becomes pseudo 10-bit Luma 8-bit Chroma 444 1080p. 'pseudo' as the 4 8-bit sample's variation summed to 10-bit is helped by noise/dither. Everything seen everywhere by consumers except in the theater is 8-bit 420. Only filmmakers/hobbyists/specialists will ever see > 8-bit (first need a 10+ bit display, graphics card, and OS/app support). Dithering/noise helps reduce banding, and downsampling 4K to 1080p helps reduce noise. Regardless of the math/tech, the A7x II hang well against the FS5 in the real world, and that's what really matters. The FS5 is a dedicated video camera, so it has other advantages beyond image quality.

JCS, I challenge thee!

I've been wanting to do this here for a while but haven't through fear it would devolve into silliness. However because I believe you're a stand-up guy and would do it properly, should you have the means and wish to accept my challenge:

Post 5 pairs of full-resolution frame grabs (JPEGS are fine) of a subject with rich, deep and subtly varied colours. Plants, flowers, skin - natural stuff. Each image pair should be identical except for the fact that one is 8bit and the other 10bit or higher (i.e. either shot simultaneously internally and externally on the same camera, or consecutively in a reasonably controlled environment so no major light changes etc). They should have identical, conservative/neutral grades applied so that they have a level of contrast and saturation close to what it would be at the point of delivery but no stylistic adjustments (i.e correction but no grading other than conservative curves and sat).

If you wish you can apply dithering/noise and the 8 bit image can be downsampled 4K.

Post all 8 bit images on one side, and all higher bit-depth images on the other (or top/bottom) or put them in separate folders named A and B.

I will tell you which are the 8 bit images. I will do this purely by eye - no analysis/scopes, etc. It will be plain as day.

 

Link to comment
Share on other sites

Guest Ebrahim Saadawi

You can easily do this test with shooting in JPEG+Raw mode in any stills camera. 

Jpeg is an all-I 8bit 4:2:2 codec

Raw is a 12 (Nikon) or 14 (bit) 4:4:4 codec after Tiff conversion 

Link to comment
Share on other sites

5 hours ago, jcs said:

4K 8-bit 422 becomes pseudo 10-bit Luma 8-bit Chroma 444 1080p. 'pseudo' as the 4 8-bit sample's variation summed to 10-bit is helped by noise/dither.

I think that you mean to say that 4k (UHD), 422, 8-bit image has equivalent color depth to a hypothetical HD, 444, 10-bit-luma/8-bit-chromas image, which is correct.  However, I don't think that hypothetical end result is an accurate description of Dugdale's conversion.  As I recall, he actually converted a 4k, 422, 8-bit image to HD, 10-bit, 422, which, if properly executed, also retains the full color depth of the original 4k, 422, 8-bit image.  Of course, HD, 10-bit, 422 has an equivalent color depth to your hypothetical HD, 444 10-bit/8-bit image.

 

By the way, if the pixels are properly summed in the down-conversion, there is no "pseudo" necessary.  All three scenarios have equivalent color depth, with or without dithering.  The dithering primarily helps eliminate banding artifacts.

Link to comment
Share on other sites

3 hours ago, Lintelfilm said:

JCS, I challenge thee!

I've been wanting to do this here for a while but haven't through fear it would devolve into silliness. However because I believe you're a stand-up guy and would do it properly, should you have the means and wish to accept my challenge:

Thanks for the challenge Lintelfilm- sorry that's not something I can do right now. Why not try it yourself? You can guide the thread you start away from chaos ;). I did something similar recently when shooting the 8-bit 420 4K A7S II (downsampled to 1080p) against that 10-bit 422 C300 II and 8-bit 422 1DX II. For that studio environment, nobody could really tell which camera was which.

Link to comment
Share on other sites

Just now, jcs said:

Thanks for the challenge Lintelfilm- sorry that's not something I can do right now. Why not try it yourself? You can guide the thread you start away from chaos ;). I did something similar recently when shooting the 8-bit 420 4K A7S II (downsampled to 1080p) against that 10-bit 422 C300 II and 8-bit 422 1DX II. For that studio environment, nobody could really tell which camera was which.

That's cool. I don't have time either as I'm currently juggling 3 jobs and a big pitch. Besides I want to be the one who guesses! :)

Link to comment
Share on other sites

1 hour ago, tupp said:

I think that you mean to say that 4k (UHD), 422, 8-bit image has equivalent color depth to a hypothetical HD, 444, 10-bit-luma/8-bit-chromas image, which is correct.  However, I don't think that hypothetical end result is an accurate description of Dugdale's conversion.  As I recall, he actually converted a 4k, 422, 8-bit image to HD, 10-bit, 422, which, if properly executed, also retains the full color depth of the original 4k, 422, 8-bit image.  Of course, HD, 10-bit, 422 has an equivalent color depth to your hypothetical HD, 444 10-bit/8-bit image.

 

By the way, if the pixels are properly summed in the down-conversion, there is no "pseudo" necessary.  All three scenarios have equivalent color depth, with or without dithering.  The dithering primarily helps eliminate banding artifacts.

Hey tupp, 420 is full Y (Luma) and 1/2 resolution (both vertical and horizontal (could say 1/4 for both); 422 is 1/2 horizontal only) Chroma (UV AKA CbCr). If we downsample 4K 420, we'll average together 4 Y's to get one new, low-pass-filtered (noise and alias reduced) pixel. The 4 samples from U and V (each) were 1/2 resolution (horizontal and vertical), so when averaged together we're getting full color resolution per pixel for the 1080p result. So Y was oversampled (and filtered) into a new 10-bit Y (needs noise/dither to really do anything useful e.g. reduce banding); not the same as taking analog from the sensor and quantizing to 10-bit digital. UV are now full resolution at 1080p, so we have 444: full resolution YUV pixel data.

Dave showed the results: the 4K downsampled 8-bit 420 images looked as good or better than the 10-bit FS5. That's all that really matters in the real world: real results. In theory, to be fair, we'd need to use the same camera for such a comparison: take a 10-bit 422 camera, shoot at 4K, then convert to 4K 420 8-bit in post. Shoot again at 1080p 10-bit 422, then compare to the 4K 420 8-bit downsampled to 1080p to check results.

Link to comment
Share on other sites

^ FS5 shoots 4K at 8-bit 420 (same as A7SII) but I agree it would have been cool and very easy to compare both cameras 4K downsampled to 1080p.. but that wasn't the main goal I guess, just to show that 4K 8-bit downsampled bids well against 10-bit 422 1080p.. and in that I think he succeeds proving something I wasn't aware of. The whole 10-bit 422 1080p advantage of such cameras as FS5 seems more like a sales pitch to me now. I'm still interested in the FS5 for other reasons but it's twice as expensive price tag doesn't seem that justified, especially as far as IQ imo..

Link to comment
Share on other sites

The reason to get the FS5 is the ability to swap between an ergonomically friendly video camera with good, but not amazing quality image (similar to A7r/s 4k) to a much less ergonomically friendly but raw/prores shooting monster like the fs700 with the new update (2.00 just came out today). The auto ND feature works and is actually REALLY useful for my event shooting. Can't wait to test it more today.

Link to comment
Share on other sites

The Red looked worse than anything else? The Canon colors were off? To be honest, ALL colors were terrible. I think the test was a worst case scenario also insofar as it was maximally biased.

We didn't another test to prove that 8-bit can look better than raw. That was the conclusion of the Zacuto shootout 2012. Where they tried to make everything look as good as possible. And what they found was that the more 'forgiving' codecs made the DPs sloppy.

The test proves further, that Sony colors look bad under almost all circumstances and that you better have 10-bit to be able to make them acceptable in post.

FS5 has better ergonomics for video than A7rii? Absolutely. That's because Sony tried very hard to make the latter as video-unfriendly as possible. And they succeeded.

Link to comment
Share on other sites

Guest Ebrahim Saadawi

TBH, in my experience, I just throw away all the spec sheet and look at the image. I don't care if it's 8 or 10 bit, each camera has a certain different image. 

I look at the 8-bit 4:2:0 HD images from a C100 and like it better than 10bit 4:2:2 4K image off the GH4. A Blackmagic 4K 12bit RAW grades worse than a 1DC 8bit image, because it has fixed pattern noise and general very high floor, 

I mean, I don't give a damn about 8 bit 10 bit 12 bit 4:2:0:2:4, some look good and others simply don't. 

 

Screw all the math and specixel-peeping...

Link to comment
Share on other sites

16 minutes ago, Ebrahim Saadawi said:

TBH, in my experience, I just throw away all the spec sheet and look at the image. I don't care if it's 8 or 10 bit, each camera has a certain different image. 

I look at the 8-bit 4:2:0 HD images from a C100 and like it better than 10bit 4:2:2 4K image off the GH4. A Blackmagic 4K 12bit RAW grades worse than a 1DC 8bit image, because it has fixed pattern noise and general very high floor, 

I mean, I don't give a damn about 8 bit 10 bit 12 bit 4:2:0:2:4, some look good and others simply don't. 

 

Screw all the math and specixel-peeping...

You've said this really well. 

I think it depends most on "implementation", how the codec and image has been designed to create the data for the image. For example, the FS5 is 10 bit but the codec is a shoddy XAVC-L. The FS7 10bit is astonishingly superior (XAVC-I). Meanwhile, the C100 with it's shitty AVCHD is an advanced implementation of wonderful engineering.

That said, many people don't understand that 10bit, 12 bit etc isn't an "instant" image quality improvement. It's just more data. What you do with that data, and done correctly, see's the boost that everyone craves. So if you are crap at grading but want 10bit, forget it. Learn to grade first. 

Link to comment
Share on other sites

3 hours ago, Axel said:

Sony tried very hard to make the latter as video-unfriendly as possible. And they succeeded.

I imagine them sitting around the Tokyo board room table thus:
"You say we put Slog in camera, will sell like hot cake?"
"Yes. Hot cake."
"And what about add strange colour and white balance also. Will be problem?"
"No problem. Gimmick!"
"Ahhh... gimmick. Yes."
"What about next year?"
"Lip my stocking function."
"Ahhh yes... very good."
 

 

Link to comment
Share on other sites

4 hours ago, Axel said:

The Red looked worse than anything else? The Canon colors were off? To be honest, ALL colors were terrible. I think the test was a worst case scenario also insofar as it was maximally biased.

This^^^^^

BTW, if you look at the comment section on youtube, I'm iFM...

(On Red) Benefits of blackshading (scroll down to see before and after pictures of blackshading): http://www.reduser.net/forum/showthread.php?111628-BLACK-SHADING-For-Dummies-amp-Experts/page14

 

Link to comment
Share on other sites

1 hour ago, Oliver Daniel said:

You've said this really well. 

I think it depends most on "implementation", how the codec and image has been designed to create the data for the image. For example, the FS5 is 10 bit but the codec is a shoddy XAVC-L. The FS7 10bit is astonishingly superior (XAVC-I). Meanwhile, the C100 with it's shitty AVCHD is an advanced implementation of wonderful engineering.

That said, many people don't understand that 10bit, 12 bit etc isn't an "instant" image quality improvement. It's just more data. What you do with that data, and done correctly, see's the boost that everyone craves. So if you are crap at grading but want 10bit, forget it. Learn to grade first. 

the codec definitely is an important variable.. and the FS5 & AS7II differ on that as well... interesting you guys mention C100.. i love the ergonomics of that cam and the mk1 is real affordable right now.. the only thing keeping me away is that 24mb/s codec.. yet i've seen some stunning footage out of it. i wonder if it's the fact that it's a 4K sensor downsampling to 1080p, meaning perhaps it's actually got more info then you're standard 8-bit 420 DSLR vid... but out of camera image is one thing, how about gradability on that "shitty AVCHD"? if the image falls apart quickly then that's a deal breaker for me..

Link to comment
Share on other sites

2 hours ago, Django said:

how about gradability on that "shitty AVCHD"? if the image falls apart quickly then that's a deal breaker for me..

Nail your exposure and it holds up very well with decent highlight retention and rolloff. If it had another two stops of DR it would be a monster. Canon are working their mojo with this and it's the only image south of the 1DC/ 1DXII that I get excited about, especially when paired with the right glass.
At the end of the day there is a lot to be said about usability when you are under pressure.

Link to comment
Share on other sites

Guest Ebrahim Saadawi
22 hours ago, Django said:

i wonder if it's the fact that it's a 4K sensor downsampling to 1080p, 

As a matter of fact it is. A 4K (4096x2160) 8.9mp sensor (the same one inside the 4K-shooting C500) but is just downsampled internally (instead of externally on your computer) to 1080p. This is one of the reasons the image is on the verge of the highest resolution HD image container can carry. And no it doesn't fall apart quickly at all when you get correct exposure even in C-LOG. 

The MKI is one of the best video camera deals on the market right now for a real workhorse but only if one 1- Doesn't need slowmotion 2- Doesn't require 4K delivery. But 1080p, dual slots, effecient high quality images, nicest colours, very high lowlight performance, XLR audios and superb in-camera audio quality even with the 3.5mm input, rotating grip, ever-lasting battery, etc, waveform, LOG, etc  

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

  • EOSHD Pro Color 5 for All Sony cameras
    EOSHD C-LOG and Film Profiles for All Canon DSLRs
    EOSHD Dynamic Range Enhancer for H.264/H.265
×
×
  • Create New...