Jump to content

Panasonic GH5 10 bit internal recording not good enough


interceptor121
 Share

Recommended Posts

EOSHD Pro Color 5 for Sony cameras EOSHD Z LOG for Nikon CamerasEOSHD C-LOG and Film Profiles for All Canon DSLRs

I think we need to stop looking at numbers so much and start focusing on results.  The effectiveness of codecs and bit depth totally depend on the sensor and processing.  My old Sony FS100 shot 1080P at 24mb/sec 8 bit and looked fantastic!  The GH5 on the other hand, looks terrible at 24mb/sec 8 bit.  The Sony Alpha cameras can look great at 100mb/sec 8 bit with S-log 2 and such, but the GH4/GH5 V-log 8 bit does not.  The GH5 was really built to shoot V-log at 10 bit, end of story.  It also does Cinelike D and Natural profiles great in 8 bit if you want, just depends on the look you are going for.  Try the internal codecs, if you aren't getting the results you want, try a recorder. 

Rent a camera and do some tests.  Know the look you are going for, and find a camera, lens, and color profile, that can effectively capture that look. 

Here are some screenshots from a short film I just directed.  GH5s, Leica R primes, and the lowly internal 10 bit codec of course...?

Screen Shot 2018-08-15 at 9.38.43 AM.png

Screen Shot 2018-08-15 at 9.38.30 AM.png

Screen Shot 2018-08-15 at 9.38.10 AM.png

Screen Shot 2018-08-15 at 9.38.10 AM.png

Screen Shot 2018-08-15 at 9.37.56 AM.png

Screen Shot 2018-07-23 at 9.51.14 AM.png

Screen Shot 2018-07-23 at 9.46.42 AM.png

Link to comment
Share on other sites

I am not unhappy with the camera I was just expecting more out of the 422 10 bit mode as I am not planning to buy a recorder and I was hoping not to need V90 cards

From what I can see with my naked eye footage straight out the camera without LOG looks better in 8 bit mode than it does in 10 bit I play it straight out of my Tv that has a 10 bit panel. I cannot see any benefit of 422 or additional colour using cinelike in the rec709 colour space compared to 420 8 bit. I white balance all my clips on a grey card so they are generally not off and look good at the outset.

I do not use log at all and this is another form of compression that uses metadata and what comes out depend on many other factors.

the considerations I have made are purely technical and generally are useless to compare different cameras but hold pretty well when you compare codecs offered by that same camera. 

The considerations are objective and not subjective and when looking at interframe codecs (100-150) all of those are subject to the same motion interpolation errors therefore size of I frame is the measure for image quality.

My impression based on combination of codec analysys and playing the clips straight out of camera on a 10 bit panel is that the camera is not capable of resolving 10 bits anyway and that you can’t see any difference between 422 and 420 so for majority of cases 100 or 150 mbps look the same and I do not see any banding on my screen dark scenes or blue skies

In terms of editing I have done not tests but you cant edit h264 decently so you need to convert to prores 422 (I use a mac) or skip h264 altogether and use all intra

based on my previous experience I would always work without conversion as any operation degrades the clip. However I find it puzzling that you can’t tell the diffference between prores HQ footage acquired with an external recorder and all intra if that is really the case the only conclusion wouls be that the camera actually does not manage to resolve 10 bits at all and thereforw the spare 300 bits go to waste

this theory that I cannot verify is corroborated by dxomark raw image of the GH5 that says the camera can only resolve 23.9 bits within an image which is 8 bits

Using the resources as your disposal as good as you can and knowing how things work is a good thing not a bad thing so am a  bit surprised that people keep going on beating up my quantitative analysis and comparing it with subjective statements

quantitative and qualitative evaluations are two different things if you are happy with what you see you won’t know 8 10 bits colour spaces etc etc

Link to comment
Share on other sites

If you are not shooting LOG or HLG or don't want to grade there is little point in 10bit and 8bit is plenty 

My own tests of 150mbps vs 400mbps show the only advantage ( I only shoot LOG or HLG) is during the grading where the 400mbps codec is easier on the hardware. Visually I can't tell the difference and also can't 'break' the footage when applying even hefty tonal changes in terms of revealing artefacts. One reason you may want an ext recorder is that of you are recording a lot of footage it may be cheaper than V60/90 cards. 

I can see 10bit vs 8bit differences when I grade as its visible in the scopes.

You have not understood the bit depth data that DXO state - more research needed ;)

 

Link to comment
Share on other sites

19 minutes ago, Shirozina said:

If you are not shooting LOG or HLG or don't want to grade there is little point in 10bit and 8bit is plenty 

My own tests of 150mbps vs 400mbps show the only advantage ( I only shoot LOG or HLG) is during the grading where the 400mbps codec is easier on the hardware. Visually I can't tell the difference and also can't 'break' the footage when applying even hefty tonal changes in terms of revealing artefacts. One reason you may want an ext recorder is that of you are recording a lot of footage it may be cheaper than V60/90 cards. 

I can see 10bit vs 8bit differences when I grade as its visible in the scopes.

You have not understood the bit depth data that DXO state - more research needed ;)

 

I record underwater video the housing for the atomos costs $3000 plus the recorder cost so I am not planning to buy it

V90 cards are much more affordable in that context

Am not planning to shoot LOG either as underwater there may be quite a bit of noise and everyone who has tried even prores HQ has had poor results

So my plan is currently cinelike D I need to decide between 8 and 10 bits and if I go all intra

Generally All Intra should save me some time as my workstation is not strong enough to process h264 real time so I would need to convert to ProRes 422

for what concerns your comment on dxomark enlightened me please raw images do not have a colour space concept so those are tonalities that may or not fit in a video colour space but it is unlikely they will all fit into REC709

if you were shooting HLG you coild have deep colours

I don’t know what transfer matrix LOG footage uses but am not planning to shoot LOG regardless best case I would shoot HLG on land but that eould require monitors etc

 

Link to comment
Share on other sites

32 minutes ago, interceptor121 said:

for what concerns your comment on dxomark enlightened me please raw images do not have a colour space concept so those are tonalities that may or not fit in a video colour space but it is unlikely they will all fit into REC709

 

 

RAW does have a colour space concept otherwise it would not be possible to convert RAW images into Tiff's or jpegs etc that can be used in a colour managed workflow.The camera can do Adobe RGB in stills and REC 2020 in Video so it's safe to assume it has a large enough colour space to fit in the REC709 (sRGB) colour space.  The Panasonic specs and DXO describe the GH5 having 12 bit sensor - are you disputing this? 

 

Link to comment
Share on other sites

2 hours ago, Shirozina said:

RAW does have a colour space concept otherwise it would not be possible to convert RAW images into Tiff's or jpegs etc that can be used in a colour managed workflow.The camera can do Adobe RGB in stills and REC 2020 in Video so it's safe to assume it has a large enough colour space to fit in the REC709 (sRGB) colour space.  The Panasonic specs and DXO describe the GH5 having 12 bit sensor - are you disputing this? 

 

The sensor is 12 bit and raw does not have a colour space the camera saves files with sensor data and metadata the processing is then done by a program that works in an intermediate color space for editing and correction and then outputs in RGB format on JPEG

The fact that the camera can work in a colour space as wide as adobe rgb or rec.2020 does not mean it can resolve 10 bit colours

a pixel may have 8 bit resolution with colours coming from a wider colour space but still not able to resolve 12 bits on a single image

even a DSLR with a 14 bit sensor stops at 26 bits which is 8 and 1/2 bits and in most cases there is no additional info between 12 and 14 bits RAW in terms of colour or resolution 

 

Link to comment
Share on other sites

43 minutes ago, interceptor121 said:

The sensor is 12 bit and raw does not have a colour space the camera saves files with sensor data and metadata the processing is then done by a program that works in an intermediate color space for editing and correction and then outputs in RGB format on JPEG

The fact that the camera can work in a colour space as wide as adobe rgb or rec.2020 does not mean it can resolve 10 bit colours

a pixel may have 8 bit resolution with colours coming from a wider colour space but still not able to resolve 12 bits on a single image

even a DSLR with a 14 bit sensor stops at 26 bits which is 8 and 1/2 bits and in most cases there is no additional info between 12 and 14 bits RAW in terms of colour or resolution 

 

A senor has a color space in the sense that it's colour response can be measured and quantified and this information used to enable a RAW conversion to interpret the data to create the desired output. 

As no camera on the market seems to be able to meet your requirements for 10bit colour then why are you complaining about this one in particular?

Bit depth as most people use the definition is to do with how many times you can slice the cake rather than how big the cake is. 

Link to comment
Share on other sites

8 hours ago, interceptor121 said:

I am not unhappy with the camera I was just expecting more out of the 422 10 bit mode

Please explain/articulate what you mean by more!

8 hours ago, interceptor121 said:

From what I can see with my naked eye footage straight out the camera without LOG looks better in 8 bit mode than it does in 10 bit I play it straight out of my Tv that has a 10 bit panel. I cannot see any benefit of 422 or additional colour using cinelike in the rec709 colour space compared to 420 8 bit. I white balance all my clips on a grey card so they are generally not off and look good at the outset.

 

The believe that 8bit is better, is confirmation bias tricking your mind.

The various color profiles (ignoring v-logl for a moment) by nature don't change the amount of color that ends up in the footage, they change how the the color captured by the sensor is transformed. A given color is made brighter or darker, shifted towards red or green etc.

 

Your eyes can't even discern the difference between two similar shades of 8 bit color, let alone 10 bit. You can verify this your self, by going into any 8 bit editor, like paint, or photoshop, etc. Draw a big box on the screen, and fillet it with one color say pure red (255,0,0). Mask of half the box so it stays pure red, and then make the other un-masked side (254,0,0). You will not be able to tell the difference between the 2 shades. keep dropping down from 254, to 253, 252 etc until you are sure you can see the line where the color changes.

If you can tell the difference between 245 and 255 then you need a 10 shade spread to see a color difference. Now you need to realize that in 10 bit that becomes a 40 shade spread. Your ability to discern the difference gets worse with age, and it can also be skewed for a given color if you have any kind of color blindness.

Dynamic range is a similar affair, the gh5 sensor is capable of a little over 12 stops at base iso. Your eyes are only capable of 10 stops, and again it can get worse with age and various medical conditions.

 

8 hours ago, interceptor121 said:

Using the resources as your disposal as good as you can and knowing how things work is a good thing not a bad thing so am a  bit surprised that people keep going on beating up my quantitative analysis and comparing it with subjective statements


Your getting beat up because you analysis reads as someone who has a serious lack of understanding about bit depth, chroma sub sampling, and their benefits or lack of, when it comes to video production. it a similar thing when it comes to codecs.


Right out of any camera 10 bit isn't inherently better than 8 bit, 4:2:2 isn't better than 4:2:0 either. Higher bit depths and higher chroma sub-sampling are only really beneficial if you are going to push the footage around in post. higher bit depth and chroma sub-sampling can be pushed farther before the footage falls apart. This is important when it comes to major motion pictures and the like, because what comes out of the camera is usually drastically different to what ends up on the screen. 

Codecs are the same, Noe one codec is inherently better than another by default. Prores or dnxhr isn't better than h.264. or h.265, or any of the other million codecs out there. They each have their pros and cons, and what is best is very situation specific.



 

2 hours ago, interceptor121 said:

The sensor is 12 bit and raw does not have a colour space the camera saves files with sensor data and metadata the processing is then done by a program that works in an intermediate color space for editing and correction and then outputs in RGB format on JPEG

The fact that the camera can work in a colour space as wide as adobe rgb or rec.2020 does not mean it can resolve 10 bit colours

a pixel may have 8 bit resolution with colours coming from a wider colour space but still not able to resolve 12 bits on a single image

even a DSLR with a 14 bit sensor stops at 26 bits which is 8 and 1/2 bits and in most cases there is no additional info between 12 and 14 bits RAW in terms of colour or resolution 

 

It sounds like you are confusing dynamic range with bit depth.

Link to comment
Share on other sites

1 hour ago, Shirozina said:

A senor has a color space in the sense that it's colour response can be measured and quantified and this information used to enable a RAW conversion to interpret the data to create the desired output. 

As no camera on the market seems to be able to meet your requirements for 10bit colour then why are you complaining about this one in particular?

Bit depth as most people use the definition is to do with how many times you can slice the cake rather than how big the cake is. 

They are two different things.

An image can contain 24 bit different colours but they do not necessarily fit into the  RGB 8 bits. You could have some in a gradation in between your level and some values completely missing hence it is important to capture higher bit colour sample 10 or 12 bits if you can even if you eventually you discard them later

35 minutes ago, Dan Sherman said:

Please explain/articulate what you mean by more!

 

The believe that 8bit is better, is confirmation bias tricking your mind.

The various color profiles (ignoring v-logl for a moment) by nature don't change the amount of color that ends up in the footage, they change how the the color captured by the sensor is transformed. A given color is made brighter or darker, shifted towards red or green etc.

 

Your eyes can't even discern the difference between two similar shades of 8 bit color, let alone 10 bit. You can verify this your self, by going into any 8 bit editor, like paint, or photoshop, etc. Draw a big box on the screen, and fillet it with one color say pure red (255,0,0). Mask of half the box so it stays pure red, and then make the other un-masked side (254,0,0). You will not be able to tell the difference between the 2 shades. keep dropping down from 254, to 253, 252 etc until you are sure you can see the line where the color changes.

If you can tell the difference between 245 and 255 then you need a 10 shade spread to see a color difference. Now you need to realize that in 10 bit that becomes a 40 shade spread. Your ability to discern the difference gets worse with age, and it can also be skewed for a given color if you have any kind of color blindness.

Dynamic range is a similar affair, the gh5 sensor is capable of a little over 12 stops at base iso. Your eyes are only capable of 10 stops, and again it can get worse with age and various medical conditions.

 


Your getting beat up because you analysis reads as someone who has a serious lack of understanding about bit depth, chroma sub sampling, and their benefits or lack of, when it comes to video production. it a similar thing when it comes to codecs.


Right out of any camera 10 bit isn't inherently better than 8 bit, 4:2:2 isn't better than 4:2:0 either. Higher bit depths and higher chroma sub-sampling are only really beneficial if you are going to push the footage around in post. higher bit depth and chroma sub-sampling can be pushed farther before the footage falls apart. This is important when it comes to major motion pictures and the like, because what comes out of the camera is usually drastically different to what ends up on the screen. 

Codecs are the same, Noe one codec is inherently better than another by default. Prores or dnxhr isn't better than h.264. or h.265, or any of the other million codecs out there. They each have their pros and cons, and what is best is very situation specific.



 

It sounds like you are confusing dynamic range with bit depth.

I think you have less understanding than you think you have I am not sure what a colour profile is maybe you are talking about a transfer matrix? The amount of information the sensor captures is nothing to do with what ends up after compression and if you squeeze information into a narrower colour space you have clipping. The GH5 uses BT709 and BT2020 and there is a difference between the two otherwise you would not have colours in HLG that clip in REC709.

And your last comment on dynamic range out of the blue of a whole set of statements is rather interesting I rather have no further explanations if you don't mind thank you

Link to comment
Share on other sites

1 hour ago, interceptor121 said:

if you squeeze information into a narrower colour space you have clipping.
 

Ummm, no just no

this is just strait up wrong. If you 'squeeze' the information down to fit into a given color space you do not have clipping. This is what V-log does.  Clipping occurs when change color spaces and don't 'squeeze' / adjust dynamic range . 

for example rec 709 8 bit supports the range of 16-235 (if memory serves), but your camera is capable of cap of capturing from 0-255. If you don't 'squeeze' or in other words decrease the dynamic range of the footage, you will clip the shadows below 16 and the highlights above 235. If you use one of the various log profiles you can keep those shadows and highlights in the source file so that you can manipulate them as you see fit.

The 'squeezing' and 'de-squeezing' plus grading in post is often what leads to the ugly gradients people complain about with 8 bit.  It also why some people want the higher bit depth and choma sub-sampling options.

 

Link to comment
Share on other sites

3 hours ago, Dan Sherman said:

Ummm, no just no

No it's correct. If you don't transform the data when you go from one color space to another you are changing the colors.

If you transform from a larger colorspace into a smaller one and keep colors you will clip the colors outside it. Your analogy about 16-235 is 100% wrong as there is no way to address colors outside the color space. The red channel will only get brighter, not redder when past 235 for example.

V-log have it's own color space that happens to fit most other color space. And if you convert that to REC709 it's going to clip unless you change the colors. One way to change them to fit would be to look at them with a normal monitor and the colors would look very flat and desaturated but would not clip.

Link to comment
Share on other sites

25 minutes ago, no_connection said:

No it's correct. If you don't transform the data when you go from one color space to another you are changing the colors.

If you transform from a larger colorspace into a smaller one and keep colors you will clip the colors outside it. Your analogy about 16-235 is 100% wrong as there is no way to address colors outside the color space. The red channel will only get brighter, not redder when past 235 for example.

V-log have it's own color space that happens to fit most other color space. And if you convert that to REC709 it's going to clip unless you change the colors. One way to change them to fit would be to look at them with a normal monitor and the colors would look very flat and desaturated but would not clip.

Yes you will loose colors if you go from a larger color space to a smaller one but if you use a non linear transform, you can determine where that color loss occurs.

Link to comment
Share on other sites

That is not a color space nonlinear or not. Color space defines where the primary component are, how "saturated" a pure color can be if that makes sense. It's usually a triangle since 3 components usually is enough. No matter how you dress it up it makes up an unbreakable wall that you can't move outside.

Sure you could stretch REC709 to BT2020 on a monitor but it won't be accurate colors and will result in oversaturation and too "rich" colors. Shove any normal content on a quantum dot display and you will see what I mean.

 

What you are talking about is giving different parts of luminosity different amount of accuracy of where colors could be addressed. Something that is always used in vision since it's non linear and it does not make sense having accuracy where it's not needed.

Now the use of log does create another problem since codecs are made to use our vision and throw away details we don't see or find important, but since the data is pushed very much around it now throws away data that we see and keeps data we would miss when it's transformed back. It might not be as big of a thing today but something to be aware of. There is always a trade off and you have to choose to take it or not depending on the need.

Link to comment
Share on other sites

25 minutes ago, no_connection said:

That is not a color space nonlinear or not. Color space defines where the primary component are, how "saturated" a pure color can be if that makes sense. It's usually a triangle since 3 components usually is enough. No matter how you dress it up it makes up an unbreakable wall that you can't move outside.

Sure you could stretch REC709 to BT2020 on a monitor but it won't be accurate colors and will result in oversaturation and too "rich" colors. Shove any normal content on a quantum dot display and you will see what I mean.

 

What you are talking about is giving different parts of luminosity different amount of accuracy of where colors could be addressed. Something that is always used in vision since it's non linear and it does not make sense having accuracy where it's not needed.

Now the use of log does create another problem since codecs are made to use our vision and throw away details we don't see or find important, but since the data is pushed very much around it now throws away data that we see and keeps data we would miss when it's transformed back. It might not be as big of a thing today but something to be aware of. There is always a trade off and you have to choose to take it or not depending on the need.

Who is going to transform REC709 to REC2020 as a workflow? Most will be doing the opposite so with the correct transformation no data will be lost. Log is not a problem as long as it's captured using enough bits and enough chroma subsampling. The problem is when you shoot LOG in 8bits and 4.2.0 and then on top of that use high lossy compression codecs. 

Link to comment
Share on other sites

50 minutes ago, Shirozina said:

Most will be doing the opposite so with the correct transformation no data will be lost.

Wrong. But you could happily change the colors to fit inside if you want. Which is why ppl get paid to correct and manage color.

 

50 minutes ago, Shirozina said:

Log is not a problem as long as it's captured using enough bits

Same could be said about the opposit, no need for log with enough bits.

 

50 minutes ago, Shirozina said:

Who is going to transform REC709 to REC2020 as a workflow?

That happens simply by viewing REC709 content on a BT2020 monitor without transforming it.

And if you convert SD to HD I'm pretty sure that is someones workflow. But then you missed the point.

Link to comment
Share on other sites

I think you are only interested in arguing but here you go;

1, You can't invent colours that are not captured or recorded in the first place. I get paid to manage colour but I'm not a magician

2, REC709 and many other non log camera profiles have limited DR and Gamut so it doesn't matter how many bits the file has if the data isn't there.

3,If you view rec709 on a REC 2020 device without transforming it that isn't a workflow it's called operator error. SD to HD - what point are you making in relation to any of the above?

 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

  • EOSHD Pro Color 5 for All Sony cameras
    EOSHD C-LOG and Film Profiles for All Canon DSLRs
    EOSHD Dynamic Range Enhancer for H.264/H.265
×
×
  • Create New...