I am experiencing a problem with Adobe Premiere which destroys the quality of Canon 1D C footage. It’s a problem which has always been attributed to the camera and others like it – banding with an 8bit codec. But actually, could the software you use to edit material play a role?
8bit DSLR codecs get quite a lot of bashing for banding, and it is something we’re well used too… A sky or a wall taking up half the shot with only 4 or 5 shades where there should be a silky smooth gradient with thousands of tiny steps in-between darker and slightly lighter areas of the image.
However one of the first things I noticed with the Panasonic GH4 was that banding was greatly reduced and it looked like my 10bit Blackmagic footage, even though it was still an 8bit codec (internally with 4K). Why such a turn-around for an 8bit codec? It turns out the 4K mode on the GH4 holds the key to hiding this banding… yet it is something we can apply to all DSLRS…
Pros are wondering what the benefit of 4K is to them in terms of overall image quality, when mastered and delivered for 2K / 1080p. A lot of work is still shot in 1080p and cameras like the Canon C300 are the workhorses of the moment.
In the case of the GH4 it may appear from the specs that it’s just an 8bit 4:2:0 camera internally.
Actually the theory is 8bit 4:2:0 4K material from this camera can be taken through a workflow in post that converts it to 10bit 4:4:4 1080p – with all the smoother tonality, better colour and workflow advantages that format brings. This is a big leap for 1080p based on the much more expensive C300 which only does 8bit 4:2:2.
I asked Go Pro’s David Newman (Sr. Dir. Software – follow on Twitter) whether this theory was correct…
What is the real advantage of installing Magic Lantern for raw video on your Canon DSLR, specifically the powerful full frame 5D Mark III?
Is it possible to finally SHOW it? Yes it is.
Here is the most in-depth comparison yet between the standard video mode and raw and why the image quality is worth your attention.