Jump to content
Mattias Burling

The Canon C200 is here and its a bomb!

Recommended Posts

6 hours ago, fuzzynormal said:

c200 IQ recognition, to me, reveals a causation vs. correlation issue. 

Are the c200 videos better because they use a c200, or are the people and organizations that use (and can afford to buy) the c200 simply more accomplished?

The question of the ages, right...

If a C200 autofocuses in the woods but there is nobody there to FaceTrack in Raw, did it ever really autofocus at all?

Share this post


Link to post
Share on other sites
EOSHD Pro Color for Sony cameras EOSHD Pro LOG for Sony CamerasEOSHD C-LOG and Film Profiles for All Canon DSLRs
11 hours ago, webrunner5 said:

I think you also have to throw the Colorist, Editor in the mix also. You can shoot the best film in the world but once it hits the post production you better hope you have some damn good talent in there.


Absolutely agreed! Which is why I said:

 

16 hours ago, IronFilm said:

"amazing" because the shooter (and everyone else on his team) are talented and experienced people with a large production budget?

 

Share this post


Link to post
Share on other sites
On 2/12/2018 at 1:39 PM, IronFilm said:

Sony RX100mk4 / etc  "bad" because it is a bad camera...

The RX10mk2 IS a BAD camera. (Notice I switched to the RX10 but they have the same image/sensor soo...)

And I still own one because of the 2sec 250fps...Tried the RX10mk4 and it improved three things:

1. 4 sec buffer!

2. Better color profiles (rx10mk2 looks like shite)

3. Autofocus.

And one thing that's worse:

An absolutely disgusting manual focus ring. You can't use it. Sometimes it goes from 1inch to infinite with a slight touch (or even not touching), othertimes you have to roll the wheel 4x over.

Share this post


Link to post
Share on other sites

So looks like Canon won't release a 10-bit codec for the C200. Man what a waste of a camera. ATLEAST let me get an uncompressed SDI output to 4K. At the very least a 10-bit HDMI 4K output. It doesn't make any sense. They've really shot themselves in the foot here. With the RAW output from the panasonic and really any of the blackmagic cameras it seems like canon can't compete. 

Share this post


Link to post
Share on other sites

Canon themselves know it should have 10bit internal, but they won't do it (says a lot about Canon):
 

 

In slightly happier news, Wooden Camera now offer an option for the user to swap out the mount for PL or EF on the C200:

https://www.newsshooter.com/2018/04/05/canon-c200-c200b-pl-mount-modification-from-wooden-camera/

 

image.thumb.png.cb540447a8022899587fca8ed0fa2e10.png

Share this post


Link to post
Share on other sites

They will, as soon as the C300mkII is due to been replaced, and most (all) other brands are offering 10bit for cheaper. Even JVC can bring 10bit for less than 3999$, easily!

But I wouldn't buy a camera (and not a very cheap one, especially in Europe), with Ifs and Maybes.

Do we even know how much specialized cine/video cameras sell around the world, or specific places? In photo/hybrid cameras we have CIPA to get an indication, at least.

Share this post


Link to post
Share on other sites

In Canon’s graph for their Spring firmware release, it shows 2K Proxy as a separate format?

Does it have to be used in conjunction with the RawLite or can you record in that format on its own?

Is the Proxy an internal downscale of the RawLite? If so, then it must be at least 10bit, correct?

If you cannot record it separately,  then can the 2K Proxy recording be imported without linking the Raw files and then rendered out to ProRes?

AFE009A8-39A5-4397-9165-EE314CFC9387.png

Share this post


Link to post
Share on other sites

Matthew Allard, who seems like a pretty knowledgeable guy to me, says in his article about the firmware update that there will be no "edit friendly internal codec that meets the standards imposed by a lot of broadcasters and production houses."  He goes on to say that "There have been several petitions from Canon customers circulating around that have been pleading with Canon to implement a 4:2:2 10-bit codec in the C200, but they seem to have fallen on deaf ears." 

Share this post


Link to post
Share on other sites
38 minutes ago, Don Kotlos said:

Proxy is low bitrate 8bit that is recorded on the SD card along with RAW:

5ac647572e673_ScreenShot2018-04-05at10_56_37AM.png.e6ac5020a65ea31d98aa4d471470d805.png

https://www.usa.canon.com/internet/portal/us/home/explore/cinema-eos-c200-cameras/recording

That’s right, thanks. That info isn’t included in their recent graph and I forgot that bit.

29 minutes ago, jonpais said:

Matthew Allard, who seems like a pretty knowledgeable guy to me, says in his article about the firmware update that there will be no "edit friendly internal codec that meets the standards imposed by a lot of broadcasters and production houses."  He goes on to say that "There have been several petitions from Canon customers circulating around that have been pleading with Canon to implement a 4:2:2 10-bit codec in the C200, but they seem to have fallen on deaf ears." 

Yeah I’d agree it was a mistake on Canon’s part not to have a 10bit middle codec, but then you hear real life accounts from C200 users on this board that says the 8bit codec has more latitude than the GH5’s 10bit. So who knows who to believe. I can’t afford either right now, so it really doesn’t matter much to me. 

Share this post


Link to post
Share on other sites
13 minutes ago, mercer said:

That’s right, thanks. That info isn’t included in their recent graph and I forgot that bit.

Yeah I’d agree it was a mistake on Canon’s part not to have a 10bit middle codec, but then you hear real life accounts from C200 users on this board that says the 8bit codec has more latitude than the GH5’s 10bit. So who knows who to believe. I can’t afford either right now, so it really doesn’t matter much to me. 

Yeah, but some people say they saw Bigfoot also. 

Share this post


Link to post
Share on other sites
7 minutes ago, jonpais said:

I think he says it grades well enough, it's just that many broadcasters and production houses require 10 bit 4:2:2 (according to Matthew Allard).

 

That makes sense. Again I am in no position to buy a C200, so this is just talk on my part, but I don’t see the lack of middle codec as that big of a deal for my uses... but I still find it a mistake not to have it. I think the biggest mistake was not including a 2K or 2.5K RawLite variant.

Share this post


Link to post
Share on other sites
15 minutes ago, mercer said:

Yeah I’d agree it was a mistake on Canon’s part not to have a 10bit middle codec, but then you hear real life accounts from C200 users on this board that says the 8bit codec has more latitude than the GH5’s 10bit. 

Yep the bit rate & bit depth of a codec are just one part of the equation and it has been shown quite few times that they are not the only important thing. While I would also like 10bit, when 8bit is done correctly it can be sufficient for quite many things (but not broadcast I guess). 

In this comparison for example the 10bit/400mbps from GH5 is far worse than the 10bit/150mbps from EVA1. Dave Dugdale also found minimal difference in 4K between the 10bit GH5 and 8bit of A6500. 

In my personal experience the 10bit from the BM pocket is on another level than the 10bit that I have seen from the GH5. Even at 400mbps there is just too much compression happening for some reason. 

Share this post


Link to post
Share on other sites

In wolfcrow's video review of the GH5 (prior to the firmware update), he went to great lengths to demonstrate that 10 bit would require at minimum 560Mbps to show a difference in quality to the 8 bit 150Mbps footage. But after the update, he said 400Mbps was fine and recommended not using the Inferno. So confusing! 

Share this post


Link to post
Share on other sites
On ‎2018‎-‎04‎-‎02 at 8:23 PM, NathanDrake said:

So looks like Canon won't release a 10-bit codec for the C200. Man what a waste of a camera. ATLEAST let me get an uncompressed SDI output to 4K. At the very least a 10-bit HDMI 4K output. It doesn't make any sense. They've really shot themselves in the foot here. With the RAW output from the panasonic and really any of the blackmagic cameras it seems like canon can't compete. 

Hardware encoding is determined by what the processor inside is capable of, and the DV6 apparently has a more consumer orientated encoder than the DV5, probably because it has been optimised for use in DSLRs/MILCs/compacts. 

Presumably the DV6 has other advantages over the DV5 when it comes to RAW, and that is probably why they used it in the C200 rather than the DV5.

Share this post


Link to post
Share on other sites
9 hours ago, mercer said:

Yeah I’d agree it was a mistake on Canon’s part not to have a 10bit middle codec, but then you hear real life accounts from C200 users on this board that says the 8bit codec has more latitude than the GH5’s 10bit. So who knows who to believe. I can’t afford either right now, so it really doesn’t matter much to me. 

People will say the C100 "grades well" too.....

9 hours ago, mercer said:

I think the biggest mistake was not including a 2K or 2.5K RawLite variant.


That would mean a hell of a crop. 


I'm just surprised Canon didn't let the C200 have an HD 10bit codec option. 

Just like the FS5 from Sony, it has 4K raw external + 10bit HD internal, which means it is a pretty capable camera but still clearly a lower end choice to their FS7

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...