Jump to content
Mattias Burling

The Canon C200 is here and its a bomb!

Recommended Posts

9 hours ago, Andrew Reid said:

One day Canon's rep can explain to me why there is 300MBit 4:2:2 on the £1400 XC10 but just 100Mbit 4:2:0 on the £7500 C200 :)

I'm kicking myself for not asking the same question!!
 

2 hours ago, tugela said:

That seems weird, unless the hardware is just not set up for that. If that is the case then no firmware can change it. It may be something to do with the new processor, with perhaps the hardware encoder optimized for the Digic 8 variant so they can enable 4K footage in consumer models of their products.

It has to be because of the different processors used and the hardware encoding options available in each.

The C200 ships with Dual Digic 6 processors which has got to be faster than the single Digic 5 in the XC10. There's no way the hardware couldn't support it. RAW output can't be the reason either because RAW is *less* processor intensive -- it just takes the sensor feed and writes it to a card, it doesn't have to debayer or add noise reduction or anything. 

Canon's own press release for XF-AVC lists the specs of the codec, and for 4K it can do either 8/10bit I-Frame 422 and for HD 8/10/12bit 420/422/444. Committing resources to creating an 8bit 420 for 4K just to protect the C300mkII would be *insane*, though I wouldn't put it past them. It might even explain why it doesn't ship with XF-AVC.

I walked into the demo planning the business case to finance the C200, and I walked out with an order for a C100 mkII. They're now selling the C100 mkII for $5000AUD and it comes with an Atomos Ninja Blade kit, vs the C200 with retails for $12499AUD. Easiest purchase decision I've ever made!

Share this post


Link to post
Share on other sites
EOSHD Pro Color for Sony cameras EOSHD Pro LOG for Sony CamerasEOSHD C-LOG and Film Profiles for All Canon DSLRs
23 minutes ago, jhnkng said:

I'm kicking myself for not asking the same question!!
 

The C200 ships with Dual Digic 6 processors which has got to be faster than the single Digic 5 in the XC10. There's no way the hardware couldn't support it. RAW output can't be the reason either because RAW is *less* processor intensive -- it just takes the sensor feed and writes it to a card, it doesn't have to debayer or add noise reduction or anything. 

Canon's own press release for XF-AVC lists the specs of the codec, and for 4K it can do either 8/10bit I-Frame 422 and for HD 8/10/12bit 420/422/444. Committing resources to creating an 8bit 420 for 4K just to protect the C300mkII would be *insane*, though I wouldn't put it past them. It might even explain why it doesn't ship with XF-AVC.

I walked into the demo planning the business case to finance the C200, and I walked out with an order for a C100 mkII. They're now selling the C100 mkII for $5000AUD and it comes with an Atomos Ninja Blade kit, vs the C200 with retails for $12499AUD. Easiest purchase decision I've ever made!

If the hardware doesn't support it, it doesn't matter how fast the processor is. Digic DV6 processors may be optimized differently than the DV5, to maximize thermal performance when doing hardware encoding. That would make sense since the DV6 is the sibling of the Digic 8, and that presumably will need a thermally optimized encoder to enable 4K in consumer cameras so that Canon can compete in those markets.

The processing power of the DV6 outside of hardware encoding may be necessary to handle the compression of the RAW feed, and that could explain why the C200 uses DV6s instead of DV5s.

There is no reasonable logical reason for not including the sorts of codecs already present in the C300II and XC10/15 otherwise. It has to be a hardware compromise. And that compromise may be leaving the middle codecs out of the equation so that the processor can deliver the best of both ends in a variety of product models, not just the C200.

Share this post


Link to post
Share on other sites
1 hour ago, tugela said:

If the hardware doesn't support it, it doesn't matter how fast the processor is. Digic DV6 processors may be optimized differently than the DV5, to maximize thermal performance when doing hardware encoding. That would make sense since the DV6 is the sibling of the Digic 8, and that presumably will need a thermally optimized encoder to enable 4K in consumer cameras so that Canon can compete in those markets.

The processing power of the DV6 outside of hardware encoding may be necessary to handle the compression of the RAW feed, and that could explain why the C200 uses DV6s instead of DV5s.

There is no reasonable logical reason for not including the sorts of codecs already present in the C300II and XC10/15 otherwise. It has to be a hardware compromise. And that compromise may be leaving the middle codecs out of the equation so that the processor can deliver the best of both ends in a variety of product models, not just the C200.

That's an interesting point, I've just noticed that the C700 uses Triple DV5s rather than the DV6. I wonder if Canon's intention was to position the C200 as primarily a RAW capturing camera, with a low end codec for fast turnaround work. From this perspective launching the DV6 with CRL in a new product line is smart since Canon already has existing cameras that can do what people already want, and it means they can get it to market first and add/improve features as time goes on. 

Share this post


Link to post
Share on other sites

No one knows what the specs of the middle codec will be. I'm sure that Canon has specifically instructed all employees to remain silent on this matter. So I would give little weight to any rumors at this point. That said buy this camera for what it is now. We all know that Canon can disappoint on specs. That's not to say they will... but buying in hopes of unconfirmed specs is not recommended.

Share this post


Link to post
Share on other sites
17 hours ago, jhnkng said:

I'm kicking myself for not asking the same question!!
 

The C200 ships with Dual Digic 6 processors which has got to be faster than the single Digic 5 in the XC10. There's no way the hardware couldn't support it. RAW output can't be the reason either because RAW is *less* processor intensive -- it just takes the sensor feed and writes it to a card, it doesn't have to debayer or add noise reduction or anything. 

Canon's own press release for XF-AVC lists the specs of the codec, and for 4K it can do either 8/10bit I-Frame 422 and for HD 8/10/12bit 420/422/444. Committing resources to creating an 8bit 420 for 4K just to protect the C300mkII would be *insane*, though I wouldn't put it past them. It might even explain why it doesn't ship with XF-AVC.

I walked into the demo planning the business case to finance the C200, and I walked out with an order for a C100 mkII. They're now selling the C100 mkII for $5000AUD and it comes with an Atomos Ninja Blade kit, vs the C200 with retails for $12499AUD. Easiest purchase decision I've ever made!

The RAW data is compressed, and that takes processing power. It is not just read to the card.

Share this post


Link to post
Share on other sites
2 hours ago, tugela said:

The RAW data is compressed, and that takes processing power. It is not just read to the card.

Without knowing the technology behind RAW light (wavelets like JPEG2000, REDCODE?) it is hard to judge the computational power needed, but I think the chip are powerful enough for 4k 10bit 422 at least at 30fps.

My opinion, as many others, is that it is only a temporary protection of the C300 II and once they are ok with releasing the firmware they can implement 10bit 422 400 Mbits and/or 8bit 422 300 Mbits. Maybe one factor would be to be able to save to the SD card so the 300 Mbits will be chosen. Although it is possible to do 400 Mbits on SDs, with so many cards brand and models it can become tricky for support and test. I'm skeptical to the 8bit 420 because it make no sense, they already have 8bit 420 so 6 months wait just for going all-I? but again is Canon....

Another theory would be that they need more time to finalize/optimize the XF-AVC for 4k 60p although the C700 does this already but at 810 Mbits so no SD card.....

Share this post


Link to post
Share on other sites
3 minutes ago, gt3rs said:

Without knowing the technology behind RAW light (wavelets like JPEG2000, REDCODE?) it is hard to judge the computational power needed, but I think the chip are powerful enough for 4k 10bit 422 at least at 30fps.

My opinion, as many others, is that it is only a temporary protection of the C300 II and once they are ok with releasing the firmware they can implement 10bit 422 400 Mbits and/or 8bit 422 300 Mbits. Maybe one factor would be to be able to save to the SD card so the 300 Mbits will be chosen. Although it is possible to do 400 Mbits on SDs, with so many cards brand and models it can become tricky for support and test. I'm skeptical to the 8bit 420 because it make no sense, they already have 8bit 420 so 6 months wait just for going all-I? but again is Canon....

Another theory would be that they need more time to finalize/optimize the XF-AVC for 4k 60p although the C700 does this already but at 810 Mbits so no SD card.....

For RAW essentially the camera has to create a zip file in real time on the fly. How well does your desktop handle that task, to put it into perspective? It will require a significant amount of processing to do that. If it were simply writing the data to storage then little processing would be required, likewise if it was discarding a lot of data during the compression to speed things up. But to keep all of the data while still compressing, that requires processor muscle to do in real time.

The code to implement higher bit rates/depths already exists, so it would already be there if it were that simple. The absence of those codecs means that the hardware encoder inside the new chip is not set up to enable it.

Share this post


Link to post
Share on other sites
38 minutes ago, tugela said:

For RAW essentially the camera has to create a zip file in real time on the fly. How well does your desktop handle that task, to put it into perspective? It will require a significant amount of processing to do that. If it were simply writing the data to storage then little processing would be required, likewise if it was discarding a lot of data during the compression to speed things up. But to keep all of the data while still compressing, that requires processor muscle to do in real time.

The code to implement higher bit rates/depths already exists, so it would already be there if it were that simple. The absence of those codecs means that the hardware encoder inside the new chip is not set up to enable it.

ZIp is a lossless compression and is very different from a lossy compression algorithm like JPG. RAW Light is a lossy compression codec 1:3 to 1:5 (30fps is 1:3, 60 is 1:5) this is why you get about the same bitrate at 30 and 60 fps. Red for their RAW uses wavelets based on the JPEG2000 compression algorithm. So if you export from Resolve a 4k video in JPEG2000 it would be a much better comparison than zip. But even this test is useless because the cpu can have HW optimization for a specific algorithms. You can believe that they are working furiously to port the code to the new chipset, I believe they are simply protecting the C300 II. In 6 month we will know...

Share this post


Link to post
Share on other sites
8 hours ago, tugela said:

The RAW data is compressed, and that takes processing power. It is not just read to the card.

True, but h.264 is also compressed, and the raw data from the sensor has to be debayered as well. But I take your point that it isn't as straightforward as I had said.

5 hours ago, tugela said:

For RAW essentially the camera has to create a zip file in real time on the fly. How well does your desktop handle that task, to put it into perspective? It will require a significant amount of processing to do that. If it were simply writing the data to storage then little processing would be required, likewise if it was discarding a lot of data during the compression to speed things up. But to keep all of the data while still compressing, that requires processor muscle to do in real time.

The code to implement higher bit rates/depths already exists, so it would already be there if it were that simple. The absence of those codecs means that the hardware encoder inside the new chip is not set up to enable it.

The desktop computer analogy doesn't really hold water, since desktop CPUs are general purpose, and has to be good at everything, while specialist processors like the DIGIC series is purpose built for a select few things only. I mean beefy a computer do you need to playback 4K 60p CinemaDNG at 60fps? If we could take that kind of computing power and shrink it to a chip that fits inside a camera, we'd be editing RAW on our phones. 

But I agree with your second point, Canon hasn't got production ready code to implement a higher bitrate codec. They've said as much really. But it's just engineering time, whatever problems they have implementing an 8/10bit 422 codec they can solve given enough time. The only real question is whether they will work to implement a beefier codec for the C200 or leave it for the C200 mkII.

Share this post


Link to post
Share on other sites
6 hours ago, jhnkng said:

True, but h.264 is also compressed, and the raw data from the sensor has to be debayered as well. But I take your point that it isn't as straightforward as I had said.

The desktop computer analogy doesn't really hold water, since desktop CPUs are general purpose, and has to be good at everything, while specialist processors like the DIGIC series is purpose built for a select few things only. I mean beefy a computer do you need to playback 4K 60p CinemaDNG at 60fps? If we could take that kind of computing power and shrink it to a chip that fits inside a camera, we'd be editing RAW on our phones. 

But I agree with your second point, Canon hasn't got production ready code to implement a higher bitrate codec. They've said as much really. But it's just engineering time, whatever problems they have implementing an 8/10bit 422 codec they can solve given enough time. The only real question is whether they will work to implement a beefier codec for the C200 or leave it for the C200 mkII.

H.264 would be using dedicate processor logic, but the RAW encoding would just be using the generic computational ability of the processor. RAW (and anything similar, such as MJPEG) are implemented through software. Efficient compression formats, such as H.264 and H.265, are implemented using hardware encoders, and once you have those you are pretty much limited to whatever the encoder is set up to handle.

The "middle" codecs would have to use the dedicate encoder in the processor, and if that has been optimized for thermal efficiency so it can be used in product applications such as consumer cameras, then the processor would not necessarily have the hardware inside to do what the DV5 cameras can do. That can't be corrected by firmware, either the encoder can do it or it can't. If the DV6 had the same encoder inside as the DV5, then those middle codecs would already be in the camera spec and you would not need to wait for anything - they would be there from the start. The fact that they are not implies that the encoder is different and suggests that it has been designed primarily for consumer applications. But, the computational power of the DV6 enables RAW recording, and that is likely the reason they used them rather than the DV5s, event though it results in the loss of the middle codecs. From Canon's point of view that would be convenient since it would allow legitimate product differentiation that is based on the hardware compromises made, not as a result of some arbitrary "crippling". In that scenario the C500, C300, C200 and C100 families would each have their own niche based on the type of footage they are capable of producing. So, depending on what exactly your needs were, you would select one camera or the other.

All of that means while they might be able to implement higher bit rates later on, you are probably not going to get H.264 encoded 10 bit 4:2:2.

Share this post


Link to post
Share on other sites

I’ve been watching some C200 videos on Vimeo now that the camera has been out in the wild for a little while. So I figured I’d link to a few that look pretty good. Actually every one I’ve seen looks pretty good. Wish I had an extra 6 grand lying around. 

 

Here’s another with some lowlight footage...

 

Share this post


Link to post
Share on other sites
1 hour ago, mercer said:

I’ve been watching some C200 videos on Vimeo now that the camera has been out in the wild for a little while. So I figured I’d link to a few that look pretty good. Actually every one I’ve seen looks pretty good. Wish I had an extra 6 grand lying around. 

 

Here’s another with some lowlight footage...

 

This cam is slowly winning me over. Wish I had an extra 6 grand lying around. 

Share this post


Link to post
Share on other sites

Got my C200 and it really is awesome. Since I didn't found any case for it yet, I've designed my own custom foam insert in AutoCad. I precisely measured every part to fit nice and smoothly into the insert. The foam was sandwiched together and I also let them add a white sheet in between to make it look more sexy! The case type is a Pelican 1610.

Video

Blog

pelicase_1610_full_003.jpg

pelicase_1610_empty_001.jpg

pelicase_1610_full_002.jpg

Share this post


Link to post
Share on other sites
On 9/22/2017 at 2:31 PM, mercer said:

I’ve been watching some C200 videos on Vimeo now that the camera has been out in the wild for a little while. So I figured I’d link to a few that look pretty good. Actually every one I’ve seen looks pretty good. Wish I had an extra 6 grand lying around. 

 

Here’s another with some lowlight footage...

 

c200 IQ recognition, to me, reveals a causation vs. correlation issue. 

Are the c200 videos better because they use a c200, or are the people and organizations that use (and can afford to buy) the c200 simply more accomplished?

Share this post


Link to post
Share on other sites
7 minutes ago, fuzzynormal said:

c200 IQ recognition, to me, reveals a causation vs. correlation issue. 

Are the c200 videos better because they use a c200, or are the people and organizations that use (and can afford to buy) the c200 simply more accomplished?


This is the problem with *ALL* cameras. 

Are the videos from a Panasonic G7 / Nikon D5500 / BMD Micro Cinema Camera / Sony RX100mk4 / etc  "bad" because it is a bad camera or because the people are newbies with a very limited budget? 
Are the videos from a Sony F35 / Panasonic Varicam LT / Canon C300 mk2 / RED Weapon / etc "amazing" because the shooter (and everyone else on his team) are talented and experienced people with a large production budget?

In the end the answer is: a little from colum A, a little from column B.

Only solution is trying it out yourself, or having very controlled shout outs. 

Or just simply reading as much as you can about everything, while taking it all in with a big grain of salt. 

Share this post


Link to post
Share on other sites

I think you also have to throw the Colorist, Editor in the mix also. You can shoot the best film in the world but once it hits the post production you better hope you have some damn good talent in there.

Just like Photography today. I think you are better off being a wiz at Photoshop than being a great Photographer. Now if you are great at both, Wowzer.

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...