Jump to content

tugela

Members
  • Posts

    840
  • Joined

  • Last visited

Posts posted by tugela

  1. 3 minutes ago, gt3rs said:

    Without knowing the technology behind RAW light (wavelets like JPEG2000, REDCODE?) it is hard to judge the computational power needed, but I think the chip are powerful enough for 4k 10bit 422 at least at 30fps.

    My opinion, as many others, is that it is only a temporary protection of the C300 II and once they are ok with releasing the firmware they can implement 10bit 422 400 Mbits and/or 8bit 422 300 Mbits. Maybe one factor would be to be able to save to the SD card so the 300 Mbits will be chosen. Although it is possible to do 400 Mbits on SDs, with so many cards brand and models it can become tricky for support and test. I'm skeptical to the 8bit 420 because it make no sense, they already have 8bit 420 so 6 months wait just for going all-I? but again is Canon....

    Another theory would be that they need more time to finalize/optimize the XF-AVC for 4k 60p although the C700 does this already but at 810 Mbits so no SD card.....

    For RAW essentially the camera has to create a zip file in real time on the fly. How well does your desktop handle that task, to put it into perspective? It will require a significant amount of processing to do that. If it were simply writing the data to storage then little processing would be required, likewise if it was discarding a lot of data during the compression to speed things up. But to keep all of the data while still compressing, that requires processor muscle to do in real time.

    The code to implement higher bit rates/depths already exists, so it would already be there if it were that simple. The absence of those codecs means that the hardware encoder inside the new chip is not set up to enable it.

  2. 17 hours ago, jhnkng said:

    I'm kicking myself for not asking the same question!!
     

    The C200 ships with Dual Digic 6 processors which has got to be faster than the single Digic 5 in the XC10. There's no way the hardware couldn't support it. RAW output can't be the reason either because RAW is *less* processor intensive -- it just takes the sensor feed and writes it to a card, it doesn't have to debayer or add noise reduction or anything. 

    Canon's own press release for XF-AVC lists the specs of the codec, and for 4K it can do either 8/10bit I-Frame 422 and for HD 8/10/12bit 420/422/444. Committing resources to creating an 8bit 420 for 4K just to protect the C300mkII would be *insane*, though I wouldn't put it past them. It might even explain why it doesn't ship with XF-AVC.

    I walked into the demo planning the business case to finance the C200, and I walked out with an order for a C100 mkII. They're now selling the C100 mkII for $5000AUD and it comes with an Atomos Ninja Blade kit, vs the C200 with retails for $12499AUD. Easiest purchase decision I've ever made!

    The RAW data is compressed, and that takes processing power. It is not just read to the card.

  3. 1 hour ago, sam said:

    I'm curious to hear what experience/knowledge lead you to form that conclusion?  I have played the same track(s) in blind a/b tests in multiple formats (33.3rpm vinyl, DSD 2.8MHz,Tidals 16 bit 44.1, and Pandora streaming to guests both young and old, rich, poor, educated and not. So far, everyone has picked vinyl as sounding best.  And that is with a lowly entry level $12k system.  Digital may be more accurate, but..........

     

    Because your receiver is digital, and it doesn't matter what comes in, what goes out is all digital anyway?

    I grew up before there were such things as CDs, and there is no way vinyl sounds better than modern digital sound. Unless of course you like all the media imperfections that were never intended to be in the sound track to start with.

  4. 23 minutes ago, jhnkng said:

    I'm kicking myself for not asking the same question!!
     

    The C200 ships with Dual Digic 6 processors which has got to be faster than the single Digic 5 in the XC10. There's no way the hardware couldn't support it. RAW output can't be the reason either because RAW is *less* processor intensive -- it just takes the sensor feed and writes it to a card, it doesn't have to debayer or add noise reduction or anything. 

    Canon's own press release for XF-AVC lists the specs of the codec, and for 4K it can do either 8/10bit I-Frame 422 and for HD 8/10/12bit 420/422/444. Committing resources to creating an 8bit 420 for 4K just to protect the C300mkII would be *insane*, though I wouldn't put it past them. It might even explain why it doesn't ship with XF-AVC.

    I walked into the demo planning the business case to finance the C200, and I walked out with an order for a C100 mkII. They're now selling the C100 mkII for $5000AUD and it comes with an Atomos Ninja Blade kit, vs the C200 with retails for $12499AUD. Easiest purchase decision I've ever made!

    If the hardware doesn't support it, it doesn't matter how fast the processor is. Digic DV6 processors may be optimized differently than the DV5, to maximize thermal performance when doing hardware encoding. That would make sense since the DV6 is the sibling of the Digic 8, and that presumably will need a thermally optimized encoder to enable 4K in consumer cameras so that Canon can compete in those markets.

    The processing power of the DV6 outside of hardware encoding may be necessary to handle the compression of the RAW feed, and that could explain why the C200 uses DV6s instead of DV5s.

    There is no reasonable logical reason for not including the sorts of codecs already present in the C300II and XC10/15 otherwise. It has to be a hardware compromise. And that compromise may be leaving the middle codecs out of the equation so that the processor can deliver the best of both ends in a variety of product models, not just the C200.

  5. On ‎6‎/‎9‎/‎2017 at 7:27 AM, DBounce said:

    First review of RAW on the C200:

     

    RAW footage samples... I'm pretty pumped for this camera.

     

    The first video is private. Looking at the second video on my laptop, the resolution looks good, but they did a terrible job grading it, the green looks too yellow. Not the fault of the camera, but a user error.

    On ‎6‎/‎23‎/‎2017 at 7:01 AM, jhnkng said:

    I was at a C200 demo a few days ago and the Canon rep said he was absolutely sure that the XF-AVC upgrade was going to be 4:2:0 8bit writing to SD cards. I even pushed back asking if there was any chance of seeing a 4:2:2 8bit writing to the CFast card, and he said no. I have no idea how high up he is in the company, and how deep his connections are to the engineers working on the C200, so take it with a big pinch of salt. Considering the XC10 shoots 4K 4:2:2 8bit to CFast it's mindblowing that Canon wouldn't put that in the C200.

    There was a lot of interest in the camera amongst the 12 or so of us in the room, and more than one person asked whether there would be a better codec coming in the future. I can't say for sure but I think if Canon had committed to even a future broadcast spec codec upgrade there might have been a couple of sales right there and then.

    That seems weird, unless the hardware is just not set up for that. If that is the case then no firmware can change it. It may be something to do with the new processor, with perhaps the hardware encoder optimized for the Digic 8 variant so they can enable 4K footage in consumer models of their products.

    7 hours ago, Andrew Reid said:

    One day Canon's rep can explain to me why there is 300MBit 4:2:2 on the £1400 XC10 but just 100Mbit 4:2:0 on the £7500 C200 :)

    It has to be because of the different processors used and the hardware encoding options available in each.

  6. 2 hours ago, Anaconda_ said:

     

    But by going from vinyl to CD and then online, lots of quality has been lost. People mostly listen to vinyl for the quality of the playback - or so they say. I look at it this way, quality was lost going from film (vinyl) to digital (CD/online) but audio has continued to get worse for the average listener, while the resolution race is constantly making video 'better'.

    8K is more than enough, but since when was that ever a problem? When you make a film, do you shoot just enough to edit, or do you shoot more than enough? When you cook a meal for friends, is it not better to have some leftovers? You can't cut a piece of string longer, so best to start off with more than you need.

    Vinyl is not better. Digital recordings are far more accurate than any analog recording and don't have any of the artifacts associated with less than perfect pressings, scratches and dust. People like to say that vinyl is better basically because it makes them feel more refined than the common people who listen to digital. It is the same as those folk who drool over the rare expensive wines mostly because they are rare, not because of intrinsic quality over less rare wines.

    There is an irony here, most of those vinyl recordings people drool over were mastered in digital in the first place, lol. The best quality playback is live. Go support actual artists. 

  7. On ‎6‎/‎25‎/‎2017 at 5:11 AM, Justin Bacle said:

    Here is the real question. Did you ever sat in a theater projecting movies in 4k and thinking "I wished the resolution was better". I know I didn't. 

    I think 8K is way too much as a viewing format. But as always, higher resolutions can see some good uses for VFX and scientific applications.
    As for myself, I shoot 1.5k 4:3 (yep, read it correctly) , so I really don't care about the resolution race :D

    That is not the point. When watching a movie you are watching the content. But, having better IQ means that the experience overall will be better.

    It is like getting married. No doubt you love your wife (or husband, as the case may be) and you have all eyes for her. But, if she looked like a supermodel, it would be better. You are not going to be SAYING that at your wedding, however that doesn't mean that some things might not improve the overall experience.

     

    2 hours ago, Justin Bacle said:

    Haha there is definitely some goods points in your post :)

    I think though, that now that the resolution and DR is "similar" to 35mm film, that is why we're saying it is enough for now.
    There will always be people wanting to have the newest best stuff. And that's good for the market and innovation. I just think that the resolution race has not much interest. DR, IQ, Rolling shutter, recording formats are of higher priority IMO.

    PS : If we apply our thinking about audio formats. People ditched the 33rpm vinyl records and adopted the CD, then everything went online. But there is still people listening to Vinyl Records, event cassettes are making a comeback ! Maybe, some people still wanna use 480p cameras for the nostalgic effect. Nothing wrong about it :D 

    Yes, but I don't want 35mm film resolution, I want eye resolution. To be truly immersive it has to be as real as possible. That is the ultimate goal.

    That means clean edges, no pixels, NO grain, no softness (I don't have cataracts, so why the must my camera have them?) and 20 stops of DR (or whatever the human eye is capable of). We have a very long way to go.

  8. On ‎6‎/‎23‎/‎2017 at 2:20 AM, sudopera said:

    I bumped on this video below and just wanted to hear your opinions on the matter.

    Maybe my eyes are not used to this much detail but I don't like it, simply because it's hard to concentrate on content when I'm constantly bombarded with so much small detail that looks almost surreal.

    I really think that 4K-5K is the sweet spot and camera manufacturers should turn their attention to DR, high framerates, efficient codecs and color science, and stop the resolution wars.

    I know you can throw softening filters in the mix but what is the point then to use 8K in the first place.

    I'm not all against it because it will certainly have it's use for some scenarios, but I wouldn't like it to become a norm in the future.

     

    You do realize that the footage is 1080p right? It might have been shot on a 8K camera, but that is 8K before debeyering. The YouTube clip however is not 8K. It is not even 4K, it is 2K. And it shows on a 4K screen.

    Existing consumer cameras that oversample already shoot at around 6K, so this "terrible future" you are so afraid of is already here.

  9. 1 hour ago, Deadcode said:

    Do i read in another topic there is a complain that RED Weapon's 8K is too "sharp" and you complaining here that a 300 eur camera is too soft even if it's shoots raw? i know, your iphone 7 has better resolution and log with filmic pro... You should read the article...

    Just saying what it is man. Sorry if you don't like it.

    The paradigm that a lot of people seem to have here is based on viewing devices from a decade ago. Things have moved on, and what was "good enough" back then is not good enough now. And going forward into the future as viewing devices become even larger and even higher resolution, that decade old paradigm is going to become even less competitive. At the very least you need oversampled 1080p footage to be acceptable, and 720p par-sampled RAW is not there, no matter how much you try to tweak it.

  10. 21 hours ago, mercer said:

    Read the article, Reid has screengrabs from clips he took with it.

    it is ~720p before debeyering, so it is still going to be soft compared to modern hybrid cameras. There is not much you can do about that. Good for the average laptop or cell phone using YouTube, but falling short compared to an oversampled 4K camera when it comes to displaying on a big 4K TV screen.

  11. Bloom has shot footage with a Barbie doll and still did a decent job with it. If someone is creative they can shoot pretty much with anything because for them it is the content that is telling the story, not the camera.

    Less creative shooters need the big spec cameras because they can hide their lack of creativity behind technical details. That is the main reason why people drool over the super-spec cameras, because they know that their footage would be lacking without it. If you can't make interesting footage that people will watch no matter what, then you need to have the image as perfect as possible to compensate. There is a reason for that. If your content does not draw the viewer in, then they are far more likely to notice IQ defects in the footage. Which means that if your content is uninspired and boring, you better be damned sure that there are no IQ defects in the footage. And for that you need the super camera.

     

  12. On ‎5‎/‎10‎/‎2017 at 0:31 PM, Bioskop.Inc said:

    Yes the book is far better & slightly different.

    The one thing that the new trailer has completely overlooked is that the original film always had the question of whether Deckard was an android at its core (depending on which version you watch, obviously) & by having an ageing Harrison Ford, that question has now been answered. But what i did notice is that they have basically pushed that question onto Gosling's character - so yet again Hollywood has rehashed the same story into a more updated version, which will be unsatisfying. Don't mess with the classics, as dissappointment will be the only result!

    Seen 2 films at the cinema recently: Guardians of the Galaxy vol 2 & The Handmaiden. One was a huge waste of time, energy & money - also too long at 2h16min (Baby Groot couldn't save it). The other is almost certainly the best film that i'll see this year, by one of the most talented directors around.

    I really think people need to forget Hollywood rubbish (they're never going to change & will regurgitate everything in a worse wrapper) & learn to embrace reading subtitles, because the best cinema isn't American or even in English!

    Just for accuracy, androids are robots that look and behave like humans, while cyborgs are people with robotic components.

    The characters in the movie were "replicants", basically genetically engineered clones. They were NOT androids. They were as human as anyone else, but only with a very limited world experience and a short shelf life. The movie was about them grappling to find their own humanity and not really understanding what being human was all about (since they were essentially toddlers being exploited as soldiers), not about them being robots.

  13. On ‎6‎/‎20‎/‎2017 at 9:39 AM, jgharding said:

    We got the XC10 for this... but no nuts.

    the fuzzy-motion noise reduction, not great low-light, all kinds of little quirks and issues... they do need a dedicated b-cam, but they're too afraid of losing sale.

    A litlle half-size C50 with C100 sensor and so on would do it

    Of course you'd want internal stabilisation, and they'll never give it to us due to wanting to flog stabilised lenses.

    Or not wanting to be sued because competitors hold the IP.

    On ‎6‎/‎20‎/‎2017 at 10:23 AM, mercer said:

    The Cinema Raw Light is so interesting to me that I wonder if they will start implementing a 1080p version in their lesser models. A 6D2 with CRL at 1080p up to 60p would be a genius move and nobody would even complain about it not having 4K.

    Obviously, this is Canon we're talking about here, so it is highly unlikely they'll ever do such a thing, but damn it would be a smart move. Especially if they're having technical issues with 4K and overheating 

    The DV6 should have a Digic 8 sibling, so perhaps the new processor will have enough efficiency to start implementing 4K in hardware in consumer cameras. We will probably have to wait until the end of 2017 when the new processor starts appearing in powershot style cameras to get a sense of whether it will be good enough for that. The Digic 7 very likely already has a 4K encoder built into it, they just can't make use of it because the processor would get too hot.

    37 minutes ago, OliKMIA said:

     

    Apparently, Canon did it again. A 6D had sex with an 80D.. taaadaaaaaaaaa 2000$.....

    http://www.canonwatch.com/canon-eos-6d-mark-ii-images-specifications/#disqus_thread

    They are still stuck in 2013 it seems. I suppose that the development cycles for the higher end products are longer, so it is perhaps unreasonable to expect too much from a 6D mark 2 if it is released with the tech Canon have now.

    Given the product cycles Canon has, it may be as late as 2019 before we start to see a competitive Canon hybrid, but by then of course everyone else would have moved on so they could well still be behind even then.

    The best bet for such a camera would be a 7D mark 3 IMO, since we can expect to see a new model in that line sometime around late 2018.

     

  14. On ‎6‎/‎18‎/‎2017 at 6:15 PM, mercer said:

    I am not disagreeing with you, I was merely answering why Canon may be holding back investing R&D into things we may find important as a minority in the world of consumer cameras.

    From a business standpoint, I can't say I disagree with their logic. As a consumer, I wish they offered more. 

    But as an artist, can you say your work is any better than your GH1/GH2 days? Has any of these, rush to market, advancements made your work any better?

    I am seriously asking your opinion on this. Because I am a fan of your work. Some of your early GH1 footage is inspiring, even to today's standards. I almost bought a GH2 yesterday after watching one of your earlier cinematography reels.

    They are "holding back" on specs not because they want to, but because their hardware is not up to it yet.

    Don't make the mistake of thinking there is some cunning plan at work here, there is not. 

    1 hour ago, mercer said:

    I wholeheartedly agree Canon makes a great tool and its DPAF, color science, humble codec and great build quality are great tools.

    And I don't think they should be dismissed by a lot of video enthusiasts simply because they don't shoot 4K. Other than 4K I'm unsure what they're lacking that would make a videographers life any easier? With native lenses you don't need focus peaking due to DPAF. Their humble h.264 codec and amazing color science requires less need for grading... so expose to eye in camera with maybe a little latitude in either direction (I prefer slight underexposure with Canon cameras) and you're good to go. Probably the easiest, most dependable tool around. 

    As far as Fury Road goes, I'm sure BlackMagic cameras were used as well, they used a shit ton of cameras in that movie, but it's fairly common knowledge that the 5D Mark ii was used extensively throughout the film.

    Focus peaking is necessary for DMF, which is a very useful tool when shooting stills.

    Traditional Canon DSLR 1080p is too soft for the world of large screen 4K TVs that currently form the bulk of the commercial lineup. What is or is not acceptable is determined by the current standard of displays.

  15. 19 minutes ago, Andrew Reid said:

    I'm talking about the way people shoot now, not in the 50's.

    Yes there will always be the exception like a crew with 4 A-cams all with the same purpose.

    But in my mind and in this topic with Oliver we are talking supporting role, small camera, multiple bodies at a price lower than the A cam.

    Improper usage of a term does not make it proper just because people use it improperly.

  16. 8 hours ago, UncleBobsPhotography said:

    Is the RX10 series worse than the RX100 series for video? I always assumed they were identical except for the lens.

    One note on the RX-series. I'm using the RX1RM2 and electronic steady-shot completely destroys the image quality for 1080p. It looks more like 480p than 1080 with steady-shot turned on, but without steady shot the image is quite decent.

    They are the same camera, except for lens and body. The package is different but the electronics are basically the same.

    3 hours ago, Papiskokuji said:

    RX10 II could have been an incredible camera if it had :

    1) a sharper lens (don't trust those "amazing sharp zeiss lens" bullshit reviews). The lens is barely ok for video, as mercer said, there is something filmic about its softness, but photos are pretty much unusable at the tele photo end, and i'm not exaggerating.

     

    Any zoom lens with a range greater than x3 (or at most x4) is going to be "unfortunate".

  17. 8 hours ago, AaronChicago said:

    I can't really say which frames are "better" or "worse" but I prefer 2.

    2 has a lot of chromatic aberration. Also a lot of noise. IQ wise it is the worst of the three.

    1 has quite a bit of noise on blowing up but not as much as 2 (which is visible even in the whole image). Also a trace of CA. It is also has the highest pixel count, so defects on blowing up may just be more apparent as a consequence. Exposure is different from 3, and that may skew perceptions.

    3 appears to be the cleanest shot, no CA that I can see, not a lot of noise. It also has a lot fewer pixels than the other two, so it may just be that the defects are less obvious as a result.

    As far as CA is concerned, that reflects the lens on the camera, while noise may reflect the sensor (but may also indicate different levels of processing).

  18. 18 hours ago, Kisaha said:

    Everywhere I was reading (like the quote above) were mentioning 8 bit! that is a great find Mercer.

    As I said in one post the first day of the announcement, just after Christmas, Canon will announce this 10 bit middling codec, and it will make some C200 owners very happy!

    The sad thing is the C300mkII owners, but that is the case with evolution, and I am glad that Canon takes part to it. At least they will sleep like babies for at least 8-10 more months!

    On an unrelated event, FS5 just got an 1000$ rebate...

    It is hardware encoding, so it will be limited by what the processor has in it's logic. If it could handle 10 bit then that would already be implemented already. My guess is that the hardware encoder will be limited to 8 bits, but other things such as bit rate or 4:2:2 might be available in future options, since those things don't require any physical change on the processor, such as register size and such.

  19. 7 hours ago, horshack said:

    Intel has had H.264 acceleration for quite a while (since Haswell). H.265 acceleration was only introduced in the current generation Kaby lake processors.

     

    H.264 support is much older than Haswell in Intel processors, at least since Sandy bridge, perhaps earlier (my gen 1 system has an X processor, which does not have an onboard GPU, so I don't know what the regular processors had at that point). Perhaps not full official implementation but the processors were definitely capable of hardware acceleration of encoding/decoding H.264 content.

    H.265 acceleration has been possible in hardware since the 4th generation processors I believe. I know it is not present in Ivy Bridge, but is in the generation after. 

    http://techreport.com/news/27677/new-intel-igp-drivers-add-h-265-vp9-hardware-decode-support

  20. Well, the thread was derailed by a poster dissing PCs, not the other way around. So be correct.

    If you maintain your system properly there is no problem. The advantage of a PC is that it you can get it set up how you want with the hardware you want. As you point out, that is an option which for the most part is not available for Macs. You are stuck with whatever they choose to give you. Personally I never have had a problem with drivers not playing nice with software. If you have those sorts of issues it is more likely that you have some odd hardware and it is your editor that is the problem, not the OS.

    And for the record, hardware support for H.265 has been in Intel CPUs for years, Apple just has not bothered to implement it on the software side until now, hence my comment. I am not sure why any user would regard that situation as generally acceptable.

     

×
×
  • Create New...