Jump to content

tugela

Members
  • Posts

    840
  • Joined

  • Last visited

Posts posted by tugela

  1. Even the original C100 does perform a full 4K sensor read-out and downsamples it elegantly to Oversampled Full HD in realtime, without debayering. It's a very sophisticated downsample algorithm to the raw data, though the downsample algorithm did get tweaked with the MKII models (they actually up-scale each 1080p RGB channel to 4K before downsampling to HD) 
    The Canon Cameras doing a 4K to HD downsample internally are XC10, C100, C100 MKII, C300, C300II, C500, and the 1DC HDMI output in 4K mode.

    They are not downsampling, all it is doing is debeyering to a HD frame space, probably with some pixel binning as well. There is no native 4K being converted to HD. Since you have more pixels to begin with the line resolution of output is relatively high (unlike the XA20/25/G30, which are limited by the pixel resolution of their sensor)

    The C300M2 debeyers to a 4K frame space, and since it has a 4K native sensor, the effective color resolution of the output with good glass is around 1400 lines. The C300M2 uses 4K as it's native output, HD is downsampled from that. It is not the same as the C100. The XC10 on paper should be similar to the C300M2 (excluding the effects of pixel level light scatter) but is limited by the resolution of the fixed 10x zoom (which apparently provides ~80% of nominal resolution). The single processor in the XC10 probably hurts it in signal processing as well, and there are likely shortcuts being taken that don't happen with the C300.

  2. That is indeed how it works. It downsamples a 4K sensor readout to 1080p very early in the image processing pipeline to improve SNR and color. Here's a long boring white paper from Canon explaining the finer points. http://www.cs.ucsb.edu/~mturk/imaging/Misc/EOS_C300_New_35mm_CMOS_Sensor_WP.pd

    I don't know much about internal vs external recording as I haven't tried them both, but count me as a +1 for exposing to the right on Canon cameras. As long as you avoid clipping any color channels, it seems to respond well to bringing the image down in post. 

    I doubt it. That applies to the C300M2, which has dual DV5 processors, so it can handle the load. The C100M2 has a single Digic DV4 processor, which is last generation. Those processors can't handle full sensor readout IIRC. They are similar to the processors on newer Canon DSLRs, except they have been optimized for video. They use the hardware encoder to produce files, which is why it's output options are the same as the XA20/25/G30 line (which use the same processor). In spite of it's "mark II" designation, the C100 has last generation technology inside and has nowhere near the capability of the C300M2.

  3. Did Canon steal your lunch money as a kid?

    No, they failed to live up to my expectations. I have a G30 that I bought when they first came out, was disappointed with the lack of resolution with that - as an upgrade for my existing equipment it was money not well spent. I was expecting the 7D2 to be a true hybrid with modern 4K capabilities, a camera for the future, but instead it turned out to be a stills only camera with token video, much like earlier DSLRs, still stuck in a bygone age.

    So I got tired of waiting for Canon to get a clue. Now I am cynical about them and do not trust that they will live up to the promise. They can be counted on to disappoint, and the XC10 is no exception - for the price it could be so much more, I know that for a fact since other manufacturers do make such cameras with apparent ease, but Canon simply did not deliver that. It seems they can't, it would cost them too much money even though they charge more for their products than others. How does that work?

    I guess you could call me a fanboi who has been shown the finger once too many times by my true love, so now I no longer believe in her good faith :)

    That is said, so now you know. The XC10 is just another in a long line of Canon products that could have been great, almost, so close, but corners are cut and it slips away, leaving a shadow of what could have been. Very sad.

  4. So Mattias, is Zach serious or am I getting false signals due to my broken sense of humor?

    That post reminds me of the typical day at my previous job.  You can give the general public all the scientific facts they need and you will still invariably get results like this.  Now do you understand why people who actually know what they are talking about can't tell if lay people are joking?  I mean the guy didn't even have to look anything up.  All he had to do was read the thread... and that was too much effort.  I like how I predicted this and sure enough a few posts later...  Voilà.

     

    I will mail you $100 if you can intelligently and accurately summarize what you just linked to and it's relevance, if any, to this topic.

    By the way did you see the Uranium bowls on ebay?  I like how they advertise the glow in the dark "feature."

    I like how people rely on United States government regulations to determine what is and what is not safe.  I remember the US government regulating the dangerous yellow cake and nuclear weapons in Iraq.  $1+ Trillion well spent.  To those of you who prefer US government regulations to peer reviewed journals and textbooks I leave you with the crown jewel of US government safety regulations, enjoy...

    moms_demand_action_kinder_surprise.jpg

    Because chocolate eggs kill kids, not guns.

  5. I get this question a lot since I use almost exclusively old glass and have made several reviews and video guides about it. 
    My thesis have always been that the authorities in charge of radiation and our nuclear facilities wouldn't let it slip through  if it was true.

    But thesis are one thing, scientific proof is another. 
    So a test had to be done.

    I think that you should immediately bury your old lenses at least 100 feet under ground. And remember, we want to see photographic proof that you have indeed done this.

  6. Well, if they're not ready for 4K, then what could they possibly do to make a really enticing mirrorless? Full Frame? Doubtful, because it seems like they are boosting their ef-m lens line up. Maybe a cine-mirrorless that shoots 1080p CLog? 

    I think the issue for them is that their processors are not up to the task, and in order to produce competitive 4K they will need multiple processors. That would be fine for something like the 5D line, bit not for more consumer orientated products, where the cost would be prohibitive. So, rather than release substandard 4K they don't release it at all.

  7. If I were Panasonic I'd be tempted to do a 2.4 firmware with some nice goodies in and locked-down V-Log, some of the stuff Olympus just announced for their next firmware upgrades would be nice. Giving people their money back would be tricky, but the stable door is still wide open and the Horse already in another time-zone...

    dwijip - Where did Panasonic admit their mistake?

    If I were Panasonic, I would release another firmware upgrade, this time for free, but only allow it to be installed on cameras with legal free or paid firmware already on board. Then increase the cost of the paid firmware 5 fold at the same time.

    That would leave the people with the hack locked out of further firmware enhancements unless they ponied up $500 :).

  8. I don't feel guilty. I've already paid Panasonic.

    Maybe now they wont bother developing new firmware then. They will just develop new hardware so you have to pay them to get the improvements.

    Much more expensive for the consumer for sure, but if they insist on cheating you, what can you do?

  9. Well, there goes my A7s for sale. Goddayum the price drops on these Sony cameras are crazy. My 5d III is still almost same price as I bought it years ago

    If the 5D1V was just announced, the resale value of a 5DIII would plummet.

  10. The difference between now and then is that manufacturers are delivering on the video potential of stills cameras, we have products like the GH4, the NX1 and Sony's latest offerings. So the need to hack to get advanced video has largely gone away. There are too many good options for stock cameras, and more are constantly arriving.

  11. I sold TVs for two years, and conducted several tests with co-workers and customers. With a 55" screen, only 58% or so could tell which was which. 
    Most consumers watch Youtube at reduced resolution to improve load times. Netflix still streams 99% of its content in 720p. More than half the customers I've dealt with still buy DVDs. Cable companies still provide mostly SD and 720p content. There's no 4K disk format available yet. We're years away from widespread gigabit internet, so streaming 4K is still a chore (hell, Netflix/Amazon Prime 1080p already shows compression problems).

    Now if you're watching a larger TV fairly close, 4K makes more sense. It makes sense for larger computer monitors. It also makes sense for projectors, although my 1080p DLP resolves just fine (sitting 9 feet from my 120" screen). 


    Believe me, this is not driven by bottom-level consumers. 4K sets are driven by marketing machines. 

    The 1080p OLED display in our store has been mistaken for 4K by almost every customer I've ever dealt with. Why? Great color depth, perfect black levels, great motion rendering, and 178 degree viewing angles. Unless you buy top of the line, which most consumers can't or won't do, LCDs still suffer from motion blur (best case scenario you're resolving 400 lines during moving shots), poor blacks, blooming (on sets with poor local dimming), limited color gamut, flashlighting, cloudy blacks, and poor viewing angles. If I had a choice between a 1080p OLED panel or a 4K LCD, I'd choose 1080p every time. 

    Like I said: 4K is nice, but it's way at the bottom of the priorities list.

    You are missing the part where I said that I have a 65" 4K screen and can see the difference. HD is inadequate, it is pretty damned obvious if you look at the thing.

    As for what people are watching, that is irrelevant and not an argument for opposing improved video. The main reasons they buy DVDs and watch SD content are (A) they don't know any better, and (B) that is what people like you have decided they want and what they will get.

    How is the field supposed to move forward when the people generating content are taking such a backward looking attitude?

     

  12. Diffraction's going to be the biggest issue. The 5DS already shows diffraction losses at f/8, so not only will you be pushing your lenses to their limit, but you'll also be shooting them at apertures no smaller than f/2-2.8. 

    We're reaching the point of diminishing returns. SD to HD was a quantum leap in quality. HD to 4K was small but noticeable--more useful for post than anything. Most theaters still use 2K projectors, and almost all films are only mastered in 2K. Most DPs even consider 4K too much resolution to shoot people without using diffusion to offset the harshness. The Epic Dragon with its 6K resolution was crazy. Luckily, RED was smart enough to use a bit of diffusion in the filter stack and digital pipeline, so we still had enough detail to downsample or crop down for a nice crisp 4K delivery. They also touted it as a way to shoot world-class video and still simultaneously, but it hasn't really worked that way in practice due to the lack of flash support and the fact that good video and sharp stills require completely different shutter speeds. Besides, 4K was more than enough for magazine covers as it was. 

    I don't see the point of 8K. To me, the resolution race is over. We have far, far more than we need. I can see one or two 8K cameras for special cases/effects; they'll be for resolution what the Phantom Flex is for slo-mo. But to act like this will be anything like the leap from SD to HD is asinine, because there's only so much detail the human eye can resolve. And guess what? We've already reached that point! At 8-12 feet (the average distance a viewer sits from their TV), the difference between 1080p and 4K is so insignificant that you're basically guessing as to which is which. Contrast ratio, motion rendering, and color gamut/accuracy will make a far bigger difference. In the theater, you're limited by the display resolution, ambient conditions, and how well the projector is maintained. My theater has all 4K projectors, but half of them have a misaligned color panel. Even if it's only off by a couple pixels, there goes your resolution advantage right there.

    So if home viewers can't resolve it and most theaters won't benefit, what is the value of higher-than-4K resolution? You have bigger file sizes, more cost in post, a more limited selection of suitable lenses, and a boost to your ego. What's the point? 

    Manufacturers need to work on color, ease of use, workflow, heat management, bit depth/tonality, and compression. Resolution is at the bottom of the list. The Alexa's already proven it.

    Guess you don't have a 4K set. The difference between 4K and HD is pretty clear on my 65" screen. HD is not good enough. End of story.

  13. Auto ISO would change exposure as the scene changes, so that would be the source of your problem IMO. If you don't want exposure to change, you need to have ISO at a fixed value.

    Also, if you are using autofocus and there is any sort of movement of the focus point, the field of view can change slightly, and that may alter your overall exposure.

  14. I'm not sure what the point is. Image quality at the pixel level is usually pretty bad with those high density sensors. It would be similar to what the typical superzoom shoots.

    Pixel wars are back... It's pretty sad. Who will use 8K?? Don't say Hollywood, they can barely keep with 4K with all the CGI!

    People delivering 4K. Having higher resolution on the master footage will allow optimal resolution at 4K with minimal artifacts.

    Basically the same principle used when shooting 4K for delivery as HD.

  15. What "prosumer cameras" where available when 5d ii, 7d and T2i where the rage??? There where no cameras that gave you the "cinematic quality" of the Canons... Super35?? Interchangeable lens?? 24p??... There is a reason people in the video business have EF glass, and it was BEFORE the c300... how quick and easy we forget!!

    *Some* people shot with DSLRs, mostly because they were hobbyist geeks who did it because they could. The vast majority of real video however was shot on camcorder oriented cameras. How quick and easy we forget!

    I had a S10 before I upgraded my old Rebel to a 3Ti, which is a camera that the hobbyists remember fondly for video. When I first got the 3Ti I was excited because I thought I would be able to shoot stills and video with the same camera.....until I shot my first clips with the thing, then I realized that wasn't going to happen. They were terrible, and the camera was in no way even close to the older camcorder. Later on we had Magic Lantern, and while that was better, it still was well behind the camcorder.

    When all that began to change was when Sony released the RX10 and RX100. Those were the first cameras designed as real hybrids in that they shot very good shots and video. Panasonic brought the GH4 in later, building on the shoulders of the Sony cameras and the Canon/Panasonic hacks.

  16. True BMPCC doesn't shoot stills so I guess it doesn't count in that category. I'm not talking about quality either, just well known and common. 4 years ago you'd always hear about 5DII, T2i, 7D for video. Of course in today's standards the internal recording is not that great compared to the newest cameras, but they were still legendary.

    Not really. Most high end prosumer camcorders were much better as video recorders than the 5DII/T2i/7D cameras. It is not as though that functionality couldn't be done at that time, but it simply wasn't (and for the most part in Canon cameras, still is not even today).

  17. GH4 now is in my opinion a legendary hybrid camera. 5D Mark II, 7D, GH2, possibly BMPCC. A few may be moving up the ranks like A7S and 1DC. What else?

    A hybrid camera is one that does both video and stills at a high level of proficiency. The BMPCC is a video camera, not a hybrid. Cameras like the 5D2 and 7D were primarily stills cameras with a video function (that was not that good) tacked on. They excelled at stills, but not so much at video. In that sense they were no different from prosumer camcorders of the day, which excelled at video but not so much at stills. I would place the GH3 and 5D3 in the "not there yet" category, since they required hacks to get decent video.

    The first real hybrid was the GH4, but, that no one buys that camera for stills, so it is not quite at the level required to be a class leader. The same argument applies to the A7S, which is primarily a video camera but lacks the resolution to compete as a stills camera. The first camera to tick both boxes really well was the NX1, and it remains at the top of the class today. The only real competition for that spot is the A7RII, which is not available yet and in any case is quite a pricey piece of equipment.

  18. Interesting, I'm sure the cpu inside the nx1 is much more powerful, yet it is also limited to 8bit. I'd like to know why. I think it's a sw thing or related to the h.265 hw encoder.

    Probably the data bit width is defined by the internal register structure of the processor. It isn't really a question of how fast the processor is, but the width of the data bus being read off the processor. Presumably Sony and Samsung use conventional 8 bit structures, while Panasonic uses 10 bits.

    If that is the case it is not something that can be changed by firmware but would require the next generation of processor with the appropriate hardware.

  19. Well, if you had marked the frame with in- and out-points in the timeline and used Premiere's export function to render directly to a TIFF, you would have had a much better quality image. (The video preview image in any NLE always sacrifices quality in favor of quick rendering.)

    Could it be that your eyes are not very trained yet? Otherwise you would have noticed the bad color banding in the background, bad color resolution in the skin tones and sharpening artifacts on the talent's pores, and the green tint on the skin and part of the eyes (which might come from energy-saving light bulbs)...

    So, basically all your technical mumbo jumbo was about a screenshot rather than an actual frame?

  20. This camera, like most Canon products, is clearly more than the sum of its spec sheet. The image is cinematic as hell. 

    You are responding to Pavlovian conditioning. Because so much stuff is shot in Canon cameras, you came to believe that is what it "should" look like.

  21. The 1dc is three years old and it can handle the workload, along with being one of the 2 or 3 best sports cameras ever produced. They could have put that same tech in a 5dc if they wanted. The new C300mkII does 4k with dual pixel AF. They clearly have the ability. They've chosen to keep it upmarket, catering to pros shooting commericals, TV news, reality TV, rental houses and so on - which by all accounts the Cinema line has been a huge success filling a price point Red was never able to meet.

    I've said it before, top video specs clearly aren't driving the market. People on this forum are still a small niche in the overall market. Most of the people buying cameras are casual shooters looking for a "good" camera to shoot kids birthdays and school plays.

    Those cameras have a whole lot of extra logic built in though in the form for multiple processors. They are also very large bodies that can act as heat sinks for the power used to drive the processors. What Canon can't do readily is produce 4K in a reasonably sized camera with a single processor. The XC10 uses one processor, but it is really shooting 2.5K resolution material. The processor is probably not fast enough to deal with an oversampled sensor that would required for full 4K resolution. Therein lies the problem for Canon.

    The rest of the 4K players are achieving their results with a single processor.

×
×
  • Create New...