Jump to content


  • Content Count

  • Joined

  • Last visited

  1. That's interesting, I didn't realize how destructive h.264 could be to color depth. Sorry to put you on the spot but I'd like to see some links or examples that backs that claim (I'll also google it) Thanks hmcindie!
  2. Ok, good question, so let's figure this out. Say you've resampled the 4K/UHD to HD then cropped to 720 and then resampled back up to HD. Scaling from 1280x720 to 1920x1080 needs to add 1 new pixel after every 2 pixels (both horizontal and vertical) which means every 3rd pixel are interpolated in post rather than "recorded" (but remember every pixel from a bayer sensor is interpolated at some point, what is important is how accurate the interpolation method is.) The good news is if using a decent scaling algorithm this will still be a 4:4:4 image since you aren't simply copying pixel
  3. No, unfortunately when you crop an image you're straight up throwing away pixels and cutting down resolution. Once you crop the image past 3840x2160 you lose that 2:1 pixel scale rate to 1920x1080, which negates any gains found in resampling "4k"/UHD to HD. You should resample the UHD to HD with Lanczos first, you want that improved image so make sure to render that in 10bit, then crop that down to 1280x720. That would get you perfect 720p footage with the ability to reframe. Then if you're feeling lucky you can try to up-res that back to HD with a Mitchell interpolation algorith
  4. Right, well a proper de-squeeze is stretching the image horizontally, which "duplicates" pixels (1920x1080 to 3840x1080). However you could also just "squash" the image vertically (1920x540), this would "resample" the vertical pixels. The "squashing" technique should "technically" result in a 4:2:2 9bit image, that is if you're averaging the pixels down rather than just deleting them, use a "Sinc filter" like Lanczos.
  5. Nah, an anamorphic de-squeeze actually adds pixels rather than replacing or resampling them and since it's essentially duplicating the already recorded pixels over to the right it would still have the same bit depth/color depth, color space, dynamic range/latitude. Someone might make the argument of a perceived difference but the numbers are the numbers and nothing "new" is created in the process. Actually instead of simply duplicating you could interpolate the "new" pixels with something like a Mitchell filter (samples the surrounding 16 "source pixels" to make one new pixel) which, "techni
  6. Ah that's interesting tosvus, so if the GH4 is like the GH3 it will use a 12bit A/D converter and if its video processing is anything like most cameras the algorithm would take 12bit values from the sensor then "rounds" those to 8bit values. The question then is what's the "round up" threshold in panasonic's algorithm. To do what you suggest would require Panasonic engineers to use a variable threshold; (**WARNING TECHNICAL RANT AHEAD**) first they would have to break the pixels into 2x2 blocks, then round the 12bit values of (at least) every "top left pixel" to 10bit before they can se
  7. I like the analogy of Bit depth being like rulers for this argument, I see a "10bit HD" source as one person having to produce a measurements in full 1/4 centimeter while "8bit 4k to HD" is like 4 different people measuring in full centimeters then averaging to the median of their measurements. For example let's say they have to measure a subject that is 3.25 cm. The guy measuring in 1/4 centimeters easily and accurately produces the measurement of 3 1/4cm. The four guys measuring in full centimeters would have to choose between 3cm or 4cm initially, if three of the four measured
  8.   http://www.magiclantern.fm/forum/index.php?topic=2764.0   This is interesting (to me): Encoder feed is 1720x974 (550D) and 1904x1072 (5D3) For 600D, sizes are 1728x972 (crop) and 1680x945 (non-crop)   So the T3i was actually only as good as the T2i in its 3x crop mode??  WOW  I'm afraid to hear the numbers for the newer Rebels and isn't that 1680x945 number very familiar (7D's HDMI out image resolution) so I bet the 7D was just as bad as the T3i.
  9. What needs to be understood is that these "2K" DNGs actually have about 1920x1080 pixels of "actual image", the rest are black bars.   So far each model seems to have a set width for the "actual image" but the image height can be changed by setting the camera's recording resolution in Canon's menus.  Also in the 5x and 10x mode the width can be increased but the height is about cut in half.  Further more 1080p and 720p modes seem to use the same "actual image size" of about 1920x1080 but the DNGs also mimic the frame-rate so in 720p mode you get a usable 1080p60.  However
  10. They should try the 7D, it has the biggest/fastest buffer.
  11. Actually the 5D2 and 5D3 have the same buffer sizes but 5D3 supports UDMA 7 (1000x CF cards).     They are currently preforming active tests on the 5D3, 5D2, 6D, 60D and 600D
  12. Just to clear some stuff up here: 1. Only about a 1920x1080 crop of each 2040x1428 file is the actually image, the rest is black bars. (this is on the 5Ds, the others are different like the 6D seems to be less and 60D in 5x mode is "usable 2520x1080") 2. DNG is not debayered, I don't know about CHDK's DNG converter that Alex has implemented into Magic Lantern but I bet, like Abode CinemaDNG and DNG, it needs debayering in post. "Images from most modern digital cameras need to be debayered to be viewed..." 3. Magic Lantern's YUV422 video recorder/silent pic is different then their new 1
  13.   It's true that the differences between 4:2:2 and 4:2:0 is hard to see in the end result but it helps keep everything together in post.  However you should see more of a difference in an uncompressed signal vs 24-50Mbps AVC (IPB).     What was the recording bit-rate for this video?     Though in most situations I would still prefer the smaller AVC file sizes.
  14.   If this ever gets to 24fps its obviously going to have bandwidth and storage issues, its only use would be for quick "b-roll"/"cut away" shots for any type of filming, I doubt it would be as useful as a BlackMagic for any other situation.   I wonder if this would work better on a camera with a bigger buffer, like the 7D or 1D mk4.
  15.   I agree, it looks like the G6 hardware has the potential to do better than the GH2 and maybe even the GH3 (moire mostly) but that doesn't mean Panasonic won't hold it back in some other way.  I'm trying to stay reasonable and resist the urge to dream about 4:2:2 clean HDMI or a high bit-rate hack within a year of its release but despite what I've seen with those auto/poorly exposed youtube "reviews" I'm hopeful that we will get decent quality with built in focus peaking :D 
  • Create New...