Jump to content

cpc

Members
  • Posts

    204
  • Joined

  • Last visited

Everything posted by cpc

  1. In the ASC mag interview Yedlin readily acknowledges there is nothing new in the video (for technically minded people). It is deliberately narrated using language targeting non-technical people. His agenda is that decision makers lack technical skills but impose technical decisions, easily buying into marketing talk. Also, in the video he explicitly acknowledges the benefits of high res capture for VFX and cropping.
  2. Testing camera profiles comes with the job characteristic of being a videographer/cinematographer. The easiest (cheapest) way to get an idea of DR distribution is to shoot a chip chart at a few exposures with a constant increment and use these to make a DR curve. Here is an example I did years ago using a Canon DSLR: Also, it is been mentioned already but log profiles do not introduce noise. They raise the blacks making the noise visible. If you see more noise in the graded image, you are not exposing the profile properly. Most log profiles need to be exposed slower than nominal because the nominal ISO of the profile is chosen to maximize some notion of SNR, which is not necessarily the rating you want for a clean image (after grading). In fact, cameras do it all the time. Take a Sony A7 series camera, for example, and look through the minimal ISOs of their video profiles. See how the minimal ISO moves around between 100 and 3200 depending on the profile? That's because the camera is rated differently, depending on how it is supposed to redistribute tones according to the profile.
  3. High detail--deep focus--bright will be the highest rate. Movement is irrelevant cause compression is intra-frame.
  4. The human eye does not perceive bits so such a claim makes no sense by itself. The necessary precision depends on many factors. There are multiple experiments on the tonal resolution of the eye. Perhaps the most applicable are the ones done during the research and setup of the digital cinema specifications, which determined that for movie theater viewing conditions and for the typical tonal curves dialed in by film colorists during the DI process and for the encoding gamma of the digital cinema spec (2.6 power gamma), you'd need around 11 bits to avoid noticeable posterization/banding artifacts after discretization (seen in gradients in the darks, by the way). This was rounded up to 12 bits in the actual specification. In an office viewing environment (brighter environments) you can do fine with less precision. This is about delivery though -- meant for images for display. You'd usually need more tonal precision when you capture an image, because it needs to withstand the abuse -- pushing and stretching -- of post production. The precision will mostly depend on transfer curves -- you need relatively more for linear images, and relatively less for logarithmic images. With today's DRs 8 bits is absolutely not enough anymore for log curves (and not even remotely close for linear images). It usually does fine for delivery for consumer devices (some of these displays are 8-bit, some are 6-bit; likely moving to 10 bits in the future).
  5. The true power of raw shooting is that you don't need to fix anything in advance and you can raw develop, edit and color at the same time in a liquid, flexible and creative post workflow. At least in Resolve you can. Proxies shouldn't be necessary. I don't see how a proxy workflow can be simpler -- you add a step (possibly two steps, if you round trip).
  6. Well, this kind of makes my argument irrelevant then.
  7. Well, I am the source. I don't need official channels for this, I trust math better than any channel. The ~11 bits used by the BMDFilm curve is orthogonal to how their pipeline constructs a 12-bit digital image. It is the result of the lack of a linear toe (which creates sparsities in the darks, i.e. values which never happen), as well as unused values at the top end of the range. I haven't looked at the specs recently, but 60 fps 4.6K lossless raw would be in excess of 700 MB/s, which I am not sure is even supported by CFast cards currently. edit: FWIW, I've written this, I guess it should be ok as far as credentials go: http://www.slimraw.com/
  8. 12 bits is 2 bits more than 10 bits, that is 1.2 times the amount of bits (and 4 times the values that can be encoded). These are nominal bits though. You can think of this as the available coding space. What you put in there is another story. First, the tone curve might not use all the available coding space (for example, the BM 4.6K log curve is closer to 11 bits). Then compression comes in. A BM camera may reduce actual precision to as low as 8-7 bits in HFR (4:1 compression), depending on image complexity. You can read the paper on canon log (c-log), canon raw uses the same curve sans debayer.
  9. Don't get fooled by nominal bitdepth. A 10-bit raw frame from the Varicam LT or the Canon C500 contains much more color info than a 12-bit 4:1 Blackmagic raw frame.
  10. It is actually the other way around. For the same compression algorithm, higher bit rates need more processing power than lower bit rates in both encoding and decoding.
  11. Canon cinema cameras raw is logarithmic like ARRIRAW and REDRAW (which are both 12-bit). 12-bit is plenty for log raw. 10 bits is still pretty good with a log curve.
  12. 24-bit audio, time code, genlock. Possibly other niceties. Also, c300II is priced at $12k, so it is $3K more, not 7k. At 1 gbps their raw is less data than ProRes444. It is around 3:1 compressed. Withholding the XF-AVC codec for 2018 is actually kinda smart cause they buy some time to see what comes next from Pana/Sony, and at the same time don't kill the C300II immediately.
  13. No profile --flat or otherwise -- will be able to properly capture what you will be able to recover in post, so you might as well use it to help with other things. A punchy profile like Standard will help with manual focusing. Couple this with digic peaking + magic zoom, and you'll be surprised at what focusing precision is possible with just the camera display. Also, learning how the monitoring profile relates to the specific post workflow pays good dividends, especially if you are in the habit of exposing consistently. And if you know which profile value ends where in post, then all you need for exposure is the digital spotmeter (which is really the ultimate digital exposure tool anyway). For B&W Monochrome might be useful in that you get to view the tones only, without color distractions, and this might help with lighting choices and judging separation. But this can also be a detriment since color cues can help with space orientation and framing if there is movement in the shot. And the profile's mapping to grey values will not necessarily match the channel mix you will do in post, so it might be misleading.
  14. It shouldn't really matter even if you are on auto WB. You can always adjust one shot in Resolve and replicate the raw settings to all the others. And not bothering with WB while shooting is one of the advantages of shooting raw. In most cases leaving it at 5600K or 3200K should be fine for monitoring.
  15. I am sure you can link to a hundred examples like these. Also sure that you don't need me to produce links to 900 focus tracking shots for you. These are cool and all, but they are certainly not the norm. It is likely that you remember them because they stand out. Do we really need to establish that a huge part of focusing is entirely technical, with no creative intent whatsoever? You wouldn't use AF so that it gets creative on you; software has no intent of its own, it can't conceive anything (well, unless we dive into some metaphysical depths, which we shouldn't). You'd use it to do the chores, not the thinking.
  16. If you think you can track focus in a moving shot better than decent AF you are overestimating your focus pulling skills. In a few years even pro pullers won't come close to AF success rates. Focus is simply one of these things an algorithm can do vastly better than humans. Yes, there are some specific shots where doing it through AF would require more work, but how often do you actually rack or do fancy creative focus work? 90% of pulling is tracking objects.
  17. No idea. I haven't attempted upscaling to 4K. I don't think it makes much sense. People used to upscale 2.8K Alexa to 4K before Arri enabled open gate, and it looks fine, but 1080p might be a stretch.
  18. At 2K you'll be giving up the full frame benefits for a miniscule increase of pixel count (I deliberately don't say "resolution", it is likely you won't get true higher res at 1:1 2K as 1:1 puts much higher requirements on lens sharpness). It is almost certainly better to just upscale 1080p to 2K.
  19. I've been using Resolve exclusively for debayering since 2013. You can surely notice differences side-by-side and Adobe Camera Raw is stills oriented (so doesn't care much about processing speed and uses more sophisticated demosaicing), but Resolve is good enough and way faster (real time!). AE was never even a contender for me (and I use Lightroom for stills all the time). The convenience and smoothness of real time raw editing and coloring is huge for me. I often abuse the raw processing controls for grading purposes and I switch from editing to grading and back all the time as I edit, something one can't do (easily) with a proxy workflow or when baking to a log space in advance. This has been Resolve's greatest appeal.
  20. re transcoding in Windows: You can easily make DNxHR/DNxHD proxies in Resolve itself as long as it can import the original files. There is also the notion of "optimized media", which is really just internally managed proxies.
  21. So how's the rolling shutter in the higher resolutions?
  22. The idea is that a 3x3 binned image draws from 3*3*1920*1080 photosites (if it is true binning), whereas a 2.7k image (say, 2700*1520, can't remember the exact resolution) would be based on ~4.25 times less photosites, which should affect SNR significantly. But yes, if you are consistently going to shoot 2 stops lower ISO sensitivity, it should be fine.
  23. Yeah, there is literally no correlation between nearby pixels to be exploited by the compression algorithm once dominant random values like noise and moire appear. So is the resolution gain from 1080p to 2.7K actually worth the (probable) loss of sensitivity and wide angle abilities? I'd figure binned full frame 1080p is still the best bang for buck? Maybe a direct comparison of noise levels will shed light on this.
  24. It depends on detail and range. ISO is only one of the factors which increase detail (more noise). If you shoot a highly detailed wide shot you will get significantly lower compression levels than in the case of, say, a close-up with lots of defocus in the frame. Also, darker shots will compress better (less range).
  25. If you are going to scan only, you should probably be shooting negative anyway. Ektar and Portra are excellent for stills, and you have the Vision 3 series for motion pictures. Ektar in particular delivers chrome-like saturation at very fine graininess. Scanning chromes is trickier in a sense, and requires more from the scanner. Still, pretty cool to have Ektachrome back so soon after its demise, it makes beautiful punchy pictures.
×
×
  • Create New...