Jump to content

cpc

Members
  • Posts

    204
  • Joined

  • Last visited

Posts posted by cpc

  1. In the ASC mag interview Yedlin readily acknowledges there is nothing new in the video (for technically minded people). It is deliberately narrated using language targeting non-technical people. His agenda is that decision makers lack technical skills but impose technical decisions, easily buying into marketing talk. Also, in the video he explicitly acknowledges the benefits of high res capture for VFX and cropping.

  2. Testing camera profiles comes with the job characteristic of being a videographer/cinematographer. The easiest (cheapest) way to get an idea of DR distribution is to shoot a chip chart at a few exposures with a constant increment and use these to make a DR curve. Here is an example I did years ago using a Canon DSLR:

    chartDR.png

     

    Also, it is been mentioned already but log profiles do not introduce noise. They raise the blacks making the noise visible. If you see more noise in the graded image, you are not exposing the profile properly. Most log profiles need to be exposed slower than nominal because the nominal ISO of the profile is chosen to maximize some notion of  SNR, which is not necessarily the rating you want for a clean image (after grading). In fact, cameras do it all the time. Take a Sony A7 series camera, for example, and look through the minimal ISOs of their video profiles. See how the minimal ISO moves around between 100 and 3200 depending on the profile? That's because the camera is rated differently, depending on how it is supposed to redistribute tones according to the profile.

  3. 4 minutes ago, Jimmy said:

    Just ran a full test.

    Filled the card with 3072x1308 24fps... ISO100… not much movement.

    Data rate is 82MB/s... Stayed in the green throughout.

    Will test some more difficult shots later. See if it crashes.

    High detail--deep focus--bright will be the highest rate. Movement is irrelevant cause compression is intra-frame.

  4. 36 minutes ago, mercer said:

    I just read that the human eye cannot perceive anything over 8bit... is that true? If so, I still see the benefit of it for professional colorists, but for the novice, a light grade in 8bit should be more than enough. I also understand why Panasonic gave 10bit because they needed that internal Log to compete with Sony. Since Panny's implementation of VLog with the GH4 was disastrous for anybody who only used it internally, 10bit was necessary. And cLog was designed with 8bit color in mind. I would think the 4:22 aspect is more important than the 10bit?

    The human eye does not perceive bits so such a claim makes no sense by itself. The necessary precision depends on many factors. There are multiple experiments on the tonal resolution of the eye. Perhaps the most applicable are the ones done during the research and setup of the digital cinema specifications, which determined that for movie theater viewing conditions and for the typical tonal curves dialed in by film colorists during the DI process and for the encoding gamma of the digital cinema spec (2.6 power gamma), you'd need around 11 bits to avoid noticeable posterization/banding artifacts after discretization (seen in gradients in the darks, by the way). This was rounded up to 12 bits in the actual specification. In an office viewing environment (brighter environments) you can do fine with less precision.

    This is about delivery though -- meant for images for display. You'd usually need more tonal precision when you capture an image, because it needs to withstand the abuse -- pushing and stretching -- of post production. The precision will mostly depend on transfer curves -- you need relatively more for linear images, and relatively less for logarithmic images. With today's DRs 8 bits is absolutely not enough anymore for log curves (and not even remotely close for linear images). It usually does fine for delivery for consumer devices (some of these displays are 8-bit, some are 6-bit; likely moving to 10 bits in the future).

  5. 35 minutes ago, mercer said:

    @fuzzynormal you can process the color from the outset, but you don't have to. For CDNGs, once you bring the Raw files into Resolve, the only real necessity is to turn them into ProRes, unless you only use Resolve from ingest to export. 

    It's probably smart to also fix your WB, if needed, before rendering as ProRes.

     Then open the ProRes files in your NLE, edit and color as you like. If you want more latitude you can send the project back to Resolve for color and then send it back to your NLE for delivery.

    I usually just use the ProRes files in my NLE for edit, color and delivery and never go back to Resolve.

    Some people use different Log profiles. For the 5D3 I use an app called MLRawViewer or a newer one called Footage that I can turn the MLV files directly into ProRes files as C-Log, sLog, Log C, etc. I highly doubt they are the exact curves but Log to Rec709 LUTS work well with them. It's a fairly simple process but it is time consuming. There are other processes that make ML even faster (MLVFS) With BlackMagic you can just ingest, edit, color, title, deliver using the native CDNGs.

    With the C200, it seems like it will be possible to edit natively but either way it will be a lot simpler with the proxy workflow. 

    The true power of raw shooting is that you don't need to fix anything in advance and you can raw develop, edit and color at the same time in a liquid, flexible and creative post workflow. At least in Resolve you can. Proxies shouldn't be necessary. I don't see how a proxy workflow can be simpler -- you add a step (possibly two steps, if you round trip). :) 

  6. 1 hour ago, Ehetyz said:

    Just went and tested this out, 4,6k60p is available in lossless RAW as well as 3:1 and 4:1. The RAW/Prores compression ratios are completely separate from the frame rate/resolution.

    Well, this kind of makes my argument irrelevant then. :)

  7. 1 hour ago, andrgl said:

    You have a source for your blackmagic claims? None of the information you've posted is available from official channels.

    It's funny because your post omits the other half of the usual internet claim of 2 x 11-bit values conformed to 12-bit log. (15 stops has to come from somewhere.) 60fps on the 4.6k can be captured lossless btw.

    Well, I am the source. I don't need official channels for this, I trust math better than any channel. :) 

    The ~11 bits used by the BMDFilm curve is orthogonal to how their pipeline constructs a 12-bit digital image. It is the result of the lack of a linear toe (which creates sparsities in the darks, i.e. values which never happen), as well as unused values at the top end of the range.

    I haven't looked at the specs recently, but 60 fps 4.6K lossless raw would be in excess of 700 MB/s, which I am not sure is even supported by CFast cards currently.

    edit: FWIW, I've written this, I guess it should be ok as far as credentials go: http://www.slimraw.com/

  8. 1 hour ago, andrgl said:

    How is that possible? Any 12-bit capture will have 4 times the amount of bits than a 10-bit file.

    Tried to find a white sheet on Canon RAW but was unable to. Care to share one?

    12 bits is 2 bits more than 10 bits, that is 1.2 times the amount of bits (and 4 times the values that can be encoded). These are nominal bits though. You can think of this as the available coding space. What you put in there is another story. First, the tone curve might not use all the available coding space (for example, the BM 4.6K log curve is closer to 11 bits). Then compression comes in. A BM camera may reduce actual precision to as low as 8-7 bits in HFR (4:1 compression), depending on image complexity.

    You can read the paper on canon log (c-log), canon raw uses the same curve sans debayer.

  9. 7 hours ago, andrgl said:

    12bit RAW is only 30p. :< Sooooooo weak! Damn, basically a perfect camera expect for the gimped output.

    RED and Blackmagic still own the throne for true internal 60+fps RAW.

    Hopefully the C200 lights a fire under the asses of Sony and Panasonic, would be amazing to get 12/14 bit RAW internal recording on their consumer grade shit.

    Don't get fooled by nominal bitdepth. A 10-bit raw frame from the Varicam LT or the Canon C500 contains much more color info than a 12-bit 4:1 Blackmagic raw frame.

  10. 2 hours ago, tugela said:

    No, I don't think it has anything to do with withholding stuff. I have just noticed from their press release that the camera has a DV6 processor - that is a new one. It probably has a different encoder than the earlier DV5 processors, so hardware encoding for 4K will have different bit rates (as well as higher frame rates). There may not be any higher bit rates as a result. 

    The high bit rates with DV5 were likely due to computational restraints to keep the processor in it's thermal envelope. A lower bit rate means more computation and implies that the DV6 is more thermally efficient, which is very good news for Canonites in the greater scheme of things. The implication of the new processor that people are perhaps not picking up on is that there will be a corresponding stills processor, the Digic 8, which will be the sibling of the DV6. It means that hardware encoding of 4K video might finally arrive in Canon consumer cameras (as well as the prosumer ones which currently are forced to use mjpeg as a codec).

    It is actually the other way around. For the same compression algorithm, higher bit rates need more processing power than lower bit rates in both encoding and decoding. :)

  11. 5 minutes ago, Jimmy said:

    And the c300 mk ii

    For now, it is safe due to the codec... But once the c200 gets the full range of codecs... What next for the c300 mk ii... Gotta be worth $7k tops?

    24-bit audio, time code, genlock. Possibly other niceties.

    Also, c300II is priced at $12k, so it is $3K more, not 7k.

    6 minutes ago, tugela said:

    Apparently you get RAW or 8 bit 100/150 mbps H.264 (30/60p), those being the only two options. It makes some sense since RAW has no hardware encoding while H.264 is limited by the thermal envelop of the Digic processor. But, IIRC the other Cx00 cameras can shoot H.264 at higher bit rates, suggesting that the camera the pre-reviewers have is not finished.

    At the price they are charging unless you really need RAW footage, you would be better off with an alternative camera unless something changes before release.

    At 1 gbps their raw is less data than ProRes444. It is around 3:1 compressed. Withholding the XF-AVC codec for 2018 is actually kinda smart cause they buy some time to see what comes next from Pana/Sony, and at the same time don't kill the C300II immediately.

  12. 5 hours ago, mercer said:

    Yeah that's basically how I monitor, unless it's a really cloudy day, I'll bump it a little. What monitoring Profile do you recommend. I'm so used to the Prolost Flat look when shooting, so I have it set to that, but I am open to other ideas if it will yield better results in the long run. For instance, I am working on testing some B&W Raw, do you recommend setting the Picture Profile to Monochrome?

    No profile --flat or otherwise -- will be able to properly capture what you will be able to recover in post, so you might as well use it to help with other things. A punchy profile like Standard will help with manual focusing. Couple this with digic peaking + magic zoom, and you'll be surprised at what focusing precision is possible with just the camera display. Also, learning how the monitoring profile relates to the specific post workflow pays good dividends, especially if you are in the habit of exposing consistently. And if you know which profile value ends where in post, then all you need for exposure is the digital spotmeter (which is really the ultimate digital exposure tool anyway).

    For B&W Monochrome might be useful in that you get to view the tones only, without color distractions, and this might help with lighting choices and judging separation. But this can also be a detriment since color cues can help with space orientation and framing if there is movement in the shot. And the profile's mapping to grey values will not necessarily match the channel mix you will do in post, so it might be misleading.

  13. It shouldn't really matter even if you are on auto WB. You can always adjust one shot in Resolve and replicate the raw settings to all the others. And not bothering with WB while shooting is one of the advantages of shooting raw. In most cases leaving it at 5600K or 3200K should be fine for monitoring.

  14. 58 minutes ago, tupp said:

    ...

    I am sure you can link to a hundred examples like these. Also sure that you don't need me to produce links to 900 focus tracking shots for you. :)

    These are cool and all, but they are certainly not the norm. It is likely that you remember them because they stand out. Do we really need to establish that a huge part of focusing is entirely technical, with no creative intent whatsoever? You wouldn't use AF so that it gets creative on you; software has no intent of its own, it can't conceive anything (well, unless we dive into some metaphysical depths, which we shouldn't). You'd use it to do the chores, not the thinking.

     

  15. If you think you can track focus in a moving shot better than decent AF you are overestimating your focus pulling skills. In a few years even pro pullers won't come close to AF success rates. Focus is simply one of these things an algorithm can do vastly better than humans. Yes, there are some specific shots where doing it through AF would require more work, but how often do you actually rack or do fancy creative focus work? 90% of pulling is tracking objects.

  16. On 5/2/2017 at 4:09 PM, Arikhan said:

    @cpc

    Just beeing curious: How does the 1080p-ML-footage looks, when upscaled to 4K? Is the look better when

    A. You upscale to 4K in post or
    B. When you display the 1080p-ML-footage on a 4K TV (as most 4K TVs do a very good upscale of FullHD footage) ?

    No idea. I haven't attempted upscaling to 4K. I don't think it makes much sense. People used to upscale 2.8K Alexa to 4K before Arri enabled open gate, and it looks fine, but 1080p might be a stretch.

  17. 8 hours ago, mercer said:

    Has anybody tried a modest bump to 2.5k or 2.7k. I would love to be able to get 2.5k 2:39 working one day. Hell, even cinema 2K at 48-60p would get me off.

    At 2K you'll be giving up the full frame benefits for a miniscule increase of pixel count (I deliberately don't say "resolution", it is likely you won't get true higher res at 1:1 2K as 1:1 puts much higher requirements on lens sharpness). It is almost certainly better to just upscale 1080p to 2K.

  18. 1 hour ago, squig said:

    Oh and After Effects is the way to go for max quality debayering, but Resolve isn't bad (hearing that it may be better in 14, but haven't tested it yet). If you go with AE use CS6 or CC 2014 for multiproccessor rendering. AE renders are way slower than Resolve. You can use the same Resolve workflow as you would with the Blackmagics provided you use MLVFS.

    I've been using Resolve exclusively for debayering since 2013. You can surely notice differences side-by-side and Adobe Camera Raw is stills oriented (so doesn't care much about processing speed and uses more sophisticated demosaicing), but Resolve is good enough and way faster (real time!). AE was never even a contender for me (and I use Lightroom for stills all the time). The convenience and smoothness of real time raw editing and coloring is huge for me. I often abuse the raw processing controls for grading purposes and I switch from editing to grading and back all the time as I edit, something one can't do (easily) with a proxy workflow or when baking to a log space in advance. This has been Resolve's greatest appeal.

  19. 6 minutes ago, squig said:

    Well I've got an ultra-wide lens so I can still get a 20mm equivalent FOV so that's not an issue. Loss of sensitivity? I don't see any. Noise wise for an equal FOV and DOF to full frame you're shooting at f/2.8 100 ISO instead of f/5.6 400 ISO, so I don't see how it could be noisier. You are zoomed in on the noise though, but to my eyes it's the lesser of two evils. It's definitely worth it for the extra detail, it's now has the resolution of an Alexa!

    The idea is that a 3x3 binned image draws from 3*3*1920*1080 photosites (if it is true binning), whereas a 2.7k image (say, 2700*1520, can't remember the exact resolution) would be based on ~4.25 times less photosites, which should affect SNR significantly. But yes, if you are consistently going to shoot 2 stops lower ISO sensitivity, it should be fine.

  20. 10 hours ago, hyalinejim said:

    What I'm seeing is stuff at the pixel level really affects compression. So ISO noise is one, as we already know. The bitrate goes absolutely through the roof when I film the moire pattern on my computer screen. I mean, it even kills crop mode 1920x1080!

    Yeah, there is literally no correlation between nearby pixels to be exploited by the compression algorithm once dominant random values like noise and moire appear.

     

    10 hours ago, squig said:

    I tried my Lexar 1066x card, it's a little faster than the Toshiba. 3K 2.39:1 is repeatable at 100 ISO. I had to go back to 2.7-2.8K at 1600 ISO. That works for me, I'll shoot 3K 100 ISO for wide outdoor establishing shots, 2.7K for everything else, and render to 2K scope.

    So is the resolution gain from 1080p to 2.7K actually worth the (probable) loss of sensitivity and wide angle abilities? I'd figure binned full frame 1080p is still the best bang for buck? Maybe a direct comparison of noise levels will shed light on this.

  21. 4 hours ago, hyalinejim said:

    OK, I'm pretty sure that the variable compression is due to ISO. In crop_rec 3K (3072 x 1286 ) I can get continuous recording at ISO 100, but if I whack the ISO up to 6400 I get eleven seconds. Can you check if it's the same for you?

    It depends on detail and range. ISO is only one of the factors which increase detail (more noise). If you shoot a highly detailed wide shot you will get significantly lower compression levels than in the case of, say, a close-up with lots of defocus in the frame. Also, darker shots will compress better (less range).

  22. If you are going to scan only, you should probably be shooting negative anyway. Ektar and Portra are excellent for stills, and you have the Vision 3 series for motion pictures. Ektar in particular delivers chrome-like saturation at very fine graininess. Scanning chromes is trickier in a sense, and requires more from the scanner. Still, pretty cool to have Ektachrome back so soon after its demise, it makes beautiful punchy pictures.

×
×
  • Create New...