Jump to content

CaptainHook

Members
  • Content Count

    69
  • Joined

  • Last visited

About CaptainHook

  • Rank
    Member

Profile Information

  • Location
    Melbourne, Australia

Contact Methods

  • Facebook
    https://www.facebook.com/CaptainH00K
  • Twitter
    http://twitter.com/captainh00k
  • Instagram
    http://instagram.com/hooktastico/

Recent Profile Visitors

1,120 profile views
  1. I'm sorry but where is "Blackmagic" implying that? If it's me, my only intention is to help some understand how to get the best out of the DNG workflow in Resolve with Sigma FP files. I'm here on my my own time including the weekend to help people, and if my intention was otherwise it would be better for me to let people stumble instead of help and not waste my time. We work with companies like Sigma all the time and I personally think the Sigma FP is a great little camera and told the product manager so a couple of times in person. I will leave you to it. People know where to find me if they need help.
  2. Its to be expected with a gamut transform to not see colours outside of that gamut, that's sort of the point of it. How you map colours from outside the target gamut into it is another thing entirely. I'd be wary of what the CIE scopes show though. The timeline space sets what space the data is transformed into for processing through nodes etc in RCM. Personally I would leave this as 709/2.4 because the tools were originally designed to work that way and I'm used to how they respond but I also don't really use RCM. I would probably not use the CIE scopes for actual grading either and just use the vectorscope otherwise it may be confusing. Yes you need to convert BMDFilm Gen 1 into a display/output space but unless you have information from Sigma on how to do it you won't be able to do it in a "technically correct" way in Resolve and will have to manually grade it.
  3. If you want "all the colour" with no transform, then with DNG's you can select "Blackmagic Design Film" for gamut/colour space (Colour Science Version/Gen 1) and as mentioned that is sensor space with no transform, only what you select for gamma. So it's the colour as the camera has captured it (I say camera rather than sensor since there are no doubt corrections applied before encoding the raw data). As I also mentioned though, this is not suitable for display and the expectation is you will transform it/grade it for monitoring purposes. I believe Digital Bolex recommended this workflow and then provided a LUT to transform from sensor to 709 for their camera back when they were still around. You will still be able to "view" the colour unmodified from the camera though. I'm just stressing (for the benefit of others) that the colour from a digital camera in it's native sensor space is not intended to be displayed this way, so you can't (shouldn't) really judge "hues, saturation", etc. Some manufacturers like us (and many others) design a "working space" that is generally larger than 709/P3 and ideally a better starting place to manually grade from than sensor space but also not intended for final display. AFAIK Sigma has not done that so your options are either native sensor space or another documented colour space. I probably wouldn't say "scaling" myself, but yes Resolve will do a standard transform from (sensor to) XYZ to Rec.709 or P3. It's not clipped on the output of this step though so you can still recover data. This is what you would expect to happen if selecting 709 or P3 in the RAW tab. I'm assuming you are compensating for the change in 709/P3 on the display side here, but whether or not you'll see the differences you're expecting will be influenced all the way from the sensor response and what they do in camera, to right at the end on the display side and how well you can display P3 versus 709. I work in the camera team (different country to where the Resolve team are based) so my knowledge of the inner workings have come via discussions with them so I don't know 100% either as the code isn't visible to me, but we do develop Blackmagic RAW and the SDK in the camera team (Resolve uses the SDK almost the same as any 3rd party app does) and we share with them the camera colour science information we develop so that they can implement it into their pipelines (DNG and CST/RCM/etc). They don't need to do as much of that now though for our cameras since it's handled in the Blackmagic RAW SDK which we handle from the camera team side. As for DNG processing, they for the most part follow the Adobe DNG spec when it comes to processing AFAIK https://www.adobe.com/content/dam/acom/en/products/photoshop/pdfs/dng_spec_1.4.0.0.pdf You may also be interested in looking at the DNG SDK if you can understand code as that will give you an even more clearer idea of how DNG's are/should be interpreted. https://www.adobe.com/support/downloads/dng/dng_sdk.html
  4. Resolve uses the matrices in the DNG. "Sensor space" isn't really defined like a normal gamut with 3 primaries and a white point. It's usually described by spectral sensitivities and if you were to attempt to plot the response on a x/y (or u/v) plot it wouldn't be straight lines connecting 3 points. You can an idea by calculating the primaries from the matrices in the DNG since they describe XYZ to sensor space transforms and then plotting that. You would want to do it for both illuminants in the DNG metadata and take both into account. I haven't read every post as its hard to keep up, but I'm curious at to why you want to know? Depends on how you want to work and what suits, but this is exactly the intention of either Resolve's Colour Management option (rather than YRGB) or using ACES.
  5. Blackmagic Design Film for colour space/gamut (Colour Science Gen 1) in a DNG workflow is 'sensor space'. So it's really how the sensor sees colour. This is not meant to be presentable on a monitor, you're meant to transform to a display ready space but to do so requires the proper transforms for that specific camera/sensor (so you can't use the CST plugin for non BM cameras for example) in which the transform varies depending on the white balance setting you choose. If you're happy to grade from sensor space, that's fine but I would also join in recommending Rec.709 or an ACES workflow in this case since neither approach will clip any data and will also do the heavy lifting of transforming the sensor response no matter the wb setting.
  6. This is based on common colour science used by Adobe and many others. If you look at Adobe presets in camera RAW, or the Resolve WB presets you will notice the same thing. Daylight is typically green compared to a black body radiator (tungsten) and the daylight locus is used for temperatures above 4000K. Here's the daylight locus compared to the Planckian (tungsten/black body radiator). Notice it's "greener" and therefore would need a tint adjustment towards magenta to compensate More info here: https://en.wikipedia.org/wiki/Standard_illuminant https://en.wikipedia.org/wiki/Planckian_locus
  7. You'll also need Resolve 16.2+ released today to open those files in Resolve.
  8. BMDFim Gen 1 doesn't touch gamut, so for DNG's what you get is the sensors native response to colour, which is not suitable for a typical display. You can't really map a sensors spectral response to just 3 chromaticity points like a typical gamut and show it on a CIE plot either, but people do it anyway to give a rough idea. But most sensors will definitely have a response beyond visible colours so this is expected. I don't work in the Resolve team but CIE scope in Resolve will be plotting our Blackmagic Cinema Camera spectral response as best it can because it assumes if you're using that colour space you're using our camera. It doesn't try work out a gamut for whatever camera clip you're currently viewing. So the chromaticity points for the gamut you see are based on the original Blackmagic Cinema/Pocket Camera. When you transform from one space to another, you are scaling values by a 3x3 matrix. There's no algorithm deciding if a pixel value is within a gamut to leave it a lone depending on the target gamut. It just scales it by the 3x3 matrix regardless. Resolve does have a RFX plugin though that will allow you to selectively decide how to deal with out of gamut colours when transforming from a larger space to a smaller, it's under "Gamut Mapping" in the Colour Space Transform plugin. But this isn't typical 'standard colour science', because generally you want your transforms to be "reversible" and when you start doing non-linear manipulations it gets very difficult to reverse. It's best to white balance and do highlight recovery in native camera/sensor RGB space. Blackmagic RAW is not white balanced, its the native sensor response and we white balance it during decode. XYZ is okay to white balance in as an alternative if you can't do it in sensor space and this is possible because you can transform to XYZ from any gamut, but if you've already white balanced you can't do common highlight recovery (in XYZ or otherwise) that uses unclipped channels as during white balance you clip the highlights to ensure white = white.
  9. "BMD Film" (version 1 which will be applied to non-BMD DNGs) will just apply a log type gamma curve and not touch colour (gamut). So you will be getting Sigma fp sensor RGB space in terms of gamut and no colour 'errors' except that a sensor response isn't meaningful on a display. To transform into a common display space would not be straight forward though unless Sigma or someone provided the correct conversion, so unless you could get something you're happy with by manually correcting the colours it might be easier and more straight forward to decode into a known defined space like 709 or something and go from there.
  10. Also head over to our gallery page for some footage downloads. https://www.blackmagicdesign.com/products/blackmagicpocketcinemacamera/gallery
  11. The issue shown at Slashcam is not the same crosshatching issue you link and has a different cause. The "solution" of adjusting pan/tilt in Resolve (I would call it a workaround, not a solution) might help hide the crosshatching issue (that was addressed in a firmware update) but doesn't change the underlying artefacts from the DNG debayer shown at Slashcam.
  12. Ah gotcha. I'm not really directing this at you but just to anyone in general who isn't aware or wants to learn more, and I'm probably being a 'stickler for accuracy' here but obviously a sensor/camera has no inherent highlight rolloff as the sensor is linear (as close as practically possible with calibration) so highlight rolloff is mostly down to the dynamic range of the sensor, how the image is exposed and how the colourist chooses to roll off the highlights. All typical CMOS sensors will have "hard clipping" though, being linear capture devices. I only say this because I see a few people mention 'highlight rolloff' as part of a log curve or colour science or something when in that respect it is just a by-product of the log curve optimising the dynamic range of the sensor in the container its being stored in. It's still assumed the user will create their own highlight rolloff in grading (I'm speaking just of RAW and log captures, not "profiles" or looks applied in camera intended for display on Rec.709 devices). ARRI for example have a lot of stops above where they recommend middle grey be exposed for - due partly to their very large pixels and the large dynamic range they have - and when a log curve is calculated for mapping that dynamic range from middle grey to 940 (video white in 10bit where they map sensor saturation to in their log curves) you get a very flat curve at the top as it maps that range. When you flatten contrast that much it also appears to desaturate. I've seen some mention they believe ARRI purposely desaturate their highlights, but if they did that in processing before creating RAW files or LogC ProRes clips you wouldn't be able to inverse it correctly into linear for ACES workflows etc because processing like that is non-linear. They possibly do something like that for their LogC to Rec709 LUTs etc but people seem to attribute it to their LogC/RAW files too. For our cameras we map sensor saturation at our "native ISO" to 940 also, but for ISO curve's above we go into "super whites" to make better use of the bit depth available especially since we deal so much with SDI output (10bit) and ProRes 422HQ is common for our customers (10bit also). ARRI says ProRes 444 (12bit) is the minimum for LogC because they don't use the full range available. We may change that in the future but the caveat would also be you would need to use 12bit for best results. In theory you could expose a 10 stop camera/sensor so that you place middle grey at the second bottom stop, giving you 8 stops to create a very gentle highlight rolloff. You would just have VERY little range for the shadows. ? So long story short, the question I would suggest people ask is 'what is the dynamic range like compared to other cameras' as that will really tell you what kind of highlight roll off YOU (the user) can create for your preference with how you like to expose given the amount of shadow information you like to retain, tolerance for noise, etc.
  13. That's an oversight on the Resolve teams part, I alerted it to them last week when a user reported that some options are also missing from the ACES ResolveFX plugin too. We are looking into it. It's not specific to the Pocket 4K, is happens with the G2 and other cameras. Its how gamut mapping is done - I've seen the same artefact on footage shot with ARRI/Red/Sony/etc from test files with saturated red highlights clipping and also seen it on publicly broadcast tv shows and movies with huge budgets shooting Alexa etc where the gamut mapping is not handled by the colourist (its very common on TV series on Netflix, HBO, Amazon, etc). FYI, Arri Wide Gamut is very similar in size and location of primaries as BMD Wide Gamut Gen 4. The issue is its a non-linear transform to address the problem so if you apply that correction to footage it's not easily reversible anymore in standard colour science workflows like ACES. So if you applied it on camera to ProRes footage and then transformed it to another colour space "technically" it would be wrong. Same if it were an option in Blackmagic RAW decode and you took that output to VFX workflow etc that requires linear or some other transform. So the user has to be careful about when to use this. But we are looking into it to make it easier. Otherwise for problem shots people can decode into another space and handle the gamut mapping themselves (I personally think this is preferable when possible so it can be tailored to each shot and target gamut but it does require the user to have a certain amount of knowledge and time to address it in post which is not reasonable to assume).
  14. What do you mean by Magic Lanterns highlight rolloff then sorry? I think I missed something.
  15. Incorrect. And we provide our colour science data to 3rd party apps, post houses, and studios like Netflix on request so they have the data to transform to ACES outside of Resolve if needed.
×
×
  • Create New...