Jump to content


  • Content Count

  • Joined

  • Last visited

About CaptainHook

  • Rank

Profile Information

  • Location
    Melbourne, Australia

Contact Methods

  • Facebook
  • Twitter
  • Instagram

Recent Profile Visitors

1,416 profile views
  1. I'm sorry but where is "Blackmagic" implying that? If it's me, my only intention is to help some understand how to get the best out of the DNG workflow in Resolve with Sigma FP files. I'm here on my my own time including the weekend to help people, and if my intention was otherwise it would be better for me to let people stumble instead of help and not waste my time. We work with companies like Sigma all the time and I personally think the Sigma FP is a great little camera and told the product manager so a couple of times in person. I will leave you to it. People know where to find m
  2. Its to be expected with a gamut transform to not see colours outside of that gamut, that's sort of the point of it. How you map colours from outside the target gamut into it is another thing entirely. I'd be wary of what the CIE scopes show though. The timeline space sets what space the data is transformed into for processing through nodes etc in RCM. Personally I would leave this as 709/2.4 because the tools were originally designed to work that way and I'm used to how they respond but I also don't really use RCM. I would probably not use the CIE scopes for actual grading either and j
  3. If you want "all the colour" with no transform, then with DNG's you can select "Blackmagic Design Film" for gamut/colour space (Colour Science Version/Gen 1) and as mentioned that is sensor space with no transform, only what you select for gamma. So it's the colour as the camera has captured it (I say camera rather than sensor since there are no doubt corrections applied before encoding the raw data). As I also mentioned though, this is not suitable for display and the expectation is you will transform it/grade it for monitoring purposes. I believe Digital Bolex recommended this workflow and t
  4. Resolve uses the matrices in the DNG. "Sensor space" isn't really defined like a normal gamut with 3 primaries and a white point. It's usually described by spectral sensitivities and if you were to attempt to plot the response on a x/y (or u/v) plot it wouldn't be straight lines connecting 3 points. You can an idea by calculating the primaries from the matrices in the DNG since they describe XYZ to sensor space transforms and then plotting that. You would want to do it for both illuminants in the DNG metadata and take both into account. I haven't read every post as its hard to keep up, bu
  5. Blackmagic Design Film for colour space/gamut (Colour Science Gen 1) in a DNG workflow is 'sensor space'. So it's really how the sensor sees colour. This is not meant to be presentable on a monitor, you're meant to transform to a display ready space but to do so requires the proper transforms for that specific camera/sensor (so you can't use the CST plugin for non BM cameras for example) in which the transform varies depending on the white balance setting you choose. If you're happy to grade from sensor space, that's fine but I would also join in recommending Rec.709 or an ACES workflow i
  6. This is based on common colour science used by Adobe and many others. If you look at Adobe presets in camera RAW, or the Resolve WB presets you will notice the same thing. Daylight is typically green compared to a black body radiator (tungsten) and the daylight locus is used for temperatures above 4000K. Here's the daylight locus compared to the Planckian (tungsten/black body radiator). Notice it's "greener" and therefore would need a tint adjustment towards magenta to compensate More info here: https://en.wikipedia.org/wiki/Standard_illuminant https://en.wikipedia.org/wiki/Pl
  7. You'll also need Resolve 16.2+ released today to open those files in Resolve.
  8. BMDFim Gen 1 doesn't touch gamut, so for DNG's what you get is the sensors native response to colour, which is not suitable for a typical display. You can't really map a sensors spectral response to just 3 chromaticity points like a typical gamut and show it on a CIE plot either, but people do it anyway to give a rough idea. But most sensors will definitely have a response beyond visible colours so this is expected. I don't work in the Resolve team but CIE scope in Resolve will be plotting our Blackmagic Cinema Camera spectral response as best it can because it assumes if you're using th
  9. "BMD Film" (version 1 which will be applied to non-BMD DNGs) will just apply a log type gamma curve and not touch colour (gamut). So you will be getting Sigma fp sensor RGB space in terms of gamut and no colour 'errors' except that a sensor response isn't meaningful on a display. To transform into a common display space would not be straight forward though unless Sigma or someone provided the correct conversion, so unless you could get something you're happy with by manually correcting the colours it might be easier and more straight forward to decode into a known defined space like 709 or som
  10. Also head over to our gallery page for some footage downloads. https://www.blackmagicdesign.com/products/blackmagicpocketcinemacamera/gallery
  11. The issue shown at Slashcam is not the same crosshatching issue you link and has a different cause. The "solution" of adjusting pan/tilt in Resolve (I would call it a workaround, not a solution) might help hide the crosshatching issue (that was addressed in a firmware update) but doesn't change the underlying artefacts from the DNG debayer shown at Slashcam.
  12. Ah gotcha. I'm not really directing this at you but just to anyone in general who isn't aware or wants to learn more, and I'm probably being a 'stickler for accuracy' here but obviously a sensor/camera has no inherent highlight rolloff as the sensor is linear (as close as practically possible with calibration) so highlight rolloff is mostly down to the dynamic range of the sensor, how the image is exposed and how the colourist chooses to roll off the highlights. All typical CMOS sensors will have "hard clipping" though, being linear capture devices. I only say this because I see a fe
  13. That's an oversight on the Resolve teams part, I alerted it to them last week when a user reported that some options are also missing from the ACES ResolveFX plugin too. We are looking into it. It's not specific to the Pocket 4K, is happens with the G2 and other cameras. Its how gamut mapping is done - I've seen the same artefact on footage shot with ARRI/Red/Sony/etc from test files with saturated red highlights clipping and also seen it on publicly broadcast tv shows and movies with huge budgets shooting Alexa etc where the gamut mapping is not handled by the colourist (its very comm
  14. What do you mean by Magic Lanterns highlight rolloff then sorry? I think I missed something.
  15. Incorrect. And we provide our colour science data to 3rd party apps, post houses, and studios like Netflix on request so they have the data to transform to ACES outside of Resolve if needed.
  • Create New...