Jump to content

Sigma Fp review and interview / Cinema DNG RAW


Andrew Reid
 Share

Recommended Posts

12 minutes ago, Chris Whitten said:

In the ACES colour space and Cinema DNG Default settings I am not getting anything like the same pronounced purple/green blotchy pattern in the blacks after shooting with the IR Cut Filter. This is the 10bit clip with the exposure increased a lot using the wheels, and pulling up the 'shadows' value.

Screen Shot 2020-03-24 at 12.49.15.png

That's a good call.

I would guess that part of the ACES DNG IDT is desaturating the shadows. Because i know the actual colour values in the source DNG from the data before any debayering are that colour. Of course i will now have to doubly check AGAIN!! 

cheers
Paul

Link to comment
Share on other sites

1 hour ago, paulinventome said:

Now if sigma are unable to output 12bit at 25p then the solution that works for everyone is to do 10bit but to use a curve to redistribute values so that the last stop of recorded light does not take 2048 integer values to describe it and then use the saved values to ensure the shadows get every last drop of sensor data they can. Then we'd have 12 bit quality in a 10 bit container at all these frame rates and EVERYONE is happy. There is literally no downside to doing this. This is what 8 bit is doing and that can produce great images in a fraction of the space. The code is already in the camera!

If you see earlier on in this thread i even show what that curve looks like because each DNG file has a copy of it in 8bit so Resolve can debayer properly. It's standard practice in cDNG.

We need a wave of people to say this to sigma because sometimes i think it's only me....

cheers
Paul

 

Could you share the email adress that you use to contact Sigma? I've emailed the support in Germany but I'm a bit skeptical if and when these reports reach Japan.

Link to comment
Share on other sites

Just double checking but yeah, the green/magenta stuff is in the actual DNG itself. No question.

What's happening is perhaps a truncation issue converting between the 12 bit source and the 10 bit. The green values are rgb(0,1,1) so no red value. This is the very lowest stop of recorded light either 0 or 1

I think this is quite common because come to think of it he A7s always suffered green/magenta speckled noise - and that would make sense that the bottom two stops are truncating strongly and the with noise on top of it.

What sigma *ought* to do is to balance the bottom two stops and make the equal grey because the colour cast with a single different digit is quite high. This could be what ACES and BMDFilm is doing.

I still say it's a bug but if you go from having 16 values to describe a colour to 1 value then any kind of tint means that when it's truncated it will be made stronger. But i think i understand better why.

Also @Chris Whitten if you see the image stats you can see that the highest recorded value in that DNG is 648 and this is a 10 bit container so there are 400ish more values that could be recorded here. I know this is an under exposure test so this is why in this case!! (Not teaching you to suck eggs i hope!!! Certainly don't mean to come across like that)

RAWDigger is an awesome tool if you want to poke around inside DNGs!!! Highly recommended!!

cheers
Paul

Screenshot 2020-03-24 at 15.19.05.png

Link to comment
Share on other sites

7 hours ago, Scott_Warren said:

No offense to Crimson Engine, but it's a strange suggestion for people to use an incorrect colorspace as a starting point with a camera that has pretty great color in its own right as Paul has mentioned! (Something can look good with weird settings, but you always want to strive for base technical accuracy + aesthetics in my opinion.)

You might also consider using the ACES pipeline since it has a film print like "look" built into the display chain already. It takes the color matrix built into the DNGs and scales them for output for your monitor without weird translations or conversions except to ACES' own massive color space.

You'd be free to add in a Colorspace Transform node and try out different camera color sciences to see if something else is more to your liking (Alexa, RED, Sony, etc), but I've found it's a quick and repeatable way to get excellent color out of the camera quickly. 

The attached shot was achieved just by dropping in the sequence and pushing exposure and saturation a bit to match to what I saw in person. (Metered the window light with an external incident meter as I tend to do.)

livingroom_1.1.1.jpg

what do you set your Davinci settings to? 

Link to comment
Share on other sites

@redepicguy Attached are my current settings. I try to make the initial capture as solid as possible as far as exposure goes to avoid a lot of pushing or pulling in post. In the node chain for the start of a grade, I'll also add a Color Space Transform node to convert the ACES AP1 space down to P3-D65. Allows for more color depth than Rec709, but gives a more natural response with saturation and is supported on more and more devices these days. Also the gamma is a little softer rolling down into shadows which lets me see into darker areas without needing to bend the gamma or use lifts to see it. I try to use a global approach to getting a look in place of tweaking knobs if at all possible. Probably leftover habits of working with film for a while :) 

You don't have to convert the colorspace from ACES AP1 to 709 or P3, but I find that AP1 needs to be saturated a LOT in order to feel normal, and pushing saturation can behave weirdly depending on the subject or scene. Using the conversion method adds what I think is a natural boost of color without needing to crank things crazily. Is any of this approach correct? I have no idea. But it seems to work well for the kind of images I like to capture.
 

resolve_ACES.png

Link to comment
Share on other sites

Just now, Scott_Warren said:

@redepicguy Attached are my current settings. I try to make the initial capture as solid as possible as far as exposure goes to avoid a lot of pushing or pulling in post. In the node chain for the start of a grade, I'll also add a Colorspace Conversion node to convert the ACES AP1 space down to P3-D65. Allows for more color depth than Rec709, but gives a more natural response with saturation and is supported on more and more devices these days. Also the gamma is a little softer rolling down into shadows which lets me see into darker areas without needing to bend the gamma or use lifts to see it. I try to use a global approach to getting a look in place of tweaking knobs if at all possible. Probably leftover habits of working with film for a while :) 

You don't have to convert the colorspace from ACES AP1 to 709 or P3, but I find that AP1 needs to be saturated a LOT in order to feel normal, and pushing saturation can behave weirdly depending on the subject or scene. Using the conversion method adds what I think is a natural boost of color without needing to crank things crazily. Is any of this approach correct? I have no idea. But it seems to work well for the kind of images I like to capture.
 

resolve_ACES.png

Thanks! Will play around with those. 

Link to comment
Share on other sites

16 hours ago, Chris Whitten said:

MOV looks to be very poor compared to the UHD modes. My rig is very small using the Wise portable SSD attached to Sigma's included hot shoe attachment.

6AA167B2-080F-4125-9C02-2454264AB4C0.JPG

Fair enough, but I'm using a Leica Elmarit-M 28/2.8 Asph, Voigtlander 35/2.5 and 40/1.2 VM as my main lenses, so - all due respect - your setup looks comparatively huge next to mine.

 

As for MOV, it is very much usable if you expose well and tweak a few settings. I'm liking Vivid minus 1 / Saturation minus 0.4 and contrast minus 0.2 with sharpness dropped all the way down and a +1 to the low end tone curve, if called for. When it doesn't flicker, it looks very natural and "cinematic".

Link to comment
Share on other sites

Ps- I do plan on going the SSD route, or at least 8 bit in-camera DNG, but unfortunately the s*** kinda hit the fan here in the States while I was on holiday with my family, so I'm unsure of my employment situation. I'm thinking the fp will be my last gear purchase for a while, and I should probably buy food and such instead.

 

On the upside, I'll have plenty of free time to tinker and play with editing software - something full-time work + caring for a toddler didn't allow for, so hey. Making lemonade, as they say.

Link to comment
Share on other sites

9 hours ago, paulinventome said:

Just double checking but yeah, the green/magenta stuff is in the actual DNG itself. No question.

What's happening is perhaps a truncation issue converting between the 12 bit source and the 10 bit. The green values are rgb(0,1,1) so no red value. This is the very lowest stop of recorded light either 0 or 1

@Chris WhittenI think this is quite common because come to think of it he A7s always suffered green/magenta speckled noise - and that would make sense that the bottom two stops are truncating strongly and the with noise.

 

There was evidence that the a7S was doing something with the blue channel to "game the system" with regards to noise, because blue is the most problematic color in noise. If the Sigma uses a Sony sensor, it could be in its wiring to do this, or Sigma may be using a similar trick to reduce the appearance of noise.

Link to comment
Share on other sites

For the record I'm not arguing about any of this, or dismissing other people's issues.

We're all trying to get to the bottom of various issues.

I try and shoot good exposure balance and with decent grading I'm not having a problem with purple/green. But yes, if it is visible in the shadows it's still an issue Sigma should address.

None of my footage at any ISO flickers. I am having the issue with a wrongly exposed first frame at ISOs below 320 (at least). Not at ISO640

Link to comment
Share on other sites

8 hours ago, imagesfromobjects said:

Fair enough, but I'm using a Leica Elmarit-M 28/2.8 Asph, Voigtlander 35/2.5 and 40/1.2 VM as my main lenses, so - all due respect - your setup looks comparatively huge next to mine.

 

 

No doubt the EVF is massive and the biggest part of my set up. It is optional though, especially indoors.

The Wise SSD is very small and light. Apart from the Takumar lenses, which do require a large but light adapter, all my other lenses are very small vintage Leica M.

The UHD quality from the FP is outstanding. From what I've seen, the MOV files are worse than most decent cameras out there. 

Link to comment
Share on other sites

14 hours ago, Scott_Warren said:

You don't have to convert the colorspace from ACES AP1 to 709 or P3, but I find that AP1 needs to be saturated a LOT in order to feel normal, and pushing saturation can behave weirdly depending on the subject or scene. Using the conversion method adds what I think is a natural boost of color without needing to crank things crazily. Is any of this approach correct? I have no idea. But it seems to work well for the kind of images I like to capture.

What are you viewing on?

If you're viewing on a modern apple device (MacBook, iMacs, iDevices) the P3 is the correct output colourspace for your monitors. If you view 709 on a P3 monitor then it will look overly saturated, conversely if you view P3 on a 709 display then it will look de saturate (the colours will also be off, reds will be more orange and so on)

Colour Management is a complicated subject and it can get confusing fast.

In theory you shouldn't be having to saturate images to make them look right. I have a feeling that perhaps there's something not set up quite right in your workflow.

Resolve can manage different displays, so it can separate the timeline colour space from the monitor space which is really important. On my main workstation i have a 5K GUI monitor which is 709 but a properly calibrated P3 display going through a decklink card. So i am managing two different monitor set ups.

So first you choose which timeline space you are working in. 709 is best for most cases, but if you have footage in P3, motion graphics in P3 or are targeting apples ecosystem then a P3 timeline can be helpful. P3 gives you access to teals and saturated colours that 709 cannot but *only* a P3 display could show you those.

The way you monitor is independent of your timeline. You could have two monitors plugged in, one it 709 and one is P3. Each of those monitors needs a conversion out (in Resolve settings). If your display is P3 and your timeline is 709 then a correct conversion to P3 is needed so you see the 709 colours properly saturated. If your timeline is P3 and you have a 709 broadcast monitor then that conversion has to do some gamut mapping or clipping to show the best version of P3 it can in a limited colour space.

This also applies to exporting images. The vast majority of jpeg stills are 709. The vast majority of video is 709. But apple will tag images and movies with display P3 when it can however universal support for playing these just isn't present,

So you kind of have to work backwards and say, where are my images going to end up....

cheers
Paul

Link to comment
Share on other sites

10 hours ago, imagesfromobjects said:

Fair enough, but I'm using a Leica Elmarit-M 28/2.8 Asph, Voigtlander 35/2.5 and 40/1.2 VM as my main lenses, so - all due respect - your setup looks comparatively huge next to mine.

which version of the Elmarit - the newest one? Can you do me a favour and let me know what the corners are like at infinity on the fp?

cheers!
Paul

Link to comment
Share on other sites

@paulinventome I'm viewing and grading on a Dell UP2718Q which does 97% of P3 :) 

I suppose what I'm reacting to is the AP1 color primaries versus P3's. P3 definitely reads as warmer with more pleasing hues out of the gate, but it's something I could dial in with AP1 with some knob tweaking if that would be more proper. Of course if I use the OneShot chart to start with, then even AP1 looks great since the colors have all been hue adjusted and saturated accordingly, so I may just need to embrace being a bit more hands on with the process. The joys of learning!

Link to comment
Share on other sites

34 minutes ago, Scott_Warren said:

I suppose what I'm reacting to is the AP1 color primaries versus P3's. P3 definitely reads as warmer with more pleasing hues out of the gate, but it's something I could dial in with AP1 with some knob tweaking if that would be more proper. Of course if I use the OneShot chart to start with, then even AP1 looks great since the colors have all been hue adjusted and saturated accordingly, so I may just need to embrace being a bit more hands on with the process. The joys of learning!

In theory, if the camera sees a macbeth chart under a light that it is white balanced for, at a decent exposure and you have the pipeline set up correctly - you should see what you saw in real life.

If you decide that the colours are wrong then, that's a creative choice of yours, but the aim is really to get what you saw in real life as a starting point then change things to how you like.

Sounds like you may be under windows? One of the problems with grading with a UI on a P3 monitor is that your UI and other colours are going to be super super saturated because the OS doesn't do colour management, doesn't know what monitor you have and therefore when it says Red your monitor will put up NutterSaturatedRed rather than what is recognised as a normal Red. When you are dealing with colours like that your eye will adapt and you will end up over saturating your work. But anyway, you probably know this but it's one of the biggest bug bears with Adobe in the sense that with no colour management it's easy to grade on a consumer monitor who's white point is skewed green/magenta and then you make weird decisions. But i digress...

Have you got a single DNG with that chart in? I'd like to take a look and see how it looks.

cheers
Paul

 

 

Link to comment
Share on other sites

@paulinventome Here's the link to the folder of DNGs of my colorchecker. Their white balance would be equivalent to Shade, so 7500K, +10 M . Incident exposed at speed with my Sekonic. 
https://drive.google.com/drive/u/0/folders/1YLY5ls433I-DgaLSbbzE3Lf6MtTKwjKT

I do have an X-Rite iDisplay calibrator to help wrangle any of the weirdness Windows might introduce without any help, however. I know it's not scientifically perfect like what could be achieved with higher-end solutions used for film post, but it was close enough for our purposes when I was lighting Call of Duty: WWII that I adopted it for home use as well :) I've been meaning to check out DisplayCAL to see if that offers anything more precise than X-Rite's own software. 

Link to comment
Share on other sites

23 minutes ago, Scott_Warren said:

I do have an X-Rite iDisplay calibrator to help wrangle any of the weirdness Windows might introduce without any help, however. I know it's not scientifically perfect like what could be achieved with higher-end solutions used for film post, but it was close enough for our purposes when I was lighting Call of Duty: WWII that I adopted it for home use as well :) I've been meaning to check out DisplayCAL to see if that offers anything more precise than X-Rite's own software. 

Good stuff on CoD, don't play it but looks pretty.

The issue under windows (and Mac to a lesser degree) that there is no OS support even if you can calibrate your monitor. There's nowhere in the chain to put a calibration LUT. So really you're looking at a monitor where you can upload one too or a separate LUT box which you can run a signal through. When you scrape beneath the surface you will be amazed at how much lack of support for what is a basic necessity. For gaming your targets are most likely the monitors  you use and so that makes sense. But for projection or broadcast you really have to do the best to standardise because after you there's no way of knowing how screwed up the display chain will be. Especially with multiple vendors and pipelines - essential to standardise.

Look at Lightspace, IMHO much better than DisplayCAL. That's what i use to stay on top of displays and devices

I will take a look at the DNG

This is really worth a watch :

 

cheers
Paul

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

×
×
  • Create New...