Jump to content

paulinventome

Members
  • Content Count

    159
  • Joined

  • Last visited

Everything posted by paulinventome

  1. I think that's pretty unfair without knowing what we do. Most of my work is P3 based. Ironically the vast majority of apple devices are P3 screens, there's lots of work targeting those screens. 2020 is taking hold in TVs and is becoming more valid and of course HDR is not sRGB. Any VFX work is not done in sRGB. Any professional workflow is not sRGB in my experience. All we're doing is taking quarantine time and being geeks about our cameras. Just like to know right. My main camera is a Red, not a sigma fp, but the sigma is really nice for what is it. It's just some fun and along the
  2. I'm with you on all this but i don't know what you need sigma to supply though... The two matrixes inside each DNG handle sensor to XYZ conversion for two illuminants. The resulting XYZ colours are not really bound by a colourspace as such but clearly only those colours the sensor sees will be recorded. So the edges will not be straight, there's no colour gamut as such, just a spectral response. But the issue (if there is one) is Resolve taking the XYZ and *then* turning it into a bound colourspace, 709. And really i still can't quite work out whether Resolve will only turn DNGs into
  3. Hi Devon, i missed this sorry. Unbound RGB is a floating point version of the colours where they have no upper and lower limits (well infinity or max(float) really). If a colour is described in a finite set of numbers 0...255 then it is bound between those values but you also need the concept of being able to say, what Red is 255? That's where a colourspace comes in, each colourspace defines a different 'Redness'. So P3 has a more saturated Red than 709. There are many mathematical operations that need bounds otherwise the math fails - Divide for example, whereas addition can work on unbo
  4. First just to say how much i appreciate you taking the time to answer these esoteric questions! So i have a DNG which is a macbeth shot with a reference P3 display behind showing full saturation RG and B. So in these scene i hope i have pushed the sensor beyond 709 as a test. Resolve is set to Davinci YRGB and not colour managed so i am now able to change the Camera RAW settings. I set the color space to Black Magic Design and Gamma is Black Magic Design Film. To start with my timeline is set to P3 and i am using the CIE scopes and the first image i enclosed shows what i se
  5. I don't know what the native space of the fp is but we do get RAW from it so hopefully there's no colourspace transform and i want to know i am getting the best out of the camera. And i'm a geek and like to know. I *believe* that Resolve by default appears to be debayering and delivering 709 space and i *think* the sensor native space is greater than that - certainly with Sony sensors i've had in the past it is. There was a whole thing with RAW on the FS700 where 99% of people weren't doing it right and i worked with a colour scientist on a plug in for SpeedGrade that managed to do it ri
  6. I got a reply from them last week, has it shut down recently? We need to double check we're talking about the same thing. I see half way through that an exposure 'pulse' not a single frame flash but more a pulse. I believe it's an exposure thing in linear space but when viewed like this it affects the shadows much more. Can you post a link to the video direct and maybe enable download so we can frame by frame through it. A mild pulsing is also very common in compression so we want to make sure we're not look at that...? cheers Paul
  7. Actually you don't really want to work in a huge colourspace because colour math goes wrong and some things are more difficult to do. Extreme examples here: https://ninedegreesbelow.com/photography/unbounded-srgb-as-universal-working-space.html There were even issues with CGI rendering in P3 space as well. You want to work in a space large enough to contain your colour gamut. These days it's probably prudent to work in 2020 or one of the ACES spaces especially designed as a working space. What you say is essentially true. If you are mastering then you'd work in a large spa
  8. If you're talking about the shadow flash half way through then yes. At ISO400 only. And yes Sigma are aware of it and i'm awaiting some data as the whether we can guarantee it only happens at 400. This is the same for me. But it's only 400 for me, whereas others have seen it at other ISOs... cheers Paul
  9. That would be awesome! Yes, thank you. Sorry i missed this as i was scanning the forum earlier. Much appreciated! Paul
  10. And that sensor space is defined by the matrixes in the DNG or it is built into Resolve specifically for BMD Cameras? I did try to shoot a P3 image off my reference monitor, actually an RGB in both P3 and 709 space. The idea is to try various methods to display it to see if the sensor space was larger than 709. Results so far inconclusive! Too many variables that could mess the results up. If you turn off resolve colour management then you can choose the space to debayer into. But if you choose P3 then the image is scaled into P3 - correct? If you choose 709 it is scaled into there. So it
  11. Some small compensation for the lock down. I've been taking mine out on our daily exercise outside. It's pretty quiet around here outside as expected. Dogs are getting more walks then they've ever had in their life and must be wondering what's going on... I am probably heading towards the 11672, the latest summicron version. I like the bokeh of it. As you say, getting a used one with a lens this new is difficult. Have fun though - avoid checking for flicker in the shadows and enjoy the cam first!! cheers Paul
  12. Thanks for doing that! It confirms that the fp is probably similar to the a7s. The f4 is clearly smeared as expected. I'd hoped with no OLPF then it would be closer to an M camera. So it seems that it's the latest versions that i will be looking for... The 28 cron seems spectacular in most ways. 35 is too close to the 50 and i find the bokeh of the 35 a little harsher than the others (of course there are so many versions so difficult to tell sometimes!) Thanks again Paul
  13. Thank you! Yes, i had most of the voightlanders, from 12mm upwards on various Sonys. They were pretty good, even the 12. The sigma is a bit better with the 12 as well. I was understanding that the 28s in particular were problematic because of the exit pupil distance in the design. It was after viewing the slack article that made me obsess a bit more about it all. I was never happy with the voightlander 1.5 and finally decided to try the summicron. Fell in love with the look and the render. Found them matched my APO cine lenses better. So decided to sell most of the voigtlanders
  14. Even calman isn't perfect. The guys at Light Illusion really, really know their stuff and some of the articles on their site can be really useful. I tend to profile a monitor with at 17 cube and then use that as a basis to generate a LUT for different colourspaces. So even on a factory calibrated 709 display (Postium reference monitor) i found that i still needed a LUT to get the calibration the best i could. I even used two different probes. And sanity checked by calibrating an iPad Pro (remarkably good display) matching that and then taking that to the cinema and projecting the test footage
  15. As far as i understand the 1st version of both the Elmarit and the Summicron had huge issues on non M cameras because the filter stacks are too thick and the rear of the lens produced very shallow ray angles. Jono Slack did a comparison with the V1 and V2 summicron on a Sony a7 and the results were very different. The performance of it on the a7 was abysmal before. Again, infinity focused. But the fp has no OLPF and there's no data as to how thick that stack is. I've been after a 28 cron for a while, but the V2 is expensive and elusive whereas there are plenty of V1 around - but there are
  16. That's really nice of you to say so, i appreciate it. I think in these times of quarantine we can all obsess a bit. Looked at the DNG and ACES. My observations (as i don't generally use ACES) - For DNGs the input transform does nothing and you have no choice over colourspace or gamma in the RAW settings which makes sense. But the question is what is Resolve doing with the data - is it limiting it to a colourspace? Is it P3? Is it 709? Is it native camera? - This ACES version uses a better timeline colourspace. But Resolve really works best (in terms of the secondaries and gradin
  17. Good stuff on CoD, don't play it but looks pretty. The issue under windows (and Mac to a lesser degree) that there is no OS support even if you can calibrate your monitor. There's nowhere in the chain to put a calibration LUT. So really you're looking at a monitor where you can upload one too or a separate LUT box which you can run a signal through. When you scrape beneath the surface you will be amazed at how much lack of support for what is a basic necessity. For gaming your targets are most likely the monitors you use and so that makes sense. But for projection or broadcast you really
  18. In theory, if the camera sees a macbeth chart under a light that it is white balanced for, at a decent exposure and you have the pipeline set up correctly - you should see what you saw in real life. If you decide that the colours are wrong then, that's a creative choice of yours, but the aim is really to get what you saw in real life as a starting point then change things to how you like. Sounds like you may be under windows? One of the problems with grading with a UI on a P3 monitor is that your UI and other colours are going to be super super saturated because the OS doesn't do col
  19. which version of the Elmarit - the newest one? Can you do me a favour and let me know what the corners are like at infinity on the fp? cheers! Paul
  20. What are you viewing on? If you're viewing on a modern apple device (MacBook, iMacs, iDevices) the P3 is the correct output colourspace for your monitors. If you view 709 on a P3 monitor then it will look overly saturated, conversely if you view P3 on a 709 display then it will look de saturate (the colours will also be off, reds will be more orange and so on) Colour Management is a complicated subject and it can get confusing fast. In theory you shouldn't be having to saturate images to make them look right. I have a feeling that perhaps there's something not set up quite right
  21. Just double checking but yeah, the green/magenta stuff is in the actual DNG itself. No question. What's happening is perhaps a truncation issue converting between the 12 bit source and the 10 bit. The green values are rgb(0,1,1) so no red value. This is the very lowest stop of recorded light either 0 or 1 I think this is quite common because come to think of it he A7s always suffered green/magenta speckled noise - and that would make sense that the bottom two stops are truncating strongly and the with noise on top of it. What sigma *ought* to do is to balance the bottom two stop
  22. Of course. Use info@sigma-photo.co.jp Kindest Paul
  23. That's a good call. I would guess that part of the ACES DNG IDT is desaturating the shadows. Because i know the actual colour values in the source DNG from the data before any debayering are that colour. Of course i will now have to doubly check AGAIN!! cheers Paul
  24. Your 640ISO doesn't flicker. That's correct right? I need to redo mine to double check that 320 was doing it all the time but 400 was the trigger factor on my camera. I think there are ways you can get around purple/green and most of them, like BMD Film, involve crushing the shadows and possibly desaturating them, so they're not so obvious. @Lars Steenhoff suggested incorrect black levels but it's not that. I suspect ACES is crushing a bit and an IR Cut filter may leech some colour out overall, you could desaturate the shadows in Resolve. The blotchiness is a factor of 10 bit shadow
  25. Yes, ACES works well once set up correctly. I just use IPP2 because i'm mostly Red based and find it nice than ACES. cheers Paul
×
×
  • Create New...