Jump to content

paulinventome

Members
  • Content Count

    159
  • Joined

  • Last visited

  • Days Won

    2

paulinventome last won the day on January 3

paulinventome had the most liked content!

About paulinventome

  • Rank
    Active member

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. I think that's pretty unfair without knowing what we do. Most of my work is P3 based. Ironically the vast majority of apple devices are P3 screens, there's lots of work targeting those screens. 2020 is taking hold in TVs and is becoming more valid and of course HDR is not sRGB. Any VFX work is not done in sRGB. Any professional workflow is not sRGB in my experience. All we're doing is taking quarantine time and being geeks about our cameras. Just like to know right. My main camera is a Red, not a sigma fp, but the sigma is really nice for what is it. It's just some fun and along the way we all learn some new things (i hope!) cheers Paul
  2. I'm with you on all this but i don't know what you need sigma to supply though... The two matrixes inside each DNG handle sensor to XYZ conversion for two illuminants. The resulting XYZ colours are not really bound by a colourspace as such but clearly only those colours the sensor sees will be recorded. So the edges will not be straight, there's no colour gamut as such, just a spectral response. But the issue (if there is one) is Resolve taking the XYZ and *then* turning it into a bound colourspace, 709. And really i still can't quite work out whether Resolve will only turn DNGs into 709 or not. So as you say, for film people, P3 is way more important than 709. And i am not 100% convinced the Resolve workflow is opening up that. But i might be wrong as it's proving quite difficult to test (those CIE scopes do appear to be a bit all over the place, and i don't think are reliable). As i mentioned before i would normal Nuke it but at the moment the sigma DNGs for some reason don't want to open. Now i could be missing gaps here, so i am wondering what sigma should supply? cheers Paul
  3. Hi Devon, i missed this sorry. Unbound RGB is a floating point version of the colours where they have no upper and lower limits (well infinity or max(float) really). If a colour is described in a finite set of numbers 0...255 then it is bound between those values but you also need the concept of being able to say, what Red is 255? That's where a colourspace comes in, each colourspace defines a different 'Redness'. So P3 has a more saturated Red than 709. There are many mathematical operations that need bounds otherwise the math fails - Divide for example, whereas addition can work on unbound. There's a more in-depth explanation here with pictures too! https://ninedegreesbelow.com/photography/unbounded-srgb-divide-blend-mode.html Hope that helps? cheers Paul
  4. First just to say how much i appreciate you taking the time to answer these esoteric questions! So i have a DNG which is a macbeth shot with a reference P3 display behind showing full saturation RG and B. So in these scene i hope i have pushed the sensor beyond 709 as a test. Resolve is set to Davinci YRGB and not colour managed so i am now able to change the Camera RAW settings. I set the color space to Black Magic Design and Gamma is Black Magic Design Film. To start with my timeline is set to P3 and i am using the CIE scopes and the first image i enclosed shows what i see. Firstly i can see the response of the camera going beyond a primary triangle. So this is good. As you say the spectral response is not a well defined gamut and i think this display shows that. But my choice of timeline is P3. And it looks like that is working with the colours as they're hovering around 709 but this could be luck. Changing time line to 709 gives a reduced gamut and like wise setting timeline to BMD Film gives an exploded gamut view. So BMD Film is not clipping any colours. The 4th is when i set everything to 709 so those original colours beyond Green and Red appear to be clamp or gamut mapped into 709. So i *think* i am seeing a native gamut beyond 709 in the DNG. But applying the normal DNG route seems to clamp the colours. But i could just be reading these diagrams wrong. Also with a BMDFilm in Camera RAW what should the timeline be set to and should i be converting BMD Film manually into a space? I hope this makes sense, i've a feeling i might have lost the plot on the way... cheers Paul
  5. I don't know what the native space of the fp is but we do get RAW from it so hopefully there's no colourspace transform and i want to know i am getting the best out of the camera. And i'm a geek and like to know. I *believe* that Resolve by default appears to be debayering and delivering 709 space and i *think* the sensor native space is greater than that - certainly with Sony sensors i've had in the past it is. There was a whole thing with RAW on the FS700 where 99% of people weren't doing it right and i worked with a colour scientist on a plug in for SpeedGrade that managed to do it right and got stunning images from that combo (IMHO FS700+RAW is still one of the nicest 4K RAW cameras if you do the right thing). The problem with most workflows was that Reds were out of gamut and being contaminated with negative green values in Resolve (this was v11/v12 maybe? I have screenshots of the scopes with negative values showing and so many comparisons! I understand that the matrixes transform from sensor to XYZ and there's no guarantee whether the sensor can see any particular colour at any point and that the sensor is a spectral response not a defined gamut. I also know that if i was able to shine full spectral light on the sensor i could probably record the sensor response. I know someone in Australia that does this for multiple cameras. But within the Resolve ecosystem how do i get it to give me all the colour in a suitable large space so i can see? If i turn off colour management and manually use the gamut in the camera RAW tab Resolve appears to be *scaling* the colour to fit P3 or 709. What i want to see is the scene looking the same in 709 and P3 except where there is a super saturated red (for example) and in P3 i want to see more tonality there. I'm battling Resolve a little because i don't know 100% what it is doing behind the scenes and if i could get the DNGs into Nuke then i am familiar enough with that side to work this out but for some reason the sigma DNGs don't work! thanks! Paul
  6. I got a reply from them last week, has it shut down recently? We need to double check we're talking about the same thing. I see half way through that an exposure 'pulse' not a single frame flash but more a pulse. I believe it's an exposure thing in linear space but when viewed like this it affects the shadows much more. Can you post a link to the video direct and maybe enable download so we can frame by frame through it. A mild pulsing is also very common in compression so we want to make sure we're not look at that...? cheers Paul
  7. Actually you don't really want to work in a huge colourspace because colour math goes wrong and some things are more difficult to do. Extreme examples here: https://ninedegreesbelow.com/photography/unbounded-srgb-as-universal-working-space.html There were even issues with CGI rendering in P3 space as well. You want to work in a space large enough to contain your colour gamut. These days it's probably prudent to work in 2020 or one of the ACES spaces especially designed as a working space. What you say is essentially true. If you are mastering then you'd work in a large space and then do a trim pass for each deliverable - film. digital cinema, tv, youtube etc, You can transform from a large space into each deliverable but if your source uses colours and values beyond your destination then a manual trim is the best approach. You see this more often now with HDR - so a master wide colour space HDR is done and then additional passes are done for SDR etc,. However, this is at the high end. Even in indie cinema the vast majority of grading and delivering is still done in 709 space. We are still a little way off that changing. Bear in mind that the vast majority of cameras are really seeing in 709. P3 is plenty big enough - 2020 IMHO is a bit over the top. cheers Paul
  8. If you're talking about the shadow flash half way through then yes. At ISO400 only. And yes Sigma are aware of it and i'm awaiting some data as the whether we can guarantee it only happens at 400. This is the same for me. But it's only 400 for me, whereas others have seen it at other ISOs... cheers Paul
  9. That would be awesome! Yes, thank you. Sorry i missed this as i was scanning the forum earlier. Much appreciated! Paul
  10. And that sensor space is defined by the matrixes in the DNG or it is built into Resolve specifically for BMD Cameras? I did try to shoot a P3 image off my reference monitor, actually an RGB in both P3 and 709 space. The idea is to try various methods to display it to see if the sensor space was larger than 709. Results so far inconclusive! Too many variables that could mess the results up. If you turn off resolve colour management then you can choose the space to debayer into. But if you choose P3 then the image is scaled into P3 - correct? If you choose 709 it is scaled into there. So it seems that all of the options scale to fit the selected space. Can you suggest a workflow that might reveal native gamut? For some reason i cannot get the sigma DNGs into Nuke otherwise i'd be able to confirm there. In my experience of Sony sensors usually the Reds go way beyond 709. So end to end there's a bunch of things to check. One example is on the camera itself whether choosing colourspace alters the matrixes in the DNG - how much pre processing happens to the colour in the camera. Cheers Paul
  11. Some small compensation for the lock down. I've been taking mine out on our daily exercise outside. It's pretty quiet around here outside as expected. Dogs are getting more walks then they've ever had in their life and must be wondering what's going on... I am probably heading towards the 11672, the latest summicron version. I like the bokeh of it. As you say, getting a used one with a lens this new is difficult. Have fun though - avoid checking for flicker in the shadows and enjoy the cam first!! cheers Paul
  12. Thanks for doing that! It confirms that the fp is probably similar to the a7s. The f4 is clearly smeared as expected. I'd hoped with no OLPF then it would be closer to an M camera. So it seems that it's the latest versions that i will be looking for... The 28 cron seems spectacular in most ways. 35 is too close to the 50 and i find the bokeh of the 35 a little harsher than the others (of course there are so many versions so difficult to tell sometimes!) Thanks again Paul
  13. Thank you! Yes, i had most of the voightlanders, from 12mm upwards on various Sonys. They were pretty good, even the 12. The sigma is a bit better with the 12 as well. I was understanding that the 28s in particular were problematic because of the exit pupil distance in the design. It was after viewing the slack article that made me obsess a bit more about it all. I was never happy with the voightlander 1.5 and finally decided to try the summicron. Fell in love with the look and the render. Found them matched my APO cine lenses better. So decided to sell most of the voigtlanders and just focus on 3 Leica lenses, 28, 50 and 90. There's no where i've found to rent in the UK to test so hence asking people! Love the photos. The only issue is that most of my photography may be like that, but i do also need f8 landscapes/architecture and as flat a field as possible. So looking for samples like that too! cheers Paul
  14. Even calman isn't perfect. The guys at Light Illusion really, really know their stuff and some of the articles on their site can be really useful. I tend to profile a monitor with at 17 cube and then use that as a basis to generate a LUT for different colourspaces. So even on a factory calibrated 709 display (Postium reference monitor) i found that i still needed a LUT to get the calibration the best i could. I even used two different probes. And sanity checked by calibrating an iPad Pro (remarkably good display) matching that and then taking that to the cinema and projecting the test footage with the iPad so i can eyeball. Once i know i have devices where i can be pretty calm about things then that's a good place to start. I work from home and i'm not a dedicated facility so do the best with what i have. One issue with the iDevice displays though is that OLED is an odd creature. You will find that only OLED can create full saturated colours in shadows. So if you watch a profile on, say, 100% Red then as the brightness decreases the red stays in the same chroma coords whereas on pretty much all other display technologies the chroma will desaturate. I mentioned the saturation as a way to eyeball the white balance. push it up and you see colour casts very easily. I found your DNG looked perfectly natural with default settings. As to the question of what is Resolve doing with the fp colours i still don't 100% know. I think it does the same in ACES as DNG. In your case i think you were using an AP1 space and if you saturate in that then you can push the colours to the edge of AP1 which is expected? Especially via a node because the sigma colours start off as a small portion of AP1 but the Resolve node can push those within that colourspace. Working in too big a colourspace is also problematic. The original ACES (AP0?) was too big as a working space as the grading controls would be too heavy handed as they had a huge space to have to work within. The original AP0 was considered an archiving space, not a working one. Math also can work differently. There were some great examples of math failing in unbound RGB and even P3. So going with the biggest is not always the best! I think to see what the camera is doing would be a case of shooting some super saturated colours, beyond pointers gamut (which is the gamut of natural surface colours) and then taking that DNG into Resolve. Doing it as 709, then P3 and comparing whether a) it looks the same save for some parts and b) do the CIE diagrams clearly show colours beyond 709 without any tweaks. I wonder if i shoot an RGB Chart off my P3 reference monitor and see how that fairs? If this is a Sony sensor then i would pretty much guarantee the Red is beyond 709 - this was an issue i had with sony sensors way back with the FS100 and 700 and the a7 series. cheers Paul
  15. As far as i understand the 1st version of both the Elmarit and the Summicron had huge issues on non M cameras because the filter stacks are too thick and the rear of the lens produced very shallow ray angles. Jono Slack did a comparison with the V1 and V2 summicron on a Sony a7 and the results were very different. The performance of it on the a7 was abysmal before. Again, infinity focused. But the fp has no OLPF and there's no data as to how thick that stack is. I've been after a 28 cron for a while, but the V2 is expensive and elusive whereas there are plenty of V1 around - but there are no tests on an fp. The 50 and 90 are incredible on the fp but i need a wide to go with them. So would be super interested to see how that fairs when you do get your fp. This is less FC and more smearing. More an issue with the camera than the lens. thanks Paul
×
×
  • Create New...