Jump to content

paulinventome

Members
  • Content Count

    75
  • Joined

  • Last visited

  • Days Won

    2

paulinventome last won the day on January 3

paulinventome had the most liked content!

About paulinventome

  • Rank
    Member

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. The only part of that i would have issues doing is writing out a DNG, i can do the rest. Unless i can make my own debayer in Nuke but ideally we'd want to match like for like. I suspect @cpc might be able to write out DNGs though... cheers Paul
  2. Beg to differ on this one ? If you have a high contrast edge and use various scaling algorithms, Lanczon, Sinc, etc,. Often the scale results in negative lobes (like a curve overshoot). It's quite easy to see this. Happens in compositing all the time. It's exacerbated by working Linear as we always do (i suspect because of the dynamic range). So it's a well established trick to cover to log, scale, and convert back to linear. I suspect some apps do this automatically (maybe resolve does, i don't know) https://www.sciencedirect.com/topics/computer-science/ringing-artifact It's a bit geeky but there's a diagram of a square pulse and lobes that extend beyond it. Sony had this issue in camera on the FS700, very bright edges would give black pixels as the scale went negative. My gut tells me in the case of a bayer sensor it could be even worse - partly because of the negative signal but also the tendancy to have stronger edges because of missing bits of image? Kindest Paul Cool. But are they not just taking each RGGB into a separate greyscale image, so you have 4 grey images roughly 3000x1250. Scale those to 1920x1080. Then create bayer data for 3840x2160 by just alternating pixels to build up the RGGB again? So we are maybe seeing artefacts in the scale process based on algorithm used. It might even be nearest neighbour which is what i think you're demonstrating - when you take an existing pixel and alternate to make up the UHD image. But if that scale is done properly - is it not going to improve the image OR is the fact that scaling those 4 layers, which are already missing pixels in-between actually makes it worse? The Ideal is take each RGGB layer then do an interpolation, so that instead of going from 3K (one channel) you interpolate that up to the full 6K by interpolating the missing pixels and *then* scale it down to to 1920 and then the resulting bayer image might actually look really good....? cheers Paul
  3. Some interesting experiments @rawshooter and @cpc Sigma has said that the 6k image is scaled to UHD. Not binned or skipped. The reasoning is that by scaling then the lack of OLPF can be mitigated a bit. I think what we might be seeing and saying is that the scaling algorithm used could be improved. There are all sorts of issues sharpening in scaling - you get negative lobes around edges which can result in black (NaN) pixels (Sony had some issues with this). I'm wondering if the black pixels in the resultant image were just clipped but actually in these images the raw data is doing odd things. The worst colourspace to scale in is Linear which is the sensors native space. In Nuke often we'd change an image to log, scale and then back to linear. The question is naively i am assuming each of the RGGB layers is scaled, then written out in bayer format - does that sound likely? Does that work okay with bayer reconstruction? One of my wish lists to sigma is more choices over the image size - 2:1 at the same data rates would be useful. If they are taking the whole 6K image (and no reason to say they're not) then i wonder whether the camera can actually dump out the data fast enough to SSD. I believe sigma are obviously keen to keep the SDXC card as a main source but just getting *all* the raw data would be useful. Could you elaborate on how you think the scale is done? cheers Paul
  4. I think the Rs are good for photography because they tend to show two different renders wide open (glowing) and stopped down, especially the Mandler designed ones (the lux's you quote). I found the 90 summicron very poor with flare, almost unusable. The 24 is not a Leica design and it shows. And the 28mm f2.8 Mark II is excellent. The last version of the 19 pretty good too, much better than the contax 18. Yes, i have focus gears for the M lenses. Studio AFS make some aluminium gears that you can twist on, like the Zeiss gears but they do them with scalloped inserts that will hold around the smaller barrels of the Ms. The Zeiss versions don't go small enough. They're really good and easy to twist on and off when needed. The issue with the Ms are cost. I'm heading for a 90 APO and the latest 28 summicron. But i may have the sell the first born. But actually you get what you pay for (up to a certain point) I don't find Resolves debayer very good. I am trying to get some fixes to Nuke that will allow these DNGs to go through there. I don't see there being any reason why we should see coloured pixels if the algorithm understands what the content is, right? I mean it's all recreated - so why false colours? In this case i believe this artefacts look like the AHD(?) algorithm. cheers Paul
  5. I've had a bunch of contax in the past. I think we all go through phases! I had that 28-70 Contax and it's a nice lens. I don't believe it was designed by Zeiss, think it was a Minolta job. But ergonomically it was a lot better than the push pull focus ones. The problem with those is that the zoom often creeps if you're not shooting horizontal! I was using it on APS-C so full frame would actually be better for it. IMHO after going through Zuiko, Leica R, Canon FD and russian Lomos, the contax are the best of the bunch in terms of older lenses. (Yes, they are better than Leica R with the exception of one or two very special R lenses). But what i did notice is that variation between samples was quite large as perhaps can be expected from used lenses. I actually had 4 versions of the Contax 50mm f1.4 at one point and one of them was stunning and the others were okay. (I still have it actually) I also have a lot of voightlanders which are a fantastic match for the sigma fp. Even so i had some issues with some of those and finally ended up with my first proper Leica M. I wasn't sure how much of the M lenses is just ranting and confirmation bias because it's a Leica (i really didn't gel with the R lenses). But actually the 50mm summicron is a stunning lens. And now finally i'm getting round to selling as much as i can and just focusing on a few of the M lenses. Perhaps all paths lead there! I think this is known about? I've reported it and also flickering in the EVF at times. especially low light. However i've not noticed any in footage (yet) cheers Paul Okay, i had a look at the video now. As we know, you aren't using an fp for anything other than RAW, there are much better choices on the market. So i think we can ignore any tests with MOV formats. I may try the resolution stuff myself when i have a moment - my feeling is that these are debayering artefacts, i think it unlikely they are introduced errors from the camera. I do think that how you debayer makes a lot of difference - different algorithms for different uses. The flickering is not in his video is it? But yours? Do you mean flickering of exposure or the flickering of fine detail as it moves from pixel to pixel? Up to a certain point i think this could be lack of OLPF here... cheers Paul
  6. That's my thinking. AFAIK highlight reconstruction happens best in XYZ space and i'd read that the BRAW files are white balanced. But that was something i'd read and have no idea whether it's true or not. It's difficult to know whether this is RAW or not - the ability to debayer according to new future algorithms and no white balance baked in really seems RAW to me. I don't quite understand the benefit of raw processing in camera vs a log file out of a camera... cheers Paul
  7. Thanks for Clarifying officially! If you actually do this then the settings are Color Space: Blackmagic Design, Gamma: Blackmagic Design Film. The gamma curve brings all the range in and the colourspace is BMD. What does that actually mean though, you say that it won't touch gamut? To try and understand i use the CIE diagram scope. And setting the DNG to P3 vs BMD Film 1 (in a P3 timeline) i see the first two diagrams. First is BMD Film into a P3 Timeline and the second is debayering P3 into a P3 timeline. Now this is basically a point of confusion in Resolve for me - to understand what resolve is actually doing in this case. A saturated Red in sensor space in the DNG is transformed into 709/P3/BMD Film - but is that saturated Red actually mapped into the target space or is it placed at the right point. I cannot work it out. So for example if that sensor Red is outside of 709 space then IMHO Resolve could either clip the values or remap them. But if you switched to P3 and that Red lay within it then no mapping has to happen. But on the CIE diagram i do not see this. I would expect to switch between 709 and P3 and the colours inside the triangles not move. In other words in a sufficiently large colourspace on that CIE diagram i ought to be able to see what the camera space is (ish). So when i do this and choose BMD Film and set the timeline to that as well i get the 3rd diagram - which is beyond visible colours so i have to assume i just don't understand what Resolve is doing here or that CIE diagram is not working? Any light shed (pun intended) would be super nice! cheers Paul
  8. @rawshooter Excellent work, thank you. I don't know very much about BRAW and workflows. One thing i am aware of is that quite a few cameras (usually not doing RAW) will do their own highlight processing in camera before spitting out an image. It's nigh impossible to get an R3D processed via IPP2 without it in because it's also a valid method of reconstructing an image (just as filling in missing pixel values). So the check would be are those BRAW files actually just raw data from the camera or do they have any camera processing in them? Then an important part of range is the colour quality and so it's quite usual for people to shoot colourful scenes or charts and over and over expose them, bring them back to a standard exposure and see how the colours are - in day to day shooting this can be important. To do this i personally tend to use shutter speed, not aperture as lenses aren't that accurate (not sure shutter speed is either but think it's more likely) I do have a step wedge here somewhere that i would like to shoot some range charts but have mislaid it! I reckon the fp is doing around 12 stops and it's fairly clean in the shadows so it's quite a usable range. As i said before there is a sensor mode that does 14 bit out but don't think that's being used anywhere. One thing with the fp is that it's easy to compare stills to cine. cheers Paul
  9. Personally i don't tend to use gimbals. A lot of what i do can be monopod. I have a carbon fibre one with what's like a gel cushion on top that allows me to level the camera easily. Of course moving is a different case and i have some edlekrone slider/heads which work fairly well. I have some bigger ones too but for the fp they're overkill. However one day i'll get one, if for no other reason than they'd make a nice motorised head. If the software works well cheers Paul insta: paul.inventome
  10. IMHO you generally shouldn't run a gimbal anywhere near its payload capacity. I'd have thought the S would be perfect? Would you be loading up the fp? wireless follow focus? cage? power? cheers Paul insta: paul.inventome
  11. That's very kind of you, thank you. Post is minimal at the moment however i think one 'trick' is that i have this as a Resolve project which is YRGB Color Managed, my timeline is 709, output colourspace is 709 but my Timeline to Output Gamut Mapping is RED IPP2, with medium and medium. (Input colourspace is RedWideGamut but AFAIK when dealing with RAW and DNGs this is ignored) This is because most of the project is Red based. BUT the sigma fp footage is debayering into 709 (So in Camera RAW for DNG is it Colourspace 709 and Gamma 709 with highlight recovery by default). What happens is that the DNGs are debayered correctly, with full data. But that IPP2 mapping is handling the contrast and highlight rolloff for my project as a whole, including the DNGs. IMHO i do this all the time with various footage, not least because it's easier to match different cameras but mostly because that IPP2 mapping is really nice. Whilst i'm sure you can massage your highlights to roll off softly, it makes more sense for me to push footage through the same pipeline. Take some footage and try. When you push the exposure underneath IPP2 mapping the results look natural and the colours and saturation exposes 'properly' Turn it off and then you're in the land of saturated highlights and all sorts of oddness that you have to deal with manually. This is not a fault of the footage but the workflow. Running any baked codec makes this more difficult - the success of this approach is based on the source being linear and natural. As i say the fp is like an old cine camera, the bells and whistles are minimal but if you're happy manual everything i think it can produce lovely images and it's so quick to pull out of a bag. If the above doesn't make sense let me know and i'll try to put together a sample. Cheers Paul insta: paul.inventome
  12. So i said i'd post some stills, these are basically ungraded. This frame is in a sequence with car lights, i like the tonality of this very subdued moment. Shot 12bit to manage shadow tonality. From a different point above. All shot on a 50mm M Summicron probably wide open. I think i hit the saturation slider here in Resolve. But this had car rolling over camera. It's a 21mm CV lens and i see some CA aberrations from the lens that i would deal with in post. But i'd never let a car run over a Red! shot on an 85mm APO off a monopod. Nice tonality again and it's day light from windows with some small panel lights bouncing and filling in A reverse of the above. Some fun shots. I think the true benefit of something like the fp is the speed at which you can see something and grab it. Using it just with an SSD plugged in and manual M lenses gives a more spontaneous feel. Now most of the film will be shot on Red, in controlled conditions with a crew and that's the right approach for multiple dialogue scenes and careful blocking. But the fp has it's place and i may hand it too someone and just say grab stuff. cheers Paul
  13. The BM Film setting is specifically designed for BMD sensor response. I believe someone from BMD also confirmed this. The fp sensor is not the same and making assumptions about that response will lead to colour errors - albeit perhaps minor. Depending on project (709 vs P3) i have been debayering into the target colourspace directly, which AFAIK uses the matrixes inside the DNG files to handle the colour according to sigmas own settings. So if i am on a P3 project i would debayer directly into P3. There is also an argument to say that you should *always* debayer into P3 because it is a larger space and then you have the flexibility to then massage the image into 709 the way you want to - this is really about bright saturated lights and out of 709 gamut colours - which the sensor can see and record, especially reds. In Resolve my timeline is set to 709/P3 as the grading controls are really designed for 709. In the camera RAW setting for each clip i would massage the exposure/shadows/highlights to taste. I see no purpose into going to a log image when the RAW is linear. If you need to apply a LUT that is expecting a log image then you can actually change a single node to do that. IMHO of course, your own needs could be different! Grading RAW is dead easy Cheers Paul There is something going on here for sure - i sometimes see the screen vary in front of me without doing anything. There have been stills that are more underexposed than i expected and i reported a bug to them about the stills preview being brighter than cine for the same settings but i couldn't reproduce it. So yeah, there is a bug somewhere! cheers Paul
  14. Me too, my left eye doesn't need glasses and it's close to -2 for it to work properly and i'm short sighted in my left but still at -2 so i really don't know what that diopter is correcting for... Also yes, the first time you view a DNG file it takes time to preview it. It's like there's a very low res thumbnail then it gets debayered on demand. cheers Paul
  15. Been away over the break and also shot a bunch of scenes for a longer form thing whilst at it, i'll post some frames later but now i've spent quite a bit of time with the sigma. I'm liking it more as a stills now, i left a summicron 50mm on it mostly and that combination is lovely - the leica is better on it than the A7s - perhaps the lack of OLPF and filter stack has something to do with that. It's the perfect camera for M mount lenses for sure. I brought a limited lens set with me, just what i needed for shots - 21mm voightlander, 50mm summicron and 85mm APO HyperPrime. The 85 is an insanely good 85mm, one of the best i've used and the sigmafp works well, resolves everything the lens sees no problem. In terms of footage i shot some MOS as 23.98 to get 12 bit, sped up to 25 in Resolve is usually not noticeable and i'm working with snow at times - so range with sun and everything is tough. The rest was 10 bit with sync audio. I shot some plates for vfx elements as well. I even ran the camera with a car going over it. Overall it performed really well. The 12 bit cine vs 14 bit still is nigh unnoticeable and 10 bit is great so long as you don't push the shadows up too much. I want to do a dynamic range test - i have a step wedge around here somewhere. I feel the range is the same as the a7sII which is also a 14 bit RAW container. I suspect the range is pretty much the same as all the current sensors like this and in fact 12 bit in cine mode seems very very common too. Issues: Well, i find myself hanging the SSD off a cable sometimes - when moving fast and rigging it's easier to leave it dangle and that's probably bad. My screen exposure seems to change and flicker (in terms of the brightness of the LCD itself) A RAW exposure tool is really vital IMHO But shooting RAW is so damn simple, as filming should be. Just nail exposure and you're good to go. I read in Film & Digital times that sigma themselves recommend interpreting this as Blackmagic Film - i think that's nuts - what does everyone else feel? cheers Paul
×
×
  • Create New...