Jump to content

cpc

Members
  • Posts

    204
  • Joined

  • Last visited

Reputation Activity

  1. Like
    cpc got a reaction from Lars Steenhoff in Sigma Fp review and interview / Cinema DNG RAW   
    Binning is also scaling. Hardware binning will normally just group per-channel pixels together without further spatial considerations, but a weighted binning techique is basically bilinear interpolation (when halving resolution).
    Mathematically, scaling should be done in linear, assuming samples are in an approximately linear gamut, which may or may not be the case. Digital sensors, in general, have good linearity of light intensity levels (certainly way more consistent than film), but native sensor gamut is not a clean linear tri-color space. If you recall the rules of proper compositing, scaling itself is very similar -- you do it in linear to preserve the way light behaves. You sometimes may get better results with non-linear data, but this is likely related to idiosyncrasies of the specific case and is not the norm.
     
    re: Sigma's downscale
    I assume, yes, they simply downsample per channel and arrange into a Bayer mosaic.
    Bayer reconstruction itself is a process of interpolation, you need to conjure samples out of thin air. No matter how advanced the method, and there are some really involved methods, it is really just that, divination of sample values. So anything that loses information beforehand, including channel downsample, will hinder reconstruction. Depending on the way the downscale is done, you can obstruct reconstruction of some shapes more than others, so you might need to prioritize this or that. A simple example of tradeoffs: binning may have better SNR than some interpolation methods but will result in worse diagonal detail.
  2. Like
    cpc got a reaction from JJHLH in Sigma Fp review and interview / Cinema DNG RAW   
    As promised, the Sigma fp centered release of slimRAW is now out, so make sure to update. SlimRAW now works around Resolve's lack of affection for 8-bit compressed CinemaDNG, and slimRAW compressed Sigma fp CinemaDNG will work in Premiere even though the uncompressed originals don't.
    There is also another peculiar use: even though Sigma fp raw stills are compressed, you can still (losslessly) shrink them significantly through slimRAW. It discards the huge embedded previews and re-compresses the raw data, shaving off around 30% of the original size. (Of course, don't do this if you want the embedded previews.)
  3. Like
    cpc reacted to Lars Steenhoff in Sigma Fp review and interview / Cinema DNG RAW   
    Thanks for the slim raw update, @cpc 
  4. Thanks
    cpc got a reaction from Emanuel in Sigma Fp review and interview / Cinema DNG RAW   
    As promised, the Sigma fp centered release of slimRAW is now out, so make sure to update. SlimRAW now works around Resolve's lack of affection for 8-bit compressed CinemaDNG, and slimRAW compressed Sigma fp CinemaDNG will work in Premiere even though the uncompressed originals don't.
    There is also another peculiar use: even though Sigma fp raw stills are compressed, you can still (losslessly) shrink them significantly through slimRAW. It discards the huge embedded previews and re-compresses the raw data, shaving off around 30% of the original size. (Of course, don't do this if you want the embedded previews.)
  5. Like
    cpc got a reaction from paulinventome in Sigma Fp review and interview / Cinema DNG RAW   
    As promised, the Sigma fp centered release of slimRAW is now out, so make sure to update. SlimRAW now works around Resolve's lack of affection for 8-bit compressed CinemaDNG, and slimRAW compressed Sigma fp CinemaDNG will work in Premiere even though the uncompressed originals don't.
    There is also another peculiar use: even though Sigma fp raw stills are compressed, you can still (losslessly) shrink them significantly through slimRAW. It discards the huge embedded previews and re-compresses the raw data, shaving off around 30% of the original size. (Of course, don't do this if you want the embedded previews.)
  6. Like
    cpc reacted to paulinventome in Sigma Fp review and interview / Cinema DNG RAW   
    So i said i'd post some stills, these are basically ungraded.
    This frame is in a sequence with car lights, i like the tonality of this very subdued moment. Shot 12bit to manage shadow tonality.

    From a different point above. All shot on a 50mm M Summicron probably wide open.

    I think i hit the saturation slider here in Resolve. But this had car rolling over camera. It's a 21mm CV lens and i see some CA aberrations from the lens that i would deal with in post. But i'd never let a car run over a Red!

    shot on an 85mm APO off a monopod. Nice tonality again and it's day light from windows with some small panel lights bouncing and filling in

    A reverse of the above.

    Some fun shots.
    I think the true benefit of something like the fp is the speed at which you can see something and grab it. Using it just with an SSD plugged in and manual M lenses gives a more spontaneous feel. Now most of the film will be shot on Red, in controlled conditions with a crew and that's the right approach for multiple dialogue scenes and careful blocking. But the fp has it's place and i may hand it too someone and just say grab stuff.

     
    cheers
    Paul
  7. Like
    cpc got a reaction from Lars Steenhoff in Sigma Fp review and interview / Cinema DNG RAW   
    It does honor settings, and it is most useful when pointed at various parts of the scene to get a reading off different zones without changing exposure, pretty much the same as you'd use a traditional spot meter. The main difference is that the digital meter doesn't have (and need) a notion of mid grey: you get directly the average (raw) value of the spot region, while a traditional spot meter is always mid grey referenced. You can certainly use a light meter very successfully while shooting raw (I always have one on me), but the digital spotmeter gives you a spot reading directly off the sensor which is very convenient, cause you see what is being recorded. Since you'd normally aim to overexpose for dense skin when shooting raw, seeing the actual values is even more useful.
    Originally, the ML spotmeter was only showing tone mapped values, but they could also be used for raw exposure once you knew the (approximate) mapping. Of course, showing either the linear raw values or EV below the clip point is optimal for raw.
  8. Like
    cpc got a reaction from mercer in Sigma Fp review and interview / Cinema DNG RAW   
    Well, it should be quite obvious that this camera is at its best (video) in 12-bit. The 8-bit image is probably derived from the 12-bit image anyway, so it can't be better than that.
    I think any raw camera should utilize a digital spotmeter similar to Magic Lantern's. This is really the most simple to implement exposure tool and possibly the only thing one needs for consistent exposure. I don't need zebras, I don't need raw histograms, I don't need false color. It baffles me that ML had it 7+ years ago and it isn't ubiquitous yet. I mean, just steal the damn thing.
  9. Like
    cpc got a reaction from Lars Steenhoff in Sigma Fp review and interview / Cinema DNG RAW   
    Well, it should be quite obvious that this camera is at its best (video) in 12-bit. The 8-bit image is probably derived from the 12-bit image anyway, so it can't be better than that.
    I think any raw camera should utilize a digital spotmeter similar to Magic Lantern's. This is really the most simple to implement exposure tool and possibly the only thing one needs for consistent exposure. I don't need zebras, I don't need raw histograms, I don't need false color. It baffles me that ML had it 7+ years ago and it isn't ubiquitous yet. I mean, just steal the damn thing.
  10. Like
    cpc reacted to Lars Steenhoff in Sigma Fp review and interview / Cinema DNG RAW   
    No I did not have any problems with dynamic range
    And make sure you set the bit depth to 14 for stills and 12 for cinema if you want to get the max out the camera.
    This is a frame from a 12 bit cdng graded in photoshop. its shot trough a window and with a 30 year old lens.
     

  11. Like
    cpc got a reaction from paulinventome in Sigma Fp review and interview / Cinema DNG RAW   
    Both Resolve and Premiere have problems with 8-bit DNG. I believe Resolve 14 broke support for both compressed and uncompressed 8-bit. Then at some later point uncompressed 8-bit was working again, but compressed 8-bit was still sketchy. This wasn't much of an issue since no major camera was recording 8-bit anyway, but now with the Sigma fp out, it is worked around in the upcoming release of slimRAW.
  12. Like
    cpc got a reaction from Lars Steenhoff in Sigma Fp review and interview / Cinema DNG RAW   
    Both Resolve and Premiere have problems with 8-bit DNG. I believe Resolve 14 broke support for both compressed and uncompressed 8-bit. Then at some later point uncompressed 8-bit was working again, but compressed 8-bit was still sketchy. This wasn't much of an issue since no major camera was recording 8-bit anyway, but now with the Sigma fp out, it is worked around in the upcoming release of slimRAW.
  13. Like
    cpc got a reaction from Emanuel in Making the jump to the big screen   
    I've done a few things that ended in both theaters and online. 1.85:1 is a good ratio to go for if you are targeting both online and festivals, and you can always crop a 1920x1080 video from a 1998x1080 flat DCP, if you happen to need to send a video file somewhere. Going much wider may compromise the online version; contrary to popular belief a cinemascope ratio on a tablet or computer display is not particularly cinematic, what with those huge black strips.
    Is there a reason you'd want to avoid making a DCP for festivals, or am I misunderstanding? Don't bother with a 4K release, unless you are really going to benefit from the resolution. Many festivals don't like 4K anyway. Master and grade in a common color gamut (rec709/sRGB). DCP creation software will fix gamma for the DCP, if you grade to an sRGB gamma for online. Also, most (all?) media servers in current cinemas do 23.976 (and other frame rates like 25, 29.97, 30) fine, but if you can shoot 24 fps you might just as well do.
  14. Like
    cpc got a reaction from Lars Steenhoff in Sigma Fp review and interview / Cinema DNG RAW   
    The linearisation table is used by the raw processing software to invert the non-linear pixel values back to linear space. This is why you can have any non-linear curve applied to the raw values (with the purpose of sticking higher dynamic range into limited coding space), and your raw processor still won't get confused and will show the image properly. The actual raw processing happens after this linearisation.
  15. Like
    cpc got a reaction from Lars Steenhoff in Sigma Fp review and interview / Cinema DNG RAW   
    12-bit is linear. 10-bit is linear. 8-bit is non-linear. No idea why Sigma didn't do 10-bit non linear, seeing as they already do it for 8-bit.
    Here is how 10-bit non linear can look (made from your 12-bit linear sample with slimraw). In particular, note how darks are indistinguishable from the 12-bit original.
    10-bit non linear (made from the 12-bit).DNG
  16. Like
    cpc got a reaction from paulinventome in Sigma Fp review and interview / Cinema DNG RAW   
    12-bit is linear. 10-bit is linear. 8-bit is non-linear. No idea why Sigma didn't do 10-bit non linear, seeing as they already do it for 8-bit.
    Here is how 10-bit non linear can look (made from your 12-bit linear sample with slimraw). In particular, note how darks are indistinguishable from the 12-bit original.
    10-bit non linear (made from the 12-bit).DNG
  17. Like
    cpc got a reaction from paulinventome in Sigma Fp review and interview / Cinema DNG RAW   
    The linearisation table is used by the raw processing software to invert the non-linear pixel values back to linear space. This is why you can have any non-linear curve applied to the raw values (with the purpose of sticking higher dynamic range into limited coding space), and your raw processor still won't get confused and will show the image properly. The actual raw processing happens after this linearisation.
  18. Thanks
    cpc got a reaction from Brian Williams in Sigma Fp review and interview / Cinema DNG RAW   
    The linearisation table is used by the raw processing software to invert the non-linear pixel values back to linear space. This is why you can have any non-linear curve applied to the raw values (with the purpose of sticking higher dynamic range into limited coding space), and your raw processor still won't get confused and will show the image properly. The actual raw processing happens after this linearisation.
  19. Thanks
    cpc got a reaction from rawshooter in Sigma Fp review and interview / Cinema DNG RAW   
    You are comparing 6K Bayer-to-4K Bayer downscale + 4K debayer to 6K debayer + 6K RGB-to-4K RGB downscale. The first will never look as good as the second.
    The 10-bit file is linear and the 8-bit file is non-linear. That's why 10-bit looks suspicious to you, it has lost a lot of precision in the darks.
    Yeah, well, the main difference with offsetting in log is that you are moving your "zero" around (a "log" curve is never a direct log conversion in the blacks), so you'd need to readjust the black point. Whereas with multiplication (gain) in linear there is no such problem. Still, offsetting is handy with log footage or film scans.
  20. Like
    cpc got a reaction from EthanAlexander in Large Format Cameras Are Changing Film Language, From ‘Joker’ to ‘Midsommar’   
    Focal lengths have no "perspective". There is no such thing as "50mm perspective", so you can't maintain this with a 50mm lens on a bigger sensor. Relative sizes of objects depend entirely on the position of the camera. Closer viewpoints will exaggerate perspective and more distant viewpoints will flatten perspective. Hence, some people may say that wider lenses have stronger perspective, which is incorrect. What they actually mean is that with a wide lens you move forward for a similar similar object size in the frame (compared to a longer lens), and this movement forward decreases the camera-subject distance and exaggerates perspective distortion.
    Surely everyone has seen one of these:
     
     
  21. Like
    cpc reacted to CaptainHook in Sigma Fp review and interview / Cinema DNG RAW   
    Oh, i see what you're trying to say now. Again, there are reasons for the decisions we make where theory and practice in hardware diverge and you have to make trade offs to balance one thing against another - more bigger picture stuff again. This is already an area i can't discuss publicly but I guess what I'll say is, if we could have implemented things that way or differently back then, we would have.

    And it's not that we didn't know some ways we could improve what we had initially done with DNG (much of it informed by the hardware problems we were solving back then), it just didn't make sense to spend more time on it when we already knew we could do something else that would fit our needs better. Like i said, the problems you describe were solved for us with Blackmagic RAW where we were able to achieve image quality we wanted with small file sizes and very fast performance on desktop with highly optimized GPU and CPU decode, the ability to embed multiple 3DLUTs, etc etc etc. THAT is a no brainer to me. ?

    I do understand your point of view especially as someone who developed a desktop app around DNG but there are so many more considerations we have that i can't even begin to discuss. Something I've learned being at a company like this is how often other people can't understand some of the decisions some companies make, but I find it much easier now to have an idea of what other considerations likely led them to choose the path they did. It's hard to explain until you've experienced it but even when i was just beta testing Resolve and then the cameras, I had no idea what actually goes on and the types of decisions and challenges faced.

    I see people online almost daily berate other camera manufacturers about things "that should be so obvious, why don't they do it" and I just have to shake my head and shrug because I have a very good idea why the company HASN'T done it or why they DID choose to do something else. I'm sure other companies have a very similar insight into Blackmagic as well, because for the most part we all have similar goals and face similar challenges.
  22. Like
    cpc reacted to CaptainHook in Sigma Fp review and interview / Cinema DNG RAW   
    I would offer that for matching shots (the majority of most grading work), adjusting white balance in sensor space (or even XYZ as a fallback) and exposure in linear makes a huge difference to how well shots match and flow. I see many other colourists claim they can do just as good white balancing with the normal primaries controls, but i think if they actually spent considerable time with both approaches instead of just one they would develop a sensitivity to it that would make them rethink just how 'good' the results with primaries are. Its one area i think photographers experienced with dialing in white balance in RAW files develop that sensitivity and eye to how it looks when white balance is transformed more accurately - more so than those in the motion image world who still aren't used to it.

    I've been a fan of Ian Vertovec from Light Iron for quite a few years, and I was not surprised to learn recently that he likes to do basic adjustments in linear because there was something in his work that stood out to me (including his eye/talent/skill/experience of course).
  23. Like
    cpc got a reaction from deezid in Sigma Fp review and interview / Cinema DNG RAW   
    It is not linear. I haven't looked at the exact curve, but it does non-linear companding for the 8-bit raw. The 12-bit image is linear.
    The DNG spec allows for color tables to be applied on the developed image. The Sigma sample images do include such tables. No idea if they replicate the internal picture profiles though. AFAIK, only Adobe Camera Raw based software (e.g. Photoshop) honors these. Unless Resolve has gained support for these tables (I am on an old Resolve version), it is very likely that the cinema5d review is mistaken on this point.
  24. Like
    cpc got a reaction from Emanuel in RED respond to Apple in compressed RAW patent battle   
    Without detracting from Graeme's work, it should be made clear that none of the algorithmic REDCODE specifics described in the text are non-trivial for "skilled artisans". I don't think any of this will hold in court as a significant innovation.
     
    A few notes:
    Re: "pre-emphasis curve" used to discard excessive whites and preserve blacks.
    Everyone here knows it very well, because every log curve does this. Panalog, s-log, Log-C, you name it, do that. In fact, non-linear curves are (and were) so widely used as a pre-compression step, that some camera companies manage to shoot themselves in the leg by applying them non-discriminatively even before entropy coding (where a pure log/power curve can be non-optimal).
    JPEG has been used since the early 90's to compress images. Practically all images compressed with JPEG were gamma encoded. Gamma encoding is a "simple power law curve". Anyone who has ever compressed a linear image knows what happens (not a pretty picture) to linear signal after a DCT or wavelet transfrom, followed by quantization. And there is nothing special, technically speaking, about raw -- it is linear signal in native camera space. But you don't need to look far for encoding alternatives: film has been around since the 19th century, it does a non-linear transform (more precisely, log with toe and shoulder) on the captured light.
    In an even more relevant connection, Cineform RAW was developed in 2005 and presented at NAB 2006. It uses a "pre-emphasis" non-linear curve (more precisely, a tunable log curve) to discard excessive whites and preserve blacks. You may also want to consult this blog post from David@Cineform from 2007 about REDCODE and Cineform: http://cineform.blogspot.com/2007/09/10-bit-log-vs-12-bit-linear.html
     
    Re: "green average subtraction":
    Using nearby pixels for prediction/entropy reduction goes at least as far back as JPEG, which specifies 7 such predictors. In a Bayer mosaic, red and blue pixels will always neighbor a green pixel, hence using the brightness correlating green channel for prediction of red and blue channels is a tiny step.
     
    Re: using a Bayer sensor, as a an "unconventional avenue":
    The Dalsa Origin, presented at NAB 2003, and available for renting since 2006, was producing Bayer raw (uncompressed). The Arri Arriflex D-20, introduced in November 2005, was doing Bayer raw (uncompressed). Can't recall the SI-2K release year, but it was doing Bayer compressed raw (Cineform RAW, externally) in 2006.
     
  25. Thanks
    cpc got a reaction from Andrew Reid in RED respond to Apple in compressed RAW patent battle   
    Without detracting from Graeme's work, it should be made clear that none of the algorithmic REDCODE specifics described in the text are non-trivial for "skilled artisans". I don't think any of this will hold in court as a significant innovation.
     
    A few notes:
    Re: "pre-emphasis curve" used to discard excessive whites and preserve blacks.
    Everyone here knows it very well, because every log curve does this. Panalog, s-log, Log-C, you name it, do that. In fact, non-linear curves are (and were) so widely used as a pre-compression step, that some camera companies manage to shoot themselves in the leg by applying them non-discriminatively even before entropy coding (where a pure log/power curve can be non-optimal).
    JPEG has been used since the early 90's to compress images. Practically all images compressed with JPEG were gamma encoded. Gamma encoding is a "simple power law curve". Anyone who has ever compressed a linear image knows what happens (not a pretty picture) to linear signal after a DCT or wavelet transfrom, followed by quantization. And there is nothing special, technically speaking, about raw -- it is linear signal in native camera space. But you don't need to look far for encoding alternatives: film has been around since the 19th century, it does a non-linear transform (more precisely, log with toe and shoulder) on the captured light.
    In an even more relevant connection, Cineform RAW was developed in 2005 and presented at NAB 2006. It uses a "pre-emphasis" non-linear curve (more precisely, a tunable log curve) to discard excessive whites and preserve blacks. You may also want to consult this blog post from David@Cineform from 2007 about REDCODE and Cineform: http://cineform.blogspot.com/2007/09/10-bit-log-vs-12-bit-linear.html
     
    Re: "green average subtraction":
    Using nearby pixels for prediction/entropy reduction goes at least as far back as JPEG, which specifies 7 such predictors. In a Bayer mosaic, red and blue pixels will always neighbor a green pixel, hence using the brightness correlating green channel for prediction of red and blue channels is a tiny step.
     
    Re: using a Bayer sensor, as a an "unconventional avenue":
    The Dalsa Origin, presented at NAB 2003, and available for renting since 2006, was producing Bayer raw (uncompressed). The Arri Arriflex D-20, introduced in November 2005, was doing Bayer raw (uncompressed). Can't recall the SI-2K release year, but it was doing Bayer compressed raw (Cineform RAW, externally) in 2006.
     
×
×
  • Create New...