Jump to content

paulinventome

Members
  • Posts

    162
  • Joined

  • Last visited

Everything posted by paulinventome

  1. I'd really recommend looking at RawDigger, it will let you open the files and see really what is inside. More so than going through Adobe/Resolve. It would be awesome to be able to download some samples? cheers Paul
  2. Tempted, but not yet... I ought to wait for Komodo. But what i would love to know is... 1. See some 4K 10bit DNGs at 25p, nice full range with skies and everything in 2. In stills mode the kind of shutter speed we can get. It dawned on me with an electronic shutter then it might be pretty useless as a stills camera in fast situations, this would diminish it's appeal as a replacement for a dSLR. Looking forward to some feedback Paul
  3. If you're digging then look at these aspects: - Highlight reconstruction, if you're in Resolve or similar then from a native sensor colourspace it is easier to recreate missing channels at clipping. If your are in a white balance baked format then the clipping has already happened. It's quite common for sensitivity of RGB on the sensor to be quite different to each other. In a baked format these have been white balanced and so some of the channels would have been clipped to do that. There is no headroom in YUV. In some situations there is up to a stop of detail to be recovered. Helps especially with skies, clouds and avoiding that horrible too cyan look. - White balancing again, creatively or otherwise it's a massive boon to balance in the grade. White balance both YUV and RAW and tweak the settings a little bit and see that you'd get smoother results from RAW - Different debayering approaches for different uses. - Retention of noise and the ability to reduce that without dealing with compression, especially in the dark areas. Now the one thing is that sigma are dumping out DNGs with all the matrixes embedded within which is great, but actually they could offer more in terms of specific transforms, like Black Magic do. As mentioned above, sensor response is not defined as well as a colourspace, the edges of the gamut can be all over the shop and when dealing with outlying colours the method to bring them into gamut can be challenging. It took Red quite a few years to get to IPP2 which provides some excellent gamut mapping - specifically designed for each one of their sensors. It could be that sigma are doing some colour work from sensor before dumping into RAW - these issues are usually seen mostly in flouro and LED lights - car brake lights and so on. Typically these are outside of 709 space and so mapping those are very important. It's too early to tell with the sigma DNGs how this will pan out - we've only seen a couple... cheers Paul
  4. That is the very point of bayer sensors, no? I don't know whether you're a developer or have had a chance to experiment with different debayer techniques but that is precisely what is happening for the 2 pixels for every pixel that are not recorded. The reconstruction algorithms are really clever, taking into account edges and gradients. A simplistic example: Red for pixel A is code value 11 and pixel B is missing and C has 14. The debayer can look at the surrounding red pixels and also the colour for the pixel it does have and determine that using a gradient the missing pixel is 12.5. Now you have code values that are not 8 bit anymore. In addition, these source values are not in a colourspace that we see, they have to go through a matrixing process which also includes white balancing AND highlight reconstruction. These are other steps that fill in more detail. You cannot do decent highlight reconstruction from a baked 709 YUV image, believe me, i've had to try. You need pre white balancing raw data to do that properly. You could argue that you're scientifically right but we have been using reconstructed images for decades and YUV is also a reconstruction of the original scene too. I assumed you were talking 12bpp and i see where you're coming from. But from another perspective the U and V is described using an 8 bit range, even though they are packed differently. At the end of the day the proof are in the images. That 8 bit RAW from sigma is much better than it has any right to be and i spend lots of my days deep in pixels - from Red to Arri and the other end of the spectrum as well. I hate YUV with a passion because it's caused me so many post headaches. I love RAW for its simplicity and flexibility - there are techniques for working on the bayered data before reconstruction or the ability to white balance after the effect in grading means that you can use white balance as a secondary and protect skin whilst changing colours around - it can be way more effective than pulling a key in a restricted colourspace. What i really hope is that sigma have done the same thing with the 10 bit files, and not made them linear - if that's the case then we're golden. I assume this is the IMX 410? So it may be doing 6K full frame internally. If you switch to crop mode i wonder if it changes the sensor to 4K crop too - in which case you can pull more depth off the sensor in that mode? cheers Paul
  5. Just to follow up on this. Most of these formats encode in YUV which is where the sampling comes from. An 8 bit YUV file is awful for tonality. The colour difference channels are so lacking detail and it's really just the luma part that 'holds' the image. But, a bit like raw, we don't see YUV or RAW files, these are both decoded into RGB space which we see on the monitors. Whenever there is an operation to 'decode' which takes place in a higher depth, then the resulting bit depth can be higher. If you have any applications that let you see inside a YUV H264 file then you really should take a good explore. When you see the data that is used to store images - be it YUV or RAW then you can see where the quality differences come from. The nice thing about DNGs is no compression, makes a wonderful difference to the image! You can get a 12 bit YUV file but most are 8 bit. That's 8bit for luma and 8bit for each of the colour difference channels but because of the nature of how colour difference works most of those 8 bit containers are empty. The colour difference channels are subsampled, so if the luma is 1920x1080 then in 422 each of those colour differences are 960x540. But you can decode an 8 bit YUV into 12 bit space if you want... cheers Paul
  6. Sorry, no what i mean is that when the debayer happens then missing pixels are reconstructed, usually in a theoretical colourspace like XYZ, then there's a matrix which brings that back into a white balanced space. All that math happens in floating point. So as we say in each 4x4 block there are two G and RB. Some sensors use different green filters as well, so one of those greens might be more sensitive to light than the other. The point of the debayer is to make up for each pixel what the other two missing colour values would be. This is not a simple extrapolation but can get quite complicated. So when that 8bit RAW source is debayered then it will be debayered into a higher bit depth container, or in most grading apps, a floating point linear colourspace. So those 'basic' 8 bit values once gone through the process would end up tonally in a different place entirely, especially the reconstructed channels. This is the point of bayering - we're making stuff up that's not there. Does that make sense? It's a very very different beast to working with an 8 bit 422 baked movie source. If you grab the 8bit UHD sample and push it around you'll see that it's way more robust than it should be and in fact if you look on a waveform (assuming workflow is correct) then you'll tell that the resulting image is not 8 bit. As for the movie recording. This is the IMX410 sony sensor. This can read 6K at 30fps, so perhaps sigma are reading the whole sensor and making up a RAW from that (bit odd but can be done). Or that sensor does support 4K crop at higher bit depths and rates. I do wonder whether sigma when set to full frame does the whole sensor but if you set to crop then sets the sensor in that mode and can achieve better rates Finally what we don't know is that whether the 10 bit version of DNG is linear or not. If they've done the same thing as they did with 8 bit then the 10 bit DNGs would be all you need. No need for that extreme 12 bit mode because i doubt you'd see any difference. So fingers crossed sigma have not done linear 10 bit. AFAIK the black magic DNGs are all 10 bit log internally. I don't know if that's still the case as i'm not BMD based (have a Red) cheers Paul
  7. Yes, i delved a little deeper and the 8bit DNGs have a linearisation table attached to them. I've been opening these up in RAW Digger which gets me the values before any debayering of colour work. I need to see if i can get the table out but it's a good sign. Having said that by the time the image is debayered into a working colourspace that 8 bit source of thee RGGB channels is tonally more spread meaning that effectively the end result will have more tonality than 8 bit implies, say compared to an 8 bit movie. I hope that makes sense? Paul
  8. You can't really downsample RAW, the reconstruction method don't work as well as you'd get aliasing in your downsampled channels and loose the ability to interpolate the missing pixels surely? 6K -> 4K Binning, not sure how that would work without introducing issues. Keep all the greens and dump some other colours? Seems like a lot of work when that processing work could be better spent dumping the data out. During the design they knew the sensor size so why not ensure the rest of the camera can handle the datarates, even if it was a little bigger? Sigma have no video camera line to eat into, they're perfectly positioned to disrupt. I can't believe that a few weeks away from supposed delivery the most basic question of whether all the movie modes are crop isn't answered! cheers Paul
  9. I see nothing to suggest that this is recording movies off anything other than a centre crop. I don't see pixel binning working on a 3:2 6K sensor in any way that would make sense. I'm 99.9% that the movies are all centre crop. I didn't read that about stills, i'd have assumed they were full RAW, 14bit. Also there's no mention of whether the data in the RAW is linear or log based - 8bit linear would be utterly useless and others have pointed out banding in the movies which originally would have thought it was compression based but maybe not. So 8bit linear is pointless, in fact only 12bit linear starts getting into the realm of normal. Some clarification from Sigma would be useful, but i guess we don't have to wait that long. I really want to like this! cheers Paul
  10. I've not seen any real confirmation but it appears that all the movie modes are 1.5 crop looking at the specs. Certainly UHD RAW has to be a crop, otherwise it isn't RAW and the impression i get from all the specs, for all the movies, is that they are all crops and the HD i presume would be a crop of the centre. You can't get RAW after the camera has processed it, so there's no way it's full frame RAW. Also the UHD RAW is 24p, yet the data rate for 30p 10bit is a bit higher so i think it would be possible to do 12bit 25p RAW. It's a shame it cannot dump the whole RAW sensor out at 6K though. A full frame RAW movie from that at that price would be super attractive. The nice thing about RAW is that it shouldn't be processor intensive - you're just dumping data out. I wonder if it's feasible for a firmware to allow a dump the whole sensor mode... cheers Paul
  11. My 2 pence. Andrew that is a good post and comment after the others. Reading previous threads it's like watching an internet lyching mob. The key here is innovation *at that time* because it's obvious now but it wasn't then. There were numerous projects flying around and mods to cameras like andromeda and it was muddy back then too. I used to mess with industrial camera, like the SI2K and it's history, taking GigE data off them and debayering. But what i think the article highlights is that Graeme was innovative and he is a person, not a faceless legal entity and he's also a really nice guy as well from any kind of interaction, who cares about what they're doing. Jim and Red assembled a team of out of the box thinkers to do something tangental to where the industry was back then. I know how obvious it all seems now, but we wouldn't be where we are today without ther work and contribution. And humanising the work that went into it is fair. And a company that is heavily invested in those people has no choice to protect their work. Apple is being sued right now for allegedly ripping off the company that did the camera systems in the iPhone. Apple is no white knight either. No company in corporate america is because of the need to protect, test and do deals on patents behind the scenes. IMHO I think Red should license RedCode to anyone that wants to use it now. cheers Paul
  12. Hi Andrew, On the vimeo clip with the blue sky banding could you confirm your workflow. I ask because that piece of video doesn't look like the video i get off my A7sII (in terms of pixel formats), i use NukeX to go through these files and the formats are different - because i've never seen banding like that i'm wondering whether there was some transcoding going on in your workflow that has caused these problems? Is it possible to see the file straight from the camera, untouched? It's certainly true that Slog3 is not needed for this camera, this camera will never fill the Slog3 bucket and so you will never make good use of the limited 8 bits. Slog2 was tweaked for the A7s and so it fills the entire curve and is by far and away the best choice, combined with SGamut3.cine which is a smaller colorspace than SGamut and therefore less wasteful of the limited bits. It's also worth pointing out that it's 8 bit YUV, not RGB and so when converting to RGB the end result is a greater depth that what you may first assume, especially if the conversion is done properly. Again, i've seen the black spot occasionally (PAL 25fps) but not as crazy as you seem to have it! The A7sII certainly manages noise better than the A7s but it looses the ability to output a 2.7k APS-C upscale to 4K over the HDMI too cheers Paul
×
×
  • Create New...