Jump to content

Falk Lumo

Members
  • Posts

    17
  • Joined

  • Last visited

Posts posted by Falk Lumo

  1. 5 hours ago, Ilkka Nissila said:

    There is no stupidity involved here, only different compromises to choose from. 

    @Ilkka NissilaWe know about the options involved. However, looking into all data rates involved, there was no technical reason for Nikon not to output 6k RAW. Like Atomos with their Ninja V is going to offer for Panasonic too. Its also described in my article  ;)

  2. 59 minutes ago, nickname said:

    there´s a video up now on youtube that has a brick wall.

    before this i also hadn´t seen any aliasing...

    This video, at the very beginning, shows aliasing color artifacts. It is claimed to be ProRes RAW. But the aliasing color artifacts could be from debayering color moiré due a lack of AA filter, of from pixel skipping in the RAW codec. It is hard to tell if not also the same footage was captured in N-Log.

    BTW ... and to avoid any misunderstanding ...

    I do NOT say "aliasing is an issue". I say Nikon aplies pixel skipping in their RAW data pipeline. As with debayering color moiré, I expect plenty of footage where its effects can't be noticed. I even remember the early days of the D800E (stills photography) where people claimed debayering color moiré would never be seen. It took a while. But today, it is pretty well understood when to expect, and not to expect, color moiré. Mostly in very fine farbrics of fashion at a single critical distance actually.

  3. 1 hour ago, gingercat said:

    I still contend that that using a test chart on a computer monitor is a not a reliable method to measure aliasing and moire.

    I do understand the researvations against using a computer monitor to act as a test chart. Many things can go wrong then. But if done right, it is a superior method actually as it is far easier to control the effect of the printing pattern, or monitor pixel grid, resp. I've done both methods in the past and I can assure that - the way *we* did it - a professional test chart would render identical results. Everybody is free to professionally print the chart (Super A3 would be the minimal size required) and reproduce (or falsify) results - all required information is public and linked in the blog article.

  4. 14 hours ago, Falk Lumo said:

    Andrew posted in this thread and I replied already that it is pixel skipping, not binning. And that bit depth has nothing to do with it, as video readouts are all 12 bit. His post and my reply to him got removed, unfortunately.

    Erratum: it wasn‘t removed (as Super8 pointed out correctly), it is on the previous page.

     

  5. @Eno, you are right absolutely. My article says skipping, not binning.

    Andrew posted in this thread and I replied already that it is pixel skipping, not binning. And that bit depth has nothing to do with it, as video readouts are all 12 bit. His post and my reply to him got removed, unfortunately.

  6. Hi, as I wrote in my blog article, field tests are missing and I am interested to learn about the practical impact myself.

    OTOH, external 10 bit N-Log may be just as good and may have less risk of flickering (between frames) at lines, edges or fine regular structures like roof tiles or fashion texture. That would be the real-world test I am most interested in.

    @gingercat

    Quote

    Shooting an aliasing test chart on a computer monitor is likely to create aliasing and moire

    That's absolutely not the case here. The topic has been dealt with in replies above.

  7. 13 minutes ago, Super8 said:

    Thanks for the reply. 

    Based on your filming the test chart on a monitor you were able to verify the 6k to 4k down sample internal recording that showed full sensor readout and no line skipping?

    Yes. For internal and external N-Log. But it‘s really all in the article ...

    P.S. If you try to replicate results, I recommend to use an even larger distance or wider focal length. It would help to understand the math of zone plates (to estimate such things) - but you can use trial and error too.

  8. 2 hours ago, Super8 said:

    My question is this normal to record test chart from a monitor screen?  I would think you have refresh rate and scan issues that would interfere  with the test.

    My regards if I'm way off base and am asking to learn more about the process.

    This looks like a sincere question to me. I withdraw my claim you‘re trolling and apologize.

    I for myself have a professionally printed A2 version of my test chart. However, for a zone plate chart, even that shows signs of remaining printing artifacts. And Marc doesn‘t have any usable printed version.

    However and because we only needed a small fraction of the test pattern, there is an established alternative in the testing scene: monitors! As odd as it may seem at first glance, they can be put to good use if a number of rules are respected. I know about at least one professional lens calibration company which is using monitors to display test charts. The most important rule is that monitor subpixels (their projection when photographed or filmed) must be MUCH smaller than a sensor pixel. Other rules are that the test pattern must be resolved with no aliasing, and that there is no flicker. We obeyed these rules. Which is also why we know to have no extra moiré effects from the pixel grid.

    It is also the reason why:

    1. the monitor shows a small fraction of the test chart only, such that the remaining part is fully resolved by the 3840px wide monitor. The original test chart is 8400px wide and 9.6MB large. It‘s pixels are displayed at 100% or 1:1. There is no moiré in the screen display to the naked eye. The test chart file was carefully created to avoid aliasing as much as possible, by myself. Which is no easy task for a zone plate chart.

    2. the monitor appears so small in the video, as it is far away. This makes the monitor pixels and subpixels disappear completely, there are more than 10 monitor pixels per camera pixel ... Hard to beat with any printed chart! It also ensures that we have spatial frequencies beyond 4k to test for. This is crucial for the test and any attempt to reproduce our results!

    3. minor sources of blur (lens, focus, motion) destroy results as we depend on high spatial frequencies being resolved.

    Btw, the article DOES contain a link to a test video before cropping.

  9. 1 hour ago, Andrew Reid said:

    They probably do need to go the pixel-binning route, due to the increased bitrate of 12bit

    A sensor typically has a slower rolling shutter scan, the higher the bit-depth.

    It could be that the 6K full pixel readout is done in a lower bit-depth mode such as 10bit on chip.

    Interesting findings, but would like to see a real-world test between Flat, My Z-LOG internal, N-LOG external and ProRes RAW

    Atomos have not sent me a Ninja V to try it out on yet.

    Hi Andrew,

    in my article, I linked to a PDF where Sony lists all 23 readout modes of the chip. There are 2 14Bit modes (Full frame and cropped) and 12 Bit modes. Video uses 12 Bit, both internal and external (as you can tell from the frame rates). There are no 10 or 8bit modes for readout. There are 4k readout modes. But as I explain in my article, they are most likely cropped modes, not binning or skipping modes.

    My guess (or speculation) is that Nikon uses a 6k 12Bit readout mode and drops pixels in the data pipeline between the sensor and the hdmi port. My finding only shows that pixels are dropped, not how and where.

    The real-world comparison tests is what I am looking forward too :)

  10. Dear fellow readers. The above posting by Super8 is defamatory. Therefore, I can't ignore the troll. I hope moderators take notice.

    Let's put some facts straight for everybody to see:

    • All information is in my blog post article. It contains all necessary details to reproduce, or falsify, my findings. It follows a scientific protocol.
    • Super8 decided to ignore all this information and rather cite an early attempt - quickly dismissed - as visible in a public Youtube comment section. My cooperation with Marc mostly happened via email in private communication. Of course, in the very beginning, Marc had to learn a thing or two. Rather normal. I can only assume that Super8 deliberately tries to run a FUD attack against my findings. That's sad. If you don't like it, just run your own test. Then we'll talk.
    • The printed test chart was never used to produce my result. Actually, I dismissed it for obvious reasons. Showing it here is nothing but trolling.

    Nevertheless, thanks for spotting the error in Marc's Youtube link (he runs two channels). Now fixed in the article.

    For easier reference, here is a copy of test footage (cropped in, scaled, graded) from Nikon Z6 FX ProRes RAW:

    frame-000200.v950.jpg

    The article provides help to interpret what you see. In short, green/magenta is from debayering moiré, blue/yellow from pixel skipping.

    I'll try hard to ignore Super8 from now on. Thanks.

  11. @Super8, sorry but I will not respond to you anymore beyond this point. You are not reading carefully enough before posting.

    First, Jungbluth did not use a print out of the test chart for the tests I used for analysis. I said that the process was iterative and I wrote in the article how the chart was used.

    Second, the Z6 uses 6k sensor readout for both stills and video. Except for ProRes RAW and DX modes that is. You can actually learn this from my article too. That the Z6 internally supersamples a 6k readout to produce 4k video is official information from Nikon. And that the Z7 doesn't.

  12. I have now concluded my analysis and published the results:

    -> http://blog.falklumo.com/2020/01/the-conundrum-of-nikon-z6-prores-raw.html

    As an answer to the above: I didn't ask Jungbluth to do the same test I already did. I asked him to do the tests for me. The test footage is from him, the instructions, cross checking, test charting and analysis is from me. I am not determined to show "line skipping", I am determined to show their absence. But that's possible only with proper testing.

  13. On 1/3/2020 at 8:01 AM, Super8 said:

    I downloaded the sample file from the link provided.  Are you really putting weight in the chart and text Jungbluth did?  Who is "our" in your preliminary test? I'll wait for better test from reliable sources. 

    Hi, you may not have been able to read the German part. I currently try to get Jungbluth redo the test with a better quality. As it is now, the quality is too low indeed to draw conclusions - as I wrote. However, done right, the test documents line skipping rather well. I did it before.

  14. We have a long lasting conversation here about how the Z6 can deliver 4k prores raw. Now first footage emerged. I currently colloborate with a German Youtuber to shed some light on this, cf. https://youtu.be/FEGxTXC6trU and the comments by falconeye.

    The point is that the Z6 does full sensor readout for full frame internal 4k (6k readout) and (as we now know) the Z6 outputs full frame 4k raw. There are 12 bit 30p full frame readout modes for 6k and probably(*) 4k (from a Sony document, (*) whether by crop or line skipping is left unspecified). However, uncropped 4k readout mode should create line skipping or binning artefacts.

    Our preliminary test shows such artefacts - but only at a rather low level, cf. the video provided by Jungbluth on Dropbox. However, the quality isn't yet at a level to draw conclusions; there may be more visible artefacts hidden by not sharp enough test image quality.

    The Z7 doesn't support full frame 4k raw. And it is easy to see why.

    But how on earth can the Z6 support full frame 4k raw, if it isn't masqueraded 6k or line skipping? Assuming further testing indicates that there are no line skipping artefacts (too early to tell).

  15. Hi Andrew, nice article.

     

    I looked a bit at the 70D's dual pixel AF, less from a practical point of view (don't have the camera). But I did a bit of background research. Found the original patent in the Japanese web and an algorithmic JP & US patent.

     

    I wrote a blog article where I looked at the patents and tried to understand what it will eventually mean for the industry:

     

    http://falklumo.blogspot.de/2013/07/comment-why-new-dual-pixel-af-will.html

     

    I think it will eventually transform the industry. The 70D won't. I found too many compromises already in the patent (not using 2D image correlation analysis etc.) that I believe the 70D may provide a smooth focus experience in video (which is great enough) but will not outperform traditional phase detect auto focus.

     

    But multi pixel auto focus (MPAF how I named the approach in a more general way) wil leventually blow PDAF out of the water. It is the only technology known today which both uses all available light for focus and at the same time, uses the parallax error information (phase shift) to compute the amount of defocus. Therefore, PDAF will not remain able to compete.

     

    And this in turn will turn high end cameras into mirrorless ones. Think of what great things at the price point of e.g., a D4 can be done with the electronic viewfinder. Which couldn't be developed because it needed a mirror to perform. This may enable a new high end camera generation (which should help the movie fraction a lot) where innovation can trickle down from the top again. This is what I mean by transformation.

     

    Kind regards,

    Falk

×
×
  • Create New...