Jump to content
IronFilm

NETFLIX: Which 4K Cameras Can You Use to Shoot Original Content? (missing F5! WTH?!?)

Recommended Posts

30 minutes ago, mat33 said:

I guess thats the case -it easy to advertise and promote resolution to consumers compared to better colour/motion/black levels.  If it wasn't we would't have the situation where a Pioneer Kuro from 2006 or 2008 has better motion and colour than the majority of TV's sold today.

I remember the Kuro from 2008, it was very nice (ended up getting one of these, and am still using it as a live monitor for the C300 II in the studio): https://www.amazon.com/Sony-Bravia-KDL-52XBR5-52-Inch-1080p/dp/B000WDW6G6. I think the OLEDs have finally caught up (and passed) the top plasmas, but yeah it did take a while!

Share this post


Link to post
Share on other sites
EOSHD Pro Color for Sony cameras EOSHD Pro LOG for Sony CamerasEOSHD C-LOG and Film Profiles for All Canon DSLRs
6 hours ago, jcs said:

Maybe the FS700 isn't on the list because of aliasing and chroma artifacts, and perhaps skintone color science isn't up to the other cameras? Might also be related to having to use 3rd party devices to get 4K. While we were able to get decent skintones and color from the FS700 (and the slomo was great), it was hit or miss, and so we went back to Canon when we acquired our next professional-level camera: C300 II instead of FS7. While the FS7 has improved color science and overall image quality compared to the FS700, the C300 II still has nicer skin tones with less effort. Even if the FS7 had matching image/color quality, C300 II still wins for native EF lens support and usable AF. With the C300 II set up with ARRI settings, ARRI LUTs can be used and it's like using an Alexa with AF! Is the Alexa still superior in DR and total image quality- absolutely, even though the C300 II is "4K" (which it really isn't- it's less than 4K and aliases!).

If Netflix really wanted to make this about 4K quality, they'd base the decision on shooting a 4K test chart. How many professional cameras actually produce a real, measurable, alias-free 4K image?

  1. Sony F65 ("8K")
  2. Red ("6/7/8K")?
  3. Alexa 65 (6.6K)?

Pre Nyquist Sampling Theory, we need 2x oversampling to fully capture a signal without aliasing. Along with an optical low-pass filter, it might be possible to get 'close enough' so aliasing is minimal/not visible with less than 2x oversampling, so a test chart is helpful.

Here the F65 rules (and maybe Red at 7K; only a 1080p output was provided):

So how does the Alexa 65 compare to the F65 and Red shooting a 4K test chart? The only camera I would put in the same league as the Alexa for color and overall look for skin is the F65 based on high-end Hollywood movies. I suspect the Alexa may be much easier to achieve great skintones and that's why the F65 isn't used as often for high-end productions (and the insane data rate of the F65).

If Netflix didn't have this 4K requirement, it would probably be dominated by Alexas, which isn't good for competition, so that's a plus in a sense (Alexa 65 is very expensive and rental only).

The C200 420 8-bit 4K won't qualify since it's not 10-bit. The 4K raw should qualify if it's log-encode (otherwise must be 16-bit linear to qualify), though it's not true 4K either- not enough photosites and both it and the C700 alias in 4K. What about the GH5 4K 10-bit? Would be interesting to see how it does on the same 4K test chart / conditions Geoff Boyle is shooting.

There's a wide-ranging misunderstanding of Nyquist sampling theory when it comes to images, and I think you might be following that widely-publicized misunderstanding. If not, my apologizes but I think it's such an interesting topic (which I was totally wrong about for years) that I will butt in:

Nyquist does not apply to image sensors how many people think it does. A 4k sensor actually CAN resolve a full 4K (well, technically anything less than 4k so 99.99%) signal, and fully. It's only Bayer interpolation and the presence of anti-aliasing filters that reduces this number in a meaningful way. 

What it boils down to is that a line pair represents a full signal wave. Yes, you can only fully capture less than 2048 line pairs in 4k without aliasing, as per Nyquist. But that's still 4096 lines. So... with a Foveon or monochrome sensor you can capture full 4k with no aliasing on a 4k sensor. Really! You can! (Assuming you also have a high pass filter with 100% mtf below 4k and 0% mtf above 4k. Which... doesn't exist... but still.)

The other point of confusion is the idea that a line pair on a normal resolution chart represents a sine wave. It doesn't. And THAT is 99% of the reason why there's aliasing on all these test charts. It represents a sawtooth wave, which has infinitely high overtones. So mtf should be measured with a sinusoidal zone plate only, as the Nyquist theorem applies to sine waves specifically (well, it applies to anything, but sawtooth waves are effectively of infinite frequency because they contain infinite high odd order harmonics). Since most resolution charts are lines–sawtooth–waves, rather than sinusoidal gradients, even the lowest resolution lines are actually of effectively infinite frequency. Which might be another reason why you see such poorly reconstructed lines and false colors around the very high contrast areas of the window in Yedlin's test in the other thread.

To that extent, the use of anti-aliasing filters is more just "whatever works" for a given camera to split the difference between sharpness and aliasing, and not correlated with Nyquist in any specific way. Bayer patterns I believe remove a little less than 30% of linear resolution, but in practice it looks a lot sharper than 70% sharpness due to advanced algorithms and due to aliasing providing the illusion of resolution... 

So the resolution issue requiring over-sampling is due to anti-aliasing filters and Bayer pattern sensors and balancing things out between them so you get a sharp enough image with low enough aliasing. It's not Nyquist eating half your spatial resolution. I'm no engineer by any means and I have made this mistake in the past and now feel guilty for spreading misinformation online. :(

Also, I'm normally an 8-bit-is-fine-for-me-and-probably-for-everyone type person, but for next generation HDR wide gamut content you need 10 bit color and a wide gamut sensor. I think Netflix is going for a future proof thing and perhaps it is due to legal. That is a very astute comment. It's not an aesthetic choice, but a legal one. Otherwise, anything could be called "true 4k." (Fwiw you can include small amounts of b cam footage shot on other cameras or even stock footage.)

Share this post


Link to post
Share on other sites

I doubt it's for legal reasons as they could easily add to their fine print that some of the 4K content may be be upscaled to 4K from a lower resolution. I assume it has to do with future proofing to 4K broadcast standards.

With the assimilation of Netflix as a channel with Comcast, I assume Netflix has plans that exceeds their current subscription service.

Even though this list isn't new, the public release of specific cameras that meet their standards is probably a way for them to encourage indie filmmakers to produce content on those specific cameras.

Although we know they won't refuse to include films like Tangerine or other indie films shot on lesser resolutions or specs, if they can encourage filmmakers to shoot on a Varicam, a Red, or others, it is only a benefit for Netflix's marketing and delivery strategy.

It will be interesting to see if the GH5 gets added to the list because it does meet all of the qualifications mentioned... I believe? And if it does get added, going forward, there really is no reason not to shoot a no or micro budget narrative or documentary on a GH5.

Share this post


Link to post
Share on other sites
1 hour ago, HockeyFan12 said:

There's a wide-ranging misunderstanding of Nyquist sampling theory when it comes to images, and I think you might be following that widely-publicized misunderstanding. If not, my apologizes but I think it's such an interesting topic (which I was totally wrong about for years) that I will butt in:

Nyquist does not apply to image sensors how many people think it does. A 4k sensor actually CAN resolve a full 4K (well, technically anything less than 4k so 99.99%) signal, and fully. It's only Bayer interpolation and the presence of anti-aliasing filters that reduces this number in a meaningful way.

What it boils down to is that a line pair represents a full signal wave. Yes, you can only fully capture less than 2048 line pairs in 4k without aliasing, as per Nyquist. But that's still 4096 lines. So... with a Foveon or monochrome sensor you can capture full 4k with no aliasing on a 4k sensor. Really! You can! (Assuming you also have a high pass filter with 100% mtf below 4k and 0% mtf above 4k. Which... doesn't exist... but still.)

The other point of confusion is the idea that a line pair on a normal resolution chart represents a sine wave. It doesn't. And THAT is 99% of the reason why there's aliasing on all these test charts. It represents a sawtooth wave, which has infinitely high overtones. So mtf should be measured with a sinusoidal zone plate only, as the Nyquist theorem applies to sine waves specifically (well, it applies to anything, but sawtooth waves are effectively of infinite frequency because they contain infinite high odd order harmonics). Since most resolution charts are lines–sawtooth–waves, rather than sinusoidal gradients, even the lowest resolution lines are actually of effectively infinite frequency. Which might be another reason why you see such poorly reconstructed lines and false colors around the very high contrast areas of the window in Yedlin's test in the other thread.

To that extent, the use of anti-aliasing filters is more just "whatever works" for a given camera to split the difference between sharpness and aliasing, and not correlated with Nyquist in any specific way. Bayer patterns I believe remove a little less than 30% of linear resolution, but in practice it looks a lot sharper than 70% sharpness due to advanced algorithms and due to aliasing providing the illusion of resolution... 

So the resolution issue requiring over-sampling is due to anti-aliasing filters and Bayer pattern sensors and balancing things out between them so you get a sharp enough image with low enough aliasing. It's not Nyquist eating half your spatial resolution. I'm no engineer by any means and I have made this mistake in the past and now feel guilty for spreading misinformation online. :(

Also, I'm normally an 8-bit-is-fine-for-me-and-probably-for-everyone type person, but for next generation HDR wide gamut content you need 10 bit color and a wide gamut sensor. I think Netflix is going for a future proof thing and perhaps it is due to legal. That is a very astute comment. It's not an aesthetic choice, but a legal one. Otherwise, anything could be called "true 4k." (Fwiw you can include small amounts of b cam footage shot on other cameras or even stock footage.)

I am an engineer and Nyquist Sampling Theory applies to all forms of signals, including images (2 dimensional pixel sequences): https://en.wikipedia.org/wiki/Nyquist–Shannon_sampling_theorem

Undersampling and Moiré (a symptom of aliasing):

Moire_pattern_of_bricks_small.jpg

Properly sampled (for some reason this image on Wikipedia is larger than the Moiré example):

Moire_pattern_of_bricks.jpg

"(well, it applies to anything, but sawtooth waves are effectively of infinite frequency because they contain infinite high odd order harmonics)" - I always thought the simple explanation was sawtooth and square waves are discontinuous functions and require an infinite number of summed sinusoids (Fourier, related to the discrete cosine transform and sampling/reconstruction theory) to exactly reconstruct the discontinuous edges. However in practice it doesn't need to be perfect, as it works fine for graphics and sound.

What lead you to believe that Nyquist doesn't apply to image sensor sampling? Do you have link to a math or science paper explaining the mathematics along with example images? We usually agree, especially on audio gear!

From https://www.edmundoptics.com/resources/application-notes/imaging/camera-resolution-for-improved-imaging-system-performance/:

Quote

The absolute limiting resolution of a sensor is determined by its Nyquist limit. This is defined as being one half of the sampling frequency, a.k.a the number of pixels/mm (Equation 3).

I do acknowledge that in practice it may be possible to under sample (<= the Nyquist frequency) when using a well-tuned optical low pass filter, such that the aliasing is so minimal in practice it's not worth worrying about. In terms of actual examples I'm aware of, the most alias-free, detailed 4K examples I have seen have all been from well-over 4K sensor resolution, the most detailed seen so far with the Sony F65 ("8K" (kinda)). I would expect 6+ K Red and Alexa 65 to also do very well.

I hadn't studied this before, however the so-called Kell Factor is said to reduce the displayable resolution by a factor of .9 for HDTVs: https://en.wikipedia.org/wiki/Kell_factor. A 1080p CCD camera can be displayed at only 1728×972 resolution in order to avoid beat frequency patterns. I'm not sure if I've ever observed this effect on computer monitors (or HDTVs). Didn't find any examples via the google?

Have you noticed how detailed, alias-free, and natural 1080p shot at "4k" looks, then downsampled in post to 1080p? The same applies to 4K: "8K" sampled down to 4K looks superior to lower resolution examples. Recall how the F65's "8K" sensor looks way more detailed (and has no apparent aliasing!) vs. all the other cameras in this test:

From real-world examples alone I'm calling Shenanigans on statements that Nyquist doesn't apply to images (or 2K doesn't really matter vs 4K, e.g Steve Yedlin's recent videos on resolution). Happy to butter up some popcorn and learn something new if someone has math and example images which show otherwise :glasses:

 

Share this post


Link to post
Share on other sites
16 hours ago, dhessel said:

It does matter to Netflix for original content, they will not allow any camera that doesn't shoot native 4K period. The reasoning I have heard for this is that they advertise that all of their original content is 4k. They don't want to deal with any issues that may arise from using upscaled to 4K material. Technically they could get hit with a class action lawsuit if they say all of their original content is all 4k but some of it was actually not 4k and upscaled. While I find this to be unlikely it is not outside the realm of possiblity. Again this is only for their orginal content, for anything else it doesn't matter.

I wonder what would happen if someone used ones of their "approved cameras" for a Netflix Original, but in post punched in to 85% (for whatever reason, maybe boom was in shot, or they just wanted a different frame). As I bet that happens a lot.

But would they get into trouble because it is "not 4K"??

12 hours ago, cantsin said:

The F5 and FS700 don't meet the 10bit requirement (that is mentioned below the camera list).

Huh? They can both do 4K raw...

12 hours ago, TheRenaissanceMan said:

One of the primary differences between the F5 and F55 is the color filter array, which is "wide gamut" on the F55 and "optimised for r.709" on the F5. Thus the F5's disqualification.

I don't buy that explanation, as the FS7 and other cameras are on that list already. 

12 hours ago, HockeyFan12 said:

The 4.6k is pretty good, though! I would expect they'd include it.

They do included it :)

Share this post


Link to post
Share on other sites
4 hours ago, mercer said:

It will be interesting to see if the GH5 gets added to the list because it does meet all of the qualifications mentioned... I believe? And if it does get added, going forward, there really is no reason not to shoot a no or micro budget narrative or documentary on a GH5.

* Bitrate of at least 240 Mbps (at 23.98/24 fps) recording

So after the '400Mbps' firmware update, 4K 10-bit log GH5 should be good to go!

Share this post


Link to post
Share on other sites
4 hours ago, jcs said:

I am an engineer and Nyquist Sampling Theory applies to all forms of signals, including images (2 dimensional pixel sequences): https://en.wikipedia.org/wiki/Nyquist–Shannon_sampling_theorem

Undersampling and Moiré (a symptom of aliasing):

Moire_pattern_of_bricks_small.jpg

Properly sampled (for some reason this image on Wikipedia is larger than the Moiré example):

Moire_pattern_of_bricks.jpg

"(well, it applies to anything, but sawtooth waves are effectively of infinite frequency because they contain infinite high odd order harmonics)" - I always thought the simple explanation was sawtooth and square waves are discontinuous functions and require an infinite number of summed sinusoids (Fourier, related to the discrete cosine transform and sampling/reconstruction theory) to exactly reconstruct the discontinuous edges. However in practice it doesn't need to be perfect, as it works fine for graphics and sound.

What lead you to believe that Nyquist doesn't apply to image sensor sampling? Do you have link to a math or science paper explaining the mathematics along with example images? We usually agree, especially on audio gear!

From https://www.edmundoptics.com/resources/application-notes/imaging/camera-resolution-for-improved-imaging-system-performance/:

I do acknowledge that in practice it may be possible to under sample (<= the Nyquist frequency) when using a well-tuned optical low pass filter, such that the aliasing is so minimal in practice it's not worth worrying about. In terms of actual examples I'm aware of, the most alias-free, detailed 4K examples I have seen have all been from well-over 4K sensor resolution, the most detailed seen so far with the Sony F65 ("8K" (kinda)). I would expect 6+ K Red and Alexa 65 to also do very well.

I hadn't studied this before, however the so-called Kell Factor is said to reduce the displayable resolution by a factor of .9 for HDTVs: https://en.wikipedia.org/wiki/Kell_factor. A 1080p CCD camera can be displayed at only 1728×972 resolution in order to avoid beat frequency patterns. I'm not sure if I've ever observed this effect on computer monitors (or HDTVs). Didn't find any examples via the google?

Have you noticed how detailed, alias-free, and natural 1080p shot at "4k" looks, then downsampled in post to 1080p? The same applies to 4K: "8K" sampled down to 4K looks superior to lower resolution examples. Recall how the F65's "8K" sensor looks way more detailed (and has no apparent aliasing!) vs. all the other cameras in this test:

From real-world examples alone I'm calling Shenanigans on statements that Nyquist doesn't apply to images (or 2K doesn't really matter vs 4K, e.g Steve Yedlin's recent videos on resolution). Happy to butter up some popcorn and learn something new if someone has math and example images which show otherwise :glasses:

 

You're right and I'm wrong. I misread your post. Again, I'm not an engineer, just interested in how things work. I thought what you were implying is what I used to (incorrectly) think: a 4k sensor can only resolve 2k before it begins aliasing. Because Nyquist says you can resolve 50% of the frequency before aliasing. And... a lot of great 2k images come from 4k downconversion... so... it makes sense...

The truth, however (I think) is that a 4k sensor can resolve all the way up to 4k without aliasing... because there are two pixels in every line pair and 2 * 1/2 = 1. 

But... this is only so long as that 4k image represents, for instance, a 4k zone plate with resolution equivalent to fewer than 2048 sinusoidal gradients between black and white (line pairs, but in sinusoidal gradients rather than lines):

210px-Zonenplatte_Cosinus.png

But the zone plates in the links above are black and white like pairs. Hence infinite odd order harmonics (or actually I think square waves would be right, not sawtooth waves, which would mean every harmonic is represented? I don't know the equation for a square wave):

420px-Zone_plate.svg.png

So yes, the F65 looks just great, but those resolution charts will theoretically cause problems for any sensor so long as the resolution isn't being reduced by the airy disc and/or lens flaws. Because black and white zone plates like the one above are of theoretically infinite resolution so you need a high pass filter elsewhere in the acquisition chain or anything will alias.

Anyhow, I just think it's interesting that non-sinusoidal resolution charts represent, in theory, infinite resolution at any given frequency because of the higher order harmonics and that anti-aliasing filters are, ultimately, fairly arbitrary in strength, and tuned more to taste than to any given algorithm or equation, and only one small part of a big mtf chain. Nature abhors a straight line let alone a straight line with infinite harmonics so in most real world use the aliasing problems seen above will not present themselves (though fabrics and some manmade materials and SUPER high contrast edges will remain problematic and I see a lot of aliasing in fabrics with the Alexa and C300, the cameras I work with most, as well as weird artifacts around specular highlights).

In conclusion, you're right and I am definitely wrong as regards my first post; I misread what you wrote. And if I'm also wrong about any of what I've written in this one I'm curious to learn where I have it wrong because ultimately my goal is to learn, and in doing so avoid misinforming others. 

This also explains why the F3 and Alexa share a similar sensor resolution less than twice the (linear) resolution of their capture format. Bayer sensors seem to be about 70% efficient in terms of linear resolution and 2880 is slightly greater than 1920/0.7. Canon's choice to oversample by a factor of 2 with the C300 was just due to their lack of a processor at that time that could debayer efficiently enough I think, nothing to do with Nyquist. 

Share this post


Link to post
Share on other sites

@HockeyFan12 I think you mean low pass filter (vs. high pass)? Anti-aliasing filter, aka OLPF = optical low pass filter. A low pass filter lets low frequencies 'pass' and cuts higher frequencies.

I would code a square wave using the equation with the sgn() function here (or something not using a transcendental function if speed is critical): http://mathworld.wolfram.com/SquareWave.html. You can see how summed sinusoids can be used to recreate a square wave in the last equation in that link (Fourier Synthesis). One would not code it this way as it would be very slow!

Bayer reconstruction is pretty neat math, however it doesn't 'overcome Nyquist' and actually is especially prone to color aliasing. This is a pretty cool: https://www.dpreview.com/articles/3560214217/resolution-aliasing-and-light-loss-why-we-love-bryce-bayers-baby-anyway. Simple link summary: more resolution = less aliasing, which means more actual resolution captured (vs. false detail from aliasing).

The C300 only oversamples in green, and barely so at that (really need at least four pixels => one pixels vs. two pixels => one pixel). Actually we need more than 2x the pixels per Nyquist, exactly 2x is problematic at the max freqency (thus an OLPF can help here). The Alexa is undersampled for 2K/1080p at 2.8K and even 3.4K. The F65 (and perhaps Red) make clear what it takes to get alias-free, true 4K resolution. How much you can cheat in practice with less sensor resolution and tuned OLPFs can be tested shooting those fun charts :) 

Interestingly, Super-Resolution uses the information that is aliasing to construct real detail from multiple frames. 

Share this post


Link to post
Share on other sites

I decided that poring over the list of what cameras I could use for a Netflix film was distracting me from my other concerns such as what amp Paul McCartney would prefer me to use on his next album and the approved list of colognes to wear on a date with Sofia Vergara.

Share this post


Link to post
Share on other sites
2 minutes ago, BTM_Pix said:

I decided that poring over the list of what cameras I could use for a Netflix film was distracting me from my other concerns such as what amp Paul McCartney would prefer me to use on his next album and the approved list of colognes to wear on a date with Sofia Vergara.

So you've got story and sound sorted and your cologne research can help when they release smell-o-vision.

Share this post


Link to post
Share on other sites
1 hour ago, BTM_Pix said:

I decided that poring over the list of what cameras I could use for a Netflix film was distracting me from my other concerns such as what amp Paul McCartney would prefer me to use on his next album and the approved list of colognes to wear on a date with Sofia Vergara.

Are we going on a double date?

Anyway, I agree such discussions and not terribly practical! But they are FUN :-D 

Plus sometimes interesting and practical points do arise naturally from such discussions. 

Share this post


Link to post
Share on other sites
5 hours ago, IronFilm said:

I wonder what would happen if someone used ones of their "approved cameras" for a Netflix Original, but in post punched in to 85% (for whatever reason, maybe boom was in shot, or they just wanted a different frame). As I bet that happens a lot.

They have flexibility in cases like that....for instance GoPro footage will be excepted in small bits or some other action cam where it's impossible to fit another camera...it's the main camera where they probably demand a legal  percentage (I'm just speculating on the legal end here) but let's say for excample your film/series had a plot driven 2 min s8 film clip that was found in a attic by a family member and viewed in the film, it would fly...Netflix has after all become one of the most successful studios in the world and not through stupidity....they are also a source for Indy filmmakers to get projects greenlighted, who would never have gotten past the 1st assistant of a third tier producer in the studio system of old...theyare planning to spend some 5-6 billion on original content this fiscal year, so it's all good for the filmmaker 

Share this post


Link to post
Share on other sites
8 hours ago, IronFilm said:

I wonder what would happen if someone used ones of their "approved cameras" for a Netflix Original, but in post punched in to 85% (for whatever reason, maybe boom was in shot, or they just wanted a different frame). As I bet that happens a lot.

But would they get into trouble because it is "not 4K"??

True but they could still claim that the piece was capture and mastered in 4K which is probably enough. 

Share this post


Link to post
Share on other sites
15 hours ago, mercer said:

(...)

It will be interesting to see if the GH5 gets added to the list because it does meet all of the qualifications mentioned... I believe? And if it does get added, going forward, there really is no reason not to shoot a no or micro budget narrative or documentary on a GH5.

Got it Glenn? Told you... GH5 rocks man, gonna buy one ; )

3 hours ago, dhessel said:

True but they could still claim that the piece was capture and mastered in 4K which is probably enough. 

Or simply go to fake an orgasm and sell your content's workflow as politically acquired... : D They're asking for, though ;-)

 

Nice guide anyway. That 240 Mbps rate is actually outstanding. Really. Means technology is finally democratized. 

Share this post


Link to post
Share on other sites
10 hours ago, jcs said:

@HockeyFan12 I think you mean low pass filter (vs. high pass)? Anti-aliasing filter, aka OLPF = optical low pass filter. A low pass filter lets low frequencies 'pass' and cuts higher frequencies.

I would code a square wave using the equation with the sgn() function here (or something not using a transcendental function if speed is critical): http://mathworld.wolfram.com/SquareWave.html. You can see how summed sinusoids can be used to recreate a square wave in the last equation in that link (Fourier Synthesis). One would not code it this way as it would be very slow!

Bayer reconstruction is pretty neat math, however it doesn't 'overcome Nyquist' and actually is especially prone to color aliasing. This is a pretty cool: https://www.dpreview.com/articles/3560214217/resolution-aliasing-and-light-loss-why-we-love-bryce-bayers-baby-anyway. Simple link summary: more resolution = less aliasing, which means more actual resolution captured (vs. false detail from aliasing).

The C300 only oversamples in green, and barely so at that (really need at least four pixels => one pixels vs. two pixels => one pixel). Actually we need more than 2x the pixels per Nyquist, exactly 2x is problematic at the max freqency (thus an OLPF can help here). The Alexa is undersampled for 2K/1080p at 2.8K and even 3.4K. The F65 (and perhaps Red) make clear what it takes to get alias-free, true 4K resolution. How much you can cheat in practice with less sensor resolution and tuned OLPFs can be tested shooting those fun charts :) 

Interestingly, Super-Resolution uses the information that is aliasing to construct real detail from multiple frames. 

Low pass filter, yes. My bad. 

I'm not saying that Nyquist doesn't apply to sensors. Not at all.

Let's forget about Bayer patterns for now and focus on monochrome sensors and Nyquist.

What I'm saying is that the idea, which I've read online a few times and you seem to be referencing in the paragraph about the C300, that a 4k monochrome sensor means you can resolve up to 2k resolution without aliasing (because we can resolve up to half the frequency) is misguided. A 4k monochrome sensor can in fact resolve up to full 4k resolution without aliasing on a sinusoidal zone plate (1/2 frequency per line pairs * 2 lines per line pair = 1 frequency in lines). So you can resolve up to 4096 lines of horizontal resolution from a 4k sensor without aliasing (up to 2048 line pairs). On a monochrome sensor, if you framed up and shot up to 4096 alternating black and white lines they should not alias on that sensor so long as they were in a sinusoidal gradient pattern rather than strictly hard-edged black and white and so long as they filled the entire frame. But that same sensor will exhibit aliasing under otherwise identical conditions on a black and white (square wave) zone plate of that same resolution/frequency because square wave lines contain overtones of effectively infinite frequency. It's the overtones, not the fundamental frequencies, that cause aliasing in this case. 

So a 4k black and white square wave zone plate will cause a 4k sensor (monochrome or Bayer) to alias, but not because of Nyquist dictating that you need less than half that resolution or the camera will alias, but because the sensor can resolve up to that full resolution in sine waves (1/2*2=1), it's just that black and white line pairs (square waves) contain infinite higher order overtones and of course those overtones will far surpass the frequency limit of the sensor.

I'm not saying Nyquist doesn't apply to sensors, just that its application is often misunderstood. The 2X scaling factor is not a magic number and there are reasons Arri went with a 1.5* scaling factor instead (which coincides closely with Bayer's 70% efficiency; 1.5*.7 is just over 1). 

That's all I'm arguing. I think we just misunderstand each other, though. Everything you've written seems right except for the focus on the 2X scaling factor.

Share this post


Link to post
Share on other sites
1 hour ago, Emanuel said:

Got it Glenn? Told you... GH5 rocks man, gonna buy one ; )

Or simply go to fake an orgasm and sell your content's workflow as politically acquired... : D They're asking for, though ;-)

 

Nice guide anyway. That 240 Mbps rate is actually outstanding. Really. Means technology is finally democratized. 

I am warming to the idea of a GH5, but I'm not ready to pull the trigger on one yet. There's definitely some better material popping up from it now.

I still think the 5D3 with ML Raw looks more cinematic and filmic but as a run and gun cinema camera, the GH5 is a great tool.

Now if it gets approved by Netflix, and I decided to make a feature, I guess it would make sense to shoot it on a GH5 as a sales tool for the film...

I'll just have my people call Mr. Netflix and tell him I made a not horrible film but at least it meets his tech requirements...

That's a pretty good pitch, right?

Share this post


Link to post
Share on other sites

@jcs and @HockeyFan12, you guys confuse me. Maybe it´s my lackful English. Maybe it is your English, you guys are able to express all

this knowledge with. Will try to understand that interesting stuff some day. Astonishing! Wow! :)

 

@mercer Glenn, Mr. Netflix might reply: If your GH5 only had Olympus colors, I´d give you green light for your not horrible film, that you pitched

oh so well:)

Share this post


Link to post
Share on other sites
8 minutes ago, PannySVHS said:

 

@mercer Glenn, Mr. Netflix might reply: If your GH5 only had Olympus colors, I´d give you green light for your not horrible film, that you pitched

oh so well:)

Idk, I think Mr. Netflix may tell me, "Hey Kid, if only your not horrible movie had pretty dames like Marty's movies, you may have something, see." Think bad Edward G. Robinson impression.

Share this post


Link to post
Share on other sites

No disrespect for the GH5 wonder cam, but the fact that the GH5 will likely be a Netflix approved camera and the Alexa isn't makes a bit of a mockery of the whole thing.  Maybe the Netflix execs had a good sales pitch from Sony/Red that they thought was a proper comparison which Steve Yedlin alludes to in his ASC interview. I'm not sure I buy the whole future proof argument either, going by current trends any good films will have had at least one remake by the time an Alexa-shot film looks bad. 

Share this post


Link to post
Share on other sites
58 minutes ago, mat33 said:

No disrespect for the GH5 wonder cam, but the fact that the GH5 will likely be a Netflix approved camera and the Alexa isn't makes a bit of a mockery of the whole thing.  Maybe the Netflix execs had a good sales pitch from Sony/Red that they thought was a proper comparison which Steve Yedlin alludes to in his ASC interview. I'm not sure I buy the whole future proof argument either, going by current trends any good films will have had at least one remake by the time an Alexa-shot film looks bad. 

Imagine a meeting with Netflix executives, marketing, and lawyers, along with reps from ARRI, Red, Sony, and Panasonic, regarding the new 4K subscriptions and 4K content. Red, Sony, and Panny say "we have cameras that actually shoot 4K" and ARRI says "but but...". Netflix exec replies, we're selling 4K, not the obviously superior image quality that ARRI offers, sorry ARRI, when you produce an actual 4K camera like Red, Sony, and Panasonic, let's talk. Netflix marketing and lawyers in the background nod to exec. A while later ARRI releases the Alexa 65 with a 6.6K sensor and it's accepted (actually 3 Alev III sensors rotated 90 degrees and placed together, AKA the A3X sensor).

2 hours ago, HockeyFan12 said:

What I'm saying is that the idea, which I've read online a few times and you seem to be referencing in the paragraph about the C300, that a 4k monochrome sensor means you can resolve up to 2k resolution without aliasing (because we can resolve up to half the frequency) is misguided. A 4k monochrome sensor can in fact resolve up to full 4k resolution without aliasing on a sinusoidal zone plate (1/2 frequency per line pairs * 2 lines per line pair = 1 frequency in lines). So you can resolve up to 4096 lines of horizontal resolution from a 4k sensor without aliasing (up to 2048 line pairs). On a monochrome sensor, if you framed up and shot up to 4096 alternating black and white lines they should not alias on that sensor so long as they were in a sinusoidal gradient pattern rather than strictly hard-edged black and white and so long as they filled the entire frame. But that same sensor will exhibit aliasing under otherwise identical conditions on a black and white (square wave) zone plate of that same resolution/frequency because square wave lines contain overtones of effectively infinite frequency. It's the overtones, not the fundamental frequencies, that cause aliasing in this case. 

So a 4k black and white square wave zone plate will cause a 4k sensor (monochrome or Bayer) to alias, but not because of Nyquist dictating that you need less than half that resolution or the camera will alias, but because the sensor can resolve up to that full resolution in sine waves (1/2*2=1), it's just that black and white line pairs (square waves) contain infinite higher order overtones and of course those overtones will far surpass the frequency limit of the sensor.

I'm not saying Nyquist doesn't apply to sensors, just that its application is often misunderstood. The 2X scaling factor is not a magic number and there are reasons Arri went with a 1.5* scaling factor instead (which coincides closely with Bayer's 70% efficiency; 1.5*.7 is just over 1). 

That's all I'm arguing. I think we just misunderstand each other, though. Everything you've written seems right except for the focus on the 2X scaling factor.

Nyquist is > 2x sampling to capture without aliasing, e.g. sample >4K to get 2K and >8K to get 4K (along with the appropriate OLPF). 4K pixels can have max 2K line pairs- black pixel, white pixel, and so on. ARRI doesn't oversample anywhere near 2x and they alias because of it. That's the math & science. From Geoff Boyle's recent chart tests, the only cameras that showed little or no aliasing for 4K were the "8K" Sony F65 and the 7K Red (only showed 1080p on Vimeo, however small amounts of aliasing might be hidden when rendering 4K to 1080p). We can see from inspection of current cameras shooting those lovely test charts, that as sampling resolution approaches Nyquist, aliasing goes down, and as we go the other way, aliasing goes up. As predicted by the math & science, right?

Since the 5D3 had eliminated aliasing, when I purchased the C300 II I was surprised to see aliasing from Canon (especially for the massive relative price difference)! The C300 II has a 4206x2340 sensor which is barely oversampled and uses a fairly strong OLPF producing somewhat soft 4K. Unlike the 1DX II 1080p, which produces fatter aliasing due to the lower resolution, it's still challenging with the C300 II 4K and fine fabrics. Shooting slightly out of focus could work in a pinch (and up post sharpening), however it will be cool when cameras have sufficient sensors to eliminate aliasing.

Given how powerful and low-energy mobile GPUs are today, there's no reason in 2017 for cameras to have aliasing, other than a planned, slow upgrade path as cameras gradually approach Nyquist. Once there, what's the resolution upgrade path? Can't have that before there are 8K displays ;) 

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...