Jump to content
IronFilm

NETFLIX: Which 4K Cameras Can You Use to Shoot Original Content? (missing F5! WTH?!?)

Recommended Posts

47 minutes ago, jcs said:

Imagine a meeting with Netflix executives, marketing, and lawyers, along with reps from ARRI, Red, Sony, and Panasonic, regarding the new 4K subscriptions and 4K content. Red, Sony, and Panny say "we have cameras that actually shoot 4K" and ARRI says "but but...". Netflix exec replies, we're selling 4K, not the obviously superior image quality that ARRI offers, sorry ARRI, when you produce an actual 4K camera like Red, Sony, and Panasonic, let's talk. Netflix marketing and lawyers in the background nod to exec. A while later ARRI releases the Alexa 65 with a 6.6K sensor and it's accepted (actually 3 Alev III sensors rotated 90 degrees and placed together, AKA the A3X sensor).

I suspect you are right and I guess its a testament to the role of execs versus DoP's in the industry.  If you believe that sensors/cameras are the digital equivalent to film stocks, then this is the equivalent of an exec saying that a film has to be shot on Kodak film stocks and not Fuji because they think it has slightly higher visible resolution (according to the Kodak sales rep). 

Share this post


Link to post
Share on other sites
EOSHD Pro Color for Sony cameras EOSHD Pro LOG for Sony CamerasEOSHD C-LOG and Film Profiles for All Canon DSLRs
10 hours ago, jcs said:

Nyquist is > 2x sampling to capture without aliasing, e.g. sample >4K to get 2K and >8K to get 4K (along with the appropriate OLPF). 4K pixels can have max 2K line pairs- black pixel, white pixel, and so on. ARRI doesn't oversample anywhere near 2x and they alias because of it. That's the math & science. From Geoff Boyle's recent chart tests, the only cameras that showed little or no aliasing for 4K were the "8K" Sony F65 and the 7K Red (only showed 1080p on Vimeo, however small amounts of aliasing might be hidden when rendering 4K to 1080p). We can see from inspection of current cameras shooting those lovely test charts, that as sampling resolution approaches Nyquist, aliasing goes down, and as we go the other way, aliasing goes up. As predicted by the math & science, right?

You'd think so. So did I. But we're dealing with more complicated systems than "divide by two."

Frequency in the Nyquist-relevant sense is a sine wave. With sound, the appearance and sound of a lone fundamental sine wave are well-known to anyone with a background in subtractive synthesis. In terms of images, however, we usually think in square waves. Compare the images I posted of a sinusoidal zone plate and a standard square wave plate. Resolution charts generally represent square waves, which contain infinite higher order harmonics and thus are theoretically of infinitely high frequency at any given high contrast edge. In theory, a square wave resolution chart is liable to alias at any given frequency, and in practice it might alias at low fundamental frequencies if the contrast is great enough (see Yedlin's windows on the F55). Yes, the higher the acquisition resolution the less aliasing there is in general, but the "oversample by two" dictum works so well more because it lets the B and R in the Bayer grid reach full resolution than because it fulfills the specific terms of the Nyquist theorem in a monochrome sense.

As you correctly state, 4k pixels can represent a maximum of 2k line pairs: one black pixel and one white pixel per line pair. But 2k line pairs times two lines per pair equals 4k lines. And I think we can agree that one black pixel and one white pixel are two distinct pixels... so 4k linear pixels can represent 4k linear pixels... which still only represents 2k cycles (line pairs). Nyquist holds true.

And in practice, a 4k monochrome sensor can indeed capture up to 4k resolution with no aliasing... of a sinusoidal zone plate. When you turn that zone plate into square waves (black and white lines rather than gradients) you can induce aliasing sooner because you're introducing higher order harmonics. NOT because the fundamental is surpassing the 2k cycle threshold. Again, 4k image resolution measures pixels. 2k line pairs measures pairs of pixels. 

Nyquist holds true. But the reasons why you see aliasing on those test charts with the C300 Mk II and not the F65 are not because Nyquist demands you oversample by two (it's really more like Nyquist converts pixel pairs to signal cycles), but because those test charts' lines represent infinite order harmonics and that oversampling in general helps prevent aliasing at increasingly high frequencies.

As predicted by math and science. :)

Share this post


Link to post
Share on other sites

Been doing some extensive research myself overnight.

Its called Terre d'Hermès apparently and according to the manufacturer its "a symbolic narrative revolving around a raw material and its metamorphosis. A novel that expresses the alchemical power of the elements. A water somewhere between the earth and the sky. A journey imbued with strength and poetry. Woody, vegetal, mineral."

I'll be dousing myself in this and inviting Sofia over for what I believe the youngsters call "Netflix and chill".

Or "Netflix from an approved list of 4K capable originating cameras and chill" as its known on here.

Share this post


Link to post
Share on other sites
20 hours ago, mercer said:

Idk, I think Mr. Netflix may tell me, "Hey Kid, if only your not horrible movie had pretty dames like Marty's movies, you may have something, see." Think bad Edward G. Robinson impression.

Glenn, you made me check out Edward G. Robinsion. Nice find! That Netflix mogul in Edward G. Robinson impression is a great idea for a little scene itself! :)

Share this post


Link to post
Share on other sites
14 hours ago, HockeyFan12 said:

You'd think so. So did I. But we're dealing with more complicated systems than "divide by two."

Frequency in the Nyquist-relevant sense is a sine wave. With sound, the appearance and sound of a lone fundamental sine wave are well-known to anyone with a background in subtractive synthesis. In terms of images, however, we usually think in square waves. Compare the images I posted of a sinusoidal zone plate and a standard square wave plate. Resolution charts generally represent square waves, which contain infinite higher order harmonics and thus are theoretically of infinitely high frequency at any given high contrast edge. In theory, a square wave resolution chart is liable to alias at any given frequency, and in practice it might alias at low fundamental frequencies if the contrast is great enough (see Yedlin's windows on the F55). Yes, the higher the acquisition resolution the less aliasing there is in general, but the "oversample by two" dictum works so well more because it lets the B and R in the Bayer grid reach full resolution than because it fulfills the specific terms of the Nyquist theorem in a monochrome sense.

As you correctly state, 4k pixels can represent a maximum of 2k line pairs: one black pixel and one white pixel per line pair. But 2k line pairs times two lines per pair equals 4k lines. And I think we can agree that one black pixel and one white pixel are two distinct pixels... so 4k linear pixels can represent 4k linear pixels... which still only represents 2k cycles (line pairs). Nyquist holds true.

And in practice, a 4k monochrome sensor can indeed capture up to 4k resolution with no aliasing... of a sinusoidal zone plate. When you turn that zone plate into square waves (black and white lines rather than gradients) you can induce aliasing sooner because you're introducing higher order harmonics. NOT because the fundamental is surpassing the 2k cycle threshold. Again, 4k image resolution measures pixels. 2k line pairs measures pairs of pixels. 

Nyquist holds true. But the reasons why you see aliasing on those test charts with the C300 Mk II and not the F65 are not because Nyquist demands you oversample by two (it's really more like Nyquist converts pixel pairs to signal cycles), but because those test charts' lines represent infinite order harmonics and that oversampling in general helps prevent aliasing at increasingly high frequencies.

As predicted by math and science. :)

You're right it's not about 2x or divide by 2. It's about > 2x, exactly 2x will alias. I think what's bugging you is the hard edges and infinite harmonics requirement. Since it sounds like you have a background in sound and music synthesis, let's talk about it from the point of view of sound and specifically resampling sound. In order to resample sound, we must have a signal > 2x the desired target frequency. Let's say we have a 48kHz sample and we need it at 44.1kHz. Can we just drop samples? Sure, and you know what that sounds like- new high frequency harmonics that sound bad. First we have to upsample the 48kHz sample just over 88.2kHz (add zeroes and low pass filter, use a sinc interpolator). Note that the resulting 88.2kHz signal has also been low pass filtered. Now we decimate by dropping every other sample and we have alias-free 44.1kHz.

The same > 2x factor applies to camera photosite sampling images as well, and we can deal with that optically, digitally, or both. The way we get around the square wave from black/white edges is via the low-pass filter, or OLPF. So the sensor will never see a sharp discontinuous infinite harmonic edge :) 

Here are all the possible harmonics (frequencies from low to high) for an 8x8 group of pixels, from: https://en.wikipedia.org/wiki/Discrete_cosine_transform

DCT-8x8.png

Aliasing typically occurs in the higher frequencies.

The trick with the OLPF is instead of only black and white, we'll get some gray too, which helps with anti-alising. It helps to look at aliasing in general:

https://www.howtogeek.com/73704/what-is-anti-aliasing-and-how-does-it-affect-my-photos-and-images/

Aliasing_a.png

Aliased.pngAntialiased-lanczos.png

The right A is anti-aliased by adding gray pixels in the right places on the edges. The right checkerboard is softer and alias-free. You can see that if we blur an image before resizing smaller, we'll get a better looking image. So, theoretically, a monochrome sensor, with a suitable OLPF can capture full resolution without aliasing. I think the problem with Bayer sensors is the 1/4 res color photosites. By using a >2x sensor we then have enough color information to prevent aliasing. Also, for a monochrome sensor, it might be hard to create a perfect OLPF, so have relaxed requirements for the OLPF, capturing at 2x, then blurring before downsampling to x and storing, might look as alias free as is possible.

So while I see your point, in the real world it does appear that as we approach 2x, aliasing goes down (F65 vs. all lower res examples). Are there any counterexamples?

Share this post


Link to post
Share on other sites
23 hours ago, mat33 said:

I suspect you are right and I guess its a testament to the role of execs versus DoP's in the industry.  If you believe that sensors/cameras are the digital equivalent to film stocks, then this is the equivalent of an exec saying that a film has to be shot on Kodak film stocks and not Fuji because they think it has slightly higher visible resolution (according to the Kodak sales rep). 

I think it might be as simple is "4K" or not in the simplest definition (camera sensor resolution).

Share this post


Link to post
Share on other sites
21 hours ago, jcs said:

So while I see your point, in the real world it does appear that as we approach 2x, aliasing goes down (F65 vs. all lower res examples). Are there any counterexamples?

It does appear that way and it makes sense that it would. The higher the sampling frequency the less aliasing. The Nyquist theorem does apply to images and every time you increase the resolution of the sensor by a linear factor of two you increase the amount of resolution before apparent aliasing by about that much too (other aspects of the acquisition chain notwithstanding). I'm not debating that.

It's just that 2X oversampling for sensors is based on what works practically, not on the 2X number in the Nyquist theorem. Most resolution charts are printed in high contrast black and white lines (4k means 4k lines), which represent square waves, whereas Nyquist concerns line pairs (4k means 2k line pairs or 2k cycles), which are represented in sine waves, which look like gradients.

If you have a set of <2048 line pairs (fewer than 4k lines) in a sinusoidal gradient rather than in black and white alone, a 4k monochrome sensor can resolve them all. So a 4k monochrome sensor can resolve UP TO but not full 4k resolution in sinusoidal cycles without aliasing (<1/2 frequency in cycles * 2 pixels per cycle so long as those cycles are true sine waves = <1 frequency in pixels). So under very specific circumstances, a 4k monochrome sensor can resolve UP TO 4k resolution without aliasing... but what a 4k sensor can resolve in square wave cycles is another question entirely... a more complicated one. Because square waves are of effectively infinite frequency along their edges.

All I'm really getting at is that Nyquist concerns sine waves, not square waves, but pixels are measured without respect to overtones. If we're strictly talking sine waves, then a monochrome 4k sensor can resolve up to 2k sinusoidal cycles, which equals up to 4k half cycles or pixels, which equals up to 4k pixels. If we're strictly talking square waves, then any sensor should theoretically be prone to aliasing along the high contrast edges at any frequency because black-and-white lines have infinite order harmonics throughout... but that's where OLPF etc. come in.

Share this post


Link to post
Share on other sites
1 hour ago, HockeyFan12 said:

All I'm really getting at is that Nyquist concerns sine waves, not square waves, but pixels are measured without respect to overtones. If we're strictly talking sine waves, then a monochrome 4k sensor can resolve up to 2k sinusoidal cycles, which equals up to 4k half cycles or pixels, which equals up to 4k pixels. If we're strictly talking square waves, then any sensor should theoretically be prone to aliasing along the high contrast edges at any frequency because black-and-white lines have infinite order harmonics throughout... but that's where OLPF etc. come in.

If we line up the test chart just right to the camera sensor, we can get full resolution:

http://www.xdcam-user.com/tech-notes/aliasing-and-moire-what-is-it-and-what-can-be-done-about-it/

f55-no-aliase.jpg?w=777

And if we shift it off center we can get this:

f55-aliasin-all-grey.jpg?w=777
 

If we look at Chapman's images above, we see that while we can resolve lines at the resolution of the sensor in one limited case, in all other cases we'll get aliasing.

More info including optics: http://www.vision-doctor.com/en/optic-quality/limiting-resolution-and-mtf.html. The OLPF cuts the infinite frequency problem.

With the proper OLPF to limit frequencies beyond Nyquist, there should be no aliasing whatsoever. When not shooting test charts, the OLPF version will look softer vs. a non-OLPF version as false detail from aliasing will be gone. Again, I think the >2x factor and Bayer sensor camera's ability to resolve at-the-limit detail without aliasing is related to 1/4 resolution R-B photosites, which is a different problem. If the OLPF was tuned to prevent color aliasing, the image would look very soft and this is a tradeoff in most cameras. Whatever Sony is doing with the F65, it provides clean, detailed 4K without apparent aliasing from an "8K" sensor (rotated Bayer).

Share this post


Link to post
Share on other sites

F65 details: https://www.abelcine.com/store/pdfs/F65_Camera_CameraPDF.pdf.

Quote

20 million photosites versus “4K” Some camera manufacturers measure their sensor resolution on the basis of “K,” a unit that equates to 1024 horizontal photosites. So a 4K sensor might have 4096 photosites on every row. Unfortunately, the actual resolution is less because these sensors use Bayer color filter arrays. This design leverages two facts. First, the human eye is more sensitive to black & white (luminance) detail than to color detail. Second, the largest component of luminance is Green. In the Bayer array, 50% of the photosites detect Green light only, 25% detect Blue only and the remaining 25% detect Red only. The 4K Bayer array runs into challenges when you try to construct a ‘true’ 4K RGB output image from the compromised RGB Bayer components. This requires a de-Bayering process, an exercise in guesswork that must estimate two out of three color values for each pixel. For example, on a Red photosite, the de-Bayering algorithm must guess at Green and Blue values. However sophisticated the de-Bayering algorithms may be, they still cannot fully recreate image information that was never captured to begin with. For this reason, the final resolution falls, to various degrees, short of true 4K.

The test chart validates Sony's claims:

Still need to see 4K charts for the Alexa 65 and Red 8K. It does appear that so far only the F65 delivers true 4K.

Regarding sine wave, square waves, and harmonics, I though you might find it interesting that you can create any possible pixel pattern by adding up all 64 of these squares (basis functions) with the appropriate weights to create any possible pixel pattern for the 8x8 square:

DCT-8x8.pngDct-table.png

Idct-animation.gif 

"On the left is the final image. In the middle is the weighted function (multiplied by a coefficient) which is added to the final image. On the right is the current function and corresponding coefficient. Images are scaled (using bilinear interpolation) by factor 10×."

That's how you do Fourier Synthesis with pixels (equivalent to sine waves with audio).

Share this post


Link to post
Share on other sites
On 7/27/2017 at 10:17 AM, PannySVHS said:

Glenn, you made me check out Edward G. Robinsion. Nice find! That Netflix mogul in Edward G. Robinson impression is a great idea for a little scene itself! :)

two things: (1) this thread is amazing

and (2) @mercer you inspired me to watch little caesar last night and i loved it

what a performer

 

Share this post


Link to post
Share on other sites
8 hours ago, jcs said:

If we line up the test chart just right to the camera sensor, we can get full resolution:

http://www.xdcam-user.com/tech-notes/aliasing-and-moire-what-is-it-and-what-can-be-done-about-it/

f55-no-aliase.jpg?w=777

And if we shift it off center we can get this:

f55-aliasin-all-grey.jpg?w=777
 

If we look at Chapman's images above, we see that while we can resolve lines at the resolution of the sensor in one limited case, in all other cases we'll get aliasing.

More info including optics: http://www.vision-doctor.com/en/optic-quality/limiting-resolution-and-mtf.html. The OLPF cuts the infinite frequency problem.

With the proper OLPF to limit frequencies beyond Nyquist, there should be no aliasing whatsoever. When not shooting test charts, the OLPF version will look softer vs. a non-OLPF version as false detail from aliasing will be gone. Again, I think the >2x factor and Bayer sensor camera's ability to resolve at-the-limit detail without aliasing is related to 1/4 resolution R-B photosites, which is a different problem. If the OLPF was tuned to prevent color aliasing, the image would look very soft and this is a tradeoff in most cameras. Whatever Sony is doing with the F65, it provides clean, detailed 4K without apparent aliasing from an "8K" sensor (rotated Bayer).

While that's true, it would be a different story if there were fewer in the input than in the sensor and if the input were sinusoidal, not square.

I fully agree that the 1/4 resolution R and B photo sites are the big issue. And the F65 sure looks good.

Share this post


Link to post
Share on other sites
1 hour ago, HockeyFan12 said:

While that's true, it would be a different story if there were fewer in the input than in the sensor and if the input were sinusoidal, not square.

I fully agree that the 1/4 resolution R and B photo sites are the big issue. And the F65 sure looks good.

When an OLPF is used there will be no hard or square edges, instead, a gradient, closer to a sinusoid, rounded squares... You can simulate an OLPF and camera sensor (hard quantization) by Gaussian blurring the test image, then resizing the image down to say 12% using Nearest Neighbor. Testing this ZoneHardHigh.png from here: http://www.bealecorner.org/red/test-patterns/ (really brutal alias test!), you'll see that with no blurring before resizing it aliases like crazy, and as blur goes up, aliasing goes down, however the final image gets softer. In order to get rid of most of the aliasing, the blur must be very strong (12% target):

Resized Nearest Neighbor (effectively what the sensor does with no OLPF):

ZoneResize.png.951aae0064616e36dacf838100622ea4.png

Gaussian Blur before resize (10 pixels), simulating an OLPF:

ZoneBlur.png.80e822e047fab8d704f6a0ec737e4251.png

Then adding post-sharpening (as many cameras do), though some aliasing starts to become more visible:

ZoneBlurSharp.png.3da22138bedd6a88a5acdf02b74f90d5.png

Then adding local contrast enhancement (LCE, a form of unsharp masking, performed before the final sharpen operation):

ZoneBlurLCESharp.png.db2879b0a4ba9cf8d8a125a9a4f079f9.png

Thus there are definite benefits to doing post sharpening and LCE after a sufficient OLPF gets rid of most of the aliasing (vs. using a weaker OLPF which results in a lot more aliasing).

Yeah the F65 is pretty excellent.

Share this post


Link to post
Share on other sites
5 minutes ago, kaylee said:

@jcs what is this called, this image? this test pattern/whatever? i like it. where did it come from?

Does that remind you of something after taking something? :innocent:

It's called a zone plate test chart, available here: http://www.bealecorner.org/red/test-patterns/, ZoneHardHigh.png. Resized to 12% with Nearest Neighbor resampling.

Share this post


Link to post
Share on other sites
9 hours ago, jcs said:

When an OLPF is used there will be no hard or square edges, instead, a gradient, closer to a sinusoid, rounded squares... You can simulate an OLPF and camera sensor (hard quantization) by Gaussian blurring the test image, then resizing the image down to say 12% using Nearest Neighbor. Testing this ZoneHardHigh.png from here: http://www.bealecorner.org/red/test-patterns/ (really brutal alias test!), you'll see that with no blurring before resizing it aliases like crazy, and as blur goes up, aliasing goes down, however the final image gets softer. In order to get rid of most of the aliasing, the blur must be very strong (12% target):

Resized Nearest Neighbor (effectively what the sensor does with no OLPF):

ZoneResize.png.951aae0064616e36dacf838100622ea4.png

Gaussian Blur before resize (10 pixels), simulating an OLPF:

ZoneBlur.png.80e822e047fab8d704f6a0ec737e4251.png

Then adding post-sharpening (as many cameras do), though some aliasing starts to become more visible:

ZoneBlurSharp.png.3da22138bedd6a88a5acdf02b74f90d5.png

Then adding local contrast enhancement (LCE, a form of unsharp masking, performed before the final sharpen operation):

ZoneBlurLCESharp.png.db2879b0a4ba9cf8d8a125a9a4f079f9.png

Thus there are definite benefits to doing post sharpening and LCE after a sufficient OLPF gets rid of most of the aliasing (vs. using a weaker OLPF which results in a lot more aliasing).

Yeah the F65 is pretty excellent.

For further comparison. The square wave zone plate downressed with nearest neighbor:

IMG_3392.jpg.99e4e3d72c8c50011c96c5ece448a38e.jpg

Vs. the sine wave zone plate downressed with nearest neighbor:

IMG_3392v2.jpg.34097d71b2257c0856cf2db97079b09b.jpg

As expected, the square wave plate appears to alias at every edge due to the infinite harmonics (even if the interference patterns aren't really visible in the lowest frequency cycles). The sine wave zone plate begins to alias much later.

So yes, the Gaussian blur (OLPF) does fix things when applied aggressively, but much of the aliasing present in the downscaled square wave test chart is due to square wave (black and white with no gradient) lines being of effectively infinite frequency (before they are blurred), not because the frequency of those lines' fundamentals are below a certain threshold dictated by Nyquist.

My only point is that observing aliasing in square wave zone plates and correlating it directly with Nyquist's <2X rule is misguided. Nyquist applies to square wave test charts, but it predicts aliasing at every edge (except in the presence of a low pass filter, as seen above) as every edge is of infinite frequency. Whereas on a sine wave chart, a 4k monochrome sensor can resolve up to 2k sinusoidal line pairs or up to 4k lines, or 4k pixels. Of course, the actual system is far more complicated on most cameras and every aspect of that system is tuned in relation to every other. But I would still argue that the 2x oversampling figure works so well for Bayer because it provides full resolution R and B channels and furthermore simply because any arbitrary amount of oversampling will generally result in a better image.

But not because 2X coincidentally lines up with the <2X rule in the Nyquist theorem, which applies to sine waves, not square pixels.

(Fwiw, I haven't worked much with the C300 Mk II but I do remember that the C500 had false detail problems worse than on the Alexa and Epic and F55, but its RAW files also appear considerably sharper per-pixel. Canon may have erred on the side of a too-weak OLPF. The C200 test chart looks really sharp.)

Share this post


Link to post
Share on other sites
11 hours ago, kaylee said:

two things: (1) this thread is amazing

and (2) @mercer you inspired me to watch little caesar last night and i loved it

what a performer

 

Yeah, he was great. I never saw Little Caesar, but that clip was excellent. It's insane the amount of dialogue those classic actors had to memorize before a cut. It really allowed them to get into their character. Nowadays we have prima donnas complaining about line of sight instead of just playing their part. 

Share this post


Link to post
Share on other sites
3 hours ago, HockeyFan12 said:

So yes, the Gaussian blur (OLPF) does fix things when applied aggressively, but much of the aliasing present in the downscaled square wave test chart is due to square wave (black and white with no gradient) lines being of effectively infinite frequency (before they are blurred), not because the frequency of those lines' fundamentals are below a certain threshold dictated by Nyquist.

My only point is that observing aliasing in square wave zone plates and correlating it directly with Nyquist's <2X rule is misguided. Nyquist applies to square wave test charts, but it predicts aliasing at every edge (except in the presence of a low pass filter, as seen above) as every edge is of infinite frequency. Whereas on a sine wave chart, a 4k monochrome sensor can resolve up to 2k sinusoidal line pairs or up to 4k lines, or 4k pixels. Of course, the actual system is far more complicated on most cameras and every aspect of that system is tuned in relation to every other. But I would still argue that the 2x oversampling figure works so well for Bayer because it provides full resolution R and B channels and furthermore simply because any arbitrary amount of oversampling will generally result in a better image.

But not because 2X coincidentally lines up with the <2X rule in the Nyquist theorem, which applies to sine waves, not square pixels.

(Fwiw, I haven't worked much with the C300 Mk II but I do remember that the C500 had false detail problems worse than on the Alexa and Epic and F55, but its RAW files also appear considerably sharper per-pixel. Canon may have erred on the side of a too-weak OLPF. The C200 test chart looks really sharp.)

(C300 II & C500 vs. Alexa, Epic, F55 and false detail: the C300 II & C500 have lower resolution Bayer sensors, so they compensate with weaker OLPFs to gain apparent sharpness vs. the much higher resolution competitors. The tradeoff is more aliasing (I wasn't happy when I saw it upgrading from an alias-free 5D3!). When you think about limit theory, as sensor resolution increases, the need for an OLPF decreases, and at the limit we don't need an OLPF at all (e.g A7R II, 5DSR). So it makes sense that the higher resolution cameras have less false detail, especially in the case of the Alexa capturing 2.8K and delivering 2K. Capture ultra high resolution without an OLPF, then provide a filtered/blurred downsample to the target resolution.) 

Two ideas:

  1. Maximum resolution capture possible from a camera sensor
  2. Aliasing

I agree a 1920x1080 monochrome sensor can indeed capture a maximum resolution of 1920x1080 pixels. In terms of frequency, we need an up and a down so we get zero crossings to form cycles (2 pixels). So that's 960x540 lines pairs in terms of frequency. Nyquist says if we want to eliminate aliasing, we must sample at > 2x the desired frequency. If we line up a test chart and 1920x1080 camera sensor perfectly, we can capture 960x540 LP without aliasing (1920x1080 pixels). However, as soon as we start moving around, it will alias like crazy. We fix it by applying an optical low pass filter (OLPF), so that the 'input' pixels are now effectively >2x wider and taller. Starting with the initial case of lining the sensor up perfectly with the test image, we've got a sharp, 1920x1080 pixel capture. Now we apply the OLPF and the image now appears less sharp and somewhat blurry, now capturing 480x270 LP in terms of frequency, or 960x540 pixels (it's still 1920x1080 pixels, slightly blurry). However, if we move the sensor around, the non-OLPF capture will alias like crazy, and the OLPF version will look nice, without aliasing (to the limits of the OLPF). This is for any image with high-frequency information beyond Nyquist, not just test charts (brick walls and fabrics are common problems).

Which means:

  1. Maximum capture resolution possible for a monochrome sensor is W x H pixels, or W/2 x H/2 line pairs in terms of frequency
  2. Maximum possible capture resolution for (1) without aliasing is : (W/2)/2 x (H/2)/2 line pairs in terms of frequency or W/2 x H/2 in terms of pixels

An HD (1920x1080) monochrome sensor can capture up to 480x270 line pairs (960x540 pixels) without aliasing when using a proper OLPF, a 4K (3840x2160) monochrome sensor can capture up to 960x540 line pairs (1920x1080 pixels) in the same way.

Another way to think about it: can you draw a black 1-pixel wide vertical line in Photoshop without aliasing? Now draw the line again with a slight angle (not perfectly diagonal). It's going to have discontinuous breaks now, which is aliasing. How can we fix it? We need to add grayscale pixels in the right places so now the line must be at least 2 pixels wide in places to provide anti-aliasing. If we create an alternating black & white pixel wide grid, which is the max possible frequency, we can't rotate it and still have the original pattern. If we don't antialias, it will be a random looking jumble of pixels, if we antialias it will be a gray rectangle: the original pattern will be gone. As we make the grid lower resolution, so that we can antialias it as we rotate it, the pattern can still be visible.

The problem with Bayer sensor cameras (except apparently the F65), is the OLPF is tuned such that effectively luminance is antialiased (mostly from green), but since red and blue are 1/4 resolution vs. green, we end up with color aliasing. Some cameras of course alias in both luminance and chrominance- a weaker OLPF or no OLPF is being used. If we want alias-free 4K, we need an 8K sensor and tuned OLPF. It's 2x again, now due to the undersampling from the Bayer sensor in R & B.

This is one of the factors in the 'film look': there must be zero aliasing, and the image can even look a bit soft. However the noise grain can be much higher frequency, providing an illusion of more texture and detail.

Right now it appears only Sony's F65 and maybe Red's 8K sensors can provide True 4K, so Netflix ought to get busy and update their camera list! :)  I believe this could stand up in a court of law, so excluding the Alexa because it's only (max) 3.4K is BS, since only the F65 and maybe 8K Red provide real, actual 4K. All others are undersampling in R & B (including the Alexa 65). The marketing angle should be that any camera which provides over HD resolution and detail requires a 4K display, and will thus look better than HD when streamed on Netflix's 4K subscription plan and viewed sufficiently closely on a 4K display.

Share this post


Link to post
Share on other sites

+ D800E for no OLPF still cameras and the OLPF in the prior post is likely tuned to the Bayer photosite size (not green), and red and blue are 1/4 resolution with green at 1/2 resolution (not red & blue 1/4 relative to green). So all 3 colors are undersampled in a typical Bayer sensor (not the F65).

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...