Jump to content
Sign in to follow this  
DreamVision

Moire Removal / Lessening

Recommended Posts

EOSHD Pro Color for Sony cameras EOSHD Pro LOG for Sony CamerasEOSHD C-LOG and Film Profiles for All Canon DSLRs

1) Avoid shooting subjects that produce it

2) Get a  Super 16mm Arriflex

3) Get a CCD Camera

4) Try some mist/softening filters

5) appreciate that the naked human eye sees Moire and learn to accept it a "Realistic" filmmaking?

 

Share this post


Link to post
Share on other sites

There are some techniques but they have never worked for me - see here: https://forums.creativecow.net/thread/2/986373

Basically, try to avoid it either by getting another camera or working out how to avoid it with the camera you have.

Also, I find that a lot of people don't actually notice moire at all on videos/TV! So it might not be as distracting to the average viewer as you think it is - depends on the case I guess.

Share this post


Link to post
Share on other sites

There are some techniques but they have never worked for me - see here: https://forums.creativecow.net/thread/2/986373

Basically, try to avoid it either by getting another camera or working out how to avoid it with the camera you have.

Also, I find that a lot of people don't actually notice moire at all on videos/TV! So it might not be as distracting to the average viewer as you think it is - depends on the case I guess.

​Cool thx for the link. I wish to try this! :lol: http://www.noiseproofpro.com/#!demo/component_9350

Share this post


Link to post
Share on other sites

Basically, try to avoid it either by getting another camera or working out how to avoid it with the camera you have.

​Working out how to avoid it. How best?

1) Avoid shooting subjects that produce it

​By doing the contrary first, by learning how to produce moire. A lot of factors contribute to that, but in the end it all boils down to the resolution of patterns. So the obvious answer is looking for patterns. The lens is capable (sharp enough) to depict smaller picture elements (google >circle of confusion) than the pixel grid on the sensor, never the other way around. This reduces the actual resolution drastically. If you additionally allow any in-camera sharpening, the aliasing stays the same, but the moire will pop out more (software solutions as shown above affect only this, they can only soften the existing aliasing, imo not worth the trouble/money). There are filters you can put in front of the sensor (> OLPF) that will tune the spatial frequency. They will eliminate moire. But they are expensive and they have to be made specifically for your camera.

However, during your deliberate chase of moire patterns you will notice that with one too sharp lens you will only be able to produce moire under certain conditions

1. The most important is the spatial frequency of the lens, which alters depending on the aperture you chose. For just about every lens there are test charts that show this correlation in a graph. As a rule of thumb: 

Sony50mmZeissResolutie.jpg

At open aperture, the lens usually performs 'worse'. You should learn to find the aperture range that allows moire with your combo!

NOTE: With some lenses, choosing an inappropriate aperture alone makes moire disappear (which is not what we intend),

2. The pattern needs a certain contrast. As we all know, contrast is highest on a sunny day at noon and lowest in a dimly lit cellar. So you need to avoid wide apertures in bright light. If you are after moire, stay away from ND filters!

By understanding those two conditions, you will capture 90% of all moire that's achievable.

The third condition is the relative size of the pattern itself. Whereas the first two factors are easy to manage, this one really restricts your options regarding framing, focal lengths or choice of motifs in the first place. A pain in the ass. But in reality, it's actually pretty easy to instantly recognize promising patterns. And then, all together now:

1. Turn up in-camera sharpening (this embosses otherwise barely noticeable moire and bakes it into the image, best advice!)

2. Find the aperture at which your lens performs best (also depends on focal length with zooms)

3. Change your position during shooting ever so slightly. Doesn't need much practice, you'll know intuitively pretty soon, it actually doesn't limit your choices.

4. Don't be afraid of moire-free images! Cameras with very high moire efficiency are around for almost one decade now. It's lunacy, but many filmmakers actually at some day lost their skills to capture moire. Cinema features have been released shot with those cameras, and nobody complained about missing rainbows. People got used to the clean images, so don't worry.

Share this post


Link to post
Share on other sites

If at all possible. control wardrobe. I find moire more distracting on people than I do on inanimate objects. Meshy fine patterns should be avoided. Solid earthtones!

Try and get a good monitor so you can spot the moire and maybe take some of the steps Axel suggests. Trying to fix it at home in your desktop is difficult, fix it before you record to your SD card. 

 

Share this post


Link to post
Share on other sites

1) Avoid shooting subjects that produce it

2) Get a  Super 16mm Arriflex

3) Get a CCD Camera

4) Try some mist/softening filters

5) appreciate that the naked human eye sees Moire and learn to accept it a "Realistic" filmmaking?

 

​1) Certainly good advice, but not always possible. I agree, though, that's the best approach. I've even seen the Alexa alias pretty severely in the right circumstances (fine-pattererned red fabrics at certain distances), and beta cam did terribly. Being smart when addressing set design is always a good choice.

2) Uhh... sure. 

3) No. CCDs alias just as much as CMOS sensors... it's what's in front of the sensor and how it's processed that matters.

4) Worth a try, so is pushing the image just slightly out of focus if you really need to. If it's a brick wall in the distance, you won't lose that much resolution throwing it just a little out of focus to lessen aliasing. But kind of iffy.

5) The human eye does not see any more moire than you put in front of it. If you put an image of something that has moire in front of it, it will see it, same as with film, which also doesn't otherwise alias. 

I've had really, really good luck with some post techniques, assuming the aliasing isn't horrible. I sample the aliasing in Neat Video and use a 0-radius temporal filter and extremely strong low frequency noise reduction settings. Then I mask this in After Effects where needed, maybe add a little blur, then regain. For less severe (color only) aliasing you can copy the footage onto a layer above the original shot, add a 12-pixel radius blur (or thereabouts) and superimpose that blurred footage using the color transfer mode. I also mask this layer as it fuzzes out the richness of the color. These techniques are a pain, but very effective at at least reducing aliasing. 

That said, the fabric thing is your best bet. Although I'd had decent luck tackling fabric in post, only if it's pretty minor in the first place though. So shoot smart... choose appropriate fabrics for your camera (as I said above, Alexa aliases with some red fabrics, and the Canon C series even more so, despite both being pretty alias-free normally), throw your worst aliasing out of focus or avoid shooting brick walls in establishing shots (or shoot a still and blur it and composite it in later: seriously).

Suddenly I don't miss my t2i as much. :)

 

Share this post


Link to post
Share on other sites

 

5) The human eye does not see any more moire than you put in front of it. If you put an image of something that has moire in front of it, it will see it, same as with film, which also doesn't otherwise alias. 

 

That's thing though,  like a lot of people I used to think Moire was a purely electronic phenomena. But the naked eye CAN see some moire and it very much looks to be some video related artifact, but obviously it isn't otherwise why is the naked eye visualizing it? Once you realize that moire is part of real life, it makes it easier to accept something that shows up on your NLE the next day. I worry much less, generally, than most of you guys here about resolution and even DR, my biggest concern with a motion picture image is the presence of artifacts that look electronic and thus break the 4th wall. Moire perhaps shouldn't be the cringe inducing apparition that it's made out to be. 

so in other words, I'm suggesting a purely psychological remedy here. 

Share this post


Link to post
Share on other sites

Moire perhaps shouldn't be the cringe inducing apparition that it's made out to be. 

​Know your foe!

Make friends with your biggest fears or they may destroy you (Bill to Kiddo, sitting at the campfire playing the flute)

If I had the perfect camera with no quirks or limitations but with incomparable strengths such as 30 x optical power zoom, perfect AF, AE and AWB, low light down to 10 luces to shoot night for day, whatever, I was never able to make an informed decision. 

so in other words, I'm suggesting a purely psychological remedy here. 

After the physical facts behind a camera's cardinal weaknesses are understood, I suggest one explores them deeper. Eventually dealing with them becomes intuition, almost psychosomatic. Sound pompous, but is rather trivial. Take a good pan or a smooth focus transition. Could you really teach someone how to do that by describing it verbally? It's all about intuition paired with fine motor skills.

Share this post


Link to post
Share on other sites

I sample the aliasing in Neat Video and use a 0-radius temporal filter and extremely strong low frequency noise reduction settings.

​Nice suggestions. How do you 'sample' the aliasing in Neat Video? 

Share this post


Link to post
Share on other sites

That's thing though,  like a lot of people I used to think Moire was a purely electronic phenomena. But the naked eye CAN see some moire and it very much looks to be some video related artifact, but obviously it isn't otherwise why is the naked eye visualizing it? Once you realize that moire is part of real life, it makes it easier to accept something that shows up on your NLE the next day. I worry much less, generally, than most of you guys here about resolution and even DR, my biggest concern with a motion picture image is the presence of artifacts that look electronic and thus break the 4th wall. Moire perhaps shouldn't be the cringe inducing apparition that it's made out to be. 

so in other words, I'm suggesting a purely psychological remedy here. 

It's not an electronic phenomenon (what does that even mean in this case?), it's a mathematical one. Read up on the Nyquist-Shannon sampling theorem. When one system samples another system there's potential for aliasing at frequencies above half the resolution of the system that's doing the sampling. This could mean music recorded at 44.1khz might cause aliasing in frequencies above 22.05khz (which is just above the cut off for human hearing). It means any sensor that's resolving more than 25% (50% in either dimension) of its megapixel count is also aliasing except when something optically reduces the resolution of the information going into the system below that threshold. Usually no one notices just a little, though, and most dSLRs alias when shooting stills under the worst circumstances... but just a little... and resolve roughly 70% resolution in either dimension at best.

That said,​ when have you seen aliasing induced by your eyes? I would argue no one ever has. When have you seen aliasing with your eyes? Only when there is a system offering a signal and another filtering it. You see it when you see it in images recorded with sensor that is prone to aliasing because it is sampling information beyond the Shannon-Nyqvist limit. You see it when you see one screen door through another at the right frequency (one screen door is the signal... the other is the system filtering it...). But the system that induces aliasing is never your eyes, because your eyes as sensors are so high-detail that it's effectively irrelevant and beyond diffraction-limited and, more importantly, the rod-and-cone pattern is too random to have any statistically significant chance of aligning with another system. (Whereas sensors are grids... and there are plenty of grids and lines out there.)

So no, the eyes do not cause aliasing at any frequency, no matter how high. They can see aliasing when it occurs as a result of another system sampling something about the Shannon-Nyqvist limit, however. So... stacks of screen doors, fences, etc. can cause a moire pattern.

But what you're suggesting (that the eyes can induce aliasing) is mathematically and objectively wrong, but maybe psychologically right, sure, in that it's not a foreign phenomenon experientially. We've seen aliasing before in real life when screen doors or wire meshes overlap and seen it in plenty of pictures. So when we see it on camera, it's not too foregone. But when we perceive it in life it's always a product of two systems interacting, and neither of those systems is ever your eye (whereas it could be a sensor.. or meshes... or whatever). You will never see moire induced by your eyes alone. Always one system interfering with another before getting to your eye. Whereas a dSLR sensor (CCD or CMOS) can induce aliasing whenever it captures detail above the Shannon-Nyquist limit; thankfully most OLPF knock out the majority of detail that causes really bad aliasing... except that line skipping decreases the Shannon-Nyqvist limit by a factor of three (if you're skipping every other line) and the OLPF obviously isn't blurring things that much!

Same goes for film, actually. Random dot pattern doesn't induce aliasing. Can't. Won't.

The same idea that explains why line skipping causes aliasing explain why the Alexa has bad aliasing with red fabrics (the C300 even worse!). The OLPF is designed to knock down resolution for green/white light, but the pixel count for red pixels is pretty small on any Bayer sensor. So when your signal is mostly red, it's getting through the OLPF with more than 50% of the frequency of the red pixel array. Thus... aliasing.

Can't imagine red fabrics through the 7D. :P;)

Share this post


Link to post
Share on other sites

​Nice suggestions. How do you 'sample' the aliasing in Neat Video? 

​Good question lol. I actually posted this technique about four years ago and everyone forgot about it until someone made a plug in that automatic a similar process (but just for chroma noise... the chroma blur I mentioned above). It works great. If the aliasing is so bad it's obvious on the screen it's hard to get rid of, but every time i've seen a bit in a suit or something I've been able to fix it. I just find the frame that has the worst and most and drag the sampling window around it. It's very low-frequency, so I crank up the low frequency nr setting in the advanced settings in neat video. Works awesome, although obviously it can get so bad nothing can fix it.

Share this post


Link to post
Share on other sites

It's not an electronic phenomenon (what does that even mean in this case?), it's a mathematical one. Read up on the Nyquist-Shannon sampling theorem. When one system samples another system there's potential for aliasing at frequencies above half the resolution of the system that's doing the sampling. This could mean music recorded at 44.1khz might cause aliasing in frequencies above 22.05khz (which is just above the cut off for human hearing). It means any sensor that's resolving more than 25% (50% in either dimension) of its megapixel count is also aliasing except when something optically reduces the resolution of the information going into the system below that threshold. Usually no one notices just a little, though, and most dSLRs alias when shooting stills under the worst circumstances... but just a little... and resolve roughly 70% resolution in either dimension at best.

That said,​ when have you seen aliasing induced by your eyes? I would argue no one ever has. When have you seen aliasing with your eyes? Only when there is a system offering a signal and another filtering it. You see it when you see it in images recorded with sensor that is prone to aliasing because it is sampling information beyond the Shannon-Nyqvist limit. You see it when you see one screen door through another at the right frequency (one screen door is the signal... the other is the system filtering it...). But the system that induces aliasing is never your eyes, because your eyes as sensors are so high-detail that it's effectively irrelevant and beyond diffraction-limited and, more importantly, the rod-and-cone pattern is too random to have any statistically significant chance of aligning with another system. (Whereas sensors are grids... and there are plenty of grids and lines out there.)

So no, the eyes do not cause aliasing at any frequency, no matter how high. They can see aliasing when it occurs as a result of another system sampling something about the Shannon-Nyqvist limit, however. So... stacks of screen doors, fences, etc. can cause a moire pattern.

But what you're suggesting (that the eyes can induce aliasing) is mathematically and objectively wrong, but maybe psychologically right, sure, in that it's not a foreign phenomenon experientially. We've seen aliasing before in real life when screen doors or wire meshes overlap and seen it in plenty of pictures. So when we see it on camera, it's not too foregone. But when we perceive it in life it's always a product of two systems interacting, and neither of those systems is ever your eye (whereas it could be a sensor.. or meshes... or whatever). You will never see moire induced by your eyes alone. Always one system interfering with another before getting to your eye. Whereas a dSLR sensor (CCD or CMOS) can induce aliasing whenever it captures detail above the Shannon-Nyquist limit; thankfully most OLPF knock out the majority of detail that causes really bad aliasing... except that line skipping decreases the Shannon-Nyqvist limit by a factor of three (if you're skipping every other line) and the OLPF obviously isn't blurring things that much!

Same goes for film, actually. Random dot pattern doesn't induce aliasing. Can't. Won't.

The same idea that explains why line skipping causes aliasing explain why the Alexa has bad aliasing with red fabrics (the C300 even worse!). The OLPF is designed to knock down resolution for green/white light, but the pixel count for red pixels is pretty small on any Bayer sensor. So when your signal is mostly red, it's getting through the OLPF with more than 50% of the frequency of the red pixel array. Thus... aliasing.

Can't imagine red fabrics through the 7D. :P;)

I've never seen aliasing with my naked eye. And I don't know about inducing moire with the naked eye in the sense you mean, the take way should be, for video, that moire is something you can perceive with your naked eye. Moire is part of real life, not just digital life. That's the key.  You don't even have to be an android to see it   -- though I'm told it helps.. 

Share this post


Link to post
Share on other sites

Fair enough. There are circumstances in which you'll see it (chain link fences back to back, because one is working as a a system sampling the frequency of the other), but when looking at one chain fence, or straight at fabric... or one part of a system... you're never going to see it.

So while it's not completely alien to nature, it's still nice to reduce it. No matter how much you stare at that brick wall it won't moire without something else in front of it.

Share this post


Link to post
Share on other sites

Fair enough. There are circumstances in which you'll see it (chain link fences back to back, because one is working as a a system sampling the frequency of the other), but when looking at one chain fence, or straight at fabric... or one part of a system... you're never going to see it.

So while it's not completely alien to nature, it's still nice to reduce it. No matter how much you stare at that brick wall it won't moire without something else in front of it.

​Yep, I still hate it. I have a continuing, haunting fantasy of selling everything digital and getting a sexy Arri S16mm. 

Share this post


Link to post
Share on other sites

They're dirt cheap. Though film isn't, but you can buy short ends. SR3s are selling for peanuts now and the images look as good as ever.

I just do this for fun (and the occasional paycheck), so it's still a little too expensive for me. :)

If you've never tried a Bolex, even that is a lot of fun. Just the lab fees and scans are brutal. :(

Share this post


Link to post
Share on other sites

If you've never tried a Bolex, even that is a lot of fun. Just the lab fees and scans are brutal. :(

​Fun. Yes. I had a mechanical Bolex and developed the B&W films myself. Wouldn't you agree you had to get much more parameters right intuitively (w/o proper preview, w/o proper exposure assistant, w/o sound) than with any of the digital cameras of today? I also had a Kiev 6x6, which had a light meter built into the (detachable) reflex viewer. I found out for myself that guessing the right aperture and shutter (relative to the stock's speed) by simply looking at the ground glass got me better results. If we have to fight some moire now, and if that's our major pita, there's no need to despair. 

Share this post


Link to post
Share on other sites

​Fun. Yes. I had a mechanical Bolex and developed the B&W films myself. Wouldn't you agree you had to get much more parameters right intuitively (w/o proper preview, w/o proper exposure assistant, w/o sound) than with any of the digital cameras of today? I also had a Kiev 6x6, which had a light meter built into the (detachable) reflex viewer. I found out for myself that guessing the right aperture and shutter (relative to the stock's speed) by simply looking at the ground glass got me better results. If we have to fight some moire now, and if that's our major pita, there's no need to despair. 

​But at the same time, film is more forgiving. Get a light meter and you're good to go. The film cams I've used all had them built in -- but a handheld was good to have. But sure, instant replay in cam is tough to beat. 

Share this post


Link to post
Share on other sites

​Fun. Yes. I had a mechanical Bolex and developed the B&W films myself. Wouldn't you agree you had to get much more parameters right intuitively (w/o proper preview, w/o proper exposure assistant, w/o sound) than with any of the digital cameras of today? I also had a Kiev 6x6, which had a light meter built into the (detachable) reflex viewer. I found out for myself that guessing the right aperture and shutter (relative to the stock's speed) by simply looking at the ground glass got me better results. If we have to fight some moire now, and if that's our major pita, there's no need to despair. 

​With both film and digital I spot meter and incident meter pretty religiously. Rarely trust internal meters. You're really ill-equipped to expose without the right equipment and a miscalibrated internal meter on an old Russian camera isn't up to snuff. 

But for casual projects, absolutely I just use the waveform monitor or something and leave it as is. Digital is much easier and my preference for sure, I mean, it's everyone's. I liked having to storyboard every shot and only shooting the portions I knew I needed from each angle to conserve shooting ratio (under extreme circumstances). Fun but kind of a needless obstacle.

Share this post


Link to post
Share on other sites

The only moire I can recall seeing in the real world is polarized-pattern stuff - two window screen overlapping, or a black net with a wave or fold in it.

Beyond all the math, moire is an interference issue between the "pixels" of a physical pattern and how they get mapped to the pixels of a sensor. Film being random (grain vs pixels in a grid) it's much harder to get moire with film.

In post, I attack moire the way I often attack other issues that need masking. Try a color-based keying effect, and if that doesn't work, experiment with color channels - R,G, or B. Find one with the most contrast in the area of trouble. Use levels or curves to make a luma mask from that channel (when you want to isolate a sky, often cranking the contrast in the blue channel will make the sky are white and trees/etc black). Then you have to roto a soft mask to black out stuff you don't want masked, and use that precomp as a luma matte to control blur or noise reduction. This is crazy-useful in Photoshop, but can be a life saver in AE as well.

 

channels.jpg

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Sign in to follow this  

×
×
  • Create New...