Jump to content

tupp

Members
  • Posts

    1,148
  • Joined

  • Last visited

Posts posted by tupp

  1. 8 minutes ago, Ben J. said:

    Yeah but would that mess up the balance of the gimbal? 

    When you said "Glidecam," I got the impression that you were using a Steadicam type rig -- not an electronic gimbal.

     

    Furthermore, a solid underwater housing might be too much for a gimbal.

  2. 24 minutes ago, Ben J. said:

    I'm thinking of using this, but trying to see if there is a more affordable option. https://www.bhphotovideo.com/c/product/1608185-REG/ikelite_71475_200dl_underwater_housing_for.html

    An underwater housing seems like overkill for your purpose.

     

    A rain cover and a strong lens filter might be better and easier.  If you are concerned about damage from the impact force of the paint balls, you could try an insulated rain cover.

  3. 39 minutes ago, kye said:

    Ah, now we've changed the game.  You're saying that the resulting downscaled image will have the same reduced colour depth as the original image.  This is not what you have been saying up until this point.

    Nope.  I said "A conversion can be made so that the resulting Full HD 10-bit image has essentially the equivalent color depth as the original 4K 8-bit image."

     

    I didn't say anything about the original image having "reduced" color depth.  You came up with that.

     

    However, the original image does have lower bit depth than the down-converted image -- even though both images have the same color depth.

     

     

    46 minutes ago, kye said:

    You said that "4K has 4 times the color depth (and 4 times the bit rate) of full HD" which implies that I can film a 4K 8-bit image and get greater colour depth than FHD 8-bit,...

    Yes.  That is a fact -- all other variables being the same in both instances.

     

     

    48 minutes ago, kye said:

    ... but now you're saying that the resulting downscale to FHD will have the same limitations to colour depth, which completely disagrees with your original statement.

    No. It doesn't disagree with anything that I have said.  You are just confusing bit depth with color depth.

     

    With such a down-conversion, resolution is being swapped for greater bit depth -- but the color depth remains the same in both images.

     

    It really is simple, and Andrew Reid's article linked above verifies and explains the down-conversion process.

     

     

    1 hour ago, tupp said:

    The banding remains because it is an artifact that is inherent in the original image.

    55 minutes ago, kye said:

    Correct.  Which is why "4K has 4 times the color depth (and 4 times the bit rate) of full HD" is a fundamentally incorrect statement.

    No. The banding artifact is caused by the bit depth -- not by the color depth.  Posterization, bit depth and color depth are all different properties.

     

     

    58 minutes ago, kye said:

    I shoot with 8-bit, I get colour banding.
    I shoot with 10-bit, I don't get colour banding.

    Seems like it has everything to do with the colour depth of the resulting image.

    Well, it would be so if bit depth and color depth were the same thing, but they're not.

     

    Stop confusing bit depth with color depth.

     

     

    1 hour ago, kye said:

    Please provide links to any articles or definitions (or anything at all) that talks about how colour depth is different to bit depth, because I have looked and I can't find a single reference where someone has made the distinction except you

    I linked an article by the founder of this forum that verifies from a prominent image processing expert that one can swap resolution for higher bit depth when down-converting images -- while retaining the same amount of color information (color depth):

    20 hours ago, tupp said:

    In addition, experts have more recently shown that higher resolutions give more color information (color depth), allowing for conversions from 4k, 4:2:0, 8-bit to Full HD, 4:4:4, 10-bit -- using the full, true 10-bit gamut of tones.  Here is Andrew Reid's article on the conversion and here is the corresponding EOSHD thread.

     

    Also, that very article gives a link to the expert's Twitter feed, should you care to verify that the down-conversion process is actually a fact.

     

    Furthermore, I have given an example proving that resolution affects color depth, and you even agreed that it was correct: 

    On 3/18/2021 at 5:27 PM, kye said:

    Your newspaper example is technically correct...

     

    Additionally, I gave lead-pipe cinch demonstration on how increasing pixel sites within a given area creates more possible shades, to which you did not directly respond.  Do you deny the facts of that demonstration?

     

     

    1 hour ago, kye said:

    it seems suspiciously like you're changing the definition just to avoid being called out for posting BS online.

    I have repeated essentially the same thing over and over.  It is you who keeps lapsing into confusion between color depth and bit depth.

     

     

    1 hour ago, kye said:

    Then explain it simply.

    I have asked you lots of times to do so.

    I have paraphrased the concept repeatedly in different forms, and I have given examples that clearly demonstrate that resolution is integral to color/tone depth.  In addition, I have linked an article from the forum founder that further explains the concept and how it can work in a practical application.

     

    If you don't understand the simple concept by now, than I am not sure there is much more that I can do.

     

     

    1 hour ago, kye said:

    The really sad thing is that there is some basis to this (and thus why Andrew and others have reported on it)

    Wait, I thought that you said:

    1 hour ago, kye said:

    I have looked and I can't find a single reference where someone has made the distinction except you

     

     

    1 hour ago, kye said:

    Making assertions that resolution can increase bit-depth but then saying that banding will still occur is simply disagreeing with yourself.

    I never asserted that "resolution can increase bit-depth."  Please quit putting words into my mouth.  Changing the resolution will have no effect on the bit depth nor vice versa -- resolution and bit depth are two independent properties.

     

    On the other hand, resolution and bit depth are the two non-perceptual factors of color depth.  So, if the resolution is increased, the color depth is increased (as exemplified in halftone printing and in my pixel site example).  Likewise, if the bit depth is increased, the color depth is increased.

     

    Banding/posterization is something completely different.  It is an phenomenon that can occur with lower bit depths in some situations, but it also can occur in analog imaging systems that possess neither bit depth nor pixels.  The color depth of an image is not affected by whether or not the image exhibits posterization.

     

    Let's say that you shoot two 4k, 8-bit images of the sky within the same few seconds:  one which is aimed at a part of the sky that exhibits no banding and one which is aimed at a portion of the sky produces banding.  Then, you down-convert both images to Full HD, 10-bit using the method described in Andrew Reid's article in which no color information is lost from the original images.  Would you say that the Full HD, 10-bit image of the banded sky has less color depth than the Full HD 10-bit image of the smooth sky?

     

     

    1 hour ago, kye said:

    For those having to read this, firstly, I'm sorry that discussions like this happen and that it is so difficult to call someone out on them posting BS misleading generic statements.  The reason I do this is because as I've learned more about film-making and the tech behind it, the more I've realised that so many of the things people say on forums like these is just factually incorrect.  This would be fine, and I'm not someone who is fact-checking 4chan or anything, but people make decisions and spend their limited funds on the basis of BS like this, so I feel that we should do our best to call it out when we see it, so that people are better off, rather than worse off after reading these things.

    Well, it's good that you are looking out for us.  Now, if you could only avoid confusing bit depth and color depth...

  4. 5 minutes ago, kye said:

    You're really not getting this....

    I sense irony here.

     

     

    6 minutes ago, kye said:

    So, if 4K has 4 times the colour depth, then downscaled to FHD it should be equivalent to FHD 10-bit.

    A conversion can be made so that the resulting Full HD 10-bit image has essentially the equivalent color depth as the original 4K 8-bit image.  Of course, there are slight conversion losses/discrepancies.

     

     

    11 minutes ago, kye said:

    When I shoot a 4K 8-bit image and get banding in it, and downscale it to FHD, why does the banding remain?  If I took the same shot in FHD 10-bit, there is no banding,...

    The banding remains because it is an artifact that is inherent in the original image.

     

    That artifact has nothing to do with the color depth of the resulting image -- the banding artifact in this case is caused merely by a lower bit depth failing to properly render a subtle transition.  However, do not forget that bit depth is not color depth -- bit depth is just one factor of color depth.

     

    It's actually very simple.

     

     

    14 minutes ago, kye said:

    ... so why doesn't the banding get eliminated like you've claimed in your original statement?

    I never claimed that the banding would get eliminated in a straight down-conversion.  In fact, I made this statement regarding your scenario of a straight down-conversion from an 8K banded image to an SD image: 

    22 hours ago, tupp said:

    it is very possible that the 8K camera will exhibit a banding/posterization artifact just like the SD camera.

     

    Where did you get the idea that I said banding would get eliminated in a straight conversion.

     

    You have to grasp the difference between banding/posteriztion artifacts and the color depth of an image.

  5. 1 hour ago, Yannick Willox said:

    To summarize: I shoot in 4k dci, because of the aspect ratio and also because my lenses actually get a little wider. But would it be better practice to shoot in normal 4k, to downscale to 1080p ? Performance and or quality wise ?

    If you (and/or your client) like the aspect ratio and like the fact that you are using a wider portion of the image circle of your lenses, then, to me, those are the most important considerations.

     

    So, you are probably best shooting at 4096x2160 (DCI 4K) and down-converting cleanly to 2048x1080 (DCI 2K) or less cleanly to 1920x1013.  Any extra rendering time for the odd height pixel in the "less clean" resolution would likely be minimal, but it would probably be a good idea to test it, just to make sure.

  6. 6 hours ago, kye said:

    You're making progress, but haven't gotten there yet.

    Glad to know that I am making progress.  You have not directly addressed most of my points, which suggests that you agree with them.

     

     

    6 hours ago, kye said:

    Please explain how, in an 8K image with banding, an area with dozens/hundreds of pixels that are all the same colour, somehow in the downsampling process you will get something other than simply a lower resolution version of that flat band of colour?

    Firstly, the banding doesn't have to be eliminated in the down conversion to retain the full color depth of the original image.  Banding/posterization is merely an artifact that does not reduce the color depth of an image.  One can shoot a film with a hair in the gate or shoot a video with a dust speck on the sensor, yet the hair or dust speck does not reduce the image's color depth.

     

    Secondly, broad patches of uniformly colored pale sky tend to exhibit shallow colors that do not utilize a lot of color depth bandwidth.  So, it's not as if there is much color depth lost in the areas of banding.

     

    Thirdly, having no experience with 8K cameras, I am not sure if the posterization threshold of such a high resolution behaves identically to those of lower resolutions.  Is the line in the same place?  Is it smooth or crooked or dappled?

     

    In regards to eliminating banding during a down conversion, there are many ways to do so.  One common technique is selective dithering.  I have read that diffusion dithering is considered most favorable over other dithering methods.

  7. 2 hours ago, kye said:

    No.  Colour depth is bit-depth.

    Nope.  Color depth is the number of different colors that can be produced in a given area.  A given area has to be considered, because imaging necessarily involves area... which area necessarily involves resolution.

     

    Obviously, if a 1-bit imaging system produces more differing colors as the resolution is increased, then resolution is an important factor to color depth -- it is not just bit depth that determines color depth.

     

    The above example of a common screen printing is just such an imaging system that produces a greater number of differing colors as the resolution increases, while the bit depth remains at 1-bit.

     

     

    2 hours ago, kye said:

    The wikipedia entry begins with:

    Quote

    Color depth or colour depth (see spelling differences), also known as bit depth, is either the number of bits used to indicate the color of a single pixel, in a bitmapped image or video framebuffer, or the number of bits used for each color component of a single pixel.

    The Wikipedia definition of color depth is severely flawed in at least two ways:

    1. it doesn't account for resolution;
    2. and it doesn't account for color depth in analog imaging systems -- which possess absolutely no bit depth nor pixels.

     

    Now, let us consider the wording of the Wikipedia definition of color depth that you quoted.   This definition actually gives two image areas for consideration

    1. "a single pixel" -- meaning an RGB pixel group;
    2. and "the number of bits used for each color component of a single pixel" -- meaning a single pixel site of one of the color channels.

     

    For simplicity's sake, let's just work with Wikipedia's area #2 -- a single channel pixel site of a given bit depth of "N."  We will call the area of that pixel site "A."

     

    If we double the resolution, the number of pixel sites in "A" increases to two.  Suddenly, we can produce more tones inside "A."  In fact, area "A" can now produce "N²" number of tones -- much more than "N" tones.

     

    Likewise, if we quadruple the resolution, "A" suddenly contains four times the pixel sites that it did originally, with the number of possible tones within "A" now increasing to "N⁴."

     

    Now, one might say, "that's not how it actually works in digital images -- two or four adjacent pixels are not designed to render a single tone."  Well, the fact is that there are some sensors and monitors that use more pixels within a pixel group than those found within the typical Bayer pixel group or found withing a striped RGB pixel group.  Furthermore (and probably most importantly), image detail can feather off within one or two or three pixel groups, and such tiny transitions might be where higher tone/color depth is most utilized.

     

    By the way, I didn't come up with the idea that resolution is "half" of color depth.  It is a fact that I learned when I studied color depth in analog photography in school -- back when there was no such thing as bit depth in imaging.

     

    In addition, experts have more recently shown that higher resolutions give more color information (color depth), allowing for conversions from 4k, 4:2:0, 8-bit to Full HD, 4:4:4, 10-bit -- using the full, true 10-bit gamut of tones.  Here is Andrew Ried's article on the conversion and here is the corresponding EOSHD thread.

  8. 2 hours ago, kye said:

    Answer me this - if I take a camera and shoot it in 8-bit 800x600 and I shoot the sky and get banding, then I set the same camera to 8K 8-bit and shoot the same sky, why do I still get banding?

    Well, this scenario is somewhat problematic because one is using the same camera with the same sensor.  So, automatically there is a binning and/or line-skipping variable.

     

    However, barring such issues and given that all other variables are identical in both instances, it is very possible that the 8K camera will exhibit a banding/posterization artifact just like the SD camera.

     

    Nevertheless, the 8K camera will have a ton more color depth than the SD camera, and, likewise, the 8K camera will have a lot more color depth than a 10-bit, 800x600 camera that doesn't exhibit the banding artifact.

     

     

    2 hours ago, kye said:

    Your newspaper example is technically correct, but completely irrelevant to digital sensors as the method of rendering shades of colour is completely different.

    Of course, it is not practical to have 1-bit camera sensors (but it certainly is possible).

     

    Nonetheless, resolution and bit depth are equally weighted factors in regards to color depth in digital imaging, and, again, a 4k sensor has 4 times the color depth of an otherwise equivalent Full HD sensor.

  9. 37 minutes ago, kye said:

    Actually, the entire point of my post is that there are complexities to the equation that your statement does not acknowledge.  Sure, you added a "all other variables" disclaimer...

    I acknowledged your single "complexity" (bit rate), and even other variables, including compression and unnamed natural and "artificial" influences such as A/D conversion methods, resolution/codec conversion methods, post image processing effects, etc.

     

    By the way, greater bit rate doesn't always mean superior images, even with all other variables (including compression) being the same.  A file can have greater bit rate with a lot of the bandwidth unused and/or empty.

     

     

     

    49 minutes ago, kye said:

    ... but the point is that those "other variables" are actually the ones that matter and the resolution makes almost no difference at all, so much so that I'd say it's sufficient to render your statement so overly simplified that it is basically wrong,...

    One is entitled to one's opinion, but the fact is that resolution is integral to digital color depth.  Furthermore, resolution has equal weighting to bit depth when one considers a single color channel -- that is a fundamental fact of digital imaging.  Here is the formula:   COLOR DEPTH = RESOLUTION X BITDEPTH^n   (where "n" is the number of color channels and all where pixel groups can be discerned individually).

     

    Most don't realize it, but a 1-bit image can produce zillions of colors.  We witness this fact whenever we see images screen printed in a magazine, on a poster or on a billboard.  Almost all screen printed photos are 1-bit images made up of dots of ink.  The ink dot is either there or it is not there (showing the white base) -- there are no "in-between" shades.  To increase the color depth in such 1-bit images, one must increase the resolution by using a finer printing screen.

     

    That resolution/color-depth relationship of screen printing also applies to digital imaging (and also to analog imaging), even if the image has greater bit depth.

     

     

    58 minutes ago, kye said:

    ... but regardless, it's misleading and distinctly unhelpful.

    Unless you're just trolling?

    I simply state fact, and the fact is that 4k has 4 times the color depth and 4 times the bit rate of full HD (all other variables being equal and barring compression, of course).

  10. 4 hours ago, kye said:

    It doesn't matter how many pixels there are, if you're projecting / viewing the total image on a screen the same size regardless of the resolution, then having a FHD mode with dramatically less bitrate than a 4K mode will just look inferior.
    ...

    If anyone is considering trying to get the best results from cheaper cameras that can shoot 4K then the strategy is probably better to shoot in 4K, even if you're putting that footage onto a 2K / FHD timeline.  Plus, with the computers and storage we have in recent years, most cameras that only shoot FHD probably have such a low bitrate that a more recent 4K image downscaled would be a better bet anyway.

    4K has 4 times the color depth (and 4 times the bit rate) of full HD, all other variables being equal and barring compression or any artificial effects.

  11. 7 hours ago, Andrew Reid said:

    Yeah but the treasure hunt, scavenging for cheap but fun stuff is all part of the fun isn't it?

    Finding a cheap, fun camera certainly can be part of the fun for those who can afford to buy one.  Another part of the fun is using inexpensive gear to shoot something compelling, which can be done with a camera that one already owns.

     

    Why exclude those who can't buy a camera, merely because they can't afford to experience one part of the fun?

  12. On 3/10/2021 at 12:05 AM, Tim Sewell said:

    Well yeah, that really shows that I'm wrong when I say that films reflect the social mores of the societies that make them. Doesn't trade on sexist or heteronormative tropes in the slightest!

    I wasn't suggesting anything about your points regarding social mores.  I was merely showing what is to my knowledge the only feature film prior to 1980 that primarily addresses issues of having a female US president.  It is not just a film that happens to have a female US president as a secondary character.

     

    Of course, the mores have changed dramatically since 1964, so much so that the ending (and title) of "Kisses For My President would have to be different.

     

    On the other hand, I don't think that changing social mores nor politics is at the heart of the mediocrity of our age.  Certainly, shoehorning diversity into content doesn't help, but there is a larger reason(s) for the shallow, uninspired material that we encounter today.

  13. 7 hours ago, Andrew Reid said:

    - Must be new camera you don't already own and under $150 maximum (have fun searching)

    The general idea for this contest is great, but forcing folks to buy a camera might be a deal-breaker for some.  Perhaps it should be stipulated that the camera merely has to has to be "trending" on Ebay for no more than US$150.

  14. 42 minutes ago, kye said:

    Those Harrison and Harrison filters are fascinating - a couple even look like they have bubbles in the glass.  I'm not familiar with them,

    Although the company is gone, Harrison & Harrison was a dominant filter maker for cinema "back in the day."  They invented black dot diffusion, which is the basis of Black Pro Mist filters and of other derivative filter technology.

     

    47 minutes ago, kye said:

    but for $500 US I can buy half a P2K, so that's where my money would be better going!

    Well, the set of 5 filters that I linked was listed at US$200, but, as mentioned, H&H filters can can sometimes be found individually.

     

    What is a P2K?  Definitely interested in that.

     

    56 minutes ago, kye said:

    I looked at the tests and found that the black levels typically get lifted almost the same amount from a 1/8

    Keep in mind that although the black levels can be lifted with diffusion filters, that doesn't mean that one will see more detail in the shadows.

     

    1 hour ago, kye said:

    I've seen DIY videos of people making their own with UV filters and black spray paint,

    To approximate black dot effect, the black spray paint specs should be "embedded" within a diffusion layer (hair spray or something similar).

     

    1 hour ago, kye said:

    My concern with the BPM was that it would be too strong near the hot spot and too weak further away,

    Not sure what you seek here nor if any existing lens filters can yield such results.

     

    1 hour ago, kye said:

    I'm sure that I can make a diffusion filter, but I'm definitely not going to be able to control the distribution of that diffusion across the frame.

    On the contrary, if you DIY, you are in complete control of the distribution of the diffusion medium.  In the videos that I watched, it didn't seem too difficult.

     

    1 hour ago, tupp said:

    It's always puts a smile on one's face when a YouTuber conducts a test with just a frontal light source, and the subject turns their head from left to right.

    1 hour ago, kye said:

    Why?

    I don't know, guess it's just me...

     

    1 hour ago, kye said:

    I frequently follow a subject and am panning and pan from almost flat-light to back-light and the sun in frame.

    If you have a good lens hood or matte box (or a solid French flag), the flare will be reduced when the Sun is out of frame.

     

    1 hour ago, kye said:

    t's all well and good if you're shooting something where you get 20 minutes to set up for 5s of footage, but some of us struggle to get 5s to setup before a 2 minute shot.  Sometimes I feel like the pros would recommend a C500, wide cine prime and Ninja recorder to a skydiver who wants to shoot POV video mounted to their helmet on the way down!

    It shouldn't take 20 minutes to "set-up" a lens hood.

     

    1 hour ago, tupp said:

    One can always add ambient fog in post.

    1 hour ago, kye said:

    True, but lifting the blacks in post also lifts the noise, which is great if you're trying to create an alien fog full of angry nano-bots who randomly self assemble in squares, but it's not an aesthetic I'm really looking for.

    I am not suggesting lifting the blacks.  To add ambient fog in post, one basically slaps a smooth white, slightly diffusing layer/track over the image, and then adjusts the opacity of that white layer/track as desired.  Doing so is very similar to an out-of-frame light source hitting a lens diffusion filter.

     

    1 hour ago, kye said:

    Doing it physically raises the black levels optically, which then enables you to lower the exposure to put them back to black and get more highlights, increasing the DR, or for you to have higher black levels in the file, which gives you a softer look without having visible noise.

    If one wants the look of ambient flare on a len diffusion filter, one can similarly lower the camera exposure and then use the post method stated directly above.  The results will closely simulate doing it all in-camera with the higher black levels and no extra noise, plus one will have more control over the level of "ambient flare."

  15. 1 hour ago, kye said:

    I searched and couldn't find Low Contrast ones for sale in the size I needed anywhere on earth.

    What size do you need?  Here is an 82mm Tiffen Low Contrast filter for sale.  If your lens is smaller, you could just use a step-up ring.

     

    By the way, there are plenty of YouTube videos on making DIY black-promist filters.  One can even make a smaller increment than 1/8.  To approximate the the black dot process one needs to apply the black spray paint before the hair spray (or other diffusion spray).

     

    Also, the Harrison & Harrison black dot originals can still be found for sale in sets or individually.

     

     

    2 hours ago, kye said:

    These two videos are interesting and useful:

    It's always puts a smile on one's face when a YouTuber conducts a test with just a frontal light source, and the subject turns their head from left to right.

     

     

    2 hours ago, kye said:

    In the second one, the guy mentioned how he's a pro cinematographer and has the BPM 1/8 on his lens almost all the time.

    As he suggests, it's generally best to use a lenser (flag the light source outside of the frame from hitting the lens/filter) or a hood/matte box.  One can always add ambient fog in post.

  16. 1 hour ago, EduPortas said:

    Every single movie I can think of that can be called art pushed normative boundries in some way or another.

    Anyone can call almost anything "art."  Art mostly defies definition.

     

    Art doesn't have to push boundaries -- art can be something that is merely pretty.  It can also be something that is stimulating, funny or entertaining in some way.

     

    To me, the big problem with current movies and television today is that there aren't a lot of good, original stories being generated.  Similarly, there just isn't a lot of inspired originality anymore in the other performing arts, such as music, dance and theatre.

     

    We find ourselves deep in the age of mediocrity.

     

    Some will put the blame on the conglomeration of entertainment companies along with the onset of digital technology.  Huge corporations (and talentless board members) making most of the big decisions in the arts has got to water things down.   Also, before digital, one had to be more deliberate and thoroughly flesh-out ideas and be extensively prepared, talented and/or experienced.  With digital, one can shoot things more "on the fly," without prep nor originality and with minimal artistic ability and little know how.

  17. On 2/20/2021 at 2:21 AM, odie said:

    If you’re in Europe filming on 16mm or 35mm here is an Athens lab we used for a series of commercials

    anmar.gr

    they include everything and they’re great!

    Good to know for anyone working in that area.  They have three scanners.  Thanks!

  18. 51 minutes ago, mkabi said:

    Have you tried the 6K Anamorphic (open gate) with (Cheap) Anamorphic lenses?

    See if that helps you on your quest of moving from the video look to the cinematic look. 

    Agreed.  A good lens choice should reduce the video look more readily than diffusion filters.

     

    Vintage lenses are ideal.  If you can't get Xtal Express, use a vintage spherical lens.

  19. 3 hours ago, kye said:

    Or accurate diffusion of any source that is clipped, or diffusion of any source that is out of frame...

    Yes, of course, but if one exposes properly and/or uses HDR features, then it might be possible to match "blown-out" areas in the frame.

     

    Additionally, lens diffusion scattering from "out-of-frame" sources is also influenced by lens hoods and matte boxes.

     

    In the 1970's, David Hamilton was the king of using lens diffusion while blowing-out highlights and light sources.  As I recall, black-dot lens diffusion didn't appear until the early 1980's, and Hamilton would push Ektachrome which increased contrast, countering the softness/flatness produced by the lens diffusion.  In addition, pushing gave coarser grain, which worked well for Hamilton's soft aesthetic.

  20. On 2/21/2021 at 3:51 AM, androidlad said:

    Other than polarisation filter, all filter effects can be precisely emulated digitally in post. This allows for much greater and finessed control and guarantees glare-free.

    Certainly there are many diffusion effects that can be emulated accurately in post.  Furthermore, there are also diffusion effects that are exclusive to post which can't be done with optical filters.

     

    However, there are some optical filters which can't be duplicated digitally, such as IR cut/pass filters, UV/ haze filters, split-diopters, pre-flashing filters, etc.

  21. 40 minutes ago, BenEricson said:

    The BMP4K weighs less than the mag on an SR3! A 105mm with no shoulder rig and multiple points of contact won’t end well. 

    The balance, weight and ergonomics of the rig are huge factors to how smooth the shot looks.

    Well, the 16S, the Bolex, the Krasnogorsk, etc. all had their eyepieces at the rear of the camera, so they weren't shoulder mounted.  There were a few tricks that one could practice to keep them stable.  There were also other brackets (such as belt pole rigs) that could help.  Of course,  weight could always be added for more stability.

     

    I am with you on shoulder rigs.  A balanced shoulder rig is always fairly stable regardless of weight.

  22. 17 hours ago, SteveV4D said:

    I was planning to buy a Pocket 6K to compliment my Pocket 4K for my work.

    Your P4K should closely match your P6K if you use a speedbooster with your EF lenses on your P4K.  As you are likely aware, a speedbooster (or focal reducer) is just an adapter with optics that condense the image circle and character of a lens to a smaller size.  Most M4/3 speedboosters will yield a Super35/APS-C frame and look, plus give an extra stop of exposure to boot.

     

    Here is a video comparing a Metabones speedbooster with a recent Viltrox focal reducer on the P4K, cued to the section comparing autofocus speed in lower light.  To me, the Viltrox is good and the Metabones is better.  Neither seems to have any prohibitive problem with their electronics.

     

    Was the AF performance of your adapters as good as these speedboosters?

     

     

×
×
  • Create New...