Jump to content

tupp

Members
  • Posts

    1,148
  • Joined

  • Last visited

Everything posted by tupp

  1. Fresnels don't focus all of the output in one place -- such fixtures can only focus the light that hits the lens. The light that hits the inside of the housing is wasted. In this sense, many open-faced fixtures are more efficient than Fresnel fixtures as almost all of the light from an open-faced fixture comes out the front of the unit. No. The Fresnel will be dimmer than using the exposed LED with a reflector. By the way, softboxes are generally a lot more efficient and controllable than naked diffusers. Naked diffusers also generate a lot of spill light. Folks, Fresnel lights on set generally have a continuous "focus" range of beam angles from "spot" to "flood." The range of those beam angles varies with each fixture. I can't recall all the times I've seen someone illuminate a diffuser with a Fresnel light, but almost always they focused the light to "full flood" to completely illuminate the diffuser. Fresnels are used all the time on film sets. I use Fresnels and open-faced focusable fixtures directly on people and sets. Keep in mind that the "spotlight" effect from a naked Fresnel will usually give a soft edge to the spot. If one wants a harder edge on a spot, use a snoot and focus the light to "full flood" (or use a good ellipsoidal/followspot or projection fixture that has minimal fringing). A lot depends on what you are trying to do. I could use that panel fixture naked in a lot of shoots. Softness in lighting is a matter of degree between a point source and completely surrounding your subject with a smooth light source. There is not definitive "soft light" and "hard light." By the way, you can use a panel light in a soft box. A lot depends on the optics in front of the LED, but, again, open-faced fixtures are almost always more efficient than fixtures using Fresnel/plano-convex lenses.
  2. The stills look great!
  3. Keep in mind that although signal-to-noise ratio and dynamic range are similar properties, SNR will always be smaller than DR. Both he upper limits and noise thresholds differ in the way that they are obtained, especially in regards to capturing photons with image sensors. This article breaks down the differences. A downscale with binning usually reduces noise, which increases the effective DR/'SNR Perhaps the in-camera downscale is not binning properly, or perhaps it might be too difficult to do in the camera because of the sensor's complex filter array.
  4. I have a fair bit of experience creating patterns and slashes on backgrounds, and unless the desired pattern is complex, you might be better off just cutting what you need out of foamcor and just casting a shadow. Slashes are easy to make with zero spill using bardoors and/or blackwrap. With foamcor patterns the spill is not that big of a problem, if you are careful with your doors/snoot and if you leave a big border around the pattern. If you really need to go with a projection fixture, consider a used Source 4 ellipsoidal. They have a nice punch and go for as low as US$175 on Ebay (sometimes with the pattern holder included). In addition, they take standard theatrical patterns -- there are zillions of them. If you anticipate working mostly in close quarters, get the 36-degree lens (or the 50-degree lens).
  5. Your video is easily better than 95% of the work that is out there -- great eye and a nicely coordinated edit. The narrator did a great job, as well. Did someone give her line readings? I noticed that the interior shot had slight noise, but I was pixel peeping. I would guess that you had to stop down due to the high scene contrast. It certainly was not enough noise to warrant using an unwieldy cinema camera, although it might have been interesting to try a minor adjustment to the E-M10's "Highlights & Shadows" setting. Your video is a superb and inspiring piece that brings life to a mundane subject.
  6. The red line didn't help, but a black Sharpie could quickly fix that.
  7. Film grain is "organized" to actually form/create the image, while noise is random (except for FPN) and obscures the image. Noise doesn't "destroy" resolution -- it just obscures pixels. On the other hand, with too much noise the image forming pixels are not visible, so there's no discernible image and, thus, no resolution. Grain actually forms the image on film. Noise obscures the image. Yep. That notion is subjective, but not uncommon. Actual electronic noise is random, whether digital or analog. It could be argued that FPN and extraneous signal interference are not "noise." This is another subjective but common notion. Again, with the medium of film, grain is organized to actually form the image -- noise is random and noise obscures the image. To me, it doesn't make much sense to remove random noise and then add an overlay of random grain -- grain that does absolutely nothing to form the image.
  8. Yes. Fringing can sometimes be an issue with some ellipsoidal fixtures. Instruments that use lenses made for slide/film projectors usually exhibit minimal fringing. Use an open-face tungsten or HMI source with a snoot. Add a tube of blackwrap if the snoot is too short. That should eliminate most of your spill, and the open-face filament/arc will give you sharp shadows if the fixture is set to full flood. Both open-face and Fresnel fixtures always give their sharpest shadows on full flood.
  9. I know what it is like to suffer attacks and abuse for merely telling the truth and stating fact. All I can say is:
  10. Okay. If you can keep your set-up and shoot time down to an hour total and if you are only using two or three fixtures, then it is probably better to use batteries. Bring spares. LED fixtures might need to be fairly close to the subject when shooting in the daytime.
  11. You have plenty of work light? Are you shooting nighttime exteriors 1000km form the nearest town? What is "IV/PTC?"
  12. If your LED lights actually draw a total of 600W, you might suffer continual battery management/charge anxiety. Also, when you kill your battery-powered set lights in between takes and in between set-ups, will you also have a separate battery-powered work-light running? On the other hand, a genny with a 600W constant capacity is not that big (but it's good practice to use a genny that is rated at twice the anticipated power draw). You can keep it and a gas can in a large plastic bin(s) to protect your car's interior from gasoline/oil. By the way, gennys last longer than batteries because their tanks have much larger power capacities than typical batteries -- not because they can be "topped-up." A battery system can be "topped-up" with a parallel, rectified circuit and/or switches. If you are recording sound, make sure to have at least 150 feet of 12 guage stingers (extension cords) just to run power from the genny to the set, and hide the genny behind a distant building or thick bushes. You can also build a sound shield with stands, heavy vinyl and a furniture pad (or a thick blanket). If you can arrange in advance to run power from a nearby building, that might be an even better solution.
  13. There might be a way to attach a speedbooster. This is a really interesting thread!
  14. Yes! The EOSM with ML raw is amazing, and @ZEEK has a great eye! EOSM ML raw videos are constantly appearing on YouTube. This guy also does nice work with the EOSM.
  15. Perhaps it was interlaced (60i), and then deinterlaced to 30p. There are probably plug-ins/filters in those programs that might work, but I am not familiar those apps.
  16. Look again. I am not trolling. I have presented facts, often with detailed explanations and with supporting images and links. These facts show that Yedlin's test is faulty, and that we cannot draw any sound conclusions from his comparison. In addition, I have point-out contradictions and/or falsehoods (or lies) of others, also using facts/links. If my posting methods look like trolling to you, please explain how it is so. It doesn't sound like you are actually interested in the topic of this thread. No need to apologize for your shallowness and frivolity. Those are two real funny (and original) jokes. Another real funny joke. I am not sure that I believe you. So, you don't really have anything to add to the topic of the discussion with this solitary post late in the thread. You have made similar posts near the end of another extended thread, posts which likewise had no relation to that discussion. What is the purpose of these late, irrelevant posts? I don't wonder about that. Please... If you have something worthwhile to offer to the discussion, I'd like to hear it, but don't talk down to someone who is actually contributing facts, explanations and supporting evidence that relate directly to the topic of this thread.
  17. Ffmpeg has a "pullup" filter, but removing pulldowns can be done in ffmpeg without that special filter. Mencoder and AviSynth can also remove pulldowns. Several NLEs and post programs have plugins that do the same. However, the pulled-down 30fps footage is usually interlaced. Can you post a few seconds of the 30fps footage?
  18. What were you hoping to achieve with your personal insults of me below: I didn't mind any of these blatant insults nor the numerous sarcastic insinuations that you have made about me (I have made one or two sarcastic innuendos about you -- but not as many as you have about me). I don't mind it that you constantly contradict yourself and project those contraditions on me, nor do I mind when you inadvertently disprove your own points, nor do I care when you just make stuff up to suit your position. However, when you lied about me making a fictitious claim comparing myself to Yedlin, you went too far. Here is your lie: I never made any such claim. You need to take back that lie. Classic projection... You should ask yourself the same question -- how are your personal insults (listed above), contradictions and falsehoods helping anyone? I have given many reason's in great detail on why Yedlin's test is not valid, and I even linked two straightforward resolution demos that disagree with the results of Yedlin's more convoluted test. Additionally, you unwittingly presented results from a thorough empirical study that directly contradict the results of Yedlin's test. Those results showed a "pretty obvious" (your own words) distinguishability between two resolutions that differ by 2x, while Yedlin's results show no distinguishability between two resolutions with a more disparate resolution difference of 3x -- and the study that you presented was also made with an Alexa! I cannot conceive of any additional argument against the validity of Yedlin's comparison that is more convincing than those obviously damning results of the empirical study that you yourself inadvertently presented in this thread. Indeed, it's a question you should ask of yourself.
  19. I thought that you were just being trollish, but now it seems that you are truly delusional. Somehow in your mind you get the notion that I am "criticizing Yedlin for using a 6K camera on a 4K timeline" from this passage: Nowhere in that passage do I mention Yedlin, nor do I mention a camera, nor do I ever refer to anyone "using a 6K image on a 4K timeline." Most importantly, I was not criticizing anyone in that passage. Anybody can go to that post and see for themselves that I was simply making a direct response to your quoted statement: Even YOU did not refer to Yedlin, nor to a camera nor to using 6K on a 4K timeline. Making up things in your mind is harmless, but posting lies about someone is too much. You need to take back your lie that I claimed that my intellect was elevated in comparison to Yedlin's. Also, if you are on meds, keep taking them regularly.
  20. No I didn't. What is the matter with you -- why do you always make up false realities? Also, I already corrected you when you stated this very same falsehood before. I never criticized Yedlin for using ANY particular camera -- YOU are the one who is particular about cameras. The fact is that I have repeatedly stated that the particular camera used in a resolution doesn't really matter: Once again, here are the two primary points on which I criticized Yedlin's test (please read these two points carefully and try to retain them so that I don't have to repeat them again): The particular camera used for a resolution test doesn't matter! Actually, I linked two tests. I am not familiar with the camera in the other test. It is truly saddening to witness your continued desperate attempts to twist my statements in an attempt to create a contradiction in my points. I have repeatedly stated that the camera doesn't matter as long as it's effective resolution is high enough and it's image is sharp enough. So, camera interpolation is irrelevant, as I have also already specifically explained. The "interpolation" that Yedlin's test suffered was pixel blending that happened accidentally in post (as I have stated repeatedly) -- it had nothing to do with sensor interpolation. Yes. *Both* tests that I linked are not perfect. The first tester has a confirmation bias, and he doesn't give a lot of details on his settings, and the second comparison just doesn't give a lot of settings details, and the chosen subject is not that great. Nevertheless, both comparisons are implemented in a much cleaner and more straightforward manner than Yedlin's video, and both tests clearly show a discernible distinction between resolutions having only a 2x difference. Unless the testers somehow skewed the lower resolution images to look softer than normal, that clear resolution difference cannot be reconciled with the lack of discernability between 6K and 2K in Yedlin's comparison. Again, it's sad that you have to grasp at straws by using insults, instead of reasonably arguing the points. Your declaration regarding Yedlin's demo doesn't change the fact that it is not valid. So far, the most thorough comparison presented in this thread is the Alexa test you linked that shows a "pretty obvious" (your words) distinguishability between resolutions having a mere 2x difference. The test that you linked is the most empirical, because it: This: I never made such a claim, and you've crossed the line with your falsehoods here. Unless you can find and link any post of mine in which I claimed that my intellect was elevated in comparison to Yedlin's, you are a liar.
  21. As I answered that question before, I have already demonstrated that it is easy to achieve a 1-to-1 pixel match. However, I will add that there is no sense/logic to your notion that I should perform such a test myself, just because I have shown that Yedlin's comparison is invalid. I have demonstrated that we can draw no conclusions regarding the discernability of differing resolutions from Yedlin's flawed demo, and that fact is all that matters to this discussion. In addition, there are several comparisons which already exist that don't suffer the same blunders inherent in Yedlin's test. So, why would I need to bother making yet another one? The execution in these demos is not perfect, but they are implemented in a much cleaner and more straightforward manner than Yedlin's video. Here is one resolution comparison shot with a GH5s at 10bit and evidently captured in both HD and UHD. Does a GH5s use the correct type of sensor and is it also common/uncommon enough and does it additionally have enough quality -- to meet your approval for a resolution test? The tester emphasizes the 200% zoom, but at 100% zoom I can see the resolution difference between UHD and HD on my HD monitor. However, this test should really be viewed on a 4K monitor at minimum viewing distance, giving priority to the 100% zoom. This comparison is more straightforward than Yedlin's test. There is no upscaling/downscaling, and the images are not screen-captures of a software viewer, and there is no evidence of accidental pixel blending. However, it should be noted that the tester performs the comparison with a confirmation bias. Here is a similar resolution comparison showing faster moving subjects. Likewise, I can see a difference in resolution on my HD monitor with 100% zoom, but the video should really be viewed on a 4K monitor at minimum viewing distance. Now, I do not claim that higher resolutions are better than lower resolutions. I simply state the fact that I can discern a difference in resolution between HD and 4K/UHD at a 100% zoom, when viewing these two comparisons on my HD monitor. One more thing... if the Nuke viewer behaves the same as the Natron viewer, I suspect that peculiarities in the way the viewer renders pixels contributed significantly to Yedlin's "6K = 2K" results. Combining this potential discrepancy generated by the Nuke viewer with Yedlin's upscaling/downscaling and with the accidental pixel blending, it is easy imagine how a "6K" image would look almost exactly like a "2K" image, when viewed on "4K" timeline.
  22. Yedlin's test isn't really applicable to any "world," because his method is flawed, and because he botched the required 1-to-1 pixel match. Again, the type of camera/sensor doesn't really matter to a resolution test, as long the camera has enough resolution for the test. Resolution tests should be (and always are) camera agnostic -- there is no reason for them not to be so. What does that notion have to do with testing resolution? There's not much to a resolution test other than using a camera with a high enough resolution and properly controlling the variables. There is no "special" resolution testing criteria for "world class" filmmakers that would not also apply to those shooting home movies -- nor vice versa. Furthermore, I don't recall Yedlin declaring any such special criteria in his video, but I do recall him talking about standard viewing angles/distances in home living rooms. By the way, if you think that there are special criteria in resolution testing for "world class" filmmakers, please list them. The fact is that Yedlin's resolution test is flawed. Not sure how to take this. By "more productive discussions," I hope that you mean "additional productive discussions" and not "discussions that are more productive." Your example involves color channel resolution resulting from a crude Bayer sensor interpolation, which has nothing to do with chroma subsampling. Chroma subsampling occurs after sensor interpolation (if a sensor needs interpolation). In addition, chroma subsampling also applies to imaging generated by other means -- not just camera images. By posting those resolution charts, along with stating, "Pretty obvious that the red has significantly less resolution than the green," you have unwittingly demonstrated that there is a discernible distinction between different resolutions and/or that there is a large problem with Yedlin's test. Of course, we all know that the green photosites in a Bayer matrix have twice (2x) the resolution as the red photosites. So, you are actually admitting that there is obvious discernibility between two resolutions that differ by 2x -- in a test shot with an Alexa camera. Furthermore, Yedlin's Alexa65 test shows no discernible difference between two even more disparate resolutions -- 6K:2K is a 3x difference. So, how is it that we see an *obvious* distinction between resolutions that differ by only 2x in a well controlled, empirical study, while we see no distinction between resolutions that differ by 3x in Yedlin's uncontrolled test? It certainly appears that something is wrong with Yedlin's comparison. As I have suggested, the problem lies with his convoluted, nonsensical methods and in his failure to achieve a 1-to-1 pixel match. Again, contrary to your relatively recent stance, it doesn't matter to a resolution test whether or not the test camera uses chroma subsampling or uses a Bayer sensor or is common or uncommon with a certain level of quality. All of your constantly changing conditions on what particular camera can work in resolution tests are irrelevant. Resolution tests should be -- and always are -- camera agnostic, as long as the camera has enough resolution and sharpness for the resolution test. Earlier in the thread, you said yourself: So, you are saying that ANY raw camera can be used to duplicate the zoomed-in moments in Yedlin's test. In the very same post, you also stated: So, you additionally say that we can use ANY 4K camera that can shoot raw video (or even a still camera) to duplicate Yedlin's entire test. The question is: which stance of yours is true? For resolution tests, can we use ANY raw 4k camera or are we now required to use the particular common/uncommon camera with the particular sensor or quality that happens to suit your inclination at the moment? Of course, even if there is an prominent difference in the resolution between green and red/blue photosites it doesn't matter -- as long as the resolution of the red/blue photosites is high enough for the resolution test. Furthermore, the study that you linked showed the results of a better debayering algorithm in which the chart clearly shows the red channel having the same resolution as the green channel. So, your notion that the difference in resolution between green and red/blue photosites doesn't really matter to a resolution test -- as long as the camera's effective resolution (after debayering) is high enough and as long as the image is sharp enough. Additionally, there is a huge difference in execution between your linked empirical camera study and Yedlin's sloppy comparison. Unlike Yedlin's demo, the empirical study: is conceived and performed with no confirmation bias; is made in accordance with empirical guidelines set by TECH 3335 and EBU R 118; directly shows us the actual results, with no screenshots of results displayed within a compositor viewer; doesn't upscale/downscale test images to other resolutions; uses a precision resolution test chart that clearly shows what is occurring. Can you imagine how how muddled those fine resolution charts would look if the tester had accidentally blurred the spatial resolution as Yedlin did?
  23. Well, you seem to live in a world in which you make up your own "realities" about what I am saying and that's okay, but posting insinuations based on those fantasies is a whole other thing. Regardless, such fantasies have no relation the fatal problems with Yedlin's test. However, just to make it clear, I never suggested that chroma subsampling should be avoided in resolution tests. If fact I implied the opposite by pointing out that 4:2:0 cameras are more common than the Alexa65, and thus, according to your logic, we shouldn't use an exceedingly uncommon camera such as the Alexa65: Of course, chroma subsampling essentially is a reduction in color resolution (and, hence, color depth), but it doesn't really affect a resolution test, as long as the resulting images have enough resolution for required for the given resolution comparison. Again, none of this discussion on cameras/sensors has any bearing on the fatal problems with Yedlin's test. Classic projection and irony. Slipshod tests like Yedlin's are for those who need conformation of their bias.
  24. What I get is that there are some hair-brained notions floating around this thread on what cameras can and cannot be used in a resolution test, notions which have absolutely no bearing on the fatal flaws suffered in Yedlin's test. No. I rejected the test primarily because: And again, the test doesn't "involve" interpolation: Incidentally, Yedlin's test is titled "Camera Resolutions," because it is intended to test resolution -- not post "interpolation." A 1-to-1 pixel match is required for a resolution test, and Yedlin spent over four minutes explaining the importance of a 1-to-1 pixlel match at the very beginning of his video. In addition, you even explained and defended Yedlin's supposed 1-to-1 match... that is, until I showed that Yedlin failed to achieve it. Yedlin's haphazard test accidentally blurred the "spatial" resolution of the images, which makes it impossible to discern any possible difference between resolutions (Yedlin's downscaling/upscaling convolutions notwithstanding). What is it that you do not understand about this simple, basic fact? Not that this matters to the fact that Yedlin's test is fatally flawed, but you seem stuck on the notion that the results of using a camera with an interpolated sensor vs. using a camera with a non-interpolated sensor will somehow differ resolution-wise -- even if both cameras possess the very same effective resolution. However, the reality is that the particular camera that is used is irrelevant, as long as the captured images meet the resolution requirements for the comparison. If a non-interpolated camera sensor has the same effective resolution as a sensor that is interpolated, then both cameras with both such sensors are equally qualified to shoot images for the same resolution test. No, I didn't. You are making that up. In addition, how do you make sense of your notion that a 6K camera necessarily involves interpolation while a 4K camera doesn't involve interpolation? Well, then who cares about Yedlin's test that used an exceedingly uncommon camera? You said: No, it doesn't. The most common digital video cameras shoot with a 4:2:0 color (chroma) subsample. However, it is unlikely that anyone who would go to the trouble and expense to rent an Alexa65 package (along with all of the effort to achieve an appropriate high-end production value) would do any kind of subsampling at all. So, no -- the Alexa65 doesn't "share the same chroma subsampling properties of most cameras." The actual number on how many projects are down-sampled is likely unknown, but, in regards to down-sampling, again: So, Yedlin's results are further misleading in the sense that his "2K" image down-sampled from 6K is not the same as the more common down-sample of 4K to 2K, nor is it the same as 2K shot with a 2K camera. Using your logic from earlier in the thread, the results from an uncommon, high-end Alexa65 cannot possibly be applicable to those from the cameras that most of us use, because our cameras are lower quality. On the other hand, the Ursa 12K -- with a non-Bayer sensor -- is a also a high quality imaging device with twice the resolution of the Alexa65. Are you saying that the Ursa 12K -- with a non-Bayer sensor -- isn't good enough for a resolution test? Regardless, such notions of what cameras can and cannot be used in a resolution test have absolutely no bearing on the fatal flaws suffered in Yedlin's test. Really? Well, it seems like the red herring is actually your hypothetical test with a Foveon sensor that "does not share the same colour subsampling properties of most cameras," because the title of Yedlin's video is "Camera Resolutions," which indicates that his test compares differences in "perceptible" resolution -- not differences in color nor color "subsampling."
  25. If we follow your reasoning, then who cares about Yedlin's comparison? He used an exceedingly uncommon camera for his test. You have to rent an Alexa65, and doing so is extremely expensive. Furthermore, on my entire continent there are only five locations where the Alexa65 is available for rent. I am not sure what cameras in your mind are uncommon, but I would bet that there are more Foveon cameras in use than the Alexa65. Likewise, with X-Trans cameras, scanning back cameras and Ursa 12K's. You said yourself, "I'm not watching that much TV shot with a medium format camera," and the Alexa65 is a medium format camera! So, why should anyone care about Yedlin's test when he used such an uncommon camera? You seem stuck on the notion that the results of using an uncommon camera vs. using a common camera will somehow differ resolution-wise -- even if both cameras possess the very same effective resolution. However, the reality is that the particular camera that is used is irrelevant, as long as the captured images meet the resolution requirements for the comparison. If an uncommon camera has the same effective resolution as a common camera, then both cameras are equally qualified to shoot images for the same resolution test. It's a simple concept that's easy to grasp. Well, please explain how your notion of an SD resolution comparison is applicable to our discussion on a comparison higher resolutions.
×
×
  • Create New...