Jump to content


  • Content Count

  • Joined

  • Last visited

Everything posted by tupp

  1. So what? Again, does that impress you? Furthermore, you suggested above that Yedlin's test applies to 99.9999% of content -- do you think that 99.9999% of content is shot with an Alexa65? Well, you left out a few steps of upscale/downscale acrobatics that negate the comparison as a resolution test. Most importantly, you forgot to mention that he emphatically stated that he had achieved a 1-to-1 pixel match on anyone's HD screen, but that he actually failed to do so, thus, invalidating all of his resolution demos. You are obviously an exper
  2. Yedlin's resolution comparison doesn't apply to anything, because it is not valid. He is not even testing resolution. Perhaps you should try to convince Yedlin that such demonstrations don't require 1-to-1 pixel matches, because you think that 99.9999% of content has scaling and compression, and, somehow, that validates resolution tests with blended pixels and other wild, uncontrolled variables.
  3. Yes, and I have repeated those same points many times prior in this discussion. I was the one who linked the section in Yedlin's video that mentions viewing distances and viewing angles, and I repeatedly noted that he dismissed wider viewing angles and larger screens. How do you figure that I missed Yedlin's point in that regard? Not sure what you mean here nor why anyone would ever need to test "actual" resolution. The "actual" resolution is automatically the "actual" resolution, so there is no need to test it to determine if it is "actual". Regardless
  4. Keep in mind that resolution is important to color depth. When we chroma subsample to 4:2:0 (as likely with your A7SII example), we throw away chroma resolution and thus, reduce color depth. Of course, compression also kills a lot of the image quality. Yedlin also used the term "resolute" in his video. I am not sure that it means what you and Yedlin think it means. It is impossible for you (the viewer of Yedlin's video) to see 1:1 pixels (as I will demonstrate), and it is very possible that Yedlin is not viewing the pixels 1:1 in his viewer.
  5. Oh, that is such a profound story. I am sorry to hear that you lost your respect for Oprah. Certainly, there are some lengthy videos that cannot be criticized after merely knowing the premise, such as this 3-hour video that proves that the Earth is flat. It doesn't make any sense at the outset and it is rambling, but you have to watch the entire 2 hours and 55 minutes, because (as you described the Yedlin video in an earlier post) "the logic of it builds over the course of the video." Let me know what you think after you have watched the entire flat Earth video. Now,
  6. The video's setup is flawed, and there are no parts in the video which explain how that setup could possibly work to show differences in actual resolution. If you disagree and if you think that you "understand" the video more than I, then you should have no trouble countering the three specific flaws of the video that I numbered above. However, I doubt that you actually understand the video nor the topic, as you can't even link to the points in the video that might explain how it works. I see. So, you have actually no clue about the topic, and you are just t
  7. Good for you! Nope. The three points that I made prove that Yedlin's comparisons are invalid in regards to comparing the discernability of different resolutions. If you can explain exactly how he gets around those three problems, I will take back my criticism of Yedlin. So far, no one has given any explanation of how his set up could possibly work. Yes. I mentioned that section and linked it in a previous post. There is no way his set up can provide anything conclusive in regards to the lack of any discernability between different res
  8. You are incorrect. Regardless, I have watched enough of the Yedlin videos to know that they are significantly flawed. I know for a fact that: Yedlin's setup cannot prove anything conclusive in regards to perceptible differences between various higher resolutions, even if we assume that a cinema audience always views a projected 2K or 4K screen. Much of the required discernability for such a comparison is destroyed by his downscaling a 6K file to 4K (and also to 2K and then back up to 4K) within a node editor, while additionally rendering the viewer window to an HD video f
  9. Agreed, but what is the point of all the downscaling and upscaling? What does it prove in regards to comparing the discernability of different resolutions? How can we compare different resolutions, if the differences in resolution are nullified at the outset of the comparison? In addition, is this downscaling and then upscaling something that is done in practice regularly? I have never heard of anyone intentionally doing so. Also, keep in mind that what Yedlin actually did was to downscale from 6K/4K to 2K, then upscale to 4K... and then downscale back to 1080. The
  10. Then Tupp said that he didn't watch it, criticised it for doing things that it didn't actually do, then suggests that the testing methodology is false. Your interpretation of my interaction with your post here is certainly interesting. However, there is no need for interpretation, as one can merely scroll up to see my comments. Nevertheless, I would like to clarify a few things. Firstly, as I mentioned, I merely scanned the Yedlin videos for two reasons: I immediately encountered what are likely flaws in his comparisons (to which you refer as "things that it did
  11. Thank you for the recommendation! I don't see any flaw in his strategy, but it also seems rather generic. Not sure if I should spend an hour watching a video that can be summed-up in a single sentence (as you just did). Also, I hesitate to watch a lengthy video that has fundamental problems inherent in its demonstrations (such as showing resolution differences without changing the pixel size nor pixel number).
  12. Very cool! Your rig reminds me of @ZEEK's EOSM Super 16 setup. It shoots 2.5K, 10-bit continuously or 2.8K, 10-bit continuously with ML at around 16mm and Super 16 frame sizes.
  13. We've certainly talked about resolution, and other Yedlin videos have been linked in this forum. I merely scanned the videos (that second video is over an hour in length), so I don't know all the points that he covered. Resolution and sharpness are not the same thing. There is a contrast element to sharpness, and it involves different levels (macro contrast micro contrast, etc.). One can see the effects of different levels of contrast when doing frequency separation work in images. Not sure if Yedlin specifically covers contrast's relation to sharpness in these video
  14. tupp


    This is a good suggestion. A snorkel bag is usually very thick plastic/vinyl. Just substitute taped foam pads for socks!
  15. tupp


    If you wrap your rain cover properly, you shouldn't have to worry about paint getting in. In regards to the impact, I don't know anything about the force of paint ball guns. If you don't use an insulated (padded) rain cover, you could tape pieces of foam sheet onto vulnerable areas on the camera and then put on the rain cover/bag. You could also run some impact tests with a paint gun, rain cover, foam pads and a wine glass.
  16. tupp


    Use the Glidecam with a rain cover and keep all flaps on the rain cover tucked-in (or use tape). Should be okay. Another option is to use a heavy, clear freezer bag around the camera, with the opening taped to the lens hood.
  17. tupp


    When you said "Glidecam," I got the impression that you were using a Steadicam type rig -- not an electronic gimbal. Furthermore, a solid underwater housing might be too much for a gimbal.
  18. tupp


    An underwater housing seems like overkill for your purpose. A rain cover and a strong lens filter might be better and easier. If you are concerned about damage from the impact force of the paint balls, you could try an insulated rain cover.
  19. Nope. I said "A conversion can be made so that the resulting Full HD 10-bit image has essentially the equivalent color depth as the original 4K 8-bit image." I didn't say anything about the original image having "reduced" color depth. You came up with that. However, the original image does have lower bit depth than the down-converted image -- even though both images have the same color depth. Yes. That is a fact -- all other variables being the same in both instances. No. It doesn't disagree with anything that I have said.
  20. I sense irony here. A conversion can be made so that the resulting Full HD 10-bit image has essentially the equivalent color depth as the original 4K 8-bit image. Of course, there are slight conversion losses/discrepancies. The banding remains because it is an artifact that is inherent in the original image. That artifact has nothing to do with the color depth of the resulting image -- the banding artifact in this case is caused merely by a lower bit depth failing to properly render a subtle transition. However, do not forget that bit d
  21. If you (and/or your client) like the aspect ratio and like the fact that you are using a wider portion of the image circle of your lenses, then, to me, those are the most important considerations. So, you are probably best shooting at 4096x2160 (DCI 4K) and down-converting cleanly to 2048x1080 (DCI 2K) or less cleanly to 1920x1013. Any extra rendering time for the odd height pixel in the "less clean" resolution would likely be minimal, but it would probably be a good idea to test it, just to make sure.
  22. Glad to know that I am making progress. You have not directly addressed most of my points, which suggests that you agree with them. Firstly, the banding doesn't have to be eliminated in the down conversion to retain the full color depth of the original image. Banding/posterization is merely an artifact that does not reduce the color depth of an image. One can shoot a film with a hair in the gate or shoot a video with a dust speck on the sensor, yet the hair or dust speck does not reduce the image's color depth. Secondly, broad patches of uniformly colored p
  23. Nope. Color depth is the number of different colors that can be produced in a given area. A given area has to be considered, because imaging necessarily involves area... which area necessarily involves resolution. Obviously, if a 1-bit imaging system produces more differing colors as the resolution is increased, then resolution is an important factor to color depth -- it is not just bit depth that determines color depth. The above example of a common screen printing is just such an imaging system that produces a greater number of differing colors as the resolution inc
  24. Well, this scenario is somewhat problematic because one is using the same camera with the same sensor. So, automatically there is a binning and/or line-skipping variable. However, barring such issues and given that all other variables are identical in both instances, it is very possible that the 8K camera will exhibit a banding/posterization artifact just like the SD camera. Nevertheless, the 8K camera will have a ton more color depth than the SD camera, and, likewise, the 8K camera will have a lot more color depth than a 10-bit, 800x600 camera that doesn't exhibit the
  • Create New...