Jump to content

tupp

Members
  • Content Count

    756
  • Joined

  • Last visited

  • Days Won

    1

Everything posted by tupp

  1. @User, it appears that you need a lens support -- but not a big one.
  2. Agreed. If Panasonic uses an EF-mount once again (in spite of the fact that they already utilize the shallow and more versatile L-mount), that is truly something insane!
  3. You are making a large format DOF adapter and using anamorphic optics? Our own @Gonzalo Ezcurra made the largest format DOF adapters that I have seen. I don't think that he ever tried anamorphic optics on them, but you should probably be aware of them, nonetheless. He made a 20"x20" version, called the "E-Cyclops." Then, he made a smaller, 14"x14" version, called "MiniCyclops." Here is construction of the E-Cyclops. Here are some of the results, evidently from the MiniCyclops. Unfortunately, he took down all of his amazing videos shot with these devices. He also made a motorized focus mechanism and a motorized stand for the cameras.
  4. Again, the pink dots in the highlights of your video appear to be very different from the pink focus pixels that are discussed (and mapped out) on the ML forum. Upon a closer look, I see that focus pixels sometime appear in your video in the same frames as the pink highlight "fixed pattern" dots. Here is what pink focus pixel dots look like: Notice the distinctive, orderly pattern of the pink focus pixels. These faint orderly dots are easy to map out. The pink highlight dots in your video seem to be some other problem, likely related to the pink highlights phenomenon (completely different from the pink focus pixel phenomenon) and possibly also related to fixed pattern noise. Again, you additionally have a "black hole Sun" problem. I think that the free, open source MLV App will solve most of your problems, but I am not familiar with it.
  5. Those pink dots don't look like the typical focus pixel dots. The problems is something else. It appears that you are also experiencing black (pink?) hole Sun effect. I seem to recall reading in the Magiclantern EOSM thread and watching one or two recent @ZEEK videos on how to remedy pink highlights and black hole Sun with EOSM raw. As I recall, one just lowers the white level in MLV App to eliminate the pink highlights. Perhaps, @ZEEK and/or @Alpicat will chime in with suggestions. Nice! Thanks for sharing!
  6. I am talking about everything that involves converting an analog signal to a digital signal, including the signal going into a camera sensor's ADC and the signal coming out of that ADC. By the way, there are zillions of machine vision camera that offer selectable bit depths. There is no encoding nor compression nor codec. The bit depth changes, but the dynamic range doesn't change. No. It doesn't. Barring any artificial signal processing, the max limit of dynamic range is dictated by the analog stage of the sensor. Dynamic range is essentially (originally) a property of analog signals notated in decibels, regarding the maximum signal amplitude relative to its noise level. An ADC merely maps some number of digital increments to an analog signal's amplitude range (not to the signal's dynamic range). Regardless of how many digital increments the ADC maps, the relationship between max amplitude and noise level remains the same. Analog to Digital Converter. Most ADCs for camera sensors are linear. Certainly, other ADCs exist that don't make a linear conversion. First of all, a lot of folks who have tested Alexas would disagree with you and and say that it's dynamic range (in stops) is greater than it's bit depth -- 15+ stops of DR. However, the manufacturer Blackmagic's sensor could integrate an 8-bit ADC with the same analog stage of their 12-bit sensor and also make another sensor with a 16-bit ADC to go with the same analog stage, and the dynamic range would not differ one iota between the 8-bit version, the 12-bit version and the 16-bit version. The reason why the dynamic range (in stops) in CMOS sensors often approximates the bit depth of the ADC is because it is usually the most optimal/efficient balance between bandwidth and color depth. Mapping 16-bits to a sensor with 12 stops of dynamic range probably wouldn't improve the look much, but it would significantly increase bandwidth. Similarly, mapping only 8-bits to a sensor with 12 stops of dynamic range would severely limit the potential color depth and might make the images susceptible to banding. There are camera sensors that have outboard ADCs (not built into the sensor), and, changing the bit depth of the ADC has no effect on the DR.
  7. Ha, ha! Likewise, I've explained many times that dynamic range and bit depth are two different and independent properties. I have also given practical, existing examples of cameras that offer variable bit depth while maintaining the same dynamic range -- the bit depth varies independently from the dynamic range. In addition, there exist cameras in which one can change the effective dynamic range while maintaining the same bit depth. It is a misguided notion that CMOS sensors (or any other types of digital sensors) have some sort of absolute linear relationship between dynamic range and bit depth. 12-bit ≠12 EV and 12-bit ≠ 12 stops DR. The mapping of bit depth increments is independent from the bit depth and also independent from the DR. You can map 8-bit logarithmically, linearly, rec-whatever or any other way -- regardless of the DR. Me too!
  8. Dynamic range and bit depth are two different and independent properties. You can have a 30-stop dynamic range mapped to 8-bit. Likewise, you can have a 3-stop dynamic range mapped to 32-bit.
  9. Is one of those Canon cameras an HV20? Why didn't you just go with the X-T3? Welcome to the forum!
  10. I don't get the reference, but "Brian Jonestown Massacre" is an interesting pun on two tragedies.
  11. Their might be some settings posted in the ML 5d II thread. Or, you could post a question in that thread on what ML settings folks tend to use.
  12. Yep. That's been going on for years -- even before Red existed. One can safely assume that the above-the-line folks are clueless and cheap, when, at the outset, they stipulate that a DP must own a certain brand of camera. One exception to this assumption is that they might be trying to match previous footage. However, in that case, they are just cheap and one wonders what happened to the original DP. The trailer (thanks, @PannySVHS!) is reminiscent of the inexpensively made, slipshod "Now" films of the mid/late 1960s. Very cool! Not sure why he didn't just an use an off-the-shelf developer, stop bath and fixer.
  13. I wouldn't mount it on an EOSM without a lens support.
  14. I just realized that ffplay (the ffmpeg player) will show histograms, video waveforms and vectorscopes. Ffplay has to be one of the smaller applications that provides such capability. These commands worked on my Linux terminal. Not sure if the syntax changes slightly for the Windows command line. Note that the word "video" in the commands should be replaced with the name of your video file (for example, my_video.mp4). I saw both simpler and more complex versions of these commands when I did a web search. If you try ffplay and have problems, please post your commands, and I will see if I can help.
  15. Shotcut is open source, and is fairly lightweight and has histogram and waveform (apparently, no RGB parade and no vectorscope). Kdenlive is also open source, and seems to offer all the important scopes. It is probably lightweight, but not as much as Shotcut. Both of these NLEs have Windows versions. There are other open source NLEs and post production applications, but I am not sure what scopes are offered nor if there are Windows versions.
  16. Disagree. I have seen several new, heavily promoted features in Photoshop declared as breakthroughs that actually appeared years before in GIMP and other open source imaging software. I also don't see much difference between Lightroom and Darktable (and Raw Therapee). If Lightroom has an advantage, please let me know. In regards to NLEs, I don't usually use a lot of fancy plugins, but there are certain features that I find in open source NLEs that I can't find in Premiere or other proprietary software (and I would avoid using proprietary software, regardless).
  17. I use open source software. Never any worries.
  18. I've always admired the Norms Travel Dolly, because it can be "underslung." You just need to get sections of 1 1/4" tube/pipe for tracks. So, with the two stands, you can adjust the dolly height from the floor and up, with no need for camera risers to do so. Norms offers a less expensive starter kit, without the special stands and special under-slung arms (you can just use two standard C-stands with grip arms for under-slinging).
  19. That camera used 127 film, so the frame on the negative was probably 4cmx6.5cm -- medium format. It would be great to see that lens mounted to a Kipon medium format speed booster, attached to a full frame camera!
  20. Is it something like this?
  21. I'll take it off of your hands for $250.
  22. It is the nationality of the physicist who found the solution to a long standing optical problem. I'm not from Mexico, but no doubt some Mexicans are proud of this significant accomplishment.
  23. Mexican physicist Rafael Gonzalez has found the solution to spherical aberration in optical lenses, solving the 2,000-year-old Wasserman-Wolf problem that Isaac Newton himself could not solve. Now would be a good time to get a Fujian 35mm, f1.7 -- before they all become as clinically sharp as Summicrons!
  24. tupp

    Gilles Deleuze

    The reclusive American philosopher Joseph Sixpac had this thought on deterritorialization:
×
×
  • Create New...