Jump to content

tupp

Members
  • Posts

    1,149
  • Joined

  • Last visited

Everything posted by tupp

  1. Micro Four Nerds video:
  2. With a dark teal subject, it should be possible to remove a key green or a key blue background. Many digital keyers come with parametric filters that allows selection of a precise keying range in regards to hue, saturation and luminance. Of course, the same parametric controls can be used to key out other colors than key green or key blue. If your camera can be locked down, you could use a difference mask. In this method, one shoots the background without subject, and, then, with the camera still locked down, the subject is shot in front of the background. The difference mask then keys out any part of the image that is identical in both videos. So, the unobstructed background becomes transparent. Here is a video that shows a few other possible keying techniques using After Effects:
  3. Ricoh just announced a monochrome version of the Pentax K-3 Mark III DSLR. Evidently, it is 25.73MP, and it shoots up to 4K 30P video. Here is a video showing footage and stills from the camera.
  4. Here is a video showing ways to install and use AgX in Resolve: It's free and open source. The way it attenuates saturation in the highlights is similar to what Arri does in-camera (and similar to film emulsion). I think that AgX additionally reduces saturation in the shadows. If you have suffered "rat piss yellow" and "nuclear cyan" in the brighter/highlight areas, it might be good to try AgX. After using the related Filmic RGB module in Darktable, I want to go back and regrade all my images with AgX/Filmic.
  5. Cinemarting achieved first light from an 8K cinema camera in a relatively short time, and within ten months of that milestone they had self-contained, working prototypes. Some would say that such accelerated steps ("strides") were remarkable ("impressive"). One can almost hear the echoes of snickering coming from a junior high school classroom. Glad that we can all laugh at jokes that avoid personal digs. Keep 'em coming! Actually, I would say that "impressive strides" sounds less like an Austin Powers 1960's Carnaby Street shop and more like the extraordinary accomplishments from someone who creates wireless camera accessories and camera apps (in addition to those who create cinema cameras).
  6. Not sure how one could come to such a notion from my post. I was responding to the latest episode in a long line of ridicule directed at a few intrepid individuals who took on the challenge of creating an 8K cinema camera -- and who actually achieved working prototypes. It's one thing to criticize the indomitable, powerful giants such as Canon, Sony, Fuji, Blackmagic, etc., but it seems distasteful to pick on a couple of little guys with scarce resources making bold, impressive strides (regardless of their eccentricities).
  7. I was referring to the look of the D16 -- I wasn't referring to film emulation. When's dessert? Actually, "caked-on" color is a good way to describe the look from some CCD's (but not from all CCD's -- consider look of the striped CCD of the Pany-Genesis/Sony F35). That's a real funny joke. However, Cinemartin rapidly achieved working prototypes of a cinema camera. They preceded the Octopus camera (still in development, but which is discussed with interest in this very thread) by demonstrating the use of a similar (but earlier) Ximea sensor module in a cinema camera. What camera developments have you achieved?
  8. Global shutter and S16 optics are significant to the look of the D16.
  9. I don't know which method is more "accurate" in matching the cameras to the reading on your meter or in matching the cameras to each other. I would guess that the difference is minor, but it would be interesting to see a comparison of the two methods. I know that merely using the custom white balance on each camera without a color meter will get a close enough color temp match, and it is quick, easy and consistent. Either way, you will still be tweaking sliders (or clicking eyedroppers) in post to get the final correction. By the way, it is generally a good idea to avoid mixing light sources of different colors on the same side of the subject, especially on a person's face. Also, you might try leaving a slight difference between the color temperature of the exterior light and that of the interior lights. I usually keep the interior key light neutral. If a window is visible in frame and if only skylight is streaming through it, I tend to keep it around 1/4 CTB -1/2 CTB from neutral. If direct sunlight is visible in the background, I sometimes keep it at neutral -1/4 CTO.
  10. If you are doing a multi-camera shoot, first put the grey/white card and color chart together, lit by the dominant full-spectrum light source of the scene/event. Flag other light sources from hitting card and chart. To avoid glare, the light source should be at about 45 degrees (or less) to the plane formed by the card and chart. Position all the cameras side-by-side as close as possible to each other, about 6-8 feet directly in front of the card and chart, and perform a custom white balance on each camera. Then, record about five seconds of footage of the card and chart with each camera. If the dominant light source of the scene/event is intentionally not full-spectrum, perform the above procedure using a separate full spectrum source with a color balance that is close to color of the dominant source of the event/scene. If you are shooting at separate times/locations with each camera (not a multi-camera session), do the same procedure above at each time/location with the one camera. Of course, if you are shooting raw, the white balance will not affect the captured image, but having a custom white balance should provide a decent starting point for your NLE and for color grading software. The footage of the charts will help in more finely "dialing-in" the color correction in post.
  11. tupp

    Pocket56K

    ... with a CCD sensor!: https://spectrum.ieee.org/the-world-s-largest-camera-is-nearly-completefd https://lsst.slac.stanford.edu/
  12. There are countless ways to solve this problem. The cables just need strain relief to the cage and the connectors need to be protected from flexing. I have done this before on a caged OG BMPCC simply with two, long, stainless steel, 1/4"-20 bolts and cable ties. nothing would happen if you pulled on the cables. We were able to keep the camera rigged in a bag with a monitor, external battery and lens attached. @BTM_Pix has made an excellent suggestion. I can also envision a solution using the aforementioned 1/4"-20 bolts plus a couple of washers and a small dollup of PC-7 epoxy (or plumbers epoxy -- but you have to work fast!).
  13. Considering the abusive way that he treats his cameras, it's good that he has more than one: "DON'T DO IT!"
  14. tupp

    Lenses

    It looks good! Thanks for the test! I bought the Canon 10mm-18mm EF-S lens on the strength of this video by @ZEEK: @ZEEK says that the lens works on full frame down to about 14mm, but he mainly uses it with a speedbooster on the EOSM.
  15. I know that there are a lot of normal open source keyers, but in regards to AI keyers, I don't know what's available in open source other than what you have linked in this thread.
  16. Blurring definitely affects color: Note that there are none of the more saturated tones in the blurred version. Likewise, lowering resolution (within the same bit depth) reduces color depth.
  17. OP said that the cameras will be used in a "talk show" setting. So, they will likely be on sticks and require a lens that can get fairly tight, with smooth zoom-in/zoom-outs.
  18. Sounds like small camcorder with a decent rocker zoom and manual capability would be ideal. Markus Pix recently touted the Sony CX405, but it would be smart to look at offerings from other brands: Tell your friend to put all the cameras side-by-side before shooting, and then to white balance them simultaneously off of the same white/gray card. Additionally, your friend should shoot a short clip of the white/gray card with each camera -- just in case!
  19. Robust Video Matting appears to be open source, licensed under the GPL-3.0. If that is so, there isn't too much to worry about -- the source code is open for all to scrutinize.
  20. Thanks for comparison! The Fuji cameras definitely introduce a significant blotchiness that is not inherent in the Canon footage. It would be interesting to see unaltered footage without the added contrast. I wish that the Canon position/framing was aligned more closely with the Fujis.
  21. Yet another Blackmagic Super35/APS-C camera with an EF mount...
  22. Compiled ML builds are only around 2MB-3 MB. It appears that one of those files is the git repository (possibly many MB). To install ML on your EOSM, follow the instructions at 03:14 in this video by @ZEEK: A lot of the ML documentation is way out of date. It's best to find recent online tutorials and/or post questions at the ML forums. I don't know where one can find the older builds. Perhaps post the question on the ML forums?
  23. It sounds like you are experiencing a known issue inherent in the first few models of the EOSM (and in some other Canon models) in which the exposure simulation feature is disabled in still photo mode, when manual lenses are mounted. Without using Magic Lantern, there are two hardware hacks that will allow the LCD screen show a usable image (but that might not be accurate in regards to exposure): Mount a smart lens (Canon or other brand) and open the aperture as wide/bright as it will go. Then, swap out the smart lens with your manual lens. Get a "preset aperture," lens chip (as shown in the below video) and touch it to the lens mount contacts of your EOSM (or to the contacts on your smart adapter), then mount the manual lens: It appears that the Magic Lantern "stable" build has an exposure simulation setting within the "Exposure" tab under the "LV Display" title. I'm using a nightly build, and the exposure simulation setting is in a different place within the "Exposure" tab. I can't get the exposure simulation setting to change from "Movie" mode. Also, I can't get the ML menu to appear when the top dial on the EOSM is set to manual photo mode. The ML menu does appear does appear when that dial is set to the green full-auto mode, and I see the Canon "Exp. Sim." symbol appear on the screen. However, even in that mode, I still cannot change the exposure simulation setting in the ML menu. Magic Lantern "stable" build also offers an "LV Display Gain" setting under the "Display" tab, that evidently only appears or works in photo mode. It's may not provide an accurate representation of exposure on the LCD screen, but it should allow framing and focusing. One can then check the histogram on the recorded images to progressively dial-in the exposure. Of course, one could use a light meter to more quickly arrive at the proper exposure. By the way, a few days ago, @ZEEK released another super16-oriented video on using Soviet/Russian lenses on the EOSM:
×
×
  • Create New...