Jump to content

eatstoomuchjam

Members
  • Posts

    1,169
  • Joined

  • Last visited

Everything posted by eatstoomuchjam

  1. AFX range is better than some others. PD Movie, for instance, advertise 4 meters. Their entire system with motor and handwheel is only like $500, though. DJI, on the other hand, 14. I'd love to evaluate the DJI because I don't like the silly proprietary(?) batteries that the PD Movie uses, but they seem to be producing them really slowly judging by the lack of "in stock" notifications I've received after signing up within a day or two of them announcing it. As far as sunlight, yes, I'd expect that the sensitivity of the sensor and the strength of the IR laser are major factors in how well that works. At least with the PD Movie, I'm pretty sure some reviewers complained that it lost accuracy on a sunny day. 🤩 (insert meme image of Fry from Futurama saying "take my money already") My willingness to calibrate is correlated to how many profiles the thing can store. I find the calibration step upsetting and annoying if I can only store like 5 lenses in the thing. It means every time I want to use different lenses on a shoot, I'm going to have to spent 15-45 minutes recalibrating lenses ahead of time. If I had enough profiles to store a decent subsection of my lens collection, on the other hand... I think that it's a fantastic tool to keep in the toolbox and like any of the other tools, it's great to understand the strengths and weaknesses. The AFX sounds great, but I don't really see any place where I could purchase it. Assuming that you are cda-tek as well as btm_pix, there are reference to firmware and documentation on the home page, but otherwise there seems to be a BlackMagic-focused focus unit which also looks nice, but wouldn't be useful for me. As far as the first bit of that, my point was only really that you need to move the motor for certain lenses and that the move is apt to have an impact on the center of the frame (if you move horizontally or vertically) or the focus distance (if you move it in or out) which, since it's not in the lens/sensor pipeline, is apt to have an impact. Your offset parameter sounds nice, though I'd be guessing at the value since I don't usually carry a ruler on me. 😃
  2. A lot of the rules are different if we're talking photography vs videography. I've heard generally good things (and had a good experience, even with Fuji) for animal eye detect in photo mode. I've had almost nothing but heartbreak with Canon and Fuji using animal eye AF in video mode. I just switched to centered focus point in those cases and didn't spend a lot of time messing with it. It's hard to say what others have done when having a good or bad experience doing it. 🙂
  3. This seemed timely and relevant to me because of the discussion of documentary stuff in this thread. Some talk about the A cameras used for the most documentaries that made it into Sundance this year. The list is topped by the FX9, the C300 II, and the FS7.
  4. Oh yeah - lidar also suffers with lenses with large front elements, as the sensor/laser thing needs to moved further from the camera's sensor. Otherwise, part of the sensor's FOV and some of the points illuminated by the laser will be the lens vs the subject.
  5. You don't seem to understand the limitations of CDAF. A more correct description of why it's slower and they are pulsing is this: Scenario 1: Subject walking toward camera. PDAF camera: CDAF system determines accurately and quickly that the subject has moved out of focus. Uses PDAF to determine difference in location between current focus point and new desired focus point. It moves lens to approximately the correct location and uses CDAF for micro-adjustments. CDAF camera: CDAF system determines accurately and quickly that the subject is out of focus. Does not know which way. Does not know how far. Guesses one. May need to use a relatively small step to avoid overshooting. If accurate and the amount of OOF frame decreases, continues in that direction. If wrong and the amount OOF increases, go the other way. If indeterminate, keep going the same way. Wrong way/multiple steps the wrong way? Pulsing time. DFD helps this by improving the accuracy of the estimates of distance + direction. Could also potentially optimize by guessing that a subject will keep moving the same direction. However, this optimization is potentially difficult due to... Scenario 2: Head and shoulders video, subject cannot sit perfectly still. PDAF camera: CDAF system determines that the subject has moved out of focus. Uses PDAF to determine where the subject went. Jumps to about the right place. Use CDAF for micro-adjustments after that. CDAF camera: CDAF system determines the subject has moved out of focus. Has to guess which way. Same as above, but now there's a bigger chance of overshooting, as the subject's movements are not in any way smooth or guessable. In my experience, frequently results in a mess. Better to stop down the lens a bit and give up on shallow DOF aesthetics in favor of an image that's vaguely in focus. Both systems use CDAF processing to some extent. However, courtesy of parallax, one system is able to cheat by knowing the direction and amount of movement. As far as Lidar, it seems fantastic for certain use cases. It suffers from range limitations and can work very poorly when in bright sunlight or other areas with a lot of IR pollution. It also doesn't operate through the lens so requires calibration for sensor location as well as profiling the lenses. This also requires understanding of its coverage vs the lens (very wide lenses may only focus on things in the center, for example). Every existing consumer lidar system I am aware of can store only a few lens profiles so choose carefully. As mentioned, though, it seems incredible in very low light. I got the cheapest version of the pdmovie live motor and (thus far) I've kind of failed at testing it in any real way.
  6. I mentioned that (briefly). Any sort of animal eye focus seems pretty unreliable on almost every camera right now. I haven't paid much attention to trains or motorbikes, etc, but I'm not even remotely surprised that they aren't very good. How fortunate that you had better luck than the half dozen people that I know who have Panasonic cameras with CDAF/DFD who complain that the AF on their camera pulses! It was weird how much it's like my own personal experience with the GH5 which would have a box drawn around the face of a person and yet, every so often... pulsing. It's almost like it chose a subject and then when that subject moved slightly, it wasn't sure which direction to go and had to rack focus a little bit to figure out that they were more or less in focus than before. It's also weird how that fits the description of how CDAF/DFD works almost perfectly. What a pity, then, that the dummies at Panasonic finally gave up on their DFD/CDAF system and switched to PDAF when the DFD/CDAF on their previous cameras was so excellent and well-loved industry-wide. Guess their management and engineering orgs just don't understand how the two things work.
  7. For continuous AF, it absolutely does. Panasonic spent a lot of years on CDAF/DFD and AF on the GH6/S5/S1/S1H/etc was still not even nearly as reliable as Canon/Sony. There's a reason that they finally took whatever steps were needed to enable PDAF on their modern bodies (and from everything I've heard, AF on the S5 II and the GH7 is fantastic). Some people say that with a bunch of tweaking, they could get the CDAF to be acceptable. That's all fine and good, but Canon and Sony users take the camera out of the box, enable continuous AF and human eye detection, and watch the camera instantly lock on to a subject and stay locked on (some caveats around terrible lighting and multiple subjects in the frame apply). (And yes, of course PDAF = PDAF+CDAF, but it should be understood that use of the phrase PDAF is usually intended to be inclusive of the two technologies) (And yes, Canon isn't PDAF, but DPAF, but DPAF is for practical purposes very similar to PDAF, as both are based on parallax) One's experience with this, though, is likely to vary based at least somewhat on the working aperture. If you're consistently shooting deeper DOF (like an 50mm at f/8 FF equivalent), the pulsing and occasional refocusing will be less noticeable than if you're shooting with shallower (like a 50mm at f/2). Simply implementing PDAF doesn't guarantee parity (see: Red and Fuji), but it certainly puts companies on the right path to it. And as BTM's photos demonstrate, a lot of the subject detection automatic modes on modern cameras need work, even for those vendors whose PDAF is solid.
  8. If I want a software look with lower contrast, I prefer to use a vintage lens. Want really low contrast? Go single-coated or uncoated.
  9. Sorry, I wasn't suggesting that the BBC was doing run & gun that would use a GH2. I was only responding to the implication that the GH2 would be a poor choice for a documentary. 😄
  10. If you're doing a run and gun documentary, the GH2 has a number of advantages over a v-raptor vv, including that it's smaller/lighter, requires less rigging to be functional, and because of those things will draw less attention.
  11. One of my all-time favorite lenses is a single coated collapsible Summicron 50/2. The only bummer is that short of rehousing it (which I won't do), it's impossible to use it with a follow focus. I also have a collapsible 90/4 which I've wanted to try for a shoot, but never get around to bringing with me.
  12. Yeah, that's what I do - I'm just saying that I glued the LTM lenses into the LTM to M "ring adapter," making it effectively permanent (so the lenses won't loosen). Then I have M mount adapters with and without the helicoid.
  13. Unless you have a camera body that uses LTM, I would suggest using a little bit of loctite and permanently mounting the LTM lenses in M mount adapters. That's what I've done. On some of mine, the focus is a little bit stiff which meant that the lens would start to unscrew slightly and wobble when I turned the focus (maybe yours are a little better). Either way, there are lots of LTM to MFT adapters. Just find one that promises to let you focus slightly past infinity and you'll be fine. There's no need to buy anything even a little bit expensive.
  14. The maker got some more StarlightEye boards in stock and I ordered one. It showed up today and I've been... sort of successful, at best, at actually getting it working. I just asked on the Discord for some pointers on what I might have done wrong. I was at least able to get a picture from my C and D mount lenses that was cut off on the edge of the screen, but that was still kind of exciting. Assuming the people on Discord can tell me what terrible mistake I made, the next step will be to design some sort of minimal body around a V-mount plate with a USB-PD output (conveniently a small V mount plate is close to the same size as the Pi 5). Using a little 50Wh mini-v-mount, I could have a tiny little camera with teensy lenses that will last all day! Could be a lot of fun!
  15. Judging by "show results," everybody has chosen not to vote so far. 🙂 You'll get no objection from me that there's a happy medium between "coke bottle" and "counting the atoms in bricks 100 meters away." With this poll, the problem is at least partly that the example photos are a weird angle and of an uninteresting subject. Whatever lens was used, I don't care much for them.
  16. I've considered getting some of the module8 tuners before, but I think that they're very much in the category of rent before considering buying. It's pretty hard to judge them from whatever footage people put online. There are just too many combinations of lens + tuner settings. Most of the stuff I've seen online looked a little bit like hot garbage to me, but maybe I just don't like their combinations of lens + settings. They're only so interesting anyway.
  17. No, it is outrageous, but it's not an easy change for a lot of people to move from Adobe products to others. I've tried several times to use Capture One instead of Lightroom and I'm just too accustomed to Lightroom. Capture One feels clumsy and awkward. To make it worse, when working with scans from very large negatives, neither Lightroom nor Capture One can handle them so I'm stuck with Photoshop - and no other product I've found is able to handle such enormous files (and Photoshop does it badly/slowly). Anyway, this is already old news and Adobe backed off on it after so much user outrage. Whether you believe Adobe when they say they won't do it, that's another story. https://www.wired.com/story/adobe-says-it-wont-train-ai-using-artists-work-creatives-arent-convinced/#:~:text=Late on Tuesday%2C Adobe issued,opt out of content analytics.
  18. That's also assuming that the color temperature settings in the camera are a match for your color meter - and that's definitely not guaranteed.
  19. As far as I remember, Panasonic don't develop sensors. They integrate sensors from Sony. But to answer the question, no. Unless technology has improved, going to 8K on M43 is likely to make the image noisier and I couldn't care much less about 8K vs 6K.
  20. Well, it's not efficient for photos and it won't be efficient for video either. 🙂 As already mentioned, there is an eyedropper tool in Resolve that you can use in one of the earlier nodes to click on a grey card. Otherwise, at the risk of sounding like a broken record, get a color checker and use the color checker tool in Resolve if you want a quick way to get nice/accurate colors as a starting point.
  21. You can shoot two clips. The second can be of any length as long as the card is in the right place. I don't know what everybody else does! There are plenty of options like doing a custom white balance in camera using a grey card or using the eye dropper (or a color checker) in post. Your mileage may vary, best to experiment to figure out a workflow that works for you before trying it on a shoot that matters.
  22. The second clip would need to be at least one frame long. For the most part, I just do them in the same clip. I taped a color chart to the back of my slate and I ask the person who is using the slate to flip it around before clapping.
  23. I've been over "shallowest DOF" for a while now. For quite a while, I've been more interested in "proper DOF." The right DOF is the one that fits the story that you're trying to tell with the right amount of detail in background areas. Want to express the subject's extreme isolation? You might actually be wide open with a very fast lens. Want to draw attention to the subject? Probably better to stop down a little bit so that the background is identifiable, but still oof. Want to present the subject occupying a place, such as in a wide establishing shot? Stop down a bit more. Similarly, lens choice should be driven by desired look/impact. Shooting futuristic sci-fi? Maybe you want some really sharp glass. Shooting something a bit more romantic with lots of high-key lighting and close-ups? Probably better to choose something a little softer. Making a short film starring a can of beans? Maybe don't overthink it.
×
×
  • Create New...