Jump to content

kye

Members
  • Posts

    7,483
  • Joined

  • Last visited

Everything posted by kye

  1. All else being equal, I'd suggest the RP + 24 1.4. This is because the 24 1.4 can be stopped down a bit to get to 1.8 equivalent, and therefore be slightly sharper at a given DoF. This would also mean you can match CS much easier to the C100 down the track. Of course, all else isn't equal, so work out what you care about more and prioritise the setups accordingly... I'm also assuming that the RP doesn't have a crop, but that might be risky with a Canon?
  2. kye

    Hybrid Mono Tripod

    There might be a middle ground where you get a <person> to make you some longer legs for the little tripod thingy on the bottom of your monopod, but legs that would use the same mechanism. I'm not sure what the word is these days, but <person> above is actually "maker" in YouTuber parlance, as there are now a ton of people on YT that can design/prototype/print/carve/lasercut custom stuff, and seem to work with any material known to man.
  3. kye

    Hybrid Mono Tripod

    You could always buy a couple of solid quick-release plate/sockets and put them in-between the head and the tripod/monopod, then you could quick-change the head from one to the other, even with the camera attached. It's some cost and some hassle and some weight, but if time is that important in how you shoot (which it might be) then you might find it worthwhile.
  4. One thing about the eGPU is that there's kind of a limit to how much use it is depending on your CPU. I bought one hoping that I would be able to decode the 5K 10-bit h265 from the GH5 real-time, so I could edit without rendering proxies, but it's nowhere close. It can't even do the 4K 10-bit h264 either, and the challenge is my CPU is the bottleneck, not the GPU. I went through the Resolve support (which is excellent BTW) and they said that Resolve 16 had GPU accelerated decoding of h264/h265 on AMD, but that it was the first version so there might be optimisations in future versions, but 16.1 didn't change anything. I was poised to buy the RX 5700 as an upgrade (it's way faster than my 470) but I'm not sure I'd get any benefit really. It would run OFX plugins faster and maybe export faster, but they're not things I care about really. If I have to render proxies, then I can render 720p proxies and then the laptop is powerful enough to apply filters and grading in real-time without the eGPU so in that sense it doesn't really help that much. In this sense I'd rather that everything becomes GPU-based as you can upgrade an eGPU really easily but not a CPU, especially in a laptop. I'm anticipating that eGPU support and utilisation will improve over time as GPUs are now being pushed for AI and we'll be piggy-backing off the back of that. I remember in university I did a unit in programming parallel computers and we had access to a cluster that had 2048 CPUs but you had to program it explicitly to use that many CPUs and had direct control over which ones did what, rather than it sorting out which ones did stuff by itself. They've largely worked those things out now (programming multi-core CPUs isn't done explicitly like it used to be, the software allocates tasks automatically) so I'm hoping that GPU usage will become the same with using multiple GPUs. This would mean that you can have a modest computer setup but connect it to a bank of however many eGPUs and everything will just organise the work amongst the GPUs and it will be like plugging in more horsepower to get the job done. We're not there yet but we're getting closer with eGPU support now completely native in OSX.
  5. I am running a MBP with SAPPHIRE AMD Radeon RX 470 in external enclosure eGPU style. It connects via a Thunderbolt 3 cable that charges the laptop as well as connects to the eGPU. It is plug and play, with the exception of you have to eject it before pulling it out (I did that by accident once and the graphics was all glitchy until I restarted - oops!) but is pretty nice after that. Resolve can auto-detect the GPU in the MBP and the eGPU and splits load between both (not that the MBP one is much help) and I have been gaming with the AMD eGPU as well on my external 32" panel. I was a bit sad that the Nvidia cards weren't supported but went AMD because I just wanted it to work, and it does. The bang-for-buck ratio of the Nvidia cards was better than AMD, but the new Radeon RX 5700 card is about on par (IIRC) so I think that performance gap has closed. I'm not a fan of manufacturers forcing customers into closed ecosystems, but overall my eGPU experience hasn't been too bad.
  6. Well, as a MFT user I can happily report that I own quite a number of non-SB adapters... and that they're all significantly cheaper than those!! But not to lessen the importance of these options - the Komodo sure looks like an interesting and capable beast so adapters are where it's at. Personally I'd be looking to adapt older fully manual glass as I find the full manual controls really immediate and also the value for money on FD, MD, and M42 glass much higher than with the EF options, but flexibility is great to have.
  7. If vignetting is a concern then maybe there's a non-SB adapter available?
  8. I'd suggest trying to get a material that has roughly the same properties as what the case is made of. It should be easily googlable and likely you can buy it in a hobby shop or ebay. The point of that is partly that Pelican would have chosen a well suited material, but also you want something that is as similar as possible so that it expands and contracts the same when heating up and cooling down and is a bit flexible if dropped etc. I'm not sure about expanding foam or just using epoxy but if they've got the right properties then go for it. I'd imagine there'd be lots online about it.
  9. kye

    Skydio 2

    You and everyone else....... Whoever gets it right will do well.
  10. Definitely something that it 2-part so that it's curing to form a strong bond. You could also try doing a double layer where you cut out things that will fit in flush (glued with epoxy) and then sand them flat and epoxy on patches over the top of that. Then it's two glue-ups that are protecting it rather than one.
  11. Great stuff. It's really useful to see how other people work. You're absolutely nailing it with these videos. Well done!!
  12. kye

    Lenses

    Yeah, as strange as it looks, it sure looks like it can make some awesome footage
  13. kye

    Lenses

    Another video on the probe lens with lots of BTS, but this time for tech reviewing. Interesting fact, it's FF and F14-40. Apparently at F14 you still need a ton of light, so that's why it has the ring light built in.
  14. I'd weigh in, but now it's been revealed that I'm actually an actor paid by the UK to play an Australian as part of a great cover-up, it's hard for me to convince people of much at all really... https://www.nzherald.co.nz/world/news/article.cfm?c_id=2&objectid=12043583
  15. If only film-making equipment was accessible to extreme couponing!!
  16. Maybe the Rokinon/Samyang 35mm T1.5? Not quite as fast and not quite same focal length, but not too far off. and perhaps the Meike 12mm T2.2 or Rokinon/Samyang 12mm T2.2 for the wider angle?
  17. Ouch. Is this a common problem? Sounds like they haven't gotten their focal adjustments correct. I've read online that often the hard stops are adjustments that get made after the lens is assembled to calibrate the focal markings, and that sometimes you can open them up and re-adjust this. Not suggesting that you do this, but that maybe it's a QC problem in the factory rather a design flaw. I mean, how could a lens company design and manufacture a lens that can't focus to infinity?!?! Are the Mikaton f0.95 lenses an option for you? I know they're not cine style, so no threads for follow-focus etc..
  18. This kind of thing can be very useful if you have long takes with a high shooting ratio.. one thing is for sports if you're recording in slow-motion and waiting for something cool to happen - you can end up with minutes of footage that are completely useless. For the purposes of editing I tend to wait until the moment has definitely ended and then stop the take and start a new one. Then in editing I don't have to scan a long clip (120p takes forever to watch) as I can just look at the end of the clip and see if something good happened. I haven't hit this issue, but if file sizes are too large (maybe with the P4K / P6K this would be more relevant) then you can split the source clip into multiple pieces and delete the minutes of unused footage. Hard-drive space might be cheap these days, but recording RAW can add up and it might be worthwhile to trim long clips. P6K 6K Blackmagic RAW 3:1 is 323 MB/s, which is ~1TB per hour. 6TB internal HDDs are $225, and archival should be on redundant HDDs, which means that the 323MB/s is about $75p/h, or ~$1/minute. So, if your hourly rate is $75p/h is less than the space you can free up by splitting and deleting un-needed source footage then you can come out ahead.
  19. kye

    Lenses

    There's a huge resource that @Tito Ferradans has put together about faking the anamorphic look... http://www.tferradans.com/blog/?page_id=15535 There's a quiz to test if you can tell the difference between fake and real anamorphic images. I'm rubbish at it but could spot some of them but not others. His guide is USD$30 but you can get a taste of it via this link: http://www.tferradans.com/anamorfake/TFerradans-AnamorfakeDemo.pdf I don't lust after the anamorphic look so it's not that enticing to me, but for those interested going fake seems to offer a much simpler approach, so it's worth a look.
  20. Absolutely... I'll take any chance to re-post that video!
  21. kye

    FX for cheap?

    and the fact that things like Fusion comes with the free version of Resolve.... really now it's just down to how much work you put in to learn the skillset, the tools are basically free.
  22. I'm not sure that every aspect ratio is compatible with every screen..... Here's a refresher in case you've forgotten some of the more exotic ratios:
  23. I blame the director... looks like they took people who were nervous and behind the camera by choice, and then filled their heads with all the "talk slowly but not too slowly" "stand up straight" "be funny but not too funny" "remember to do these 17 things..........." pieces of helpful advice their heads could hold, then probably finished it up by adding ".......and just relax and enjoy it!" *talent has spontaneous internal freak-outs*
  24. kye

    bmp4k adventures

    Cool.. I can't remember where I read it, but there's some fairly standard approaches to doing profiles of people as mini-documentaries. Interview someone on camera, shoot a bunch of B-roll and then cut it together, finding the story in post. Obviously that's a drastic oversimplification of what is a deep art form, but once you're comfortable with the tech, shooting things like this allow a lot of room for error and if they're historic / human interest pieces that are unpaid then expectations will be low and you have lots of flexibility in post. What are your goals?
×
×
  • Create New...