Jump to content

All Activity

This stream auto-updates

  1. Past hour
  2. Take a look at Amaran and Smallrig
  3. Never done any VR stuff, but a few thoughts/ideas: You could try a using stereo mic with a 120 degree (instead of the more usual 90 degree) angle between the capsules. That will give you a wider, more diffuse soundfield. Some stereo mics that use mid plus side capsules internally have a switchable 90/120 option (and/or an option to output the raw mid and side signals instead of L and R, enabling you to choose the virtual capsule angle in post using a mid-side to stereo conversion plugin). Stereo field manipulation/enhancement plug-ins might also be useful to widen the soundfield and create a more spacious effect e.g. I've used iZotope Ozone Imager (free), Nugen Stereoizer (in full and cut-down 'elements' versions), various Melda Production plugins to do this and/or create pseudo-stereo from mono sources.
  4. Back in the day I did a few shoots with 5 and 10k hmi's that you could blast through a window and would compete with daylight. I don't really use lighting much anymore and I'm very out of the loop. I have a couple of cheap and cheerful godox 150w leds. Is there anything current that has higher output that is $1000 USD and below? I've seen a few diy things that look absurdly bright but no idea what the output actually is. They don't need to run for hours on end either.
  5. Today
  6. And not only RAW, Quick Sync (hardware decoding of H.264/HEVC media) supplied by Intel takes a lead AFAIK :- )
  7. I'm really a nob in this. But for VR 360 you are right, you need ambisonics, as your head may turn 180° the audio would come from the opposite side confusing you a lot, many VR players support some sort of ambisonics audio so they can move the audio following your head. From my limited research and understanding there are multiple standards so you may end up doing different edit+metadata for Youtube VR than for local players Meta Quest TV , Skybox etc.... Apple Vision Pro? Also I think Resolve 19 now support ambisonics but not sure if you can do all in there, for sure not metadata injection. Now for VR180° your head my turn max 45° as more than that half of your video will be black so it seems that you can get away with binaural or even stereo. I would prefer a stereo mic like the Sennheiser 440 or Rode Stereo Mic as they are not as tall as the H3 so I can use it on a gimbal.
  8. My guess is that the difference in actual user experience of a system receiving a 923 and one receiving 893 will be negligible. Between the two, I'd choose the one where I get a better deal.
  9. By the time you add the size and weight of a converter box, you should probably just get the Z Cam EVF or the Portkeys OEYE and save money.
  10. Yesterday
  11. Just watching this video: After 2m55s the guy from BM is asked if the viewfinder could be used on other manufacturer's cameras and he says that there's nothing preventing that. If another camera could output Display Port information and power through USB-C it would work. So this is very interesting. So for this to work on another camera, there might need to be a little converter box that brings in power and converts the HDMI or SDI to Display Port. What I'm also curious about is if things like focus assist and frame guides are done in the viewfinder or in the camera.
  12. After posting the previous post I went back and compared the looks a few times and realised I was a bit harsh on the ARRI LUT, considering that it was very flattering on my battered skin tone but basically didn't screw up the strong colours too much, whereas the film look is much stronger without being that much more flattering. Inspired by the ARRI LUT, I created this custom grade from scratch. SOOC (for reference): New Custom Look: ARRI LUT (for reference): I'm actually really happy with that look - I went a bit further in evening out the skin tones and brightening them up a bit and it didn't seem to come at the expense of anything else. I think I could easily build a look around this, and will experiment further.
  13. Last week
  14. Well, not exactly the fastest... https://cpu.userbenchmark.com/Compare/Intel-Core-i9-14900KS-vs-AMD-Ryzen-9-7950X3D/m2295306vsm2052977 https://www.pugetsystems.com/labs/articles/intel-core-i9-14900ks-content-creation-review/ Except for RAW though:
  15. https://www.fxguide.com/fxfeatured/actually-using-sora/ Cool to see some transparency. Pretty interesting read. Having worked on video demos in the the tech space, I am not at all surprised. Absolutely mind blowing tech, but stuff like "300:1 ratio" for editing and "10-20 min for 480p" was not mentioned in the demo notes. $$$$$$. 🙂
  16. Speaking of devil... «The primary area where we found a notable difference between the 3D V-Cache processors was in thermals and power draw, where the X3D CPUs used around 30% less power and remained around 5-10 degrees Celsius cooler on our test benches.» source «(...) 7950x can actually support 192GB of RAM at the moment, with the possibility of supporting 256GB RAM soon (BIOS updates are already out)» source So, any doubts on the best processor out there? ;- )
  17. It's possible to synthesize all sorts of microphone types/polar patterns, including things like perfectly co-incident stereo pair mics (which are impossible to physically build), from a B-format Ambisonics stream/recording - https://en.wikipedia.org/wiki/Ambisonics#Virtual_microphones - and also produce a binaural stream - https://en.wikipedia.org/wiki/Ambisonics#Decoding I think that post-processing flexibility is the real strength of using a Soundfield microphone for ambient sound recording.
  18. Apparently this is very popular among people who are into paranormal activities and looking to recording on-field audio
  19. I'm still lost down this rabbit hole, but these are an interesting reference. This is what happens if you put the GX85 through a "look". I put the GX85 test image through a bunch of output LUTs to see which (if any) I liked the flavour of. In order to compare them equally, I adjusted after the LUT to match the black and white points, exposure, and contrast. This way we're not just comparing the different contrast curves, but the other aspects of the LUTs like colour rendering etc. The node structure was this: slightly lower gain to bring GX85 image into range (it records super-whites) CST from 709/2.4 to Davinci Intermediate (my preferred working colour space) (my grade would go here, but for this test no adjustments were made) CST to whatever colour space the LUT expects The LUT (these all convert to 709/2.4) A curve to adjust the overall levels from the LUT to roughly approximate the GX85 image The round-trip from 709/2.4 to DWG to 709/2.4 is almost transparent, if you compensate for the gamut and saturation compression in the conversion at the end, so I didn't bother to grab it. Results: The famous ARRI K1S1 LUT (the ARRI factory LUT): One of the 5000 BMD LUTs that come with Resolve, which I tried just for fun: The Kodak 2383 PFE (Print Film Emulation) LUT. The D55 one seemed the closest match to the WB of the image for some reason, but everyone always uses the D65 ones, so I've included both here for comparison. The D65 one: The Kodak 2393 PFE. It doesn't come with Resolve but it's free online from a bunch of places. I like it because it doesn't tint the shadows as blue, so the image isn't as muddy / drab. The FujiFilm 3513 PFE: I find the ARRI LUT a bit weak - it helps but not as much as I'd like. The comparison above is flattering to the LUT because it has a bit more contrast compared to the SOOC so looks a bit better. The skintones are a little more flattering on it though, which might be enough if you want a more neutral look. All the PFE looks are very strong, and aren't really meant to be used on their own. The film manufacturers designed the colour science to look good when used with a negative film like Kodak 250D or 500T stocks, so it's "wrong" to use it unless you're grading a film scan, I think people use it like this anyway 🙂 Some time ago I purchased a power-grade that emulated both the 250D and 2393 PFE from Juan Melara, which looks like this: To me it looks much more normal than just the 2393 PFE on its own, but it's definitely a stronger look. The powergrade is split into nodes that emulate the 250D separately to the 2393, and the 2393 nodes are almost indistinguishable from the LUT, so I'd imagine this is probably a good emulation. Anyway, lots of flexibility in these 8-bit files!
  20. I'd agree with you, except that about half the movies in the box office aren't much better.. add in all the religious propaganda movies that thrive in more religious areas of the world and sadly, I think there's a huge market 😞
  21. One question that stood out to me immediately was how much you want the audio to move around when the person in the VR turns their head. For example, if you record stereo and the VR person turns their head the visuals will all move but the audio won't change at all - I would imagine that to be unnerving and potentially ruin the immersion wouldn't it? Does VR have a standard where you can record to that and then the headset will decode it to match where the viewer looks?
  22. The end of taste? Who wants to watch this garbage?
  23. Hollywood shoots ProRes for the most part, except for VFX plates.
  24. I did a first impressions thing here when I got my H3-VR. It has examples of the auto down mixed binaural output for general city ambience and you can download the original ambisonic files to see the extent of the steering that you can do with them in post if you every wanted to. I think it obviously punches above its weight and even in binaural mode it does add more pickup width outside of the visible frame as well as a certain degree of rear and height cues so for 180 video I think it might do the trick for what you are after without having to do anything in post. The step up would be the Rode Soundfield mic paired with something like the Zoom F6 or even the new H6 Essential but, obviously, you will then need a more elaborate rig versus the diminutive H3VR and, of course, with no down mix you'd have to do that in post.
  25. I think I found the lens! It's the Angenieux Zoom 15x10B 10-150mm f/2-2.8 lens. This lens has "TYPE 15 x 10B" written on it and a red pinstripe around it.
  26. My guess is that Sigma and Tamron paid Canon to license the mount. They have enough profit margin that they can afford to give either a single pretty big upfront payment to Canon or to pay them a royalty/percentage on each lens sold. Canon makes a lot of money selling lenses and know that over time, they'll fill in any existing gaps in their lens lineup - but if they can also make money through licensing (with no need to even build/ship a product), why throw away the chance?
  27. If you're in the PC world, you're definitely better off editing on desktop vs laptop if you're working with high-res footage or doing effects, etc. Most laptops throttle down a lot when not plugged in and (in my experience) make a jet engine noise when dealing with a prolonged load on CPU/GPU. And as ac6000cw says, the mobile GPU's are almost always lesser versions of their desktop counterparts. Also when on battery, life tends to be very short because CPU/GPU pull a lot of power. My M2 Max, on the other hand, can handle 8k Canon raw acceptably - and I got the weaker variant of it. Performance is almost the same whether plugged in or on battery. Fans do ramp up when working it hard (now that I've added denoising to most of the scenes that need it, the 14-minute short I'm currently working on/grading definitely has the fans running full blast when I run an export). Basically, in absolute performance numbers a high-end PC desktop will beat any Mac currently on the market and at a fraction of the cost of a Mac Studio ($3k for a decent Ryzen + RTX 4090!). A top-of-the-line PC laptop plugged into the wall will also outperform the MBP in absolute numbers (except whoooooosh fan noise)... but if you want to actually be mobile, the Mac is the hands-down winner.
  28. I had my ah ah moment while trying Vision Pro and watching Adventure a couple of months ago. I then rented a Canon dual fisheye to use on my R5c, borrowed a Quest 3 and started experiment with some sports that I cover and so far, people are impressed and surprised. Will this time take off? I think we are still too early, but the experience is getting better. I wish Vision Pro would be priced lower and be more open to load content on it, something that with Quest 3 is super easy to do. I kind of grasp now the filming, editing and delivery other than sound. I need ambient sound of the sport event, no dialogs etc... I would like to create a bit of the feeling to be there BUT I don't want to spend a lot of time managing the audio. KISS principle as for now is just a side thing. I see two options, simply using a Stereo Mic or using a Zoom H3 VR and use the "straight out of the device" binaural (not sure how it can be binaural if the mics are so close together). I'm not too interested in dealing with the various ambisonic formats, editing, metadata etc... seems a huge pain and time consuming. Would people perceive a big difference between a stereo mic and the H3 binaural? Any other ideas/opinions?
  29. Haha I’d imagine some folks who own Sonys also love canon skintones. I’ll do some digging on the interwebs if nobody here has don’t it.
  1. Load more activity
×
×
  • Create New...