Jump to content

Llaasseerr

Members
  • Posts

    345
  • Joined

  • Last visited

2 Followers

About Llaasseerr

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

Llaasseerr's Achievements

Frequent member

Frequent member (4/5)

110

Reputation

  1. I meant shoot full frame without a dedicated anamorphic mode and you are mainly cropping the sides in post, but with about a 12% crop vertically. So although you can't do a sensor crop in camera, you can use guides and then crop in post and get the better readout of the cropped window. I did misunderstand the rolling shutter speed improvement impact of cropping the sides though. Apparently it's more based on vertical cropping than horizontal where there is a slight but less noticeable improvement. So it's more about reducing the total amount of lines read and to a lesser extent how long it takes to read a line. But you will still get a benefit in both directions. The 2x anamorphic stretch helps reduce the perceptual effect of the rolling shutter too because it's so wide in the horizontal, especially on a 2x setup.
  2. Right, basically the grade is never baked in for vfx delivery but yes the lensing grain etc is baked in if needed. The way to look at it is almost to not think of vfx shots as any different to any non-vfx ("real life"). The exception is totally cg shots on bigger budget films, but even then, ideally they have some real world reference or photographed guide that they are following. If you look at what Yedlin does, he is indeed bypassing what happens in DI to a degree and he's using Nuke in the same way the vfx houses do. He is basically doing what the color science departments at places like ILM have been developing for years. I believe that he applies a pass on all the shots on his films where he adds his lens emulation and film emulation if vfx have not already integrated that. Which I imagine he still does before DI, but then DI would be comparatively simple. I know he also works with color scientists/developers at places like EFilm or Fotokem though, to port some of his Nuke tools over to Resolve as OFX or DCTL plugins so that some can be done in DI.
  3. This vfx approach has been used for over 15 years. That level of in-camera "photographed" realism was pioneered by ILM in films like War of the World, Pirates 2 and Transformers. What the director is really doing is thinking back to his methodology with his first film and wanting to get back to that immediacy and creative spirit, but with a bigger vfx budget. As has been described, for him the rig was obviously fundamental to that. And I now understand how much they would have saved on shipping lights, cranes, on hotel budgets etc due to the need for less crew and gear - also crediting the high ISO there.
  4. I worked at ILM just at the crossover from film to digital and they had a phenomenal color science department that was able to take all their knowledge of film scanning and recording and apply it to the digital world. All the media is ingested or rendered as linear light wide gamut, and the show LUT is based on a film print and applies the highlight rolloff - ie, it "shapes the light". I worked on a show with anamorphic film and flat digital, and the final target look was an anamorphic film print. So things like anamorphic flares, aberration and of course film grain were added, but not as accurately and as tightly pipelined as it would be today.
  5. The Kowas are pretty small and of course the FX3 was small (and light and the right shape, as we have found out) so they didn't need a smaller lens. It was a 75mm lens so you could argue that if spherical it would be only a 37mm, but you can't say people don't notice the anamorphic look. I think many of us have that visceral emotional connection to all the films shot with the C-series anamorphics that the director cites from his youth. To be honest what they did on The Killer was not full blown anamorphic. Just "partially". I watched the Variety video. There is still some spherical bokeh happening, and considering the shallow DOF in The Creator on quite a lot of shots it would have been difficult and expensive to get the more waterfall bokeh happening in vfx. You have to get decent depth maps from live action which is difficult, and then the edges where the falloff happens can be a real pain. Pointless basically. It does make vfx much easier if shot spherically, but if you can blend the vfx right into the optical signature of an anamorphic lens then it really sells it and looks great. ILM would have only had to characterize basically one 75mm lens and they did a great job. Did I like the film? Not really, but credit where due.
  6. I gotta admit I have almost zero experience with gimbals, but a friend with an FX6 and I think an RS2 also commented that it was a little hard to balance it. So yeah, the weight, the size, the shape, all of the above...
  7. Nah I got it the first time you pointed it out and it all sounds very valid. Appreciate you sharing this by the way. I was curious why an FX6 wasn't chosen given it's the same sensor but the cameras is "more pro". My takeaway was that although it's lightweight, the rig needed something even lighter (the FX3).
  8. Agreed, forgot about the ISO. Obviously a major factor in choosing this sensor.
  9. Yes, the film could have gotten the same 4-perf anamorphic film back and FOV as they got with the final 2.66 crop image on the FX3 if they had used an Alexa Mini with the 4:3 option. Really any old Alexa that shoots 4:3 is designed for traditional 35mm anamorphic lenses for 4-perf film, but a full frame sensor that only shoots 16:9 which is so ubiquitous now is going to give you the same kind of coverage when you crop the sides. With the asterisk that they did in fact create anamorphic 135 masters for some venues. In the end, the sensor in the FX3 is the only affordable full frame with fast rolling shutter that is very close to the speed of an Alexa and is also able to shoot the full 4-perf anamorphic height, and housed in a compact body. So then it comes down to the weight of the rig. And I get it if the director is also operating a lot on long shoot days, he would have been extra exhausted at the end of the day with an Alexa Mini rig considering he also mentioned he felt like he was a bit of a wimp. I will say that the fact they didn't crop the sides on ingest and instead released a super duper wide 32:9 version in some venues blows my mind, because the first thing the producers will do to save money on vfx is crop into the image to cut down on the "world building". The Kowa anamorphics are famously compact. Back in the 60's Japanese new wave directors were doing extensive handheld anamorphic work in their films while Hollywood only had their huge rigs. So I get why the Kowa lens was chosen. Besides that, they do have a beautiful look.
  10. I wonder if you can shoot raw stills at 24p or even 48p indefinitely.
  11. There may be a bit of confusion because I think kye said the reason the FX3 was used instead of the FX6 was for marketing, which I disagreed with. Or that's how I interpreted his comment. I did also notice at a premiere that Gareth looked a little cagey about talking about the camera to some vlogger, considering the niche subject, but he did so. Rightly he wanted to focus on the film itself. I think overall we are just seeing a phenom like when DV cameras and then DSLRs broke through and it got a lot of attention. I posited that the choice of camera could be solely down to Gareth's personal preference, and JulioD pretty much confirmed that - along with additional context that given the lens, the rig may well have required the FX3 to balance on the gimbal. So that explains why the FX6 might not have worked for them.
  12. That totally checks out with what I suspected. Thanks for the additional context!
  13. Also a general technical comment on using the FX3/FX6 instead of an Alexa: Basically all the Sony cameras have the same dynamic range, except the Venice will clean up better in the shadows. So you could re-rate it if you underexposed and pushed it in post. What I saw on Top Gun Maverick was shots of the Venice clipping the highlights for some sky/sun shots where an Alexa would not, because it has 2 stops more above mid grey as rated by Arri. I did not see any clipped highlights on The Creator. The did an excellent job of controlling them, or maybe fixing them in post when need be considering how much of the image was cg in most cases. I know that on the Matrix Resurrections, the vfx vendor used AI to add highlight detail into RED footage that clipped highlights in explosions, for example.
  14. Don't you work as a location sound guy though? I mean of course people are gonna talk about it in the industry. For sure. It has bled over into mainstream media to a degree. I think in the first week of release once the embargo on discussing the camera was lifted, it was definitely a topic of conversation. But by then it's too late since these kinds of films live or die on their opening weekend. So there's no way it was part of a coherent marketing strategy for the film. But I agree it wouldn't hurt. I got a bit confused about whether you and kye were talking about marketing for Sony, or the film. I already mentioned that Sony had zero influence on the financing of the film and there's no way Gareth would choose the camera for marketing purposes over a different camera he would rather use, and the studio instructed the crew to not mention the camera until after release. So no, it doesn't seem to be marketing in the traditional sense, but good press after release.
×
×
  • Create New...