Jump to content

Will The Creator change how blockbusters get filmed?


ntblowz
 Share

Recommended Posts

EOSHD Pro Color 5 for Sony cameras EOSHD Z LOG for Nikon CamerasEOSHD C-LOG and Film Profiles for All Canon DSLRs

 

I saw it almost twice because I have a friend who works at a theater where they were showing it and I went to catch it. The script is nothing much and it gets lost in the end. Just another movie.

Let's face it: it was discussed so much for a very simple reason: all independent filmmakers and the kids (including white-haired ones like me) je***ng off with their little cameras in the backyard thought: if he made a blockbuster with my own little camera, so can I.

But picking up on the thread title "Will The Creator change how blockbusters get filmed?" I venture to say that This was a whim of a very particular director (his resume proves it) but the film industry doesn't give a shit about the camera used for the film and nothing will change.

As I said 8 pages ago, if there is one reason the film will be remembered and may change the way these blockbusters are shot, it is the visuals.
The film is a visual treat. But not because of the camera or the lenses used but because unlike 99% of other similar films they chose not to create everything by computer but to use real locations as much as possible to which, in post production, the necessary modifications were applied (mainly for budget reasons i suppose). There are scenes shot in real Buddhist temples in Nepal, ruins in Cambodia, hotel, train station, and Bangkok airport, to which they added futuristic details. The futuristic lab you see in the trailers is a real nuclear accelerator in Thailand. the result is mind-blowing. The locations are real and fuck if you can see it.
The visuals have a great scope that the Star Wars productions look like South American soap operas in comparison.
In short seeing the total CGI reconstructions of entire landscapes in other much more expensive and famous films you get the impression that perfection has been achieved. Then you see this film and knowing the technique used (I would almost call it augmented reality) you realize that there is still a long way to go.

Here, in my opinion, this will be something that will change the approach of other productions.

 

 

Link to comment
Share on other sites

22 hours ago, kye said:

In addition to the above, one of the previous wipes they apply is often the show LUT, which also reveals a bunch of stuff about the colour grade too.  Things like:

  • cooling shadows / warming highlights
  • highlight rolloff
  • subtractive saturation effects
  • shifts to skin tones
  • etc

Normally the VFX will only have the show LUT and the "final" shot in the VFX breakdown isn't the same as the final shot in the film because the final VFX shot will be exported without the show LUT and will be coloured by the colourist in the final process, but it'll be in the same overall direction and colour breakdowns are often not available for those movies so it's good info to have.

I worked at ILM just at the crossover from film to digital and they had a phenomenal color science department that was able to take all their knowledge of film scanning and recording and apply it to the digital world. All the media is ingested or rendered as linear light wide gamut, and the show LUT is based on a film print and applies the highlight rolloff - ie, it "shapes the light". I worked on a show with anamorphic film and flat digital, and the final target look was an anamorphic film print. So things like anamorphic flares, aberration and of course film grain were added, but not as accurately and as tightly pipelined as it would be today.

Link to comment
Share on other sites

15 hours ago, Davide DB said:

 

I saw it almost twice because I have a friend who works at a theater where they were showing it and I went to catch it. The script is nothing much and it gets lost in the end. Just another movie.

Let's face it: it was discussed so much for a very simple reason: all independent filmmakers and the kids (including white-haired ones like me) je***ng off with their little cameras in the backyard thought: if he made a blockbuster with my own little camera, so can I.

But picking up on the thread title "Will The Creator change how blockbusters get filmed?" I venture to say that This was a whim of a very particular director (his resume proves it) but the film industry doesn't give a shit about the camera used for the film and nothing will change.

As I said 8 pages ago, if there is one reason the film will be remembered and may change the way these blockbusters are shot, it is the visuals.
The film is a visual treat. But not because of the camera or the lenses used but because unlike 99% of other similar films they chose not to create everything by computer but to use real locations as much as possible to which, in post production, the necessary modifications were applied (mainly for budget reasons i suppose). There are scenes shot in real Buddhist temples in Nepal, ruins in Cambodia, hotel, train station, and Bangkok airport, to which they added futuristic details. The futuristic lab you see in the trailers is a real nuclear accelerator in Thailand. the result is mind-blowing. The locations are real and fuck if you can see it.
The visuals have a great scope that the Star Wars productions look like South American soap operas in comparison.
In short seeing the total CGI reconstructions of entire landscapes in other much more expensive and famous films you get the impression that perfection has been achieved. Then you see this film and knowing the technique used (I would almost call it augmented reality) you realize that there is still a long way to go.

Here, in my opinion, this will be something that will change the approach of other productions.

 

 

This vfx approach has been used for over 15 years. That level of in-camera "photographed" realism was pioneered by ILM in films like War of the World, Pirates 2 and Transformers. What the director is really doing is thinking back to his methodology with his first film and wanting to get back to that immediacy and creative spirit, but with a bigger vfx budget. As has been described, for him the rig was obviously fundamental to that. And I now understand how much they would have saved on shipping lights, cranes, on hotel budgets etc due to the need for less crew and gear - also crediting the high ISO there.

Link to comment
Share on other sites

46 minutes ago, Llaasseerr said:

I worked at ILM just at the crossover from film to digital and they had a phenomenal color science department that was able to take all their knowledge of film scanning and recording and apply it to the digital world. All the media is ingested or rendered as linear light wide gamut, and the show LUT is based on a film print and applies the highlight rolloff - ie, it "shapes the light". I worked on a show with anamorphic film and flat digital, and the final target look was an anamorphic film print. So things like anamorphic flares, aberration and of course film grain were added, but not as accurately and as tightly pipelined as it would be today.

Interesting stuff, and it makes me think about the future of VFX integration.

My understanding of the current VFX workflows is this:

  1. Movie is shot
  2. Movie is partly edited, with VFX shots identified
  3. Footage for VFX shots is sent to VFX department with show LUT
  4. VFX department does VFX work, and applies the elements that are required to match the rest of the film (e.g. lens emulations, etc) but does not include elements not present in the real-life footage (e.g. colour grade)
  5. VFX department delivers footage and this is integrated into edit
  6. Picture lock
  7. Colour grading occurs on footage from real-life footage as well as VFX shots

In this workflow, things like colour grading etc get applied by the colourist to the whole film.  Things like film grain would only be applied by the VFX department if it was shot on film (and would only be applied to the VFX elements in the frame, rather than the whole finished VFX frames).

I wonder if in future, VFX will become so prevalent that the colour grade might become integrated within the VFX framework, rather than the VFX being considered a step that occurs prior to the colour grade.

The discussions that I have seen imply that advanced things like lens emulations / film emulations / etc (which are more image science / colour science, rather than colour grading) are beyond the scope and ability of most colourists.

Link to comment
Share on other sites

19 hours ago, kye said:

Interesting stuff, and it makes me think about the future of VFX integration.

My understanding of the current VFX workflows is this:

  1. Movie is shot
  2. Movie is partly edited, with VFX shots identified
  3. Footage for VFX shots is sent to VFX department with show LUT
  4. VFX department does VFX work, and applies the elements that are required to match the rest of the film (e.g. lens emulations, etc) but does not include elements not present in the real-life footage (e.g. colour grade)
  5. VFX department delivers footage and this is integrated into edit
  6. Picture lock
  7. Colour grading occurs on footage from real-life footage as well as VFX shots

In this workflow, things like colour grading etc get applied by the colourist to the whole film.  Things like film grain would only be applied by the VFX department if it was shot on film (and would only be applied to the VFX elements in the frame, rather than the whole finished VFX frames).

I wonder if in future, VFX will become so prevalent that the colour grade might become integrated within the VFX framework, rather than the VFX being considered a step that occurs prior to the colour grade.

The discussions that I have seen imply that advanced things like lens emulations / film emulations / etc (which are more image science / colour science, rather than colour grading) are beyond the scope and ability of most colourists.

Right, basically the grade is never baked in for vfx delivery but yes the lensing grain etc is baked in if needed. The way to look at it is almost to not think of vfx shots as any different to any non-vfx ("real life"). The exception is totally cg shots on bigger budget films, but even then, ideally they have some real world reference or photographed guide that they are following.

If you look at what Yedlin does, he is indeed bypassing what happens in DI to a degree and he's using Nuke in the same way the vfx houses do. He is basically doing what the color science departments at places like ILM have been developing for years. I believe that he applies a pass on all the shots on his films where he adds his lens emulation and film emulation if vfx have not already integrated that. Which I imagine he still does before DI, but then DI would be comparatively simple. I know he also works with color scientists/developers at places like EFilm or Fotokem though, to port some of his Nuke tools over to Resolve as OFX or DCTL plugins so that some can be done in DI.

Link to comment
Share on other sites

2 hours ago, Llaasseerr said:

Right, basically the grade is never baked in for vfx delivery but yes the lensing grain etc is baked in if needed. The way to look at it is almost to not think of vfx shots as any different to any non-vfx ("real life"). The exception is totally cg shots on bigger budget films, but even then, ideally they have some real world reference or photographed guide that they are following.

If you look at what Yedlin does, he is indeed bypassing what happens in DI to a degree and he's using Nuke in the same way the vfx houses do. He is basically doing what the color science departments at places like ILM have been developing for years. I believe that he applies a pass on all the shots on his films where he adds his lens emulation and film emulation if vfx have not already integrated that. Which I imagine he still does before DI, but then DI would be comparatively simple. I know he also works with color scientists/developers at places like EFilm or Fotokem though, to port some of his Nuke tools over to Resolve as OFX or DCTL plugins so that some can be done in DI.

I don't really have a clear understanding of how Yedlin manages his pipeline - do you have a link to something I can look at?  In some senses I guess he's got one of the most "processed" image pipelines.

One thing I'm becoming much more aware of is the difference between colour grading and colour science / image science, where the colourist works on individual projects and the colour scientist develops the tools.  Obviously some colourists are also doing colour science things as well, so there is definitely some crossover.

After my previous post I was thinking about it more and realised that the colourist often acts as a sort-of central coordinator of visual processing, where they understand the needs of the client / project and then apply a variety of tools as is appropriate.  Some of these tools can be enormously sophisticated, with the well-known examples being the film emulation packages like Dehancer / FilmConvert / FilmBox / etc, but also lesser known things like BorisFX / NeatVideo / etc also getting heavy use.  

I'd suggest that with this increasing level of sophistication these tools are really now a different form of VFX, maybe a 2D VFX?  So in that sense the 3D VFX is mostly done pre-colourist, but then the colourist would apply a whole bunch of other VFX treatments after that.  I don't know if this is making sense, but it seems like the workflows and scope of the VFX / DIT / colourist are going to change in interesting ways in the future.  

The move to the Cloud and having the ability in Resolve to go back and forth between the Edit and Fusion and Colour pages certainly supports this idea that it's no longer a one-directional process but a set of interactions with iterations etc.  As a one-person setup the ability to colour grade and edit in an iterative fashion, going back and forth, always made sense, and the idea of the linear workflow just seemed restrictive, although in a big production I can see why it would make sense.

Anyway, hope those thoughts were semi-coherent.  It's a fascinating space.  The film industry can be incredibly slow to innovate and change, especially in regards to how the different departments work with each other, but in some areas there is definitely innovation and it seems like this is one of those.

Link to comment
Share on other sites

https://time.com/collection/best-inventions-2023/6324047/sony-fx3/

Quote

The high-end cameras used to film today’s blockbuster movies can cost upwards of $100,000. But director Gareth Edwards (Rogue One: A Star Wars Story) shot his recently released sci-fi epic, The Creator, primarily on Sony’s FX3. The camera can be purchased at electronics stores for under $4,000, and most viewers won’t see a difference in quality. 

https://www.xdcam-user.com/2023/11/sonys-fx3-wins-time-magazine-award/

Quote

For the film the FX3 was connected to an Atomos Ninja V and they recorded ProRes Raw.Of course – the film went through some extensive post production work and there is a lot that AI can now do to clean up an image or to rescale it. But, I think we are now at a stage where almost every cinema camera that is in the market today, from the FX30 to a Venice could be used to make a feature film and the audience is unlikely to be aware of whether you used a $3K camera or a $75K one. At the same time I do feel that there is a lot to be said for picking the right camera. A studio based film might be quicker and easier to shoot on a Venice. A location based film may benefit from a smaller and lighter package. 

Whichever camera you choose, great story telling remains the main goal. Good lenses, lighting (or the use of the available light in a pleasing way) and composition are key elements in telling that story. Your skills as a film maker are more important than the camera you choose to use, but choosing the right camera can make the job easier. It’s a wonderful time to be a film maker.

 

Link to comment
Share on other sites

15 minutes ago, IronFilm said:

(From the XDCAM quote)
But, I think we are now at a stage where almost every cinema camera that is in the market today, from the FX30 to a Venice could be used to make a feature film and the audience is unlikely to be aware of whether you used a $3K camera or a $75K one.

Absolutely..  the thing is, this has probably been the case for many years now.  

It certainly has been the case that at least some of the affordable cinema cameras would be indistinguishable to audiences, the OG BMPCC / BMMCC / 5D + ML RAW for example.  

To me the milestone of having at least one affordable cinema camera be good enough is a much more significant event than "every" cinema camera being of that standard - who gives a crap how long it takes for the worst models to catch up?

Link to comment
Share on other sites

After watching the Creator, I feel the camera plays such a small part in the final product, I am attracted to the story and characters much more, and totally forget the movie was shot by the tiny fx3. 

I suspect that the movie should be more suitable to be shot by 30 5d3 ml raw. 

Link to comment
Share on other sites

  • 1 month later...

Saw the movie on pay per view the other day, much better film than I was expecting. Think Blade Runner meets Terminator 2 meets Rogue One (also by Edwards/Fraser). Loved the visuals. Mostly filmed on location in remote parts of South Asia and it shows. Very different approach than most sci-fi blockbusters like Rebel Moon and such that make extensive use of green screen , virtual production, the volume etc. 

I still think shooting on the FX3 was a buzz/challenge/marketing decision but hey they pulled it off. Goes to show how mirrorless sensors are so good these days its really everything else in the pipeline (lenses, grading, post etc) that determines "cinematic" image quality.

Link to comment
Share on other sites

Finally just saw it myself today.

Had to watch on my 16” MacBook because…well that is a long and boring story, so I won’t bother with that, but really enjoyed it.

Big fan of Bladerunner (both), Dune, Rogue One, ie, well done sci fi and this can be added to that list.

It could have been a bit longer actually, but the opposite of Rebel Moon which we saw the other day and didn’t think was complete trash, but fairly poor.

I’m tempted to trade in my S1H and S5ii for a pair of FX3’s but somehow, I don’t think it will be enough, so sticking with them.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

  • EOSHD Pro Color 5 for All Sony cameras
    EOSHD C-LOG and Film Profiles for All Canon DSLRs
    EOSHD Dynamic Range Enhancer for H.264/H.265
×
×
  • Create New...