Jump to content

kye

Members
  • Posts

    7,925
  • Joined

  • Last visited

Everything posted by kye

  1. Interesting stuff, and it makes me think about the future of VFX integration. My understanding of the current VFX workflows is this: Movie is shot Movie is partly edited, with VFX shots identified Footage for VFX shots is sent to VFX department with show LUT VFX department does VFX work, and applies the elements that are required to match the rest of the film (e.g. lens emulations, etc) but does not include elements not present in the real-life footage (e.g. colour grade) VFX department delivers footage and this is integrated into edit Picture lock Colour grading occurs on footage from real-life footage as well as VFX shots In this workflow, things like colour grading etc get applied by the colourist to the whole film. Things like film grain would only be applied by the VFX department if it was shot on film (and would only be applied to the VFX elements in the frame, rather than the whole finished VFX frames). I wonder if in future, VFX will become so prevalent that the colour grade might become integrated within the VFX framework, rather than the VFX being considered a step that occurs prior to the colour grade. The discussions that I have seen imply that advanced things like lens emulations / film emulations / etc (which are more image science / colour science, rather than colour grading) are beyond the scope and ability of most colourists.
  2. What do you mean? It looks flawless to me!!! 😂😂😂 Doing it in post in Resolve looks like it might be a mature enough solution now, but I can't imagine that devices will be good enough to do it real-time for quite a few years. I started a separate thread about this semi-recently: https://www.eoshd.com/comments/topic/78618-motion-blur-in-post-looks-like-it-might-be-feasible-now/
  3. The more you look at something, the more you notice. When I first started doing video, I couldn't tell the difference between 60p and 24p, now I can tell the difference between 30p and 24p! BUT, having said that, my wife is pretty good at telling very subtle differences in skin tones and has spent exactly zero time looking at colour grading etc, so we all start off seeing things differently as well.
  4. kye

    Panasonic G9 mk2

    Just un-sharpen more. Unless you like the digital look? TBH I find that if something hasn't been un-sharpened, even if it was shot with RAW video and not processed at all, I find that it is trying to shout at me over the content of the video. I suspect the culprit is the post-sharpening that is done by YT / streaming service. Comparing your upload to YT shows quite a substantial difference.
  5. Yeah, high resolution is great if you want high resolution. Not so good if you are interested in making a meaningful final film.
  6. The image quality side-by-sides aren't even close..
  7. 170Mbps... not terrible. Let's see what the implementation is like, especially the sharpening, NR, and auto-awesome AI. I like the idea there's two of them, with the "Pro" one having a larger sensor and more resolution etc. I feel like most action products are the generic model and somehow they never release the higher up models.
  8. Yes, I think the appropriate reaction is the 😂 face... we're happy for you, we are thankful for the idea, but we also know ourselves and thus, the tears!
  9. In addition to the above, one of the previous wipes they apply is often the show LUT, which also reveals a bunch of stuff about the colour grade too. Things like: cooling shadows / warming highlights highlight rolloff subtractive saturation effects shifts to skin tones etc Normally the VFX will only have the show LUT and the "final" shot in the VFX breakdown isn't the same as the final shot in the film because the final VFX shot will be exported without the show LUT and will be coloured by the colourist in the final process, but it'll be in the same overall direction and colour breakdowns are often not available for those movies so it's good info to have.
  10. I've had some contact with ILM and discussing how they emulate lenses in post and they do very detailed and meticulous work and overall it's very impressive. So much so that I actually find the VFX breakdowns a very instructive tool in understanding the look development process. When they show the scene starting with mesh and then wipe after wipe shows each stage of the VFX process, often with the final wipe being the one that just adds that cinematic magic. In analysing what that final wipe does, I normally see: vignetting un-sharpening overall, and often more in the corners bokeh and defocusing glow / halation / mist If you take the average nice-but-digital looking shot from a modern camera and apply similar effects, the flavour of image quickly goes from video to cinema..
  11. Yeah, this is one of the other hidden costs of higher resolutions - the lack of ability to have reasonable bitrates with the higher resolutions. Obviously RAW scales with the resolution of the image, but so does the bitrates of the Prores codecs too... but here's the issue - the screens don't get bigger! and your vision doesn't get better either! You: I'd like to buy a higher resolution camera please. Shopkeeper: Sure. Here is a stack of larger hard drives, here is a huge new computer, here are the blazing media cards, here is ........
  12. Quick - you've got just over a week to stock up!!! Seriously though, this is good. Anything that helps you focus on your goals more and be less distracted is a good move 🙂 While there's always things I'm curious about, sadly, there aren't a huge list of things I'd like to buy - they just don't make the things I really want!
  13. kye

    Panasonic G9 mk2

    Just un-sharpen in post. I remember a great thread on a colourist forum some months ago asking about sharpening, and the responses were that mostly people don't sharpen, and many reduce the sharpness of the image to avoid a digital look. I think that unsharpening might be one of those hidden things that camera fondlers would consider heresy but is widely done by the pros for high end work. They were talking about cameras that shoot RAW too, not compressed codecs from consumer cameras.
  14. This is the internet... such things are irrelevant when arguing about technical matters!
  15. Yep.. it's the darnedest thing - on a film set they're thimble-sized and in danger of being lost, but if you pick one up and then walk out of the film set it starts to grow... as you walk through crowded tourist hotspots it has become quite large, perhaps the size of a toddlers head, but as you walk away from the crowds it rapidly inflates to be the size of a watermelon, with passers-by stopping and staring at you.. by the time you leave the areas with moderate foot-traffic it has become the size of a dozen adult-themed helium balloons and gathers about the same amount of attention.
  16. kye

    Canon EOS R5C

    Hot damn! You mean we can shrink the camera bodies by just lopping bits off? Where's my hacksaw!!
  17. Most sensors do a full readout in the highest bit-depth at 24/25/30p, but at higher frame rates typically reduce the bit-depth of the read-out. Assuming this was to save a bit on data rates and processing, then that means they have been making progress - they just spent it all on resolution instead of bit-depth. Yet another hidden cost of this preposterous resolution pissing contest that the entire industry is doing, with consumers cheering all the way down.
  18. Availableism - nice! It reminds me of approaches like Dogme95 etc, which integrate that sort of element and go a lot further with it as well. Sadly, I'm not surprised about the grant decision. I've been to enough film festivals to know that the thinking is often enormously traditional / blinkered, and also motivated by who-you-know and all that crap too. One student film festival I went to had a film in the documentary category that was old people talking about their sex lives - it was very entertaining and the old folks were all very cute and it definitely deserved to win an award for concept / direction / producing, which it did. However it was shot terribly, there were booms in shot on half-a-dozen occasions, the camera wasn't held steady sometimes and was bumped significantly and obviously a couple of times, but it also won best cinematography and best sound, which was completely ridiculous. One of the things that dominates the overall architecture of how traditional films are made is that many of the people who are involved are not critical thinkers, they learned how to perform their role but they don't understand the other roles, the overall process, or even how to make a film. There are often territorial disputes as people defend their patch, etc. The film-making process is a factory production line, and most workers in a factory don't understand it's possible to redesign a factory and make it work better, let alone be ok with it when someone suggests it. Yeah, that's a big drawcard about the FX3 that makes it stand out. I don't understand why there aren't more cameras with the ISOs further apart. Most cameras are have lower base ISOs, and have a much smaller interval between the lower native ISO and the higher native ISO, combining to give the FX3 a huge advantage in low light.
  19. Never mind that colourists working on high-end material still regard noise reduction as a critical tool for every shoot, including the ones where all the material was exposed properly in-camera and recorded at native ISO! It's fascinating to download cinema camera footage for the first time and see that it has more noise in it than a mirrorless low-light test. I got a bit of a shock when I saw that for the first time.
  20. I agree, but would go further and say that not only will AI radically reduce the cost to make a film you could make now, it will also make films possible that really aren't possible (or aren't practically possible) now. So in that sense it doesn't just reduce costs, it expands the possibilities to be practically infinite.
  21. I agree - it wasn't that different. I've spoken at length to a few in PMs about this and I find it a fascinating subject, but I see modern film-making having three pivotal points. There are probably others, but these are the ones I'm aware of. 1) The French New Wave, which was (how I see it anyway) an exploration of new possibilities of 16mm film that weren't possible with 35mm film. In many ways they took the traditional "coverage" of Hollywood and radically expanded it to include almost all the techniques used in modern film-making. 2) The DSLR revolution This is pretty much what this forum is for, and what we talk about. The tricky thing about this is that it didn't deliver what people thought it would. They thought it would mean that anyone could film a movie with a camera and no money at all and that there would be another revolution in cinema like the FNW part 2, but this didn't happen. What we got instead, and this is my impression, was TikTok, YouTube, influencers, live streaming / Twitch, etc, which are all new forms of film-making (despite how creative or worthwhile you might think they are) because they all involve video recordings made into final products and distributed to an audience. The promise was real though, and people like Noam Kroll have mapped out a path for up-ending a lot of the traditional processes. One particular process that he's put forward that I really like, and WOULD change film-making is: Come up with a concept for the film, restricted to things you already have access to (cast, locations, etc) Cast the film Work out half-a-dozen or so sections with major plot points for the film, even in very high-level terms Workshop the characters with the actors, develop a concept for the first section Shoot the first section, involve lots of improvisation from the cast, potentially not even writing a script beforehand Edit the first section, see what worked and what didn't, concentrate on the performances Develop/update the concept for the second section Shoot and edit that one, once again with improvisation and concentrating on performances IIRC Noam shot a film like this and ended up giving the two main actors writing credits too. He mentioned that he made a lot of adjustments to later sections based on what worked and didn't from earlier ones, and I have a vague memory that during some improvisational parts the actors did while on location, he even ended up changing some major plot points to replace them with more interesting ones inspired by the process. This is the kind of thing that would change the future of film-making. Not having a new camera body. 3) AI The third pivotal point I foresee will be AI. Anyone who has watched any (decent) anime will know that writers in anime have been enjoying the freedom to create worlds without any practical limitation for over a century now. Up until the last few decades no-one could do that with realistic motion pictures, and right now it's mostly limited to those with huge budgets or with incredible skill and huge amounts of free time. AI will change that. The ability to shoot anything you like, however flawed, and just have AI add and remove and bend and change what you shot into whatever your imagination can come up with will be groundbreaking, just like the freedom that anime artists have enjoyed solely until recently. When AI can create Inception from your iPhone footage, things will be unleashed. However, much like we saw with the DSLR revolution, there will be other forms of film-making that are invented too, like deep-fakes, alternate histories, and who-knows-what else. No. I watch a lot of YT. Far more than streaming sites. I do try and include lots of film-making stuff, as well as a great many other interesting and niche things. The world is a fascinating place!!
  22. Sure - bigger is better. Easy. The real test would be how they evaluated a large, complicated looking, but crap setup vs a small one that had a much better image. Also, in this imaginary test, you can't pick something old looking vs something that looks much more modern, as that's another giveaway. Size is so correlated with image quality that it's difficult to think of counter-examples. Even the big but really old cinema / ENG cameras are either still actually really good (and 1080p), or they are horrifically dated and wouldn't fool many (old betamax cameras for example). I'm not saying there are no counter examples, but they're a very very small percentage of the possible comparisons. Also, people pretty much know that the longer the lens the more zoomed in it is.
  23. Absolutely. I do all sorts of side-by-side tests and nit-pick minor differences in colour grading techniques too, but it's important to always remember to make final judgements on the final result, rather than some artificial situation. This is why I encourage people to grade the footage, export it, and upload it to whatever final platform they'll use for delivery. It doesn't matter how the image looks in your NLE, it's how the viewers will see it that matters (and if it's a paid gig then obviously the producer/director/client need to be happy too). The only reasons I do these nit-picky tests is: to keep familiar with shooting in-between my trips (which is what I shoot) to eek out the best results I can when shooting in difficult conditions with modest cameras to eek out the nicest colour grading I can (spectacular colour is the culmination of a bunch of small tweaks, not a few big ones) to test out new techniques and keep learning and improving my skill levels to understand and optimise the many trade-offs that we're forced to make I also do all these things with my cinematography, editing, sound design, and delivery. The whole pipeline matters.
×
×
  • Create New...