Jump to content

kye

Members
  • Posts

    7,925
  • Joined

  • Last visited

Everything posted by kye

  1. You know it's a good thread when people start pulling out the latin phrases!!
  2. Great post. As a fellow computer science person, I agree with your analysis, especially that it will get better and better, and will get so good that we will learn more about the human condition due to how good it will get. This is also not something new, in the early days of computer graphics, someone wrote a simulation of how birds fly in formation and it was so accurate that the biologists and animal behavioural scientists studied the algorithms and this is how the 'rules' of birds flying in formation were initially discovered. I just wanted to add to the above quote by saying that studios have already made large strides in this direction with the comic-book genre films, whose characters are the stars and not the actors that play them. This is an extension of things like the James Bond films. These were all films where the character was constant and the actor was replaceable. VFX films are the latest iteration of this, where the motion capture and voice actors and the animators are far less known, and when it's AI replacing those creatives to make CGI characters that will be the next step, and then it will be AI making realistic-looking characters. For those reading that aren't aware of the potential success of completely virtual characters and how people can bond with a virtual person, I direct your attention to Hatsune Miku, a virtual pop star: Link: https://en.wikipedia.org/wiki/Hatsune_Miku She was created in 2007, which in the software world is an incredibly long time ago, and in the pop star world is probably even longer! But did it work? That's a figure from over a decade ago and equates to just over USD$70,000,000, which is almost USD$100M in todays money. I couldn't find any reliable more recent estimates, but she is clearly a successful commercial brand when you review the below. What does this mean in reality though, it's not like she topped the charts. Here is a concert from 2016 - she is rear-projected onto a pane of glass that was mounted on the stage. She was announced as a performer at the 2020 Coachella, that was cancelled due to covid. So, while Japan might be more suited to CGI characters than the west is (although that is changing) - take the Replika story for example. Replika is a female virtual AI companion who messages and sends pics to subscribers, including flirty suggestive ones. The owners of Replika decided that the flirty stuff should be a separate paid feature and turned it off for the free version - the users reacted strongly. So strongly in fact that it's now an active field of research for psychologists trying to figure out how to understand, manage and regulate these things. It's one thing for tech giants to 'curate' your online interactions, but it's another when the tech giants literally control your girlfriend. Background: https://theconversation.com/i-tried-the-replika-ai-companion-and-can-see-why-users-are-falling-hard-the-app-raises-serious-ethical-questions-200257 There are also other things to take into consideration as well. Fans are very interested in knowing as much as possible about their idols, but idols are real people and have human psychological needs and limitations, but virtual idols will not. The virtual idols that share their entire lives with their fans will be even more relatable than the human stars that need privacy and get frustrated and yell at paparazzi etc. These virtual idols will be able to be PR-perfect in all the right ways (i.e. just human enough to be relatable but not so human that they accidentally offend people). There is already a huge market for personalised messages from stars, virtual idols will be able to create these in virtually infinite amounts. Virtual stars will be able to perform at simultaneous concerts, make public appearances wherever and whenever is optimal, etc. And if you still need another example about how we underestimate technology... "Computers in the future may weigh less than 1.5 tons.” - Popular Mechanics magazine, 1949.
  3. Makes sense. I did think that it would likely be a quite different rendering than the modern lenses that you've been using, so was wondering how that would go. I think vintage lenses suit a certain style of shooting where things are very controlled. This means that you can have control over the flaring and halation and other vintage lens characteristics because you can adjust lighting and matte boxes etc. Unfortunately for high-paced run-n-gun shooting it is nice to have these things kept to a minimum so you're not so limited by them. Still, it's always an interesting experiment and sounds like you're wiser for it.
  4. Deep Fake Love... a reality show that basically uses AI Deep Fake tech to torture the contestants... https://www.euronews.com/culture/2023/07/25/the-cruellest-show-on-tv-deep-fake-love-goes-too-far-with-ai I just checked Netflix AU and I have 8 episodes ready to watch.
  5. kye

    DJI Pocket 3?

    I'm keen to see some frame grabs when you get them into post!
  6. The main quality issue will be that you're uploading SD files, which YT thinks should be delivered at an ultra-low bitrate. I suggest exporting at 1080p, with a healthy bitrate. The traditional wisdom is to upload at 50+Mbps, but as your content is SD, a lower bitrate would probably be visually similar. In terms of the colour space, it's a little trickier, and the TLDR is to try and export using Rec709-A if that's available to you, otherwise you could try Rec709 and the different gammas (e.g. Rec709/Gamma2.4 or Rec709/Gamma2.2 etc). Unfortunately, it is very common for platforms to either not properly support colour management or have bugs. I suggest a bit of googling to get specific instructions on FCPX and YT, there should be some good advice out there.
  7. I don't think we should extrapolate that to decide what is best for the prosumer market. If we compare RAW with Prores (especially Prores 4:4:4 which is sadly completely lacking from the prosumer market), then we see that: Prores is compressed, but so are most forms of RAW RAW has to be de-bayered but RAW is also frequently compressed in a lossy way as the bitrates are almost unmanageable otherwise - this is especially true considering that most implementations of RAW are at the sensors full resolution, or are a brutal crop into the sensor completely revising your whole lens package RAW is ALL-I, but so is Prores Prores is constant-bitrate per pixel, but so is RAW RAW is "professional" quality, but so is Prores The comparison even extends into licensing, where there's been frequent speculation about licensing fees being a barrier to why manufacturers are reluctant to include Prores, and with RAW the patents are also a barrier. The more I think about this, the more that I think cameras should just implement the full-range of Prores codecs (LT, 422, HQ, and 444) and forget about RAW with all the BS that seems to go along with it... the image quality, bit-depths, bit-rates, performance in post, support across platforms, and licensing all seems to be similar to RAW or in the favour of Prores.
  8. kye

    THE Big Question

    Interesting.. I thought this was a good explainer: TLDR; Nolan only mixes for the best theatres, and doesn't care about shittier ones. I guess that arrogance has run its course, since you saw it on IMAX and still couldn't hear it!
  9. kye

    DJI Pocket 3?

    I think the biggest weakness of these budget cameras is the lack of good codec / bitrate. In practice it puts a ceiling on what you can do with the images in post, however, having multiple cameras might go some way to mitigating that issue. For example, if they created a setup where there were focal lengths like a phone with a super-wide and a 'normal' (and maybe even a tele), then this would mitigate the issue of having to crop heavily into a super-wide. Also, if they had a sensor that allowed a 2x digital crop (for example), that was coming from a sensor with >4K resolution, then that would allow a 2x digital crop to downsample from >1080p. If that mode was also paired with processing that had dedicated levels / algorithms for that mode, and was saved as a 10-bit file, it would be quite flexible in post. (All else being equal, a downsampled 1080p file with a small amount of sharpening to compensate is practically indistinguishable from a 4K file). This would also allow the two (or three) focal lengths to be spread apart further, increasing its flexibility. The lack of something wider than 24mm equivalent is a major limitation for the form-factor though, as something wider than 20mm not only enables vlogging, but also allows for a great number of situations where a tiny pocketable camera is most desirable (viewing the incredible vista from lookouts, getting shots in narrow European streets that are wide enough to show the buildings as well as the street, getting shots inside venues like cafes and restaurants without having to out your camera so far away from you that people wonder WTF you're doing or that you've forgotten it, etc etc). There's a reason that Apple puts an ultra-wide and a normal lens on the two-camera iPhones, rather than putting the normal/tele combo. It's because the ultra-wide is more useful to the average person in normal life and on holiday.
  10. kye

    Panasonic GH6

    I never thought about it that way but I think you're right...... Sony delivers enormously expensive firmware updates, they just come with a free camera!
  11. It's likely that most footage you've seen from the Helios is when it is at its peak swirl settings, which is what it is famous for, but that requires the right combination of subject/focal distance and background distance/contrast. Lots of people buy a Helios and are disappointed because it's no-where near the swirliness they see in all the pictures. Here's a more general review of one of the models, focusing beyond the swirl and including a bunch of normal compositions. Also, it's worth pointing out that while the Helios does swirl, so do lots of other lenses from this time, and they do so almost as much. Once again, the internet glorifies the one that is "the most" of something and the ones that are a close second get no attention whatsoever (link with timestamp): Also, and this is quite controversial I know(!), but it is possible to close the aperture of these lenses(!!), and this tends to increase contrast and sharpness and reduce flaring etc - all the things that happens when you do this to other lenses(!!!). Here's a range of compositions comparing the lens wide open and then stopped down (linked to timestamp): Plus, all the swirls happen further out from the centre of the frame, so if you use it on a crop sensor then you're effectively cropping out the worst part of those optical distortions. Plus, lots of well known and highly prized cine lenses also swirl quite a bit, yet the films shot on them aren't a swirly mess. Here's a controlled test of a bunch of them, just skip through it looking at the string of lights in the background: and finally, if you crop to a wide aspect ratio, the "swirl" will only be seen on the very sides of the image, which means that the swirls are limited to being quite close to vertical - very similar to an anamorphic bokeh! Here is the Master Anamorphic 50/1.9 - potentially the most optically correct anamorphic lens ever made, and yet the bokeh is oddly-shaped with cat-eye rendering and also differently shaped towards the edges vs the middle: Compared with a swirly spherical lens like the Super Baltar 50/2.3: The character of the bokeh changes on the swirly lens from anamorphic-like on the edges to normal in the middle, which some might find distracting, but you might also find to be less distracting because it limits the distractions to the edges of frame rather than being directly behind the subject. The Zeiss CP.2 50/2.1 has very similar rendering to the Baltar above, and yet is known as a relatively neutral lens and is a workhorse of Hollywood: What I find far more distracting in bokeh is the edges of the shape, rather than the geometry of the shape. Take this example of perfectly round bokeh balls and see how distracting the ones on the right are.. and don't even get me started on "bubble bokeh"
  12. Which one(s) do you have your eye on?
  13. Looks like someone found the contrast knob! If only they made a video about contrast instead of making the internets 100,000th video on the Helios and getting basically everything about it wrong.....
  14. You better not swap... I'm still waiting on your report on the 40-80mm lens!
  15. I'd forgotten that Matteo had a ZV-E1.. here are the other videos he has done with it. and this was what he shot with the ZV1 (NOT the ZV-E1) - just as a comparison
  16. With creative ideas like this so abundant, it completely fails me why the studios only want to make endless movies about a guy with a flying mammal fetish and the tin man from The Wizard of Oz..... 🙂
  17. One thing I have learned as I get older and gradually learn more about psychology and neuroscience is that we are all very different from each other. In most cases, very very different from each other. We don't notice it because people work hard to fit in and behave like everyone around them, plus we tend to spend most of our time around people who have a lot more in common with us than the average. As an office worker in a city I don't spend much time with manual labourers from the country, and far less time with uneducated villagers. When I do spend time around people that don't have as much in common with me, like if I meet the husbands of my wife's friends for example, I tend to find a topic you are both interested in and just stick with that. There are lots of people I am friends with that would drive me crazy if I had to live with them - they're that different to me and these are my friends! It is widely recognised in productivity research that we all have a very limited amount of self-discipline per day, and constantly using it all the time is a big predictor of burn-out. So I'd say that you think it's a choice because it's a choice that isn't that far from your natural behaviour, preferences, habits and values, and so for you it isn't a very difficult challenge to overcome with willpower, but that's not the case for everyone, or even a great many people. There are likely things that I think are easy that you would find completely impossible, and vice versa. I mean heck, it's pretty obvious from these forums that a lot of the time we can't even stretch our thinking to how differently other people like to use their cameras!
  18. If you want to visit the past, I'll buy a G7 and re-sell it to you for USD$999!! 🙂
  19. I recommend the movie "The Congress" from 2013.
  20. Looks good - muted in a very appropriate way. How did you handle the post workflow, especially colour grading? Fashion is perhaps one of the genres that require the most "accurate" colour so that product colours reflect the actual products.
  21. Interestingly enough, they've used Zcam on previous MI films... https://ymcinema.com/2020/10/09/z-cam-e1-crash-cam-spotted-in-mission-impossible-7/ Agreed. My impression was that there are three types of productions: High budget feature films / flagship TV series These have the budget to use high resolution RAW capture, high-end cameras and fancy lenses, significant budget for professional colour grading, to promote the film they get lots of media attention and interviews etc. The process is overseen by professional folks who know how to extract every ounce of quality. Low-medium budget feature films / most TV shows These don't have the budget for extravagances and shoot with only the level of equipment that is necessary for professional results, using lower resolutions and Prores, using solid but less remarkable cameras and lenses, get minimum colour grading budget, and get far less media attention (and basically no media attention for technical matters). The process is overseen by professional folks who know how to do the basics so that the result is solid but is delivered within budget . Amateur features / short films / cat videos Devote more person-hours to their short film than major Hollywood feature films but spend that time obsessing over camera specifications and lens technical sharpness tests, scouring over the latest $100M feature film post-workflow and trying to implement every tool and technique, insisting on only the best. Most of the time their lack of basic understanding means the result is worse than even very low budget professional productions. A quick search revealed that IMDB says that Game of Thrones was shot on Prores and mastered in 2K.... https://www.imdb.com/title/tt0944947/technical/ and more searching reveals that it was the first three seasons shot in 1080p, then 3.2K from Season 4 onwards: https://thedigitalbits.com/item/game-of-thrones-complete-series-4k-uhd I wasn't able to find any original source for the above, but it appears to be the consensus. The fact it was shot in 10-bit 4:4:4 suggests it was compressed as RAW isn't 10-bit. Prores 4444 XQ spec is 396Mbps so this is roughly equivalent. Season 1 used the HDCAM SR format: https://en.wikipedia.org/wiki/HDCAM So yeah, Prores is good enough.
  22. The discussion seems to be about reproducing colours that are lifelike, or accurate, or real. I posted an interview where a world renowned expert discusses the subject. I thought that it would be of interest, considering that colour grading is one of the weakest areas of knowledge online. There is no "point", only information.
  23. kye

    Canon Highlights?

    I haven't reviewed the current batch of cameras so not really. Realistically unless you have the cameras yourself then you need to find a source online where someone has done latitude tests. CineD.com does good ones, so that's a good place to start. For example, here are a few cameras under exposed and pushed back: But when reviewing these you have to also compare how far over each camera can do as well, as different cameras put middle grey in different places. You'll also note that the Sigma FP uses different notation ("ETTR" vs "stops under") because they didn't know where to put middle grey and so compared to a different scale. Like many things in cameras - it can't be reduced to a single number so you have to do the analysis and comparison yourself because half the benefit of the information is the understanding that you get in figuring out how to compare them yourself.
  24. Peter Doyle (colourist on Harry Potter, Lord of the Rings, etc) speaks about how closely we can reproduce the colours in the real world. Spoiler: no. (linked to the relevant timestamp)
×
×
  • Create New...