-
Posts
7,846 -
Joined
-
Last visited
Content Type
Profiles
Forums
Articles
Everything posted by kye
-
I'm seeing more and more FX30 videos on YT and they all seem to have the same overall 'look' as the other Sony cameras like the FX30 and A7S3 (and FX6 to some extent). The fact that YT people (who often can't colour grade to save their lives) are getting consistent results across a variety of conditions really speaks to the FX30 having similar capabilities to the rest of their 'cinema' line. That could be a good thing or a bad thing depending on what you think about the Sony look, but I firmly believe that how easy a camera is to use (both the camera itself as well as how easy it is to colour grade) plays a huge role in how good a camera is. If something is the best camera in the world, but only one person on earth is talented enough to grade it to get better results than other cameras, then in practical terms, that camera isn't better than the others.
-
I guess the way I was looking at it was that there's the top end who can use whatever they want and will use the latest and greatest, then the mid-tier who hire, then the top-tier who own, then mid-tier who can afford to own an ARRI (no low-tier when it comes to owning an ARRI!), and the functioning cameras would essentially trickle-down through the tiers. The fact that the lowest second-hand price was still high sort of indicates that there's still that much demand, even in the lowest tiers. Considering that ARRI have been making Alexas for over a decade now means that there will be huge numbers our there in the wild, so if they make a years worth of sales of the 35, it's unlikely to be more than 10-20% of the Alexas already out there (plus assuming that there aren't many LFs compared to normal Alexas), meaning that 80%+ of ARRIs will be S35 and still in high demand. To me, if I was in the market for a cinema camera and an Alexa Classic was in my budget and I wanted the name brand recognition and the colour science etc, the fact that there's a new FF Alexa wouldn't really affect me that much. Yes, it would impact the top-end folks, but at some point in the 'tiers' the people will still value that an Alexa Classic is an ARRI, with all the pedigree and tack record the camera has, and none of that goes away just because there's a newer model. I think this is a significant thing considering that I suspect the majority of ARRI owners (non-rental houses) probably don't care that much. Unless you're somehow required to shoot 4K, which very few are, then an Alexa Classic is still a very desirable camera that meets probably all the requirements you'd have.
-
Interesting. If it was purely a supply and demand thing where people just needed more cameras then you wouldn't expect any price changes at all (sensor size doesn't create new production houses!) so any drop must be that they're less desirable, or that people were waiting to update their equipment. I guess if rental houses maintain a fleet of X cameras and they all decide to get more FF models at once (a sensible strategy I'd imagine) then it would be a temporary glut in the market as they all offload their worst condition models. That would suggest that prices will bounce back up again once the demand has absorbed the temporary supply blip. I suspect it's a combination of both so prices will bounce back but probably not to the same level that they were at previously. Good news for those looking for a bargain!
-
The example that stands out to me for shallow DoF was The Handmaid's Tale, which used shallow DoF to show the isolation of the handmaids from their surroundings and society in general. Such subtle use is far more unsettling than the average horror film that is little more than the output from a random motion / sound generator with credits at each end.
-
This would have been my guess - that equipment and tech 'trickles down' to the lower budget productions over time. In keeping with that thought, OG Alexas and Amira's are still many many thousands of dollars second hand, so are still of considerable value and in demand. I would assume then, that all cinema cameras ever made (except the ones that have died), are still in use somewhere, which would mean that there's a spectacular number of S35 cinema cameras out there in use. I saw in a welding YT channel video the other day that one of the cameras was an Amira (when one camera got another in shot), and it was fully rigged out as you'd expect. The channel was around 1M subs, and was definitely a commercial operation judging by how it was branded and the content. I'd imagine that there are probably millions of channels with that number of subscribers, so easily enough of a user-base to absorb all yesterdays cinema cameras.
-
The FX30, which, as a crop sensor, is mathematically incapable of shallow DoF, cinematic images, or going viral on TikTok. I know those things are true - I read them on the internet.
-
This would be my main concern. Obviously it offers lower resolution downsamples, but they're limited and as you say, 435Mbps is still pretty hefty. I find it strange that the entire industry seems to have forgotten that peoples TVs don't scale with resolution, and that image quality is proportional to bitrate rather than resolution, but processing power does scale with resolution - regardless of bitrate. Does it offer 1080p Prores HQ? That's just under 200Mbps.
-
I recently saw a video with a sponsorship that lets you license normal music - not sure if it's the same one but it seems like that's a thing now.
-
This video is a colourist putting an FX30 shot through a relatively strong grade, so might be useful for those curious about how much the footage holds up in post with qualifiers etc.. TLDR; it held up. Like almost all modern cameras that shoot 10-bit log. He was surprised that the WB and skin tones were dead on, which apparently other Sony cameras are bad at.
-
It's better, but it's still really scraping the bottom of the barrel... In that example it's mostly that there are slightly less colour shifts.
-
Yeah, and people often forget that on productions where everything/everyone costs money then equipment is likely rented and the cost of the camera is virtually inconsequential in comparison to the costs (and delays) associated with having issues on set or in post. It's pretty hard to justify anything that people aren't already familiar with, simply because it is another thing that can cause delays or go wrong and therefore cost the production money.
-
Sadly, these forums are one of the most enlightened places online. The current state of film-making is at an all-time low due to availability of equipment and popularity of outputs. Is this a good thing or a bad thing, well, that's up to everyone to decide for themselves, but the average discussion online will very confidently tell you that: FF is better there are people who say sensor size doesn't matter, but they're basically triggered all the time and sound crazy you choose the colour science you want by buying the best camera (and half the people can't even spell LUT, let alone understand that you can colour grade to match cameras) the only measure of a camera is the resolution of the sensor the only measure of a lens is the size of the aperture the only measure of a lens is how sharp it is at its widest aperture the best lens if you're not on FF is the Sigma 18-35 BM are the best cameras available BM owners don't know anything about anything and should be banned from owning cameras etc If you think Hollywood is immune to these trends, then think again. It took a while for them to get FF fever, but they got it in the end, and when they talk about sensor size and lenses and shallow DoF it's just word-salad coming out of their mouths and they know about as little as the average social media bro with a FF Sony...
-
I often watch YT in various resolutions, from 4K to 480p, and I've noticed that the 1080p quality can be hugely variable. Some content looks great and other stuff looks absolutely atrocious. The only thing that seems to predict it is the quality of the camera, however we all know that better cameras are typically used by people who pay more attention to things like uploading at an increased bitrate etc, so it may not be the camera but something else from the image pipeline. If they went 1080p-limited it would be great to see a bump in the associated bitrate, although I suspect that wouldn't happen. For some reason everyone seems to allocate constant bitrate per pixel, completely ignoring the fact that screen size is a completely independent variable.
-
That would be brilliant!! The vast majority of people wouldn't pay to watch 4K content, YT would take their 4K-preference out of the algorithm, without 4K watching ability most people would stop insisting on 4K content, creators wouldn't be ruthlessly pummelled for not shooting 4K, and the entire platform could switch focus from irrelevant pedantry to actual real content.
-
For me, if we're talking 'cinema' here, the biggest difference is in lenses and their coverage. If you're buying lenses then the S35 lenses from Sony are smaller/cheaper/lighter, and if you're renting then the zillions of S35 cine lenses out there must be cheaper to rent than those with FF coverage. I understand that the comments from others here also discuss the crop modes of FF cameras, which is also a logical comparison. Words in the English language are constantly evolving, being co-opted for good and bad, etc, but I'm wondering if a new definition is evolving? If you take an enormous step back, then you see that there's roughly the following categories for cameras: Smartphones Point-and-shoot (integrated lenses) Consumer video cameras (hand-held, integrated lens) Professional video cameras (ENG style) Hybrid mirrorless (often photo-first designs, typically quite ergonomic, no support for rigs) Cinema cameras (video-first or video-only, integrated fans, mounting points, etc) Looking at things from this perspective, which is really the perspective of the 'content creator' who shoots with a variety of tools, the FX30 is more akin to a cinema camera. If you're someone that exists solely in the world of the professional sets, the fact that Sony call something a "cinema" camera won't fool you at all and is a pretty inconsequential label to add to it. You could make the argument that cameras like the R5 and GH6 and FX30 might be best put into an additional category rather than the ones I listed above, but if you had to choose then lots of cameras without these 'essential' features fit much better in the last category than any other.
-
The internet can be magic, but not that kind of magic!
-
The problem with digital sharpness is that it cannot be un-done. If you take a RAW frame, it will have the resolution of the camera/lens with a neutral amount of sharpness. If you then sharpen that image, compress it, and then blur that image, you don't get the original image, you get something that is simultaneously too blurred and also too sharp - the worst of both worlds. If you blur the image sufficiently to counteract the sharpening (and the associated compression artefacts) then you end up with an image that is waaaaay lower in resolution. If, however, the image isn't overly sharpened in-camera (a tiny amount is ok), then you can process it in post without those issues.
-
Yes, this appears better, but still shows small signs: However, it's nothing to get too excited about. This is a stress-test image, of course, so it would do much better than this in almost all real-life examples.
-
Here's a test I did some time ago, downscaling an 8K clip to various resolutions then putting them on the same 4K timeline and uploading in 4K. The shot at the end puts them all side-by-side for direct comparison. There are differences, but even pixel peeing, they're pretty minimal. If you're watching something rather than talking tech or doing tests, then it doesn't matter, plus if you add a bit of sharpening to a lower resolution file then it can easily make up small differences. You're right that YT in 4K looks much higher quality than YT in 1080p, but it's a bitrate thing rather than resolution.
-
Just watched this video from Crimson Engine about camera trends, and he dropped an absolute bombshell.... "We shot and mastered The Devis Fortune [his recent feature film] in 4K, and then when it came time to deliver it to Amazon and iTunes, I was told they would only accept a 2K file, they would then up-res it for 4K delivery if that's what people wanted to watch". Link to the video at that point: I understand that this is a lower-budget film and not a blockbuster and I don't believe it was through a major distributor or studio, so maybe those films are actually uploaded in 4K, but..... How much actual 4K are people watching? Does anyone have more information on this? Wouldn't it be hilarious if all the people watching YT for free were demanding 4K at the minimum, when all/most/some paid content was all 2K.
-
You're probably already doing this, but in case not, my preferred order of operations is to take two images and try and make a general match first: Make both images black and white and then match levels of white points, black points, grey levels, and then contrast (in that order) Go back to colour and match WB of white points, black points, and then grey levels (in that order) Then using hue curves, match the hue v hue, hue v sat, and hue v lum curves (in any order - sometimes it takes a few passes) Then I apply that correction to a second pair of images and fine tune anything that doesn't look like it matches. Then I apply it to all the shots, and then just look at the whole lot together (or if you can't do that then just watch it on a fast speed) and stop to correct any shots that stand out.
-
A screen grab works as well, in a slightly different way. Here's some fringing, but it does seem to be well controlled: All lenses will exhibit some degree of CA - it's just a limitation on the various physics involved and especially on such wide angle lenses as these. It's not a criticism or attack. Also, people add CA in post when emulating film, so it's not always even a negative thing. The above grab shows slight over-sharpening, as does the below: It also depends on what you think the ideal amount of sharpening is, which is subjective. Personally, I prefer a high-resolution but low-sharpness image presentation, but everyone is different, depending on the needs of their project.
-
Looks like you missed both points I was making.
-
The existing lens may well be soft compared to a high-spec alternative, but I'd suggest that the images from the GoPro are bordering on over-sharpened already, so if the lens is swapped for a sharper one then that is likely to be a net-negative on the overall image quality. It might, however, reduce chromatic aberrations and fringing etc which would be an improvement. I'd be curious to see a comparison between the stock and third-party lenses. I'd also be curious to see a 1:1 crop on the 5.3k files you're recording - could you post a short clip to YT? Instead of downsampling the 5.3K file onto a 4K timeline could you simply put the 5.3K file onto the timeline at a 1:1 scale (which would crop part of the original image). If you're able then including a 2:1 where it's zoomed in to 200% would also be nice.