Jump to content

kye

Members
  • Posts

    7,846
  • Joined

  • Last visited

Everything posted by kye

  1. It has been for over a decade. No-one who tries and fails to make a film did so because of equipment limitations, unless they're trying to shoot a film about deep diving in acid or some other extreme environment. Film-making is hard, but saying you couldn't make the film you wanted because of the camera is the same as saying you can't make it because you couldn't find the right shoes!
  2. I am perhaps the closest here to saying yes to this comment, being that I value size, simplicity, speed of working, stabilisation, and am actively opposed to creatively-inappropriate shallow DoF, etc. I do, and will continue to, use my phone as my second camera. I use it when I don't have a dedicated camera with me, I use it for a wide-angle (to complement the normal-tele lens on my proper camera), and I use it to shoot the in-between shots like getting in and out of vehicles and while travelling between locations. However, it is along way from becoming my only camera, because: They're crap in low-light, especially the wide-angle camera I tested my iPhone 12 Mini vs my GX85 and GH5 and I found that the normal iPhone camera had similar noise performance to the MFT cameras when they were at F2.8, and the wide camera was equivalent to F8! It's always your phone and your camera You always want to use the latest phone as your camera because they're getting better and better, but you always want to use the latest phone as a phone because the battery life etc is newer and better and they're faster. Therefore, you are always in danger of getting messages and pop-ups etc on your camera while shooting, or if you put it into aeroplane mode then you might miss important notifications. They're too expensive to buy two of. They're a pain to shoot with No handle, no wrist strap, etc, and if you rig it up then it's a pain to use it as a phone, and a pain to put in your pocket. Also, if you want to have any external accessories like lenses or NDs, they always require a case, but the cases are completely shit at being a protective case, so you're perpetually changing cases, which screws them up. Even if you don't use lenses etc you're going to want to use an SSD with it now because Prores. They lack flexibility of lenses Even a Panasonic GM5 that shoots low-bitrate 1080p will look better fitted with an F1.2 (or F0.95) lens than a phone for low-light situations. The GM5 will also look better fitted with a 100mm (or 200mm, or 400mm!) lens than a phone using heavy digital cropping on its longest lens. The GM5 will look nicer when fitted with a fast aperture lens than the Cinematic Mode which has about the same subtlety in blurring the image as a toddler does when smearing food on themselves at dinner time.
  3. kye

    Panasonic G9 mk2

    Great post and well made points. I'd suggest that MFT is more suitable for any form of film-making that is focused on story/plot/emotion/meaning because it protects you from moronic ideas like shooting with a 50mm lens at maximum TONEH, which beyond shooting a scene where someone is on serious drugs, is creatively inappropriate.
  4. I'm reminded of the comments from @John Brawley talking about how he used a super-minimal setup based on the BMMCC when shooting The Resident. Obviously these are all completely controlled sets with lighting etc all dialled in, so external shots out in the world would need a bit more rig perhaps. John spoke about how the size of the rig allowed him to get much closer to the actors like you see with him holding it out in front literally between the actors, which would have been a different equation had it been bigger / heavier.
  5. Yes, but it's been the most popular camera for quite some years! Just look back at the Flickr camera stats from previous years...
  6. You and me both! They say the first step is admitting you have a problem, which we've both done, so that means we're probably ahead of half the pack already 🙂
  7. This isn't how the technology industry works. In order to cater for the long development cycle and also try and respond to market conditions you want to use something like set based design. This is a reasonable introduction to the topic. https://scaledagileframework.com/set-based-design/#:~:text=Set-Based Design (SBD),eliminating poorer choices over time.
  8. There is no good way to judge colour online - lets review all the possibilities: log images are shown this is the purest experience of the camera, but you can't judge anything with this log images + manufacturers LUT is shown this is the best way to judge images, but its random chance how good this will look and doesn't really show the potential of the footage graded images are shown, but they look crap this tells you nothing as you can't tell if the camera is bad, the LUT is bad, the colourist is bad, or all of them graded images are shown, and they look good this tells you what is possible but not what is possible for you. great images could be because the camera is great and colourist is ok, camera is good but colourist is very good, camera is mediocre but colourist is world class BTW, if camera forums and camera YT had the same level of knowledge about cameras as they do about colour grading then every video would be trying to work out the exposure triangle and failing. Even rudimentary colour grading and colour science knowledge online is rare outside professional colourist circles - I know more about these things that most and I am at the very very very shallow end of the pool so if I know more than you do then you're basically no-where...
  9. You'd have to watch that around young kids - now the corners will be razor sharp.
  10. That setup caught my eye too - it's basically a way to add "internal" NDs 🙂
  11. kye

    Panasonic G9 mk2

    I hear you, being one of the most "sizeist" users here, but I just don't see the strength of the reaction being proportional. We could make a list of all the things wrong with the G9ii release, or the release of any other model from Panasonic, and we could easily find parallel examples from other manufacturers of things they do that are just as egregious, but when Sony releases yet another camera that overheats, or GoPro releases another doppelgänger camera with microscopic spec improvements, or whoever the hell else waits years and then releases an uninspiring black box, we don't then spontaneously question the future of the brand or the entire format. If someone were to post in every Sony camera thread that FF was dead and Sony was going to exit the camera market because of the latest release then they'd be considered unhinged, but somehow with MFT every camera announcement is turned into a wake.. What will you use it for?
  12. kye

    Panasonic G9 mk2

    Right, now I get it. Throughout this thread I thought that people were saying "Look how huge it is..... by giving it the body of the S5 they made it huge" when actually they were saying "Look at how the mk2 is the same size as the mk1 - that's outrageous - every camera update should be drastically smaller!" TBH, if we're going to judge everyone that didn't give us what they could have given us, we'd all better be saving up for an Alexa, because the people who worked on every other camera released in the history of the world are going in front of a firing squad tomorrow at dawn.
  13. kye

    Panasonic G9 mk2

    I'm confused.... WTF was everyone complaining about?
  14. I think I understand where @IronFilm is coming from - the advantage of a larger body is that you get dedicated buttons and other things that are useful on set. Think about it, if there was no use for something then they wouldn't add it to the camera, regardless of how large they were allowed to make it. On a controlled set you'd imagine that they'd have a proper cinema lens with remote follow-focus etc attached, matte box, v-mount power, monitor, and the whole thing would be rigged appropriately. By the time you add all that then the difference between an FX3 and FX6 is maybe only an extra 25% to the size of the whole rig.
  15. Comparison with S5iiX... Connor shows that sometimes it looks over-sharpened: But that's still not as bad as I remember previous models being.
  16. It might be time to upgrade I think! That is a far cry from previous models.
  17. Just found this video outlining a bunch of new features in Baselight (the other worlds best colour grading platform) and it confirms exactly what my thoughts were about sharpening and blurring, starting at 31m: He set up a frequency generator that starts at low frequencies on the left and goes higher towards the right, then uses the waveform monitor to show the amplitude, essentially showing a frequency response plot. The plot looks like this before any transformations: Adding sharpening to the image increases the amplitude of smaller frequencies: and a simple Gaussian blur reduces them: which is what we want to counteract the effects of in-camera sharpening, as well as replicate the decreasing response of film: I'm HUGELY encouraged now that I've worked out that a blur is a reasonable approximation of a more analog look, and that it also reverses sharpening. Another thing he demonstrates is that by increasing contrast on the image, you are amplifying the contrast of both the fine detail as well as the lower frequencies, so you're changing the texture of the image in ways that may not have occurred had the scene been lit with a greater contrast ratio. At 46m he shows an example (from Mindhunter, which was graded heavily in Baselight, fun fact!) with this being how it was shot: then you might add an effect in post like there was an additional light source: which has the effect of amplifying the contrast on everything, including fine details (ie, what sharpening does), but then he uses a special tool they've developed that only adds contrast to the lower frequencies and doesn't amplify the finer details, and voila: Now it looks natural, like that could have been how it was on set. I suppose you might be able to build this operation using frequency separation, but in Baselight it's just a slider. I guess that's one of the many reasons why instead of buying Resolve for $299 you'd buy Baselight, which is more of a "mortgage your house" sort of thing. The rest of the talk is excellent and worth watching, and this one that is more recent has some new stuff that is also fascinating:
  18. Ah, indeed I was! My bad. I re-read the article and it doesn't look like there's any concrete information in it at all about where they shot and how many people were present. A lot like most other journalism then - you read it and feel like you've been told a lot of stuff but when you look again there's just a stream of vague statements 😕 So, who knows why, I guess.
  19. Wait until the autonomous vehicles and autonomous other-things market really starts ramping up - those things will have dozens of cameras, each needing an entire camera module of its own. It is decades away, but the future is likely to have many times more autonomous devices than there are human beings, and they'll all need an array of various sensors.
  20. I suspect it could have been because it was small and they didn't want to get too much attention. "so his crew consisted of just his actors (Scoot McNairy and Whitney Able), a sound tech, a line producer, a translator and a driver. Edwards operated the camera, grabbing footage guerrilla style whenever they came upon a compelling location while traveling through Central America." I'd imagine that security might have been a concern in that region (maybe I'm wrong on that) but combine that with not wanting to draw attention from local officials who might hassle you for permits etc. Also, if you're bringing equipment into a country and are going to leave with it again you will need to declare it at the border (so you don't have to pay import taxes - I can't remember what this is called) and this process is a huge PITA for documentary crews etc, so getting around that would be a huge time advantage (I've seen docos where it takes the better part of a day for each border crossing because of this).
  21. I would tend to agree with your assessment that it's deliberate and they'll continue with it. If it was accidental then it's a pretty amazing thing to screw up. Also, while it seems to do enormous "damage" to the video from a technical point of view, the images all looked absolutely fine to me from an aesthetic point of view, so the probability they accidentally created a problem that large that didn't look worse is basically zero. It's yet another example of how you can make huge changes to the video signal and they can be invisible or even desirable - being "accurate" is the starting point of the creative process rather than the goal itself.
  22. Thanks! It is a tricky thing that basically every piece of technology faces - how do you cater to entry-level, intermediate, and expert users all with the same product: - If you make it too simple then the experts complain about lack of customisation (which is what Apple has done here - there aren't enough options). - If you make it too complex then the entry-level people can't work it out or end up setting things wrong, then blame you for the bad results. - You can't even make a "pro" mode because half of the entry-level people who don't know anything think they're "pro" and will screw it up and blame you for the results out of embarrassment alone.
  23. I got quite excited when they originally released the Prores support in their previous model as every camera I've seen with Prores had great images that had minimal processing, but unfortunately they just implemented it as a high-bitrate version of the same heavily processed mess that they sent to h265 compression. However, I'd be extraordinarily happy to be proven wrong! I use my phone as the second camera on my travels and it would be great if it could live up to its potential 🙂
  24. *shrug* only looked marginally better than my own footage from my 12 Mini. Until they stop pummelling the footage basically to death internally it won't produce anything other than a brittle digital looking image. I've just spent about 90 minutes trying every blur technique I can imagine on some sample footage I have from my 12 Mini and then jumped over to some similar shots from the GX85 and it was night-and-day different because it wasn't over sharpened before the compression. The GX85 is 8-bit vs the iPhones 10-bit, but the GX85 was still outright superior, despite both cameras being used outside during the day, so have absolutely tonnes of light, so both sensors would have been base ISO. The RAW footage from the android phones looked great - even after being put through the YT compression which is far more severe than the compression that is applied by any smartphone manufacturer. I don't know why Apple are so intent on ruining the video signal they're pulling off the sensor.
×
×
  • Create New...