-
Posts
7,846 -
Joined
-
Last visited
Content Type
Profiles
Forums
Articles
Everything posted by kye
-
SJCAM SJ20 Dual Lens Action Camera to be Released Soon...
kye replied to powerman668's topic in Cameras
The only remotely solid thing that google has is this: https://competition.adesignaward.com/design.php?ID=145503 Which appears to be a design competition entry.... -
So, is anyone reading it yet? I'm up to chapter 11 and enjoying it, although I'm only skimming the technical parts.
-
To be honest, the whole thing of this movie getting all this hype for being shot on an FX3 seems ridiculous. I think on paper it sounds like this is an incredible thing where a big budget movie shot on a tiny cheap camera, which is the kind of story that makes people who can only shoot on tiny cheap cameras feel better, but it's not true. Even if you have a tiny cheap camera, you don't have $72M to spend in post... .....and if you had $72M in post, you could have shot it on an EOS M using Magic Lantern and it would have looked almost as good (professional colourists and VFX artists are incredible - high-end work looks great mostly because of those, not the camera that was used). The image isn't really that much better than other things I mean, can you tell which of these below is the FX3, or the Venice 2, or the G9ii, or the R6ii, or the GH6? It should be obvious right - the Venice 2 is something like $50K, and the G9ii and GH6 are consumer MFT cameras, and the FX3 should be easily identifiable, right? The FX3 isn't really that small You might be thinking the FX3 is small because it's smaller than a 5D, for example: But that's not what it looks like, this is what it looks like: It's not that cheap So you want external RAW - sure. The Fujifilm X-S20 does Prores RAW, and it's $1299 - which is a third of the cost! Oh, you want 16-bit RAW, not that crappy 12-bit RAW that the Fuji shoots. Let me ask you this - have you ever seen the difference? Have you graded it? I didn't think so. If I showed you a 12-bit RAW video graded by a professional colourist or 16-bit RAW video you shot and graded yourself, which would be better? Maybe you should shoot your movie (or cat video) on the Fuji and pay a professional colourist to grade it - it would look better.... It's a marketing stunt.
-
Yeah, sounds to me like marketing spin, where you say something that's technically correct (so if anyone asks you have an answer) but not relevant in any practical sense. For example the FX6 gets a lot heavier when you mount a screen etc, that the FX3 already has, but they shot in RAW on an Atomos, so the FX3 would never have been used without the screen anyway...
-
I recently asked for book recommendations to learn about human vision and was given a link to a free PDF. It is incredible. I'm only a quarter of the way through, but I'm absolutely blown away. The human vision system looks like it was designed by committee and then re-imagined by Dali and Picasso, while on drugs. It is a wonder we can see anything at all! Did you know that the rods and cones (which detect light) are BEHIND a bunch of nerves and nerve cells and blood vessels, so the light has to go through a bunch of crap before you even sense it? The book is actually a mix of how the human vision system works and also what we have done with the tech to try and align to it, so it's a nice blend of biology and tech. It's also very readable and tries to be as non-technical as possible. This is a rare find compared to other books that are hugely tech heavy. Take the red pill with me... download it here: https://www.filmlight.ltd.uk/support/documents/colourbook/colourbook.php (download it by clicking on the box next to the file size).
-
Well, it is a telephone, after all.... Considering how bad the 14 was compared to this, if they keep going the 17 will be like a MILC, the 18 a cinema camera, and the 20 an Alexa!
-
Yeah. It also didn't look too bad with a Black Pro Mist filter either:
-
I'm surprised no-one has posted this yet.... Gerald tested the DR: The TL;DR is that below about ISO 1000 it measures 12.2 stops, but then around ISO 1000 it starts rising and at ISO1480 managed to get 13.8 / 13.3 stops.. Gerald said that he suspects the native ISO is around 1250 or so, and the DR going down below that is typical of sensors set below their native ISO. All in all, this is a seriously impressive result. It makes me wonder if you set it to full-auto and use it in daylight if you'll be limiting the DR by forcing it to a lower ISO? That's not quite so ideal..
-
More well-shot footage...
-
Well, if it goes from GX3: to FX3: ..the GX3 will be the size of a matchbox!! Then, in a massive twist no-one saw coming... Sony releases the next version of the GX85....!
-
I know of some VFX folks that use these in bulk to capture high-res plates for compositing, so they're at least known and in-use in professional circles. I don't know of examples where it was used as the main camera, but that's the thing about people that actually make content - mostly they're not online talking about the brand of paper-clips they use 🙂
-
8K is cancelled. Just use AI to make it better... IMAX does! https://ymcinema.com/2023/10/02/imax-ceo-we-use-ai-to-blowup-images/?expand_article=1 From IMAX CEO, Richard Gelfond: "we use it to blow up images. We use AI to make the images look better, we sharpen the edges, and we take the grain out. We have been using AI for supplements for a while." "However, the best reference for that utilization of the IMAX proprietary algorithm and AI tech, is the talked-about sci-fi project, The Creator. The movie was shot entirely on Sony FX3 which is not an IMAX-certified camera. Nevertheless, the ProRes RAW footage was undergone special treatment by IMAX AI technologies, in order to boost the imagery and make it capable enough to hold up against the huge canvas." Other streaming services only store feature films in 2K and upscale to 4K for people who stream in 4K, and now IMAX, the supposed best quality folks upscale using AI. We all knew that streaming was a low-quality distribution, and now IMAX is too... How can they get away with this???? Maybe.... *gasp* ....because it's not visible? I mean, you could say that AI is good enough for IMAX, but you could also say that visual perception is so low that you can't even tell that IMAX is upscaling with AI! It works both ways!! 😂😂😂
-
I don't think it matters as long as it's not white or black. Even if they just recorded a test clip where they point it at different stuff. If there was a pixel stuck on or off you'd notice it pretty easily I'd imagine. "Film something bright and something dark" might be relatively simple advice to understand?
-
It might be worth trying to come up with an "identity" for your audience, so you can hit the right level of info. For example, if you imagined you were talking to your partner, or next door neighbour, or grandmother, you would say things in different ways. You could even imagine there are two or even three audience members with different levels of knowledge. I get the impression that once you get going and start getting lots of comments on videos then you'll get a sense of who is out there watching. YouTubers often talk to their followers in ways that make me think they have a good sense of what their expertise is and what they like and don't like etc. But, to get you started you might have to make up your audience. A wireless mic sounds like the best solution. Even if it fails occasionally, a VoiceOver in post is a good fall-back option and far from ruining the video. Lights is a sensible solution, especially because large aperture lenses have shallow focus planes, and I'm not sure if it's worse to have a noisy image or one where you're out of focus half the time. For monitors, is it practical to have a wireless monitor? If you had a wireless monitor, a wireless mic, and a wireless trigger then you could put the camera wherever you like and still be able to control it and check focus etc. Most solo shooters only record from close to the camera with wide lenses, and that's one aesthetic, but there are other aesthetics too, and using a longer focal length from further away gives a much more professional look. Martijn Doolaard is a self-shooter and films from far away as well as close/wide, which gives a higher production value I think. Here's a video linked to an example: https://youtu.be/Ybgr8OUskcM?t=563 The alternative solution to using a monitor with power-out is just using a battery plate. These seem to be really useful as thy just take a battery and often have many different power outputs. One of these might make your setup more flexible in future if you decide to add more accessories or change the monitor etc.
-
800 isn't really base ISO for the sensor, just for that mode, and as @Django mentioned they could do one at ISO 200 that is a lot cleaner. I'm fine with that level of noise, but that's because I'm a fan of cinema which doesn't require anything even remotely close to 8K - it's really just for cropped modes and pixel peeing folks.. I think people are broadly aware of oversampling and its advantages. However the other thing to keep in mind is that you don't need to double the nyquist frequency - you only need to have a slight advantage, like audio being recorded at 48K and then delivered at 44.1K, or cameras like the GH5 which was 5.2K downsampling to 4K. The more I learn about what is going on under the hood, the more I realise that statements like "8k bayer is 8k" don't even make sense. To truly un-pack that statement would require a whole textbook, and that's just the technical side and ignoring the perceptual aspects!
-
Considering how long it takes to design electronics products, the GH8 is probably also in "very early stages" of development. But, as they say, talk is cheap! Until it's in your hands it's not real..
-
If true, that would definitely explain it. It sounds quite plausible too.
-
Record a shot of the blue sky, or a wall perhaps? Anything that isn't 0 or 100% in any colour should be visible in that test clip I would imagine.
-
There's a bit more noise in these images than I would have thought. Were you keeping to base ISO? It is 8K, so by the time its used in the real world then this will have mostly cleared up, but by that logic there's no point in having 8K - might as well have had better/bigger pixels.
-
That was my impression from what he said. I'd imagine that Filmic Pro will provide an update that will use the full capability of the new Prores Log mode, but I don't think it will provide any advantages over the BM app. TBH, if I owned Filmic Pro, I'd be sitting in a closed room with the smartest people in the company scribbling on a whiteboard furiously trying to work out how to stay relevant. My prediction: we'll see a torrent of YouTube videos sponsored by Filmic Pro as a last-ditch effort to keep revenue up (a huge advertising campaign is a standard sign a company is in trouble). I think you make a good point. The way I think about it is this. Your smartphone contains one (or more) small-sensor (just under 1"), fixed-lens (probably a prime-lens), digital cameras, which records internally/externally to h264/h265/Prores/RAW files. If you want to shoot deep-DoF videos with the available FOV in good light and the codec is sufficient for you then it's a solid choice. If you want to shoot with a different lens, larger sensor, in very low-light conditions, or you need better image quality than it can provide, then it's not a good choice. This is the same for any camera. Name a camera and there are things it's not good for. The ARRI Alexa 65 is a terrible choice for skydiving, a smartphone is terrible for shallow DoF projects, the RED V-Raptor XL is not suited to home or cat videos, an IMAX 70mm camera would be miserable for on-location night-time safari shoots, etc.
-
Ah, I just realised I mis-read your comment in my above reply as "the quality that the manufacturers keep in the drawer (through limiting their potential with too heavy-handed image processing and compression)". You are right, of course, especially considering that the main reason people keep them in a drawer is because of their limited technical specifications, when realistically people have just gotten used to the latest technologies. Most cameras we keep on shelves or in drawers are better than 16mm film, and that was what was used to shoot all but the highest budget TV shows and was used on a number of serious feature films too, like Black Swan (2010), Clerks (1994), El Mariachi (1992), The Hurt Locker (2008), Moonrise Kingdom (2012), The Wrestler (2008), etc etc.. I think the biggest problem is that people don't know how to colour grade, or don't know what is possible. I mean, anyone with a Blackmagic camera that shoots RAW has enough image quality to make a feature. Hell, if the movie Tangerine could be a success when shot on the iPhone 5S, then no-one has any excuses for not being able to write a movie that is within the creative limitations of their equipment. Even a shitty webcam could be used to shoot a found-footage horror movie set in the days of analog camcorders!
-
The Hawk "emulation" was simply two blur operations, each at a partial opacity. In Resolve: Node 1: Blur -> Blur tool at 0.53 with Key of 0.6 Node 2: Blur -> Blur tool at 1.0 with Key of 0.35 The first one (0.53) is the small radius blur that knocks the sharpening off the edges, and if you were just using this one on its own you might even want to make it closer to 90% opacity. The second one is a huge blur (1.0) that provides the huge halation over the whole image. I use the Resolve Blur tool because it's slightly faster than the Gaussian Blur OFX plugin on my laptop, but the OFX plugin allows much finer adjustments so that might be easier to play with. You can also adjust the size of the blur and the opacity in the same panel, so it might be easier to get the look you want using it. What are you grading? I'd be curious to see any examples if you're able to share 🙂
-
Yeah, if you can bypass whatever the camera is doing and get the RAW straight off the sensor then it should be a good image. Sony know how to make sensors, and the FX3 shouldn't overheat.... From the Atomos page on the FX3: https://www.atomos.com/compatible-cameras/sony-fx3 I never hear people in the pro forums talking about Canon, only ARRI / RED / Venice. Gotta shoot fast and get that IMAX image! Absolutely. It's one of the reasons I am so frustrated, especially as now they've "unlocked" this quality via bolting on an external recorder instead of just giving us better internal codecs. I mean, for goodness sake, just give us an internal downscale to 2.8K Prores HQ with a LOG curve and no other processing! Even the tiny smartphone sensors look great in RAW. Scale that up to MFT or S35 and imagine the quality we'd be getting from every camera! My favourite WanderingDP video explains everything...
-
I'm curious to hear how it holds up on an IMAX screen too - keep us informed. The link that @ntblowz shared has lots of info: "the filmmakers use the Atomos Ninja V+ as an onboard ProRes Raw recorder" "75mm Kowa 2x anamorphic lens with a prototype of the Atlas Mercury 42mm as a backup for the small spaces where the 75mm was too tight" TBH, the choice of the FX3 could have been as simple (and uninformed) as simply being that they are aware of ARRI, RED, and Sony (through the Venice) and looked at their cine lineups to find the smallest cinema camera, but never evaluated Panasonic or Fuji because they were simply unaware of them. Sometimes a lot of these industry heavyweights can be just as dogmatic about their favourite brand and just as naive / hoodwinked by rumours / misunderstandings / marketing as the worst camera fanboys/fangirls online.
-
I've developed a more sophisticated "false sharpness" powergrade, but it was super tricky to get it to be sensitive enough to tell the difference between soft and sharp lenses (when no sharpening has been applied). Here are some test shots of a resolution test pattern through two lenses - the Master Anamorphics which are gloriously sharp, and the Hawk Vintage '74 lenses which are modern versions of a vintage anamorphic. ARRI ungraded with the false colour power-grade: Note that I've added a sine-wave along the very bottom that gets smaller towards the right, and acts as a scale to show what the false sharpness grade does. Here's the Hawk: and the Zeiss one with a couple of blur nodes to try and match the Hawk: Here's the same three again but without the false sharpness powergrade. Zeiss ungraded: Hawk ungraded: Zeiss graded to match the Hawk (I also added some lens distortion too): Interestingly, I had to add two different sized blurs at different opacities - a single one was either wrong with the fine detail or wrong on the larger details. The combination of two blurs was better, but still not great. I was wondering if a single blur would replicate the right shape for how various optical systems attenuate detail, and it seems that it doesn't. This is why I was sort of wanting a more sophisticated analysis tool, but I haven't found one yet, and TBH this is probably a whole world unto itself, and also, it's probably too detailed to matter if I'm just trying to cure the chronic digitalis of the iPhone and other digital cameras. ....and just for fun, here's the same iPhone shot from previously with the power-grade: If I apply the same blurs that I used to match the Zeiss to the Hawk, I get these: It's far too "dreamy" a look for my own work, but the Hawk lenses are pretty soft and diffused: