-
Posts
7,873 -
Joined
-
Last visited
Content Type
Profiles
Forums
Articles
Everything posted by kye
-
Canon EOS R5 / R6 overheating discussion all in one place
kye replied to Andrew Reid's topic in Cameras
That's my impression - we're still in the early days of h265 hardware support permeating through all the hardware and then that hardware support permeating into all the software so it gets used. It makes sense that h265 would be the topic of the day and we'd gradually get our act together because....... h266 has just been released! -
Those mirror attachments are a pretty ingenious device to get a selfie screen when you need it. The alternative is to just go wide (anything under 20mm should be great) and just stare down the barrel. I tried selfie shots (not vlogging I promise!!) with an action camera and the width really made it great - not only because you don't have to get framing completely right, but also you don't have to hold the camera as far away as possible like it smells really bad or something - a 15mm equivalent FOV can be held at about reading distance and you still get a nice frame that doesn't feel too cramped or too intimate.
-
Speaking of Voigtlander, I'm (finally) editing the trip we took to India in late 2018 and pulled out a few hero shots and did a basic grade to get a feel for the project and to get some inspiration. That was the first trip I took with the GH5 and Voigtlander 17.5mm and one of the concerns that I had was low light performance as the GH5 isn't known for it, but it wasn't until I got back to the hotel and looked at this footage that I realised how good the combination really is. Here's a sample grab of the tour guide when we went to Varanasi to watch the nightly religious celebrations and see the cremations on the shore. No idea if it was wide-open or not, but this is a basic grade, some NR, and some film grain over the top, no sharpening applied, and it's even got a small negative amount of mid-tone detail. GH5 was in 150Mbps 4K IPB and HLG. I can't recommend these lenses enough.
-
It ended up that I couldn't shoot in the ETC mode for some reason - it was disabled so not sure what that was about. Anyway, I just zoomed in post, which is pretty easy to do. I'd zoom in a bit more, or just adjust vignette in post. I should do more tests to see how much it sharpens up when stopped down. The Voigts are like that, two lenses for the price of one basically 🙂
-
Just did this Cosmicar 12.5mm C-Mount lens test for a friend and thought I'd share. GH5 shooting C4K in 400Mbps Cosmicar 12.5mm F1.9 shot mostly wide open Cine-D with Noam Kroll settings Contrast 0 Sat -5 NR -5 Sat -5 Hue 0, no processing in post except zoom. This combo is seems like a winner!
-
That's really useful @Video Hummus. At one point the wife and I were talking about starting an online training platform where we'd sell online courses and content and one of the potential business partners (who is a high-end working pro and a source of one of the first courses) lives interstate and I was contemplating how to set up a kit like this and ship it and have them set it up themselves.
-
I don't have any real shoots planned for this setup, but was just part of getting to know the camera and prepared in case.
-
Perhaps a basic question, but how much of what AP does is video vs stills? My impression of AP was for print media - do they have a big video arm too? with journalists reporting from wherever? I only ever remember seeing the old "Credit: AP" "Source: AP" type captions on photographs.
-
Played around and got this configuration which seems to meet all the criteria... It's a bit top heavy, but it's balanced (it literally stands up on the QR plate when placed on a flat surface). You can get to the handle, at least from the left, without touching anything but the handle, and the big hole in the vertical support even lines up very nicely with the coiled cable for the mic making a convenient cable tidy! It's not exactly compact or elegant though!
-
@Sage on the GH5 have you compared the 1080p 422 200Mbps All-I vs the 4K 422 400Mbps All-I vs the 3.3k 422 200Mbps All-I anamorphic modes? IIRC you said that the 1080p 200Mbps mode had better colours than the 4K 400Mbps mode, but I don't recall any comments around the 3.3K 422 All-I 400Mbps mode. It has less bits/pixel (2.7x the pixels and only 2x the bitrate) but the bitrate is 2X for the whole image which includes the same amount of tonal variation in the subject (you've got the same FOV and twice the bitrate to describe it) so when watching full-screen the pixels from the 3.3K mode aren't quite as good but they're much smaller, so maybe that offsets it? If you haven't tested it but are curious, I can share some sample stills with skin tones and a colour checker for you to look at.
-
Haven't had a chance to look at the above links, but I found this recently which is an interesting comparison of two lenses, listing their pros and cons and with comparison footage:
-
Follow-up question. I have a grey card, so can do custom WB. Is there any way to reliably set the levels with it using GH5? I don't recall there being anything that will tell you what IRE something sits at.
-
Yes, that's exactly what I was thinking - to design my own view-assist LUT and just be able to turn it on and off. IIRC there are two zebras and you can set them to be whatever you like. I got the impression you could only choose one of them at a time though, so I don't think you could have zebras showing 40-60IRE for example. If I could design my own view-assist LUT I was thinking of something that I could use all the time instead of turning on and off, like making the middle range colour and maybe expanding them a little, the shadows and highlights compressed and B&W, and have bright yellow and bright red as clipped or black. Something to be technical enough to tell you about your exposure, but something that wouldn't make me creatively blind to what I was shooting, considering that I shoot my holidays and family events and have basically no control of a situation whatsoever, so I'd want to be able to react to new things going on, etc. Oh well.
-
Is there any way other than the V-Log View Assist to get a LUT into the GH5? HLG View Assist doesn't seem to allow external LUTs.
-
Great job @hmcindie! It's refreshing to see an action sequence without having cuts at a dizzying pace, this seemed much more reasonable and didn't seem like the edit was trying to make dull choreography more exciting, as is often the case. Funny ending too.
-
Blackmagic casually announces 12K URSA Mini Pro Camera
kye replied to Andrew Reid's topic in Cameras
Computer vision is about the camera understanding what it is seeing. From wikipedia: "Computer vision is an interdisciplinary scientific field that deals with how computers can gain high-level understanding from digital images or videos. From the perspective of engineering, it seeks to understand and automate tasks that the human visual system can do." It could be about AF, in the sense of choosing what to focus on. In a sense, the logic of PDAF is something like this: Look at all the focus pixels on the screen and go towards the middle / closer objects One things are broadly in focus, employ face / animal recognition to identify faces Identify faces and rank which ones are the most important, and where to focus to (eg, could be one face or many) Use PDAF to focus on that face / focus the best on all target faces The problem with modern PDAF focus systems is that they go wrong because they're focusing on the wrong thing, not having problems with focusing itself. I suspect that Panasonic with its DFD system it hoping to 'out-smart' the PDAF systems by being able to intelligently work out the scene using some kind of AI. For example, if you see a blurry human-shaped blob that is sitting on top of the rest of the image, it's pretty easy to conclude that it's a person (by the shape), relatively how far away from the focal plane it is (how out of focus it is and where the focal plane is now), and that it's closer than the focal point rather than behind (if other things are more in focus but this isn't), and that it's the closest object (because it's over the top of everything else). If it's also the biggest and completely in frame then it would be reasonable to assume that might be the focus of the shot too. I suspect that this is how human vision works, in a way. These things are pretty straight forwards. PDAF is only good for telling if a given pixel is out-of-focus to the front or back, and by how much. At the point when you have an AI engine that can look at an image and understand it, blurry things and all, then there is no advantage to having PDAF. Our eyes don't have PDAF, we have killer AI, and out eyes are pretty good at focusing on objects and basically never get things wrong. I suspect that that'll be what Panasonic is doing with it's computer vision people. If they get it right and get it to fulfil the promise of the technology, then it will out-perform other systems because it will understand a scene even when everything is out of focus. Of course, they're a long way from that being market ready at the moment. -
A little late to this party, but good stuff in here @leslie! Firstly, great to get out and shoot. Those models you've got there don't look particularly bright - just hanging around each other like sheep. You're in good company with the FD lenses. I've got the 70-210/4 and a 2X TC, which I used yesterday to film my son playing football and also today to shoot a camera test. I was very confused about how to put the FD mounts together, and now I just have the lens, TC and adapter all as one piece. It works as I don't have any other FD lenses. The mount sure is strong though. The good company I speak of for FD lenses isn't me, it's this guy: https://www.youtube.com/c/MatteobertoliMe/videos He's a professional DP and a talented one at that, and you'll find a few videos he's shot with the P4K and FD lenses. He's upgraded now to P6K and Leica lenses, but the fact he had FDs before that implies they were a second preference which is high praise from him. Good score on those lights. I'd suggest first job is to put out the fire that appears to be burning the entire earth from below in this photo though: In regards to covid, we're also watching from the west with concern and disapproval.
-
I have noticed this in a couple of comparisons between BM and Alexa. This is one: This is another: (The shots by the window show the window frame to be very green on the Alexa). Both tests were shot by professional DPs, so I can't imagine that they both screwed up in the same way. Of course you can fix these things in post, but it's interesting to note.
-
OK, I made a mistake. The 3.3K mode isn't h265, it's h264. Damn. That means that it's very similar to the 4K 400Mbps mode, except it's 4:3 anamorphic. 4K 422 10-bit All-I 400Mbps h264 3.3K 422 10-bit All-I 400Mbps h264 I thought that all the modes in the anamorphic menu were h265, but it looks like that's not true. Double damn. More reading required. I also just shot the mode comparisons, but Resolve won't load the 5K h265 files for some reason, so working on that.
-
Thanks. I also figure that the colour checker and skin tone tests will show colour macro-blocking if it exists. I guess a follow-up to my follow-up is to ask if anyone actually needs to see a comparison? I mean, when I'm watching something incredible on Netflix I don't need to see their attempts at shooting it on a different camera to appreciate the IQ of Dark, The Queen, Stranger Things, etc.
-
Follow-up. If I stress test this codec vs the 5K Long-GOP 420 codec and the 4K 400Mbps ALL-I codec and the 1080p 200Mbps ALL-I codec, what should I film? I'm thinking: Colour checker plus my face under tungsten light Colour checker plus my face under natural outdoor light General external scenes I also have access to a city where people walk around inn outdoor malls, and a beach where there are often very few people. What would you like me to include?
-
I'm not saying that the US doesn't get hot, obviously it does. What I am saying is that camera overheating tests, both by manufacturers as well as bloggers / vloggers / news sites are typically done in very mild conditions, eg Canons official testing at 23 degrees. Here in Australia we keep our air-conditioning at 24 degrees, so their test isn't even an indicative test of using their cameras indoors - let alone outside! It's great to see someone who lives in a genuinely hot climate testing overheating - it's not a common thing to see.
-
Let us know how you go if you use it. Apart from the advantages of 422 and ALL-I I also think the 3.3K resolution might be an advantage over 5K resolution. IIRC many Alexas are 2.8K or 3.2K but upscale in post and I believe that the softness this creates (along with no / low compression) is partially responsible for the filmic look they are prized for. I did a bunch of googling last night to try and see if I could get some comparison of signal-to-noise figures for Prores vs h264 or h265 but couldn't find much. I have found in the past that h265 is about twice as efficient as h264 for the same IQ, so I am comfortable taking that as a rule. I also found the article by @Andrew Reid and thread by @KnightsFan comparing h265 and prores: https://www.eoshd.com/news/new-h-265-codec-test-prores-4444-quality-1-file-size/ Which indicated that h265 is something like 50X as efficient as Prores 4444. This would make 4K h265 at 200Mbps equivalent to 4K Prores 4444 at 1061Mbps, which intuitively doesn't ring true to me, as no-one is seriously talking about consumer cameras having h265 codecs as good as Hollywood grade intermediaries. If I had an external recorder then I could directly compare the two but unfortunately I don't. Can you record with an external recorder at the same time as internal with GH5? If so, someone could do direct testing with the same image stream. One thing I did see is that various encoders have varying levels of quality for the same bitrates, so we can't rely on the encoders in our NLEs to be a reliable proxy for what the GH5 is doing internally. We'll have to film real tests. However, one thing that I do think is promising is that Prores is an older codec, and compression is something that is getting steadily better with time, so in general a newer codec should be better than an older one given the same bit-depth, bit-rate, colour subsampling, and ALL-I mode. Incidentally @Video Hummus I shot a few test clips of the 3.3k mode yesterday and my 2016 MBP + eGPU was almost able to play the files real-time. In fact, with Resolve set to showing every frame, it was slow at the start of each clip but came up to speed in a couple of seconds and then was able to play forwards and backwards at 25p. I think it was an overhead of the SSD loading the file maybe, not sure. In comparison to the Long-GOP 5K h265 or 4K h264 codecs, it was a night and day experience. I normally render proxies in 720p Prores Proxy for editing as they're small enough to fit onto the SSD in my MBP and they cut like butter even with complex colour grades, but my challenge was getting to fine-tuning the colour grades where I switch back to the original files, which I store on a spinning disk. Essentially i'm only using them for tweaking and doing tracking and stabilisation (which benefit from the extra resolution) but the 5K and 4K modes are painful to work with. The 3.3K mode seems like it would be very workable, especially when I upgrade my MBP from a dual-core to quad-core in a month or two. Pretty much the only downside to this codec I can see is the storage space (although similar if you shoot 4K 400Mbps) and the need for expensive UHS-II cards, but to get an 800Mbps-equivalent codec I consider the cards as a camera upgrade rather than a nuisance.
-
@Kisaha I guess if we can get a side-benefit from global warming then that's great, but I think on balance I'd rather it be the other way!!
-
Blackmagic casually announces 12K URSA Mini Pro Camera
kye replied to Andrew Reid's topic in Cameras
Colour science is incredibly difficult. I'm the first to admit that i'm rubbish at colour grading, and this is why I am attracted to it and spend a lot of time doing experiments and trying to learn. We've previously seen that Sony has the most accurate colours when tested scientifically, but they are regarded by many as aesthetically displeasing, so the secret is in the sauce, as they say. I've been on a mission to understand what is in that sauce, and so far have attacked this in a few ways: I've reverse-engineered a couple of the film-emulation LUTs in Resolve using standard grading tools I've bought the GHAlex LUTs and reverse-engineered them using standard grading tools I've bought a BMMCC and a colour chart and done indoor and outdoor comparisons trying to match the GH5 to the Micro I've reconstructed most of the node graphs from the Juan Melara videos to understand what he is doing and why I've done numerous side-by-side tests with my GH5, Canon 700D, Canon XC10, GoPro and iPhone matching the colours in various combinations to each other I've also graded real footage that I shot and struggled through trying to repair the vast quantity and variety of cruel and unusual mistakes I made while shooting, effectively putting myself through the colour grading equivalent of a special forces training (It's still uncertain if i'll complete the course alive, i'll let you know....) The reason I say all this is as a prelude to say this - what I have found is a pandoras box of craziness. There are colour tweaks inside the cameras we talk about, inside the LUTs from manufacturers and highly skilled colourists, in the colour space transforms, and elsewhere that are tiny, numerous, complex, and often make no sense. They take place in colour spaces that have probably never been mentioned on EOSHD, they do things that are not possible in FCPX or PP, and maybe not even possible in Resolve or Baselight. I have developed a relatively solid ability to reverse-engineer a grade given side-by-side footage. Not perfect, but solid. But I am absolutely no-where when it comes to making adjustments in the service of making a shot look great. Let alone strange and parallel-universe type adjustments. But even that isn't enough. Manufacturers are in the business of making these parallel-universe mind-bending transformations in service of making every shot look nice. Even when filmed by people they've never met in locations they've never been. It's taken me everything I have done over a period of years to get to the point of realising just how much there is I don't know about colour.