Jump to content

kye

Members
  • Posts

    7,504
  • Joined

  • Last visited

Everything posted by kye

  1. Interesting lens. Panasonic released one for MFT but I've never heard anyone talk about it. In theory you could use the 5K 4:3 mode to get some reasonable resolution too. I suspect that a significant application of this could be virtual presence at events like concerts or weddings etc. In years past I have been into high-end hifi and making and recording music and the common approach to record an acoustic concert (basically anything in a concert hall) is to put a pair of microphones directly above the second row, and a pair high in the ceiling nearer the back and to mix them together so that when you listen with a high-end stereo setup it replicates the experience of attending the concert. Obviously audio is a little different to video, so for video you'd want to put the 3D camera front dead-centre for the best view. I can imagine wedding videographers offering a VR package where they put one or two of these in the good spot so that the whole event can be re-lived. Concerts might end up having a lot of them placed in various places and maybe you can swap between them as you like, or even provide an experience where every-so-often you move to a different vantage point to keep it interesting. One thing that's fascinating and we haven't really worked out yet is how you can "edit" VR footage. ie, I've seen people talk about how to transition from one place to another (basically you do it slowly through a dissolve or fade down/up so that people don't get disoriented - we're not that used to teleporting!) but we haven't really worked out more than that. It's an interesting space.
  2. They wouldn't be the first people in history to end a relationship and get closure by bringing something new into the world, but I'd imagine it's frowned upon by their future partners....
  3. kye

    Panasonic GH6

    Looks like they're taking design cues from the S1H... Let's hope that the GH6 isn't too big... I remember walking around Pompeii with my GH5/Prime/VMP+ combo in one hand and having a sore wrist just from carrying it around in one hand for 3-4 hours. Maybe there's a set of new "physical preparedness" exercises we need to be doing where we carry a brick around to build up strength. Like marathon training, 1 minute the first day, then 2, then 5, then 10... so that by the 6 week mark you can carry two bricks all day without straining any ligaments!
  4. kye

    Olympus OM-1

    I'm not an expert in sensor tech, but Sony did a pretty amazing job of improving the ISO performance of their A7S line from normal FF sensors of the time, so if that can be done then maybe it is possible to have similar improvements in MFT sensors. If the GH5 is a reference point for ISO performance on MFT sensors then I'd suggest that we aren't anywhere near what is possible! I'm not saying anything definitive here, just that Sony showed that the tech can really be pushed - presumably with serious investment in the tech, and maybe that effort could significantly better low-light to MFT cameras.
  5. kye

    Panasonic GH6

    That tech truly was incredible to see. It's nice to see that at least one company making money with product after product has decided to innovate - in anything other than their cripple hammers!
  6. kye

    Olympus OM-1

    It sounds like it might be a camera for adventure photographers, but also now would be "enough" if that person decided to do video or start a YT channel on the side. I can imagine someone new to video being told "if it has 4k60p 10-bit HLG then that's all you need". I see quite a few outdoor photographers on YT who started their channel to talk about their photography and started by using their main camera. The HLG wikipedia page is worth a review: https://en.wikipedia.org/wiki/Hybrid_log–gamma. Maybe I mis-spoke about it not being a real standard? But I think "HLG" being used to market a product is more like saying something is "HDR". I shoot HLG on the GH5 and then convert from rec2100 to rec709. In Resolve you can do this via a CST in the node tree, or you can enable Resolve Colour Management (IIRC in the project settings) and then set the Input Colour Space of the clips to rec2100. The colour management behaviour of Resolve is complicated, so that's kind of a different subject. I know it's not either rec2100 or rec2020 because I took a controlled test scene and shot three clips at -1, 0, and +1 stops. I brought them into Resolve and converted them with a CST like described above, and adjusted the exposure on the -1 and +1 to the 0. If the colour space I was transferring them from was correct the shots should have all matched, but they didn't - IIRC there were strange differences in the gamma curve. That's how I knew they weren't actually those standards. However, when I set Resolve to Colour Managed and Input Colour Space to rec2100 (I think I use "Rec2100 (Scene)" - not completely sure) then certain controls in Resolve become "colour space aware" and adjust their behaviour to act appropriately for that colour space. IIRC this is things like WB and the LGG wheels, and the WB definitely does a pretty good job in that mode, so I'm actually really happy with how that works. I have a bunch of "hero" shots that I've copied to my SSD in my laptop that are from basically every camera I've ever owned, so include 700D, 700D + ML, Canon XC10, BM OG BMPCC, BM BMMCC, iPhone 6, iPhone 8, GoPro Hero 3 with Protune, Sony X3000, GH5, and probably others. I tried that mode by setting all the input colour spaces on those and then grading all the clips and it did a pretty good job across all the various colour spaces and various colour science of those, and made it pretty straight-forwards to match them pretty easily, which is a pretty good test IMHO.
  7. kye

    Olympus OM-1

    The GH5 has "HLG" but it turns out that HLG isn't a standard, it's more like a concept. Rec2020 and rec2100 are standards, but unfortunately the GH5 HLG isn't either of them (I tested it myself). I use rec2100 in Resolve and it does a pretty good job, but having HLG isn't worth nearly as much as it actually being a standard.
  8. kye

    Olympus OM-1

    From the above... 4K60 10-bit internal 2K200 10-bit internal ("slight" crop) HLG H.265 Prores RAW externally
  9. kye

    Olympus OM-1

    Real-world review from a brand ambassador... No doubt more to come. There's chapter markers to get to the video bits - Chris does lots of stills so they're featured prominently too.
  10. My take is fundamentally different, and I expect to get pushback on this, but I don't think it's getting worse at all. Not picking on you @Marcio Kabke Pinheiro - the "it's all going wrong" narrative is pretty common around here. I think the fundamental issues are to do with humanity and the aggressive parts of our nature. That hasn't changed. Anyone who thinks the current period in history is bad should review history and see how things used to be. Life used to be brutal and short. In almost all cases "might made right" and the golden rule was the only rule - he that has the gold makes the rules. Social media isn't bad at all, it's just an echo chamber. If you don't like the echos you're hearing then find another chamber. Actually, find 27 other chambers. Find something that inspires you. My favourite YouTubers are metalworkers and woodworkers and are building their own homes or cabins in the woods or write music or cook or talk about town planning or explain philosophy or mathematics or physics or economics or travel or living in other cultures/places or.... <and I stopped here, but I was only up to G in my subscriber list> This advice goes for me too, as I need to spend less time on social. I'm trying to read more. I'm actually looking for more hobbies as the pandemic has limited things for so long that I need to have some new things in the mix to break things up and inspire me again.
  11. The film curves that I have seen typically have about 5-6 stops in the "linear" range, then hit the roll-offs pretty hard after that, so your curve seems to be right around what I would expect. I recall a landscape photographer I watch on YT taking out a medium format camera with an older stock that had something like 5-6 stops in total. After carefully metering the scene he took a shot of some clouds. The shot was absolutely amazing - the moodiest clouds I have ever seen! Just incredible contrast, but looked great because nothing was clipping and the image was thick as hell. In colour grading I've found a consistent challenge is how to get the 10+ stops of camera DR into the 709 container and not make it look unnatural - the rolloffs on your LUT look pretty nice in that regard 🙂
  12. kye

    Lenses

    Yeah, I'm on some vintage lenses groups and people are constantly buying lenses, matching, cinemodding / rehousing, and completing sets, which then get casually listed for 4 or 5, or even 6-figures. There are also people new to things and just buying their first Helios or CZ, it's a real mixture. A month or two ago someone posted asking for info on a prototype lens which he believed to be from Panavision. Apparently it was a development prototype of a lens that never made it to production and something happened and the person was told to throw it out, so they put it in their bin at their desk and someone else immediately took it out and took it home lol.
  13. kye

    Panasonic GH6

    Rolling shutter is a problem in certain situations, mostly that happen in videography. One is when there's lots of motion and you stabilise in post, the other is when you're filming something moving - like filming out the window of a moving vehicle, which is something I do a bit. You're right that it's not horrible though - I remember the Sony A5xxx and A6xxx range being way worse. I also agree that DfD has huge potential but people are just sick of waiting for it because AF seems to have become something that people can't live without. I personally don't care as I manually focus almost exclusively, but the wedding videographers who want to run around with a gimbal and keep the bride in focus at F0.5 seem to shout down anyone who suggests that there are other options. I watch lots of YouTube channels that use Canon and Sony PDAF and they have out-of-focus shots regularly - some vlogs have more than the GH5 people used to have back in the day. No-one seems to talk about that, or when it focuses on the background instead of the vlogger in the middle of the frame.
  14. kye

    The Aesthetic

    "looks less and less like reality, HDR and high resolutions make for a fake look that one does not see with our eyes in real life" .......well said. I think this is the fundamental aspect that I dislike - it's that the image looks unnatural because it's too sharp. Reality just isn't that sharp, and so I think it gives a completely artificial look. When I see lots of images they look like they've been slightly embossed. Video and film are essentially creative fields, so everything is allowed, so using this hyper-real look for Transformers or an action movie or tech startup is appropriate, but using it for a romantic comedy or period piece is silly. People have taken "cinematic" and made it apply to something that never actually occurred in the cinema. So we all end up with specialist VFX cameras that fall short in other ways. No problems if you're creating a spectacle, but is it really the best situation for Canon and BM and Sony to be making cameras for Michael Bay and thinking that we all do the same type of work as he does? I don't think so! I'd much prefer the camera they'd make for Werner Herzog! There will always be that element - trendy stuff sells. But in the background there will always be people making things of substance that stay relevant for 10 years instead of 10 days. Maybe you're watching the wrong stuff 🙂 Interesting thought about specialist cameras - I agree. Phantom have done well to carve themselves the high speed niche. In a sense ARRI have too with the best colour lower-resolution space (I hazard to call this a niche as it's really all non-VFX films and TV). I wonder about the GH6 and if it will try and do something super-well or if it will be an all-rounder at launch. "if you have a lot of talent and money you can get away with being more subtle".. totally agree, but I would rephrase this to be less subtle.... I'd say that if you have talent and almost any budget at all then you can make a film worth watching without having to resort to using party tricks from the tech to get people to keep watching.
  15. I suppose that it could happen - it's easily possible to build a LUT where multiple input values go to the same output variable, in which case reversing that adjustment would be impossible, although I'm guessing that it's not likely. I would suggest that the Panasonic colour science may be compressing a certain part of the colour space (which is flattering to pull skintones closer together so is definitely a thing that happens) but I would doubt they'd completely crush anything. That's why I was suggesting a higher resolution colour chart - it might be that two patches are very similar but if you knew what skewing was happening around those then you might be able to see what was happening and find a way around it. I did a bunch of tests in Resolve by using two 10-step generators (one horizontal and one vertical) to create a 10x10 grid of things like Hue vs Saturation etc, and pointed the camera at the monitor in a dark room. You can "zoom in" to the hues and see what the GH5 is doing. That's relatively easy and might be a quick exercise to get a feel for those trouble spots. You could even setup the computer with one of those, film it with the GH5, put the GH5 feed into a monitor that applies the LUT and displays a Vectorscope, and then in Resolve you could play with the Hue and Sat controls in the LGG panel to move the grid around, and watch what happens on the Vectroscope. If it pinches or ripples or something then it should be pretty obvious. I wouldn't suggest that a monitor is a good source of light, but it might be useful to get a sense of what the colour profile of the GH5 is doing. I'd either shoot HLG, which isn't quite rec2100 or rec2020 but is close enough, or I'd shoot CineD. The challenge is that I shoot in 24p, 60p, and 120p and the HLG profile isn't available in 120p mode, so for a while I was shooting in CineD on all of them so I'd have the same colour in post. Since then I've changed back to HLG and just CineD in 120p because I can match things easily enough in post and I appreciate the extra DR. I also shoot the same projects (trips) on the Sony X3000 action camera (prior to that it was a GoPro), and my current iPhone of the time. There aren't ACES profiles for either of them, so there was no advantage to the ACES compatibility of the Vlog upgrade for the GH5. I kind of went down the colour grading rabbit hole and ended up buying the BMMCC to copy the colour science on the GH5 (as I couldn't afford an Alexa to compare it with) but since then I've learned a lot about colour grading and how to get good colour and what that means so I'm now less concerned with that detail. Since then Resolve implemented the Colour Managed feature which does a great job of adjusting WB etc in post with the rec2100 colour space. I also worked out how to get nice results when grading under a film emulation LUT (I like the 2393 LUT quite a bit). Now I'm more focused on cinematography and editing, as I can get colour that's good enough for my purposes. I'm not winning any awards, that's for sure, but I still wouldn't be if I managed to perfectly replicate one camera with another. On the colourist forums there is thread after thread of people getting super detailed about tiny little details to do with film-emulation, and then in the other threads there are casual comments that mention that a PFE LUT is one of the 20+ adjustments they typically do to make a look, or that in real projects they can get away with one or two nodes instead of a PFE to give a bit of a film vibe because the rest of the look is obscured by the other adjustments. The number of times when an authentic film look is really required seems to be pretty low, and most people get a good enough 8mm simulation if you just blur, add gate weave, apply a simple HueVsHue and HueVsSat, and apply a ton of grain. I've seen threads where people have said that they've never before seen a film grain emulation that looked remotely real, and that they worked with film for 30 years so they are especially attuned to it, but then you've got Walter Volpatto (about as senior a colourist as you get) saying that he doesn't even add grain on most projects anymore because the streaming compression kills it.
  16. I think this is where the art and true deep-knowledge come into this whole challenge. I don't know what tools or techniques you're using to make your adjustments, but there are a number of ways to accomplish something, and certain ways are more or less likely to break an image. I highly recommend watching Juan Melaras YouTube videos where he matches things (and if you can find it, he did a video replicating the Linny LUT but pulled it down - maybe it's available somewhere else). I suggest this because he does a bunch of really cool things using alternate colour spaces (HSL, YUV, Lab, and more) and using tools that won't break the image such as the channel mixer or curves. It's obvious that Juan can look at the response of a LUT or look and see the big picture and knows what global tools can align to that look the easiest way without breaking it. Have you read a lot of resources about film emulation? Happy to share my bookmarks on it if you're interested. What reference setup are you using? I would imagine that you'd want to use as high a quality of light you can (definitely a black-body radiation source) and either the sun or a halogen lamp, and then just ignore all other lights as they will be inferior. The other challenge will be the shape of the RGB spectrum sensitivities, eg: The way that Juan has matched these is with the RGB mixer, although there might be some situations where that won't be completely effective, not sure. Yeah, huge variation exists between batch, processing method and labs... Steve Yedlin had a lot to say about that in this article that I'm assuming you're familiar with: http://www.yedlin.net/OnColorScience/ A phrase for colour matching that I really like is "in the same universe". It accounts for how closely you need to match individual shots in an edit, but it also accounts for what @MrSMW says about getting used to the look while watching a film, which is absolutely a factor too. We never watch the same scene through two different cameras / processes / grades, so differences have quite a degree of tolerance. I have a theory that making a grade that allows for WB adjustments should be completely possible. I never got around to trying it, but my theory is this. The camera does things in a certain order: light comes in and hits the sensor with its spectral sensitivity the camera applies the WB the camera applies the colour profile I think that the secret is to organise your adjustment so that it peels-the-onion by reversing the order of operations, like: undo the colour profile (GH5) undo the WB (GH5) adjust the spectral sensitivity from the source camera (GH5) to target (film stock) apply WB adjustment of target (film stock) apply the colour properties of the target (film stock) The challenge is to separate the three layers, which in my case I was matching two digital cameras and so you could just take a RAW still image in each which allows you to separate out the colour profile from the sensor and WB. I think you could potentially still apply some of this logic by building the adjustment in a modular way and shooting the colour checkers in a range of WB situations (maybe using gels?). Then you might be able to adjust the WB and spectral sensitivity adjustments in their own nodes and you could see if they are compatible with the same colour profile in matching the reference images. The order of operations isn't completely clear in my head, but I think you need to do controlled tests to work out the spectral sensitivities first using a proper WB, then the WB adjustment by shooting RAW stills in different WB settings, then the colour profile. Hopefully that made sense? It would require getting the order of operations completely down, and executing in a meticulous way, but if I'm right then it should be able to be done. If you make it modular like I suggest then once you've done the GH5 version the Canon RAW version would be super easy. The adjustment would only require reverse engineering the spectral sensitivity and WB on the Canon and then applying the properties of the film stock which you've already done. Juan Melaras discussion on how he made his BMPCC 4K and 6K to Alexa conversions, and in my GH5 to BMPCC thread were very interesting and I think because he was just going from RAW to RAW he didn't have to nullify colour profiles in either camera, just having to do the two lower-levels.
  17. I completely agree. I'm a solo-operator so do literally everything and I'm very aware of editing being limited to the footage captured - in fact one of my primary goals in learning editing is to shoot better footage in the first place. My biggest challenge in editing is not having a clear understanding about what I'm actually doing, but the second biggest challenge has been working with my own rather mediocre footage! Everything in film-making should have an effect on the end result, either directly or indirectly, but the way that the industry is segmented it's hard to get that overall understanding. There are lots of resources available for each step in the process, or even each aspect of each profession within each step, but they mostly focus on doing things in a certain way "because that's how it works" rather than explaining the actual end result. Alternatively, there are people who are solo-operators or work across many more departments and have a broader view, but they either don't know how to do things well, or do things really well but don't know how to explain what they're doing and why. I've decided to devote my efforts in 2022 to learning editing (and sound design), as I feel it's the part about film-making I know the least about, and I think it might be the part that perhaps has the most to teach me about the overall process. Welcome to the forums!
  18. I don't have V-Log so can't apply it my my own footage, but I have a few thoughts. To me, I think the colour charts can only get you so far with skintones as they don't have enough hues to work with. Assuming you're shooting your own film and can point it at whatever you want to, I'd recommend making your own colour checker that has a much more detailed set of hues around the skin-tone hues. You could either do this the arts and crafts way, and mix some paints. I'd recommend getting a hue that is quite red and one that's quite yellow and then blending them together in various proportions to create a number of steps that smoothly go between the two hues. if you then mix each of those with white in greater proportions you should get a version of each hue in gradually diminishing saturations. I'd let them dry then compare them to your own skintones and see if they include your own skintones. You might have to try this a few times to get the source colours and strengths right, but you should be left with a grid that covers the pie in the vectorscope around the skin tone indicator quite well. The other way to do it is to do it digitally and then have it printed. Then try and compare it to real skintones and iterate if required. Those tests won't be calibrated the way a colour checker is, but it'll give you the ability to really dial in the response of that critical region, which is useful as long as you can shoot test shots from both cameras under identical conditions. The other dimension is to shoot the charts in a number of exposures, to capture what happens to hues when they are exposed lighter and darker. Another thought is that if you're not doing it already then you should confirm with a LUT stress test image to ensure you're not accidentally breaking the image. I use this one and find it very useful: https://truecolor.us/downloads/lut-stress-test-image/ Good luck with this, I've done a lot of camera matching over the last few years and it's a frustrating but interesting challenge and very educational. Actually, doing this is a spectacular way to get a deeper understanding of colour science. I've matched cameras on multiple occasions and it's an amazingly difficult technical exercise if you want to get a good match, and I've always learned a lot each time I've done it. LUTs have a poor reputation which is mostly undeserved. Yes, there are lots of YouTube camera bros out there selling LUTs as the answer to getting good colour, but someone writing a bad book doesn't mean that literature is worthless. Apart from the fact that manufacturers supply technical LUTs to take their various LOG formats to a 709 colour space, the use of film-emulation LUTs is one of the best-kept secrets of the professional colour grading industry. I've heard that the majority of TV shows and movies include the use of a film-emulation LUT of some kind. It makes sense - if you're a professional colourist you're looking to get the best results in the shortest possible time you'll use whatever tools can do that. It doesn't mean that those using a LUT aren't knowledgeable enough to have created it, some in-fact did create their own film-emulation LUTs for their own use and their own secret sauce. Some even custom-wrote their own plugins in scripting language with all the mathematics involved. Colourists save their own power-grades to apply quickly, and a LUT is simply a type of power-grade. Using a LUT doesn't mean that the skill of a colourist isn't required. It's the same as everything else, an Alexa and Master Primes doesn't make you a great cinematographer, using Sky Panels or haze doesn't make your film look wonderful, and using an NLE doesn't mean that you don't have to know how to edit.
  19. kye

    The Aesthetic

    I agree that the early Alexas were emulating a 2K film scan and that with more resolution they are further away from that look, and that the look of cameras is converging. What is interesting to me is that on the one hand we have resolutions that are going up exponentially, which are taking us further and further away from the resolution / sharpness of film (specifically the resolution/sharpness of the film you would see in the theatre, which was a few generations removed from the original), but on the other hand we have a continued obsession with film-emulation in colour science. Now, don't get me wrong. When it comes to the pros, DPs are tempering the high resolution sensors with vintage lenses / filters / haze, and in post the same people that are applying film-emulation in the colour reproduction are also emulating the resolution / halation / gate-weave / grain, which all also reduce the effective resolution of the image. The pros seem to be taking a more holistic view of the overall look. However in the amateur / low-end space it seems to be only about high resolution sensors and lenses and emulating the colour reproduction of film. It seems very strange. Keen to hear your thoughts on if you think the target aesthetic is changing, or if people have just lost sight of the whole picture and got swept up in the hype of the camera market..
  20. I must admit that I have always been confused by their model numbering system. Something like the Panasonic lines (G, GX, GH) or Sony (Ax, AxR, AxS) or Canon (xD, xxD, xxxD) etc make it easy for people to navigate. If Olympus had a system in their model numbers, I never saw anything that explained it. That makes a lot of sense. I would have been surprised if I was choosing between an Olympus and the GH5 and the Olympus brand ambassador I was talking with just failed to mention the Oly had PDAF! I do remember that the Olympus was more about getting it right in-camera than the GH5 which had the log profiles and 10-bit, which would allow more flexibility in post. That definitely matters to me as I shoot travel in available light and really appreciate the flexibility in post that I can get. Yeah, it definitely smells like a rebrand from some marketing people. I'd imagine that such a thing would be a pretty standard process - failing brand comes on as a client and you do the normal brainstorming / focus group / market research / graphic design / blah blah blah stuff.
  21. I'd suggest it's possible they'll get into medium format - it's where ARRI have gone so why not. In terms of lenses, I wonder how many of their RF lenses actually cover MF? Maybe that's something they've been working on in the background. It would certainly help to justify their enormous prices. Or maybe they'll make an adapter for medium format glass from others, eg, the ARRI medium format glass. The other thing that strikes me about it was the 186x56 size, which would obviously require specialist glass, but might be for recording a huge resolution panorama for backgrounds for VFX work? Obviously it would be a speciality set of electronics (sensor, processor, etc) but it seems plausible to be something that industry would want, similar to having Phantom cameras as a slow-motion speciality camera. Do you have any links for this? I'm curious to read a bit more. I view it as a micro version of going to court and being acquitted. It's embarrassing, interrupts what I'm doing and takes up my leisure time, and is basically a whole lot of hassle for zero benefit. I prefer not to have to "educate" anyone, as quite frankly, I'm skeptical of actually changing anything. My experience of security guards is that the ones that are likely to hassle someone who is obviously not shooting something professional but do it anyway is that they're not likely to actually listen to what you say. Plus, if I'm in a museum or whatever, they have full control and what they say goes, even if it makes no sense. You can try and escalate to someone more sensible, but you're putting them in a position of choosing between being sensible and supporting their staff, which normally they side with their staff because they have to work with that person. Anyone who has actually been to court and witnessed what can happen knows that the best strategy is to just never get put into that situation in the first place. I went once to support a friend of a friend and the judge was in a bad mood and basically gave everyone the maximum penalty, and highlights included "I hate people like you" and "you wouldn't be here if you hadn't done something wrong". Another reason I'd prefer not to stand out, and thus, smaller cameras are better.
  22. It's not about rules, it's about perception. We're at a funny time in history. Anyone can shoot in public with a phone and no-one bothers them. Governments and private residences can setup permanent security cameras that record people in public without their consent. It's legal in most public places (here in Australia anyway) to record video. Most private places such as museums and galleries and amusement parks and events allow photography and videography for private use. The only thing that really isn't allowed is professional shooting without a permit. Unfortunately, the way that people tell the difference between the two is by the size of the camera. If I went to a park with a bunch of other parents and we all stood in a line recording our kids, everyone with an iPhone and me with an FS5, I'm going to be seen differently. If I go to a museum and film my kids running around with a BMPCC6K and a shotgun microphone I'm going to get interrogated by security. Guerilla film-making is a phrase to indicate that you are shooting without permission. It doesn't mean that you NEED permission, it just means you don't have it - just like all the mums with smartphones don't have it either. It means that fitting in and not getting noticed matters. It doesn't mean you're doing anything wrong. If someone calls security and I have to talk with them and convince them that I'm not doing anything wrong and they walk away, that's still failure to me because I don't want to have that happen in the first place. It doesn't matter that I'm not doing the wrong thing. To give you an idea about how poorly venues are able to distinguish between pros and amateurs, I went to a temple in Bangkok and there was a sign at the entry. 8mm film cameras are ok, 16mm film cameras are not. That was 2019. I hoped that no-one would think my GH5 was a 16mm film camera.
  23. The patent system is broken. I remember years ago seeing enough examples that were beyond ridiculous that I now just think of it as yet another system that is dysfunctional and if you have enough money you can basically do anything. I just want lots of camera manufacturers to start making smaller cameras so that it gives us solo hand-held guerrilla-shooters more options 🙂
  24. kye

    Panasonic GH6

    This absolutely blows my mind and makes all kind of sense. I've made a new thread because I think it needs to be highlighted more than in here, where it'll get buried quickly. So, the GH6 could have that sensor from the UMP12K, retain the MFT mount, and just use the middle 8K part as it is MFT sensor sized? That would be incredible. Could that be mounted on an IBIS mechanism? I don't really know how those things work. In theory then, assuming it kept the GH5 ethos, it could offer 8K downsampled to any resolution you want. The UMP12K can do 8k120 and 8k160 in 2.4:1 aspect ratio, so with downsampling it could offer 120p in any resolution you wanted without cropping. That would be something to put the GH6 on the map!
  25. I am completely blown away by this, but it's great news. In another thread, @sanveer has shared a link to Imatest who have shown that lens design might be the limiting factor to the DR of a setup, not the sensor. This is great news, especially for those of us who aren't shooting on the most pristine modern lenses, and don't want to. It means that we don't need to keep buying cameras with higher and higher dynamic range specifications. I have actually been chasing lenses that have slightly less overall contrast so that the veiling flare pulls up shadows, giving the sensor a bit more light to digitise and getting it a bit higher compared to the noise floor. This also makes the image much more like film, which has a nice shadow roll-off and looks more organic. Lots to unpack in this one.
×
×
  • Create New...