Jump to content

kye

Members
  • Posts

    7,849
  • Joined

  • Last visited

Everything posted by kye

  1. You're probably right in terms of marketing a better IBIS because that would be something that people care about and would help to sell cameras. More generally, I still don't think you can rely on marketing. Firstly, there are potentially large numbers of individual changes from model to model of a camera line, not all of which will get attention. Marketing treads a delicate balance between saying the new camera is awesome because it is so much better than the other ones on offer, and implying that the other ones are so much worse than the new one, especially when an improvement can be seen as fixing some kind of deficiency instead of just adding on awesomeness. Someone recently started a thread about the IBIS noise being picked up by the internal microphones in the G/GX ranges (IIRC) and someone else pointed out that the GH5 has added internal microphones to cancel out that noise from the internal mics, which was a feature that some people hadn't heard about. My guess was that this kind of thing probably wouldn't be marketed because it's fixing a flaw in other models that they really don't want to highlight, despite the fix costing money and being of some benefit to customers. In terms of why they might innovate without a need, it could be that they buy parts in huge volumes to get sustainable prices and after running out of the IBIS motors or controllers or whatever and when they did the next bulk purchase they could only buy the next model up because the old one was discontinued, so the decision to upgrade was made for them. Or the new features were available in a newer firmware in that chipset (and they hadn't allowed updating of that chipset in the previous models). Or they had to upgrade a part due to physical size, power consumption, pin layout (to enable different circuitboard layouts) or any number of other reasons. People tend to underestimate how complicated technology actually is. If we were to take the drive-train of a car (without the computer chips) and draw it out in a kind of functional diagram, you could fit it on a large sheet of paper, if we included all the switches and controls for indicators and heated seats and power windows in a luxury car then it might be the size of a small wall in your house. If we did the same for something like a laptop computer or digital camera (they're going to look roughly the same) the writing would be so small that it probably wouldn't be readable even if you made the diagram the size of the field in a sports stadium. How they build electronic devices is also a strange process. People think of it like they start designing a product, then they work out all the features, then they buy the bits and put them al together to do that. It's not like that. It's a massive shit fight between the marketing and design departments who want to offer every new feature imaginable requiring new parts and redesigns, the engineering department who want to add no new features but make it robust and reliable requiring better parts without a redesign, the accounting department who wants to make it the cheapest possible by buying older cheaper parts in bulk, senior management who are concerned with how their product is or isn't compatible with competitors products, all in an environment where the companies who make products are trying to do secret deals with the parts manufacturers to give them great deals on new parts but block the other companies from buying the best and latest chips. Add onto that the football-pitch sized complexity of what they're building and the ripple-effect of how changing one part can mean that other parts also need to be changed, potentially then needing further changes in other parts. This includes interconnectedness of other products which might already be on sale like lenses and memory cards and flashes etc. And after all that, companies like Nikon are known for implementing support for features well ahead of the time of launch. IIRC they released some kind of lens functionality which was previously unavailable but it was supported on camera bodies that had been selling for years - so those camera bodies had features that were sitting there waiting for the lenses to be released, so you not only have to ensure compatibility with current features but perhaps future features as well. It's a huge mess basically, so marketing isn't really a reliable source for all that stuff.
  2. IBIS and OIS exist in a realm that we have very little insight into, and may never have. From an engineering perspective, stabilisation is a very simple concept: you put vibrations into a camera from the outside and the stabilisation system attempts to perfectly compensate for them such that the optical operation is immune to their effects. This type of feedback circuit is like any other, and will have a percentage of vibration removal across a frequency range across the 6 axises of potential vibration up to certain limits of amplitude. Ideally, the manufacturers would publish these curves and let us as consumers understand which system performs best overall, or in the case of strengths varying among manufacturers, which is best for a given task. Instead we get a single number - "stops". It is so overly simplified a specification as to both be completely useless, and also, quite frankly, extremely insulting to the consumer. Marketing is also equally unreliable. "if this was the case they would market it" suggests that the job of marketing is to communicate everything about a product regardless of what is trendy, what a particular market segment uses for purchasing decisions, etc. Obviously this is not the case, so you can't say that if they didn't market it that it doesn't exist. Instead we get YouTubers who do completely uncontrolled tests and then proclaim conclusions that not only violate the scientific method, but often the test footage that they publish along side their judgements. I've seen more than one test where the person declared a winner and I compared the footage and couldn't see any real difference in performance at all. Our own experience of using a camera for a long time is perhaps the most reliable, but even then, our technique changes, our projects change, the weather changes, our blood-sugar changes, etc etc etc. Unless we design a machine that we mount a camera to that puts the camera through a completely controlled and repeatable set of vibrations and then compare the results from different cameras with identical fields-of-view, shutter angles, and both centre and distribution of mass, then we will never really know how different systems compare.
  3. I think making something cinematic occurs on every level within film-making. The purely creative elements such as narrative and story structure, layering, complexity, poetry, and world can captivate the viewer and take them to another place. The artistic elements of film-making such as lighting, composition, casting and dramatic performances, wardrobe, art-department, sound design, music, editing and grading etc can create the texture of the world, matching the visual aesthetic with the conceptual elements previously mentioned. The technical elements such as camera movement, 180 degree shutter, etc. The YouTuber "how to be cinematic" focuses on the technical elements because they're easy to talk about, unfortunately the above categories are in descending-order of importance. It's like how most YouTubers fill their vlogs with cinematic b-roll sequences, when instead they should fill them full of interesting and useful content instead
  4. Good video - I thought the soundtrack made it quite cinematic.
  5. Nice! There are some interesting projector lenses around, but the lack of a standard mount or adapters means they're a little too out-there for me, at least at the moment. Plus them having no ability to stop down does limit their use in a creative sense. Still, considering that adapting FF vintage lenses seems pretty mainstream now, they're probably where the real bargains are
  6. Hang out on these forums more often... People here use native lenses, speed-boosters, non-speed-booster adapters, vintage stills lenses, vintage cinema lenses, and even free-lensing of lenses! I don't remember anyone using non-camera lenses like projector or enlarger lenses, but someone probably has.. Welcome! Personally, my m43 kit has both native and vintage lenses in it. I do own a m42 to m43 0.7x focal reducer which I might use with a ~50mm to fill in the 80-100mm equivalent spot in my kit, but I might end up just using a 35-50mm lens without a focal reducer and therefore not end up using it at all. The crop factor makes it a spectacular choice for longer zooms too, for sports or for wildlife. A 135/150/200mm vintage FF lens will set you back less than dinner and a movie, and combined with a $10 dumb-adapter you get a fast, high quality and solidly built manual telephoto lens, plus because you're cropping into the lens with the sensor size you don't get all the CA and soft-focus of the corners of those lenses so they're even better than when they were new!
  7. Maybe that's an incentive to not downscale.. Assuming that the compression is done in hardware maybe it's easier to just output 8K?
  8. Thanks! I was reading through what it does thinking "this is probably another thing I don't need" until I remembered that one clip where I clipped the audio really badly but really wanted to use the shot...
  9. Philip Bloom released a prores file he shot and I played with it, and you could really push that file around. Am I right in thinking that was 10-bit? If so, 12-bit would be absolutely spectacular!
  10. Thanks for sharing that, what a wonderful piece. Lots of great things to say about it, the thing I liked most about it is that it really had a strong and coherent style, fun and whimsical but also tender and genuine too. But perhaps the greatest compliment I can pay it is that despite it being shot on a significant camera, and being shared in the thread discussing that camera, when I watched it I stopped thinking about the camera and just enjoyed it as a creative work and all thoughts of equipment dropped away. This is the point of art, to take us to another place, right?
  11. Just a small thing but the GH5 isn't dual ISO, only the GH5S has that. I wish my GH5 did have it, but sadly, no
  12. Yeah, after having the XC10 (with it's 24-240mm equivalent fixed lens) and my iPhone (and it's fixed lens around 24-28mm equivalent) I switched to the GH5 and bought 16mm and 35mm equivalent primes for my main lenses (as well as some longer ones). I thought about the 24-240 range and I realised I was sick of the 24-28 range. I shoot travel and home videos and so mostly my shots are about people in a location, which is 35mm territory when you factor in the practical distance I am from people, and the other main shot is really wide for those grand landscapes, or interior shots. With the ETC 1.4x crop the 35mm becomes a 50mm, so that adds a bit of reach, and the 16 becomes a 22mm but I hardly ever use that length. I'm so much happier with 16mm as my wide end.
  13. kye

    Lenses

    Ah, Linear - of course. PM away, happy to help.
  14. kye

    Lenses

    That is a really nice shot.. even working in the landscape photographers S-curve from the waterline into it and of course, ML makes the image quality look effortless! Maybe you can process the footage in Resolve and get a LogC output.. Do they give you the Colour Space Transform OFX plugin in the free version? If so, you could pull everything into the timeline, set whatever RAW settings on each shot you wanted, then in the Colour page, go to the Timeline node editor (instead of the Clip one) and using the transform plugin convert the shots to LogC before exporting all the shots as separate files? If you output as a 12-bit codec (I don't think there are 14-bit codec options but maybe there are?) then you shouldn't lose too much colour info in the conversion. I know that LogC can be quite flat as it's designed to hold a lot of stops of DR, so higher bit depths are preferred. The only question in that workflow is what colour space to transform it from, because Canon RAW isn't a colour space in that list. You seem to be pretty locked-on to the look you're going for, so maybe just cycle through the list and see what gets you closest? I've graded ML RAW footage in Resolve and couldn't find any definitive answer on what to use, but there were a few options that looked about right. In terms of having 'skillz', if someone can get the look they're after from any NLE with only a couple of changes, those are skills indeed! Edit: just worked out CST means Colour Space Transform lol. Sounds like you're on the path already
  15. kye

    Lenses

    Nice!! Looking forward to hearing (and seeing?) your impressions when it arrives I kind of like that feeling on eBay when you think you might end up with a bargain, but mostly I just pay too much and don't know what I'm doing! ???
  16. The biggest delay is from the top to the bottom of the frame (or at least, it is in every camera I've seen - maybe others are different?) so if you're panning left-right or right-left then it won't make much difference. Think about those RS tests where people just move the camera left-right-left-right-left-right-left-right and the whole thing is wobble-vision. Technically each line is a little compressed or expanded horizontally when you move sideways, but the jello effect of the rolling shutter is mostly just vertical. If you had some magical way of eliminating the vertical effect (a plugin that lined up all your vertical lines perhaps) then maybe you'd notice a compression or expansion when panning, but I suspect it's too small to really be of much importance.
  17. Now THAT is how to shoot a camera test!! Bravo!
  18. kye

    Lenses

    When did you switch to the simpler workflow? To me, the colours look basically the same as previous shots you've posted, so if you've changed workflows during that time I didn't notice any difference so you're matching relatively well
  19. This board is a little more hip-and-shoulder than other places, so strong opinions are kind of the norm here. They do spark disagreements sometimes too. However, this forum has been through a number of discussions about the merits of free-speech vs political correctness and (IMHO) the general tone of those conversations was that open and honest opinions were more valuable than keeping nice with everyone. Be that for better or worse! In terms of the style of that video, it is a very strong style of editing, one which we know the internet took up heavily and is now a cliche, and for many without much talent is a crutch and people churn out videos with impressive transitions instead of making useful content. You may also not be aware, but many of the people on here are older, who came from broadcast or ENG where the bells and whistles weren't even available. I don't know how young you are, but there is a huge range of tastes, much moreso than the 20-something "professionals" who think that The Matrix is classic cinema, and haven't watched a foreign or B&W film in their lives. Besides, @webrunner5 is just grumpy sometimes
  20. kye

    Lenses

    I meant that it looks really good That's an expression that my dad always used to use when something exceeded his standards or expectations - like if you hired a car but swapped your Toyota for a Lexus and asked if that would be ok. I guess it doesn't translate well ?
  21. In terms of what you can try to improve the clips you've already shot, here are some thoughts. Try using the stabiliser to smooth out the pan a little. The stabiliser will crop into the image, but if you're only wanting to stabilise a slightly jerky pan then it shouldn't have to crop much at all to really help. Here's a useful video: If you're really seeing compression artefacts then you can attempt to hide them by adding a small amount of blur to smooth them over and then adding a small amount of sharpening after that to match the look with your other shots. This is a pretty nasty thing to do to your footage and if you add too much of this treatment it will look like very low quality footage, but if you add a tiny tiny bit of this then it might improve things a small amount. I'd suggest using the OFX plugin called something like Soften and Sharpen which allows you to soften/sharpen small/medium/large textures individually, so you can do it all in one place and just turn it on/off to check if you're making things better or worse Nice work on plowing through a paid gig - hopefully the first of many?
  22. kye

    Lenses

    Looks like retro film, both in the good ways with the bokeh and rendering and the bad ones with the highlight flaring. Not bad for $20! Nice work What do you think of the image? Looks just fine to me
  23. The output files are certainly starting to look like film, that's for sure!
  24. I asked the pro colourists if I should ETTR with the 10-bit footage on the GH5 and they said it wasn't necessary and it would be better to expose as normal. I thought that people used ETTR to get better signal:noise in 8-bit, so anything 10-bit or more should be fine. Although, the exposure video from Filmmaker IQ wasn't talking about ETTR as much as adjusting where the DR was around your middle-grey. It would be interesting for someone to do some tests with setting skin-tone by exposure levels at the various ISOs and seeing how the different shots grade in Resolve.
×
×
  • Create New...