Jump to content

kye

Members
  • Posts

    7,501
  • Joined

  • Last visited

Everything posted by kye

  1. This video that @heart0less shared in another thread is also interesting as it compares a modern lens + BPM with a vintage FD lens. So just get some vintage lenses and forget your BPM filters! One of the things that people talk a lot about with the Contax Zeiss lenses is the T* coating, saying that the coating is where the lovely look comes from, which is essentially performing the same function, so this isn't a new concept by any means.
  2. I'm a bit confused about your use of the phrase 'cine'.... The things that separate cine lenses from other lenses are that cine lenses have things like: same weight across whole set (so that you don't need to re-balance rigs when changing lenses) same size front filter size across whole set (so that you don't need to adjust filter setups when changing lenses) aperture / focus gears so these can be connected to a follow-focus same location of manual control rings across whole set (so that you don't need to adjust follow-focus controls when changing lenses) de-clicked apertures parfocal same coatings across whole set (so that shots and flares aren't visually very different when changing lenses) same resolution across whole set (so that shots aren't visually very different when changing lenses) etc Which of these are you interested in? You don't seem to be interested in all of them.
  3. I know it would work. My question is if the motion tracking uses the extra frames for motion estimation or if they get thrown away before the tracker is applied. The reason I care about this is I recall the demo from Lok where he did star jumps, and at 25p there was a frame where his arm was sloped down, next frame they were about level, next frame was more than 45 degrees up, next one was vertical. In that example, would the motion estimation pick up that his arm is moving very quickly? or would it just think that there are several unrelated objects appearing and disappearing? If it was filmed in 120fps then there would be 5 times as many frames and the fact it's one object that's moving would be obvious to the motion tracking. They all look quite similar to me, but the difference between 24 and 120p isn't that much as both of them appear to have approx a 45-90degree shutter. If you compare the size of the blur with how far the object moves in the next frame then you'll see that the blur isn't that long in any of these modes. I know I'm not that sensitive to SS in general, but my impressions of these kinds of tests are that if you have a bit of blur then it mostly looks fine. It's when you're filming in direct sunlight and the SS is like 1/2000 that the motion effect breaks down and it looks more like a burst of unrelated images, or something moving but somehow moving through a medium that gives it strange sharp textures as it travels. I guess the whole point of this is to do away with NDs, which would mean that shots would range from being indoor and 360 shutter to being in direct sun and being 1/4000.
  4. I've taken stills and used them as photographs, but it's not really the same. Pretty difficult to use video to sync to a shutter, or do astrophotography, etc, plus IIRC the Pocket cameras don't do short shutter speeds? ie, 1/500 or 1/4000 etc. I've also used a GoPro for taking photographs, but I also understand why it doesn't fit with this concept.
  5. Is there any objective measurement for measuring ISO? I always thought that a standard measurement would be a good idea, but reviewers just seem to use their own judgement, which of course isn't comparable between different reviewers. IIRC the broadcast standards quote specs but very few cameras ever get tested.
  6. I think the point that @Deadcode was making was that the extra frames can help the motion estimation better, which makes sense, although I'm not sure that it would actually work that way in how they would have likely implemented the video processing pipeline.
  7. These would require the blur to essentially simulate a 360 shutter, or in the case of 160fps -> 25fps it would involve simulating a 1152 degree shutter! I guess there's no reason why you couldn't do that, it would simply blur the motion significantly more than the object moves in each source frame. Of course, this depends on the order of operations - if the motion blur is applied after the speed change then the input to the effect might only be the frames that survived the speed change, so extra frame rate wouldn't help the accuracy of the effect.
  8. I'd be interested to see some stress-testing of it, like Lok did on digital rev when he did star-jumps and it really emphasised the effect. Ultimately it will come down to the motion detection and estimation so it knows what to blur. In Resolve it would be pretty easy to bake the effect in while converting - just drag all clips to the timeline and then do an export and set it so that all the clips are individually exported. This workflow would also work if converting from RAW, and would handily be a place to apply LUTs any other things that always get added to your grades.
  9. Since I found out about the Motion Blur feature in Resolve I think I'll probably ditch NDs as well. I had it on my list to do some A/B tests and see how well it worked, so I'm looking forward to seeing what Andrew finds and reading the blog post.
  10. Can't remember who but I do recall someone on here saying that they liked to shoot with SS longer than 180 because it kind of compensated for the limitations of DSLR / MILC cameras in other ways, sort of balancing them out to get something cinematic. Conversely, films like Saving Private Ryan have used shorter SS than 180 to give the opposite effect: https://cinemashock.org/2012/07/30/45-degree-shutter-in-saving-private-ryan/ 180 is a 'rule' in the sense that the rule of thirds is a rule - they're great to learn when you're new because they help you to avoid doing stupid things, but once you've passed a certain maturity in craft you can use them or break them depending on the situation....
  11. The other thing I thought was strange in the video is when he says that people put stockings over the end of the lens, but I thought they put the mesh filters between the lens and the sensor, so at the inner end of the lens. I wonder if that has an impact on the effect. He also put a stocking and then complained that the effect was too strong, but there are many meshes available that wouldn't have so much coverage and therefore wouldn't be as strong. Lots of stuff online about different meshes and their effects.
  12. That is a very interesting look, and definitely usable. It would be quite a cheap exercise to make a 'set' with some having less passes through the paint, and perhaps trying a variation with a clear varnish instead of black paint or perhaps other types of spray products.
  13. I suspect this would be the case. Probably the larger issue would be that in many countries you now need to be certified and lodge flight plans etc before flying the drones above some small weight size (is it 250g?) so all drones except things like Spark have very steep learning curve and initial investment.
  14. I highly recommend for anyone watching this for the first time that you actually write down your impressions before the cameras are revealed, and do it properly, because once they're revealed it's very difficult to remember what you really thought, rather than what you think you thought after all your biases are plastered all over your blind impressions. Somewhat contrary to prevailing currents, I thought the cameras were in three tiers - best / middle / worst - and I put the GH2 in the worst tier. I didn't like the colours or the noticeable compression issues. Interestingly my three tiers were basically in price order, which surprised me as I had my doubts that I would be able to pick that out. From memory, the comments from the audience members who preferred the lower-budget cameras were around the colours and contrast, which is a creative choice. One reason I prefer cameras with higher DR is that you can take a scene and create either a high-contrast or a low-contrast grade from it, but in a high-DR situation you can't create a low-contrast grade from a low-DR camera, so it's a matter of flexibility for me. Plus I shoot primarily outdoors in available light, so flexibility is more valuable to me than getting the look I want straight out of camera, and I do also prefer a less contrasty more natural look, which also matters.
  15. I agree with @heart0less that sales is sales, but also with @hk908 that content is content, and Waqas certainly puts the content in the videos. Every time I see a YouTuber flogging some brand or other I do recoil slightly, but then I remember that TV was full of ads for things I had no interest in whatsoever that went on for whole minutes at a time and did so every few minutes of content. I'd rather see a content creator promoting their own materials, or having one ad in their video than having to watch clusters of ads where everything about it was turned up to 11 to try and get me to hear it from the next room when I went to get a drink!
  16. Just watched this grading video and found it really useful. There's a few things in here that I hadn't seen before, and I think the look (nice bright and colourful) has more practical application than emulating some particular movie or other..
  17. kye

    USB 3.2 ?

    My first computer had 8kB of RAM. Luckily I had the VHS-sized expansion module that increased that by 16kB! It took a long time to type in 24kB of basic code, and even a considerable amount of time to load it from tape once you'd typed it in for the first time..... and fixed the typos
  18. Not true - one thing that cannot be done in real-time is predicting the future. External software can (and does) analyse the entire length of the clip and take that into account. You probably get most of the same benefits in-camera with a 2-3 second buffer on the stabilisation mechanism, but it's not the same. I also find that external software gives you the option to try different ways to stabilise and to optimise parameters, so that's something you get without it being baked-into the footage. Of course, the footage with stabilisation baked-in makes better use of the codec and bitrate as it's not having to compress an image that's jumping around all over the place, so that's a win for in-camera processing. I've also considered there to be three time-scales to stabilisation: movement that occurs during an exposure this is blur or RS that can only be prevented with OIS / IBIS or having a huge camera rig - EIS might be able to help a bit with RS but can't help with blur during normal shutter times (ie, 1/50 not 1/4000) movement that occurs due to hand shakiness etc this is what light camera rigs have and heavier ones do not have - EIS can help a little with this movement that occurs slowly this is where the 'floaty' look can come from with gimbals and forms the aesthetic part of the equation My experience is that stabilisation of the first two is desirable and management of the third should be done to taste. Stabilisation is a topic that is almost always oversimplified - please resist the temptation.....
  19. kye

    USB 3.2 ?

    640kB of RAM should be enough for anyone.
  20. kye

    Lenses

    On UMP the 20mm should be equivalent to a 28mm on FF? If so then that's a more useful wide. I prefer to have 15mm and 35mm equivalents as I love the 'wow' look of the 15mm but it's really what you're shooting that dictates what you need. Plus I'm a bit over the 28mm FOV after shooting and watching a lot of stuff from smartphones over the years, but that's probably just me. How long are you expecting the shipping to take? I haven't bought anything since covid so have no idea what shipping is like.. Which ones didn't you like and why? I'm not aware of other lenses that are 37mm other than the Mir, but there are multiple brands of 58 and 85mm available all with differences in coatings and sometimes optical recipe, so it matters which ones you have.
  21. kye

    tamron lens 35mm

    @noone where would you say that Tamron are compared to other lenses? I know that most lens companies started off optically worse than todays offerings, although there are notable exceptions, but were they more flawed? more vintage? more modern? more neutral? Things like takumars have a real look to them, known for being sharp but not too sharp, and contrasty but not too contrasty, etc. Russian lenses are known for being soft wide open then very sharp when stopped down, etc etc.. I'm wondering how the Tamrons fit.
  22. kye

    Lenses

    I thought the Mir was an interesting lens. Being apochromatic it has some of the optical qualities of vintage lenses but not all, so it's an interesting mix. The 25, 37, and 58 are a good lineup, but with S35 you might consider a wider lens? Or are you using a SB?
  23. kye

    Lenses

    The Sigma is a stunning quality lens for the price but it is that modern look, where there's no character. It depends on what you're shooting and your tastes. I have two Helios 58/2 lenses and couldn't even think about using that as my only option, considering how distracting the background blur is, especially on video where if you move the camera then the whole background blur makes it look like you're filming in a parallel universe where the rules of physics are broken. One trend I found when I binged the lenses forum at reduser.com was that people advocate using a modern set for projects that want a more neutral look and a vintage set for projects that can benefit from a look with more character. Often that meant that the modern set was something like Zeiss CP.2s and the vintage set was the Russians, FD or Takumar glass, with the price difference meaning that they would rent the Zeiss set and own the vintage set so they could use it on personal projects and longer-form content like shooting documentaries over months/years. Depending on your workflow it's pretty easy to make the modern lenses look a lot more vintage, but some don't like doing much in post.
×
×
  • Create New...