Jump to content

kye

Members
  • Posts

    8,072
  • Joined

  • Last visited

Everything posted by kye

  1. Adding weight is one thing (that people already mentioned) but what they didn't mention is adding weight at a distance. A steadicam works because the weights are at a distance from the camera. Here's the physics of the situation: https://en.wikipedia.org/wiki/Moment_of_inertia Something you can try immediately is to use a tripod. Mount your camera on a tripod and try the following tests: Use the tripod with legs compacted and folded together and hold it underneath the camera allowing it to work like a steadicam Same as above but extend the legs most of the way to the floor Same as above but fold the legs out You will find that the first test gives some smoothing in tilt and roll but little stabilisation in pan, the second will be more of the same, and the third will add in some stabilisation to the panning as well because the legs are not all on the same axis. With physics you need a combination of mass and distance, you can't get around it. If you don't want either then do what I did and invest in IBIS.
  2. It sounds to me like you guys are reacting to the title a lot more than the content? But even if you aren't, I actually happen to think he presented a reasonable argument that MFT will die. I think the sticking point might actually be timescales. We'd probably all agree that MFT will die in the next 100 years, and not in the next 6 months, so it's a question of degrees. The Northrups talk about long-term trends in a world that has the attention span of a squirrel and I think that mis-match of contexts is often a factor in people disagreeing about this stuff. In terms of people "making mistakes", or in this example I'd say it's closer to "missing the mark", everyone does that. But then again, if you gauge things by the internet comments then basically everyone should go kill themselves immediately and companies should give up on everything and hand out fantasies for free. Plus everyone seems to fall in love with what they own, which is a strange phenomena. And if you want to get a sense of how level-headed the Northrups can be, check out their reply to Jared Polin around RAW vs JPG. Tony was so level-headed the video was almost boring to watch - not only did he not capitalise on the drama but his reply was designed to put the whole thing to bed instead of create more. I think most of their videos are akin to "The Pocket4K is quite decent really" and that's why they have such a following. They gave the GoPro 360 camera a pretty bad review (for many good reasons) but still presented its good points, and even more recently they still listed its pros and cons when comparing it to the more recent insta360 camera. They're definitely not perfect, but I think on average they are ahead of 99.99% and criticising them based on the title of a video rather than the myriad of points within the video isn't that helpful.
  3. That may be true now, but the way I understand it is that it's a shrinking market and that kind of split may not be the recipe for sustainability. Who knows of course, those with crystal balls (or steel balls) will be putting all their savings on the winning horse. Every successful YouTuber is clickbait royalty - that's just the reality of getting new subs and keeping ahead of the others - look at the UK tabloids (and sensationalist media the world over) and tell me that it isn't a successful strategy. What I like about Tony is that he isn't afraid to say things, isn't afraid to predict things, but in addition to those traits (that are shared with nearly all YouTubers) he's also level-headed, admits when he's wrong, and explains his thinking so that you can understand why he came to the conclusions he did. Even if he gets things wrong he definitely adds to the conversation rather than just taking up your time and contributing nothing.
  4. I'd like to be a better DOP, audio technician, editor, and colourist.. ??? Unfortunately, skills aren't on sale!
  5. I recall some issues that Dave Dugdale had with the a6xxx cameras that looked like IBIS and IS fighting each other. I don't recall if it ever got resolved, but I think that the only way to work out what is going on is to do a logical evaluation through testing. Shoot a shot with everything plain settings and verify that you can get a smooth pan like that. After you've established a baseline start adding in the settings you use one by one and see which one 'creates' the problem. Then leave that setting on and start reverting the previously adjusted settings back to your baseline and see if it comes good again, and if so then it will be a combination of multiple settings.
  6. I'm yet to make the move, but I think I'll have to eventually. Slightly OT, but don't they have new OS's free for a certain period of time and after that you have to pay for the upgrade? I think I had to pay for one when the version I was running didn't support the software I wanted to run and the only free version was the one that had buggy wifi so I paid to get the second newest version.
  7. Absolutely! Think of all the awful products out there, and take a minute to order why no reviewer has ever looked at any of them. It's simply incredible that the people who make awful products never contact reviewers, or if they do then the reviewers aren't interested, or if they are then the product gets lost in the post. That postal system must be crazy good at assessing how good products are - it's the final gatekeeper in the entire system!!
  8. I'd have a look at what the YouTube community uses (especially people who do talking to camera) because they shoot a ton, are hard on their equipment and value reliability and durability very highly. Kai Wong and Lok for example.
  9. I've used it with a key to beautify skin a few times, but I'm not sure that it's the best way to go as I haven't compared it to other methods. Also worth checking out is the Sharpen and Soften OFX plugin which gives you independent sliders for small, medium and large textures so you have more control, and it let's you sharpen some and soften others so it's powerful for skin tones. If you want to use a dedicated sharpen or blur tool you can use the Edge Detection OFX plugin (I think that's what it's called) in the mode to display the edges and then connect the main output from that to the key input of the node with your effect in it. This gives you the ability to apply any filter to edges (and presumably also the inverse) in a very controllable way. There are so many ways to get the job done in Resolve!
  10. @Sage I have shot a bunch of footage at night with HLG at various exposure levels (ETTR as well as ETTL to keep highlights) and I'm wondering how to pre-process the footage in post to get the right levels for the LUTs. At what step in the Resolve node graph would you suggest adjusting the exposure back so skin tones are in the recommended 40-60 IRE, and which controls would you suggest (perhaps either the Gamma wheel or the Offset wheel?). Thanks!
  11. My timecode keeps resetting - does anyone know what settings to use so that it only counts up while recording but doesn't reset? It reset at 00:13:11:01 so it didn't "wrap-around".
  12. What is your impression of how transferrable things like this are? I read the ML forums for a while and saw some instances of where figuring something out on one camera helped to solve it on other models, but I'm not sure if these were isolated instances or if this was more the norm. If so, perhaps these findings might apply to the 5D3 and maybe 5D4?
  13. Yes, I suppose that the 1DX isn't so bad if you think of it as having a battery grip included. Personally though I don't use a battery grip and I would expect to be able to get away with not having one, so it depends on your perspective.
  14. Sorry to hear you got two duds... I'd suggest buying a lottery ticket - luck that bad can only be that it's been redirected somewhere else! I was particularly interested in the dial system that Fuji had, being able to control every parameter in either auto or a given manual setting is great, and gives more flexibility than some PASM mode implementations do, with a much better interface. Are you back to using the 1DXii for video as well? It looks great but I hear it weighs as much as a bus!
  15. kye

    Lenses

    I haven't seen it discussed much for video (maybe it was and I missed it) but lots of people have looked at it for photos. It was the widest non-fisheye lens that I could find for a reasonable price. It looks hilarious on the GH5 because it's so small and thin! My strategy has been to get a 17.5mm lens for generic people / environment shots, and the 8mm for those wider "wow" scenery shots where you want the grand look of a wide angle, and it's been great for that. Today was my third day in India and it hasn't come off the camera yet as everything here seems to lend itself to that "wow" aesthetic - look at that building - look at how many people there are - look at this amazing chaos... Basic grade and crop.... I'm still making friends with the GH5 so these might not be sharp, but they certainly suit the geometric nature of the architecture
  16. kye

    Lenses

    I'd be sceptical, the bridge in the sample shot goes exactly through the middle of the frame, so it won't show many forms of distortion. 7artisans does have a range of lenses and a track record though, so who knows. I've been using the SLR Magic 8mm F4 on my GH5 and I'm really enjoying the image, but it's also designed as a drone lens and so the ergonomics aren't that great though!
  17. kye

    Color science

    So, if I understand correctly, colour information is worse than in a theoretical 8-bit RAW codec then?
  18. Does Resolve have support for the NX1 colour and gamma profiles? If so, you could use the Colour Space Transform OSX plugin to convert both types of files to a common colour space. Although it's not perfect, I've had good results and it does most of the work, I just wish it had support for Protune so I could also use it to match GoPro footage.
  19. kye

    Color science

    Good post. I think we're mostly agreeing, but there are aspects of what you said that I think are technically correct but maybe not in practice. @TheRenaissanceMan touched on two of the biggest issues - the limitations of 8-bit video and the ease of use. It is technically true that you can use software (like Resolve or photoshop) to convert an image from basically any set of colours into any other set of colours, but in 8-bit files you may find that information may well be missing to do a seamless job of it. Worse still, the closer a match you want, the more manipulations you must do, and the more complicated the processing becomes. In teaching myself to colour correct and grade I downloaded well shot high-quality footage and found that good results were very easy and simple to achieve. But try inter-cutting and colour matching multiple low-quality consumer cameras and you'll realise that in practice it's either incredibly difficult or it's just not possible. Absolutely!
  20. kye

    Color science

    Manufacturers design the bayer filter for their cameras, adjusting the mix and strength of various of tints in order to choose what parts of the visible and invisible spectrum hit the photo sites in the sensor. This is part of colour science too. Did you even watch the video?
  21. Use a tripod and fix it in post!!
  22. You should all hear how the folks over at liftgammagain.com turn themselves inside out over browser colour renderings... almost as much as the client watching the latest render on grandmas 15 year-old TV and then breathing fire down the phone about how the colours are all gone to hell... I'm sure many people on here are familiar with the issues of client management, but imagine if colour was your only job!!
  23. Dogs: the guard animal for people that don't care about protecting their property enough to keep geese.
  24. Good question. Here's some info: (Source) However, it seems some codecs are supported. V14 release notes: Unfortunately it doesn't seem like there's a complete list anywhere, the "what's new" feature lists are only incremental, and BM don't use consistent language in those posts so searching for phrases doesn't work either (sometimes they say "hardware accelerated" sometimes "GPU accelerated" etc). My guess is that you should just set up a dummy project with a high resolution output and then export it in every codec you can think of (just queue them up, hit go and walk away) and then pull all of them into a timeline and compare the FPS and CPU/GPU load on each type. It's a bit of admin you shouldn't have to do, but it will definitely answer your question.
×
×
  • Create New...