Jump to content

KnightsFan

Members
  • Posts

    1,214
  • Joined

  • Last visited

Reputation Activity

  1. Thanks
    KnightsFan got a reaction from Thatguy in Stabilizer or gimbal?   
    I think a motorized gimbal will give the best results with the least amount of attention paid to the rig, which seems to be the critical element here. Benefits to a motorized gimbal: Significantly easier to keep level, which will be HUGE if you are walking on rough terrain or climbing over things. It will be significantly easier and faster to balance, which is crucial if you are in a hurry.
    The glidecam is not a bad option at all, and would be my pick if you can be sure of your footing. A few benefits to the glidecam: no batteries, no firmware, very weather resistant. It's significantly easier to pan and tilt naturally with a glidecam, you just touch it and it moves. Much more intuitive and tactile than twiddling with joysticks or touchscreen apps. You can also carry it easily when not shooting. A motorized gimbal will flop around if it is off, so you need a way to secure it when not in use.
    If neither of those will work, I would go for a fig rig. It's essentially just two handles shoulder-width apart, and a top handle. I know people who have made their own out of old bike wheels. Using this will keep your horizon more level than just holding the camera. You can operate it one handed if you need to clamber up stuff. Again no batteries or firmware to worry about. A fig rig won't do MUCH stabilization, but it's better than nothing.
    My last pick would be the shoulder rig. I can't imagine doing a strenuous hike with one of those on me. Maybe other people have better technique, but for me to get decent results I have to keep my back and neck pretty straight, which is impossible hiking up or down any sort of incline or when scrambling up and over things. And if you need to swing your arm out suddenly to maintain balance... goodbye camera.
    Whatever you go with, I definitely recommend having a quick and secure way to put the rig away, and leave both hands free, just in case.
  2. Like
    KnightsFan reacted to hoodlum in Nikon Z6 features 4K N-LOG, 10bit HDMI output and 120fps 1080p   
    Nikon announced that ProRes Raw is coming in a future firmware update.  Nikon also announced a Nikon Z 6 Filmmaker’s Kit for $4k.
     
    https://***URL removed***/news/7011682215/nikon-to-add-eye-af-raw-video-support-and-cfexpress-support-to-z-series
     
  3. Like
    KnightsFan got a reaction from Rick_B in Best overall 5-6" on-camera monitor (no recording)   
    @Rick_B I appreciate you taking the time to ensure that I see that. Always happy to give my opinion on gear!
  4. Like
    KnightsFan reacted to kye in Canon EOS R First Impressions   
    Most camera tests are more influenced by how accurate the focusing of the lenses is, or how similar the shots are (true A/B comparisons require you to point the cameras at the exact same thing under the exact same lighting - something most reviewers don't understand!).
    You want to do normal tests and also stress-tests.  Trees blowing in the wind is a good stress-test.  I designed a torture test by making a 4K video file with 3 frames of completely different photographs, exporting it as Prores and playing it on repeat, then setting the camera to a very short shutter-speed and framing up the monitor.  Codecs normally do well when frames are similar to the previous ones, but do badly when frames are very different.  This test absolutely destroys some cameras and others hold up very nicely.  It's a total exaggeration but gives useful information if you don't have a tree handy.  It's also 100% repeatable between cameras so is a valid test.
    My advice is to test things properly, make your decisions, then move on to making real content
  5. Like
    KnightsFan got a reaction from Kisaha in NX1 panning super jittery   
    @DaveBerg
    In your footage I see virtually no motion blur, so I think that particular clip is a fast shutter.
    To be honest, I'm don't see the jitter you are talking about. It's possible that you have a more critical eye than me, or even that I am so used to an NX1 that I am blind to it. If you get a chance, can you do a side by side with another camera that doesn't have the jitter? Preferably in 4k so it's super obvious. If you don't have the camera with you then no hurry, we can always keep investigating this later.
    Another possibility is that the player you are using has issues with its HEVC decoder, which would explain why you see it and I don't. Here on EOSHD we've already discovered problems using the builtin Windows 10 HEVC decoder inside Da Vinci Resolve Free. (This seems unlikely to be the cause, but worth mentioning as we haven't ruled it out yet). If this is the case, then if I transcode your jittery files on my end to, say, ProRes, then the jitter should disappear.
    If this is true, then it's either a problem with your settings or a genuine glitch/broken part in your camera. (Hopefully the former!). However, if you are viewing transcoded footage then it could still be a decoder issue. What firmware do you have? Any hacks currently or previously installed? Does it occur with all lenses? To rule out lens issues, have you tested it with an adapted manual lens?
    If you tell me a specific set of settings (exposure, picture profile, white balance), I'm happy to shoot some tests on my end and send the files for you to look at.
  6. Like
    KnightsFan reacted to Rick_B in Best overall 5-6" on-camera monitor (no recording)   
    @Kisaha It was actually your passionate advocacy for the Focus in a thread from a few months back that encouraged me to take another look at the real benefits that I'd be getting for spending the extra cash, and to take another look at the reviews. When I factored in that the mount is good enough that people were looking to get it for non-SmallHD monitors, general enthusiasm for the touch interface (which most inexpensive options don't have, or if they do, it has issues), questions about support/documentation for the inexpensive options, and that I am an "Apple guy" (we have 4 Macs, 3 iPads, and 2 iPhones in a household of 2 people) and like setups that I know will work well out-of-the-box, it seemed like the best choice.
    Also, just realized that I didn't do the @ thing right in my earlier thanks, so thanks again to @KnightsFan for all the info on the A5!
     
  7. Like
    KnightsFan got a reaction from Rick_B in Best overall 5-6" on-camera monitor (no recording)   
    I have the lilliput a5. For the price, it is excellent. I can use it in bright sunlight, it has peaking and false color, and was cheap even brand new. It has mounting points on three sides, and runs off 9v power from the same brick that powers my camera. The color is NOT accurate. If you want to judge color on your external monitor, you will need to spend more money. But if you mainly want a monitor it for flexibility and focusing, I highly recommend the a5. The main problem with it is it only had one customizable button.
    I previously owned a 7" feelworld and a 3.5" feelworld evf. Both were inferior to the lilliput. The 7" was just too big and heavy, it really dwarfed my rig. The evf was just bad in every way. My impression is the new feelworld monitors are better than the ones i had, probably on par with the a5 for the similarly priced ones.
    I have used some smallhd monitors briefly. They are certainly better build quality. I cant say they would be worth 5x the cost to me, since i have never broken any of my cheap gear and i have no need of great color accuracy on set. Your needs may differ of course.
  8. Like
    KnightsFan got a reaction from User in Resolve - Edit Clips on Timeline for Transcode Question?   
    It might not be very easy to do this if you've cut the files before generating proxies. When I do this for my proxy workflow in Resolve, I'm relying on the fact that each proxy file is exactly the same filename and length as its corresponding online media. How would Premiere know that Proxy A corresponds to time 0:05 - 0:15 of file A?
    If you have unique timecodes for each file, and have properly carried those over into the proxies, you may be able to do it painlessly, since timecode would indicate which proxies correspond to which segments of the original. I think Resolve's conform abilities might be able to handle this, no clue about Premiere.
    In my experience, there won't be much degradation. It might be visible if you pixel peep, but certainly will not be immediately noticeable to the average viewer. It may be worth the cost, especially since at this point the H.264 files are really holding up your ability to finish editing. So I wouldn't worry too much about it--it's a non issue at best, and a necessary evil at worst. But I'd keep those original files around anyway, just in case you find that you need them.
  9. Thanks
    KnightsFan got a reaction from Ki Rin in Best overall 5-6" on-camera monitor (no recording)   
    Yes, it is big enough if you are close enough to be operating (ie not 10 feet away). I downscaled from 7 to 5 because the 7 was so unwieldy on a dslr rig. Much happier now.
  10. Like
    KnightsFan got a reaction from Meahmut in which are pros and cons of proxy video: editing, color correction, color grade and crop   
    A proxy workflow is what you make of it. Essentially, you create an exact copy of your entire project in a format that plays back smoother on your system. At any time, you can toggle between using the "online" media and the "offline" media. If you wish, you can do ALL your post production on the proxies (editing, color, sound), and only toggle back to online media for the final export. Or, you can do some editing, switch to online media, do some color, switch back to proxies, do more editing, switch back to online media, edit more--etc. Or you could use proxies to edit, switch back to online media for color correction and export, and never use the proxies again.
    On the other extreme, you can do a proxy workflow where you never switch back to the online media, but then it's called transcoding.
    You can make proxies for any format. You can even make your proxies BE in any format. You can make HD ProRes proxies for 8k 10 bit H.265 footage, or 4k H.265 proxies for an HD ProRes project (not sure why you would want to though...).
    That's all quite abstract, though. In practice, proxy capabilities depend on which software you are using, especially if you are using the builtin proxy generator. In Resolve, if you use Optimized Media (essentially, proxies) you can switch between optimized and original media at any time, like I described. However, you are limited in the formats that you can make proxies in, and I've often found that Resolve "loses" optimized media all the time. Personally, I make proxies in ffmpeg, and manually switch out which files my timeline uses. That way I have maximum control over the proxy format, and can easily troubleshoot problems.
    A decent workflow should allow you to do crops/reframing and variable framerates without issue, but it depends on the software you use.
     
    In general, the only pro is smoother playback while editing. However, proxies are also a huge benefit if you have a remote editor and need to send files over slow internet connections. My 500 GB project is only 10 GB in proxy form. I can use Google drive to send the entire project to an editor, and all they have to send me in return is an XML.
    Cons are a messier workflow, having files become unlinked if your workflow is not perfect, tons of headaches. But all of those problems can be avoided if you know what you are doing, and rigorously test your workflow before using it on a project.
  11. Like
    KnightsFan reacted to kye in which are pros and cons of proxy video: editing, color correction, color grade and crop   
    In addition to the excellent comments from @KnightsFan above, proxies have cons for colour grading and VFX.
    The pros warn against doing colour correction and grading on proxies because they're not an exact colour match to the original footage.  Also, if you're doing any tracking then you'll want to do that on the original footage so that you get the best movement accuracy possible.  If you're tracking a grading window with a large soft edge then it might not be that important, but the harder the edge on a grading window or the stronger the adjustment the more chance it will be visible to the viewer.  For VFX, tracking accuracy is an absolute must, as if your compositing doesn't track perfectly with the scene then it can be quite obvious - human perception is a lot better than you'd think.  This is why for VFX work and green screening it's best to shoot RAW as it eliminates the pixel-level errors of compression.
    In a practical sense, and if you're not doing huge budget work or VFX stuff, you can use lower resolution proxies to edit and do rough colour work, switching to the source media for final grading and if you're tracking any windows.  For my own projects, I will render out the final project and watch it through for any tweaks I want to do, then tweak and re-export.  This works if you have time to do so, but it depends on your schedule and level of attention to detail that your budget covers
  12. Like
    KnightsFan got a reaction from leslie in which are pros and cons of proxy video: editing, color correction, color grade and crop   
    A proxy workflow is what you make of it. Essentially, you create an exact copy of your entire project in a format that plays back smoother on your system. At any time, you can toggle between using the "online" media and the "offline" media. If you wish, you can do ALL your post production on the proxies (editing, color, sound), and only toggle back to online media for the final export. Or, you can do some editing, switch to online media, do some color, switch back to proxies, do more editing, switch back to online media, edit more--etc. Or you could use proxies to edit, switch back to online media for color correction and export, and never use the proxies again.
    On the other extreme, you can do a proxy workflow where you never switch back to the online media, but then it's called transcoding.
    You can make proxies for any format. You can even make your proxies BE in any format. You can make HD ProRes proxies for 8k 10 bit H.265 footage, or 4k H.265 proxies for an HD ProRes project (not sure why you would want to though...).
    That's all quite abstract, though. In practice, proxy capabilities depend on which software you are using, especially if you are using the builtin proxy generator. In Resolve, if you use Optimized Media (essentially, proxies) you can switch between optimized and original media at any time, like I described. However, you are limited in the formats that you can make proxies in, and I've often found that Resolve "loses" optimized media all the time. Personally, I make proxies in ffmpeg, and manually switch out which files my timeline uses. That way I have maximum control over the proxy format, and can easily troubleshoot problems.
    A decent workflow should allow you to do crops/reframing and variable framerates without issue, but it depends on the software you use.
     
    In general, the only pro is smoother playback while editing. However, proxies are also a huge benefit if you have a remote editor and need to send files over slow internet connections. My 500 GB project is only 10 GB in proxy form. I can use Google drive to send the entire project to an editor, and all they have to send me in return is an XML.
    Cons are a messier workflow, having files become unlinked if your workflow is not perfect, tons of headaches. But all of those problems can be avoided if you know what you are doing, and rigorously test your workflow before using it on a project.
  13. Like
    KnightsFan got a reaction from Grimor in which are pros and cons of proxy video: editing, color correction, color grade and crop   
    A proxy workflow is what you make of it. Essentially, you create an exact copy of your entire project in a format that plays back smoother on your system. At any time, you can toggle between using the "online" media and the "offline" media. If you wish, you can do ALL your post production on the proxies (editing, color, sound), and only toggle back to online media for the final export. Or, you can do some editing, switch to online media, do some color, switch back to proxies, do more editing, switch back to online media, edit more--etc. Or you could use proxies to edit, switch back to online media for color correction and export, and never use the proxies again.
    On the other extreme, you can do a proxy workflow where you never switch back to the online media, but then it's called transcoding.
    You can make proxies for any format. You can even make your proxies BE in any format. You can make HD ProRes proxies for 8k 10 bit H.265 footage, or 4k H.265 proxies for an HD ProRes project (not sure why you would want to though...).
    That's all quite abstract, though. In practice, proxy capabilities depend on which software you are using, especially if you are using the builtin proxy generator. In Resolve, if you use Optimized Media (essentially, proxies) you can switch between optimized and original media at any time, like I described. However, you are limited in the formats that you can make proxies in, and I've often found that Resolve "loses" optimized media all the time. Personally, I make proxies in ffmpeg, and manually switch out which files my timeline uses. That way I have maximum control over the proxy format, and can easily troubleshoot problems.
    A decent workflow should allow you to do crops/reframing and variable framerates without issue, but it depends on the software you use.
     
    In general, the only pro is smoother playback while editing. However, proxies are also a huge benefit if you have a remote editor and need to send files over slow internet connections. My 500 GB project is only 10 GB in proxy form. I can use Google drive to send the entire project to an editor, and all they have to send me in return is an XML.
    Cons are a messier workflow, having files become unlinked if your workflow is not perfect, tons of headaches. But all of those problems can be avoided if you know what you are doing, and rigorously test your workflow before using it on a project.
  14. Like
    KnightsFan reacted to webrunner5 in hi which interal HDD do you suggest me to buy?   
    You don't want any kind of HDD for editing, you want a SSD. A high end Samsung one.
  15. Haha
    KnightsFan got a reaction from webrunner5 in Z Cam E2 will have ONE HUNDRED AND TWENTY FPS in 4K??   
    There you go! And you get to feel like a secret agent every time you use it.
  16. Like
    KnightsFan reacted to DBounce in Z Cam E2 will have ONE HUNDRED AND TWENTY FPS in 4K??   
    Hey, I have a clue... I've played with both of these cameras,  and I can tell you this,  I find the Z much more interesting than the P4K. Besides, do we really need another P4K review?
    That said there is so much coming out next year that it might be better to sit on the sidelines if you already have usable cameras. I'm really curious to see what Panasonic comes out with to vanquish all the competition. And lets not count out Sony just yet... they have the tech, but do they have the will?
    I'll wait until the dust settles and then decide. For now I'm more interested in the other elements such as camera movement and lighting. Let's face it any one of these cameras can produce a cinematic image it you nail the other components.
  17. Like
    KnightsFan got a reaction from deezid in Z Cam E2 will have ONE HUNDRED AND TWENTY FPS in 4K??   
    Yeah people have complained about that on the facebook group. Apparently z cam is putting a no-sharpening mode into a firmware update, as well as an option for less noise reduction. I am not sure whether the firmware update has been released yet to be honest.
  18. Thanks
    KnightsFan got a reaction from IronFilm in Color Science Means Nothing With Raw... Really?   
    To anyone who says "color science is bs:" I'm curious what your definition of color science is.
    From the CFA, to the amplifier, to the ADC, to the gamma curve and the mathematical algorithm behind it, to the digital denoising and sharpening, to the codec--someone has to design each of those with the end goal of making colors appear on a screen. Some of those components could be the same between cameras or manufacturers. Some are not. Some could be different and produce the same colors.
    Even if Canon and Nikon RAW files were bit-for-bit identical, that doesn't negate the fact that science and engineering went into designing exactly how those components work together to produce colors. As it turns out, there usually are differences. The very fact that you have to put effort into matching them shows that they weren't identical to begin with.
    And if color science is negated by being able to "match in post" with color correction, how about this: you can draw a movie in Microsoft Paint, pixel by pixel. There is no technical reason why you can't draw The Avengers by yourself, pixel for pixel, and come up with the exact same final product that was shot on an Arri Alexa. You can even draw it without compression artifacts! Compression is BS! Did you also know that if you give a million monkeys typewriters, they will eventually make Shakespeare? He wasn't a genius at all!
    The fact that it's technically possible to match in post does not imply equality, whether it's a two minute adjustment or a lifetime of pixel art. Color science is the process of using objective tools to create colors, usually with the goal of making the color subjectively "good." If you do color correction in post, then you are using the software's color science in tandem with the camera's.
    Of course, saying one camera's color science produces better results is a subjective claim...
    ...but subjectivity in evaluating results doesn't contradict science at all. If I subjectively want my image to be black and white, I can use a monochrome camera that objectively has no CFA, or apply a desaturation filter that objectively reduces saturation. If you subjectively want an image to look different, you objectively modify components to achieve that goal. The same applies to other scientific topics: If I subjectively want larger tomatoes, I can objectively use my knowledge of genetics to breed larger tomatoes.
  19. Like
    KnightsFan got a reaction from webrunner5 in Color Science Means Nothing With Raw... Really?   
    To anyone who says "color science is bs:" I'm curious what your definition of color science is.
    From the CFA, to the amplifier, to the ADC, to the gamma curve and the mathematical algorithm behind it, to the digital denoising and sharpening, to the codec--someone has to design each of those with the end goal of making colors appear on a screen. Some of those components could be the same between cameras or manufacturers. Some are not. Some could be different and produce the same colors.
    Even if Canon and Nikon RAW files were bit-for-bit identical, that doesn't negate the fact that science and engineering went into designing exactly how those components work together to produce colors. As it turns out, there usually are differences. The very fact that you have to put effort into matching them shows that they weren't identical to begin with.
    And if color science is negated by being able to "match in post" with color correction, how about this: you can draw a movie in Microsoft Paint, pixel by pixel. There is no technical reason why you can't draw The Avengers by yourself, pixel for pixel, and come up with the exact same final product that was shot on an Arri Alexa. You can even draw it without compression artifacts! Compression is BS! Did you also know that if you give a million monkeys typewriters, they will eventually make Shakespeare? He wasn't a genius at all!
    The fact that it's technically possible to match in post does not imply equality, whether it's a two minute adjustment or a lifetime of pixel art. Color science is the process of using objective tools to create colors, usually with the goal of making the color subjectively "good." If you do color correction in post, then you are using the software's color science in tandem with the camera's.
    Of course, saying one camera's color science produces better results is a subjective claim...
    ...but subjectivity in evaluating results doesn't contradict science at all. If I subjectively want my image to be black and white, I can use a monochrome camera that objectively has no CFA, or apply a desaturation filter that objectively reduces saturation. If you subjectively want an image to look different, you objectively modify components to achieve that goal. The same applies to other scientific topics: If I subjectively want larger tomatoes, I can objectively use my knowledge of genetics to breed larger tomatoes.
  20. Like
    KnightsFan reacted to Cliff Totten in Sony A7R IV / A7S III / A9 II to feature 8K video, as new 60MP and 36MP full frame sensor specs leak   
    Absutely right. All the gain in the universe doesn't turn a half-full pixel well into a full pixel well. Gain has zero affect on the amount of light (photons) that were counted. Gain also cannot improve signal to noise ratio either. A 25% full pixel well will always be noisier than a 99% full well.
    Yes, you can take a 25% full pixel well reading and amplify that (dim voltage) and amplify it to represent pure white, but it will never be the same as getting a pure white value from a 100% full pixel well. (And of course all the mid tone mappings that come with it)
    "Gain" is not "exposure" in any way shape or form.
  21. Like
    KnightsFan reacted to Cliff Totten in Sony A7R IV / A7S III / A9 II to feature 8K video, as new 60MP and 36MP full frame sensor specs leak   
    That was me that posted that on Sony Alpha Rumors. I read that MIT is working on a sensor readout technique that virtually doesnt allow a photosite to saturate. It makes a photosite fill up, dump its value and fill up again and again until the readout cycle is complete.
    If you were using a 1/30 second shutter setting, you could collect and "add up" the photons in your your brightest highlight areas 2 or 3 or more times in that 1/60 shutter opening.
    Your dark shadow areas wont get much help because those photosites would never completely fill up but your highlights theoretically have no limit. (Execpt for the readout and processing speed of A/D conversion.) How fast can the processor handle a photosite fill, read, dump, add, fill, read, dump, add, fill, read, dump add in 1/30 or 1/60th of a second?
    Thats a TON of adding work for each photosite. You would probably need a specialized processing chip to take advantage of this new way of handling this kind of A/D.....
    Oh wait!!,...isnt that what Sony is saying? You must also buy their specially paired image processor to go with this sensor? ( otherwize it cant work with any other? )
    Hmmmm.......
    P.s. Im sure this is a gross over simplification of what MIT and maybe Sony are doing. We will need to see how this develops to truely know. I think is clear that Sony is doing something unusual to make these phototon collection numbers so high using photosites that ore only 4+microns large.
    CT
  22. Like
    KnightsFan got a reaction from IronFilm in 8K is still 4 years out   
    One thing no one talks about is audio for UHD. I have a 4k tv and a 4k graphics card, so i should be able to watch 4k content, right? Nope. My receiver can only take 1080p hdmi, and that is the only way to get 5.1 audio. So to get 4k, i have to do stereo sound via aux input.
    I have no desire to buy a whole new receiver just to get 4k, let alone 8k, and 5.1 is more important than high resolution for me.
    Its frustratingly ironic that wanting decent audio is the reason i cant watch 4k right now and have absolutely no interest in 8k content.
  23. Like
    KnightsFan got a reaction from IronFilm in Z Cam E2 will have ONE HUNDRED AND TWENTY FPS in 4K??   
    Yeah people have complained about that on the facebook group. Apparently z cam is putting a no-sharpening mode into a firmware update, as well as an option for less noise reduction. I am not sure whether the firmware update has been released yet to be honest.
  24. Like
    KnightsFan got a reaction from Ivko Pivko in 2019: Are we Finally There???   
    The real problem with IoT devices is internet security. This happened while I was in an internet protocols class and made for some good discussion: https://www.techtimes.com/articles/183339/20161024/massive-dyn-ddos-attack-experts-blame-smart-fridges-dvrs-and-other-iot-devices-why-your-internet-went-down.htm
    How good is the security on your coffee machine? Making coffee one day, being leveraged to attack Twitter tomorrow, and providing a backdoor into your home network the day after. It's worth being wary of cheap, internet-enabled devices with little to no security.
  25. Like
    KnightsFan got a reaction from kaylee in Help me record a rap   
    I use Reaper as a DAW. It has a free trial thay never actually expires, and the full version is like $50. Ive been using it for maybe 4 years and absolutely love it. I mainly use it for mixing for films, but have used it occasionally for recording simple stuff (which i do in a closet with heavy sleeping bags hanging about for isolation, if you are looking for budget ideas). I use my zoom f4 as an audio interface, and monitor with the classic mdr-7506 headphones.
×
×
  • Create New...