Jump to content

KnightsFan

Members
  • Posts

    1,214
  • Joined

  • Last visited

Everything posted by KnightsFan

  1. Nice analysis. I'm glad you actually delve into the numbers and explain the different types of encoding! However, it would be better if you included files straight out of camera and ran analysis on them, both qualitative and quantitative. Since Prores and AVC intra use different algorithms, simply comparing data rates is not an accurate representation of their fidelity. I suggest encoding an uncompressed video into Prores, and also into AVC intra, and then running a script to compare the compressed images to the original and get an exact number on how much scene data is lost. Also, your title is meaningless unless you specify what the 10 bit internal is not good enough for. Is the 10 bit theoretically as good as Prores? Maybe--let's run some tests and see. Is the GH5's specific implementation inferior to external recording to Prores HQ? We need real world tests again. Are the GH5's 10 bit codecs better than any other photo/video hybrid's codecs? Almost certainly. Is it good enough for anything that we used to use GH3 and GH4's for? Of course!
  2. I believe E shutter is only available with native Samsung lenses. I read through a DPR thread about it, and some people claim E shutter can introduce rolling shutter artifacts. https://***URL removed***/forums/thread/3784289#forum-post-55098816 However, I'm not sure that's the case since only the first curtain is electronic in the NX1's implementation. Usually rolling shutter is due to readout speed, and since the second shutter is mechanical you wouldn't necessarily have any issues there. I may try to dig out my only native lens and do some tests now that you mention it.
  3. This is amazing! Really fantastic. I'm considering building an electronic follow focus for just spherical lenses. Do you mind sharing more about how you made it, like what servos you ended up using?
  4. You'll have trouble saving space AND reducing export times, unless you are willing to sacrifice a lot of quality. The reason MJPEG and ProRes files are huge is because they are lightly compressed, i.e. take less processing power to decode. So here are a few options: 1. Transcoding to ProRes, DNxHD, or Cineform seems like a bad idea, since you won't save any space with comparable image quality. 2. You could save a ton of space with HEVC--100Mbps would be plenty--but pay for it in increased exporting time. 3. H.264 might be your best option for balancing size, quality, and encoding time. Maybe do some tests at 200 Mbps and see how it goes.
  5. @Allen Smith do you get the same jitter with other cameras shooting at 24p? If not, then what you're seeing is not frame conversion judder.
  6. I would not use dynamic linking (or whatever it's called with Resolve/Fusion) if you can avoid it. Just bring the original clip into the compositor of your choice, do the effects, and render out a new clip. Then replace the clip in Resolve. The key is to be organized, so you can easily remember which effects clips replace which raw clips. You'll have no performance hit, fewer chances of crashing, and more control over what gets rendered when, and it will make no difference whether you use Fusion, AE, both, or something else entirely. No point forcing yourself to use a specific tool just because it's the same brand as your editor! I do recommend learning Fusion at some point if you have a lot more VFX shots in the future. Fusion is much easier to composite with, AE is better for motion graphics animation.
  7. 1. Personally, I would composite first, and grade afterwards. However, I don't see any reason you can't do it the other way around if that workflow is easier for you. The benefit of compositing first is that you composite on the original footage, and render that once before the final export. Going the other way, you transcode, then composite, then render, then do the final export. It takes more time, and arguably reduces quality. 2. You want the highest quality possible without breaking your hard drive. I always render effects as lossless image sequences, either PNG or TIFF. If you want high bit depth, 16 bit TIFF is the way to go, since PNG is 8 bit except in rare circumstances. The benefit of an image sequence is that you can re-render specific sections without re-rendering the entire thing, which, depending on the effects, could save you a LOT of time. Again, if you composite first, you're only rendering out the effects once instead of twice, which saves a lot of HDD space. 3. Depending on your effects and how much you care about image fidelity, you might be able to simply render the effects from AE, and composite them in Resolve. That's what I do whenever possible. It saves rendering time and means no transcoding of any kind before rendering.
  8. It's not RAW the way photo cameras use the term: lossless linear camera data. It's the same story with Redcode Raw, Canon RawLite, and Blackmagic's compressed Raw. They realized that compressing before debayering allows them to pack more meaningful data in smaller files, with the caveat that the user is required to debayer the files later. And, naturally, they call it raw because they want people to confuse it with actual RAW camera data.
  9. You could, but having a unique identifier in both the video and audio file is more foolproof. If you bulk rename to date time, you rely on accurate date and time settings in both the camera and recorder. If someone replaces batteries in the DR-60 and doesn't set the time correctly afterwards, you're in for a huge headache. Also, you will never start and stop the audio and video at the exact same moment, so the file names won't match perfectly anyway. It's very easy and reliable to read four numbers off the screen. An added benefit is that you immediately know if you didn't record audio for a take if you read the numbers off.
  10. You should be able to send audio from one of the Camera Outs on the DR-60 to the microphone jack on the NX1 via an aux cable. However, I tried this once and got horrible noise, so your mileage may vary. For syncing, I usually read the name of the audio file off the DR-60 screen. That way I know which audio file goes with which video file. You could also write down the filename on a whiteboard or clapperboard and hold it in front of the camera, but I've found it's easier to have that identifier in the camera's internal audio than it is to have it in the video image. Whichever works for you. You should also make a loud clap to have a place to sync on. In post, it's usually easy, but time consuming, to sync audio and video manually. Alternatively, there are several automatic tools. Premiere and Resolve both have builtin features where you select an audio file and a video file, and then it automatically syncs them. I've found they work some of the time. PluralEyes seems to also be a popular tool, though I've never used it.
  11. I read that it's a limitation of the physical design of the E-mount. The mount diameter is very small, so lenses can't project a large enough image circle to allow sensor movement without vignetting. Source: https://petapixel.com/2016/04/04/sonys-full-frame-pro-mirrorless-fatal-mistake/
  12. I got one of these Talentcell power banks: https://www.amazon.com/Talentcell-Rechargeable-6000mAh-Battery-Portable/dp/B00MF70BPU/ref=sr_1_1?ie=UTF8&qid=1532111479&sr=8-1&keywords=talentcell+6000+mah It powers my F4 for several days of shooting on a single charge. It can simultanteously provide power and charge from an AC outlet, which saved me once when I forgot to charge it the night before. I have my battery mounted to the top of the F4 with a custom clamp, but you could probably use a cell phone tripod adapter to hold it on the F4. You could also use gaffe tape if all else fails.
  13. Exactly. I can just do away with the USB cable clamp, so that I can unplug it when necessary to avoid the glitch, but it's a lot less sturdy. @iamoui Thanks for the suggestion! I considered that early on. But I like powering both the monitor and camera (and any other accessories) from the same powerbank. Furthermore, the battery grip will add extra height, which throws the balance off my rig. If this glitch can't be solved, I'm satisfied with having a non-permanent USB cable that I can detach when necessary. And eventually I hope to figure out a way to have it "quick release." But it would certainly be ideal if there was a way to work properly with the setup I shared in the photo.
  14. I see, makes sense! Yeah, don't leave any long HDMI cords dangling about. I think if you're in a shop and don't need to move the monitor about very much, 7" won't be too large. I personally might even favor the suggestion of a PC monitor, or an HDTV that you can see from across the room. Is there any chance you can get audio meters and peaking out from your GH5? Speaking to the monitors you've noted, my experience with cheaper monitors is that pixel-to-pixel never works with 4k. The Bestview pretty clearly states that it DOES, but I assume that both Feelworld monitors do NOT, despite what any YouTubers say. The Feelworld monitors I've used will zoom in, but they zoom in on the downscaled image, not the original 4k one. So it's really pretty useless. So it seems that the Bestview is the best for viewing focus, though pricey. I wonder if you could ask a seller whether UHD 23.976 and 29.97 are supported? It does seem very odd if not. Out of the Feelworlds, it seems the main difference is the number of custom buttons, which really doesn't make an $80 difference to me. I usually leave focus peaking on. I switch false color on and off regularly, but tbh I don't regularly toggle any other features.
  15. I, too, switched from 7" to 5" because the large screen was unwieldy. However, the size was not an issue for locked off tripod shots. It was just a problem for handheld or those occasional really tight spaces. I could see 7" being perfect for a one man show, where I assume you'll be using a tripod most of the time. However, if you ever do another project with a crew or handheld shots, I think the 5" will be more useful. Either way, what about using a long HDMI cable and bringing the monitor with you, rather than leaving it on the camera? You can just set it down out of frame when you're ready to roll.
  16. There are two USB modes, Mass Storage and Remote Access. In Mass Storage mode, the camera will lock up if a USB cable is plugged in (such as a permanent right angle adapter...) It's in Remote Access that I get the HDMI glitch. True, the batteries last a long time compared to many other cameras. But 90% of my shots are on a tripod with room for a massive battery, so it's a choice between changing batteries every two hours, or never having to worry about batteries.
  17. One of the biggest annoyances about the NX1 for me is that the HDMI output and USB recharging barely work together. If anyone else has overcome this issue, please let me know. Essentially, if there is a USB cable attached to the NX1, but the cable is not attached to a powerbank when the camera is powered on, the HDMI output will not work at all. My ideal setup is to power the camera from a TalentCell powerbank, which also powers my monitor (fantastic solution, btw, it runs all day and can even be tethered to an AC outlet). To that end, I 3D printed a clamp that holds both an HDMI and USB adapter. It's sturdy and great for cable management. However, if I power on the camera without attaching the USB cable to the power bank first, the HDMI out will not output any signal at all. For example, if I power on the camera as it is seen above, with right angle adapters in place that go nowhere, then there will be no way to get an HDMI signal out of the camera--- I would have to power off, attach the USB cable to my powerbank, and then power the camera back on. Pretty much any other combination works perfectly: if the camera is powered on without the right angle adapters in place, no issues. I can even disconnect the USB cable after powering on the camera, and HDMI will continue to work. It really just seems to be an issue of whether there is a USB cable -to-nowhere at the moment that the NX1 is switched on. Has anyone else encountered this, or even better, overcome this?
  18. I'm not saying it looks like film grain, but digital noise seems fine to me before compression.
  19. Compression also hurts digital noise. Noise from uncompressed Raw files looks great; it's only after compression that it becomes video-y.
  20. @Timotheus No, it does not have anamorphic mode.
  21. Forgot to mention: + Almost no latency with a 1080p signal. It's only noticeable at all on very fast whip pans. I have not tried with a 4k signal yet.
  22. I got the A5 in a rush (after finding last minute that the Ninja 2 does not work with the NX1), and have been shooting with it every day for the past two weeks. I'm pretty happy with it. Regarding the discussion of the brightness, it's adequate. I've had the brightness at 50% and it's viewable outside for framing and peaking, though it's no match for an EVF. All in all I would totally recommend it to anyone on a budget who doesn't want a massive 7" monitor. But make sure your batteries work before using! Here are my pros and cons, loosely in comparison with the Feelworld FW760 that I used to use. + Smaller but still 1080p. 5" vs 7" is a huge difference - and also it has thinner bezels and is thinner so it's actually much much smaller. + MUCH lighter. Makes my setup more balanced + HDMI out + Mounting points on three sides + Screen is fantastic. Focus peaking works very well. + It remembers settings through power on/off cycles. The FW760 would reset everything if it powered down. - The battery plate is finicky. It's not a problem for me since I power everything from a power bank (I didn't even own any batteries for my FW760). It will not work with a Wasabi LPE6, but it will work with an Ipax LP-E6, even though BOTH of those batteries power a 5D3 perfectly. - Plasticky. I think the FW760 felt a little sturdier, but not by much. However, the mounting points seem secure and it's not going to fall apart in your hands. Sort of the build you expect from a screen this price. - Only one customizable button. - False color does not have a color scale on the side, so you have to guess what IRE each color corresponds to - The 1/4-20 threads are too shallow. I had to put a few washers on my friction arm's thread. - The hot shoe mount that comes with it is junk. - It would have been really nice to put one of these on the back: https://www.amazon.com/SmallRig-Camera-Threaded-Monitor-Accessories/dp/B01NCK79G2 . I had one on my FW760 and then used a lock washer on the friction arm to keep it from accidentally rotating.
  23. I'm not 100% sure I'm following you. If you mean that the focus throw of the lens is too short to accurately pull focus, that is entirely due to the construction of the lens. You can make the same optics have a focus throw of 5 degrees or 300.
  24. No, we were talking about pixel vignetting, which is caused by pixels being recessed. Oblique rays of light get occluded by the "rim" of the pixel. The farther from the center of the image, the more oblique the rays that strike that pixel. Thus, more light is occluded by the "rim" at the corners which causes vignetting. My thought was that if you simply take a design and scale it down, none of the angles will change. So the light rays passing through a 50mm on full frame will have the same angles as they would through a 25mm on MFT, if measured at corresponding points on the sensors. Now, I just did a back-of-the-envelope calculation that implies that I was wrong, since we are measuring at equivalent DOF, which interestingly makes the diameter of the aperture equal. (Maybe that's obvious to all of you, but I found it interesting). But I don't know enough about optics to be sure that any of my thought process is correct! True! That is why Andrew compared a 17.5mm f0.95 to a 35mm f2.0. Halving the focal length requires an extra two f-stops wider aperture for equivalent depth of field.
  25. I'm probably wrong, I'll need to read up more on lens design. It was an intuition based on the fact that scaling an object doesn't change any angles--but again, I am not sure it's right. I hoped someone could explain why it was right or wrong. @Brian Caldwell That's how I understood you before, thanks for clarifying!
×
×
  • Create New...