Jump to content

KnightsFan

Members
  • Posts

    1,351
  • Joined

  • Last visited

Everything posted by KnightsFan

  1. @Emanuel ideally we could find someone to record on both an atomos and internally on really any camera. In 1080p, prores proxy is about 35 Mbps so a f3 would be great. In 4k we could compare a gh5s in 150 Mbps mode with prores proxy. I dont have an external recorder, so on my own i am limited to finding some raw or uncompressed footage and using ffmpeg. Obviously thats using an overpowered encoder, unfortunately, though i can lower the encoding speed. The benfit of doing it with ffmpeg, though, is that we can run psnr tests if you are interested, though psnr is a really bad metric for video quality. From wikipedia: "Although a higher PSNR generally indicates that the reconstruction is of higher quality, in some cases it may not. One has to be extremely careful with the range of validity of this metric; it is only conclusively valid when it is used to compare results from the same codec (or codec type) and same content.[8][9] Generally, PSNR has been shown to perform poorly compared to other quality metrics when it comes to estimating the quality of images and particularly videos as perceived by humans." And you are right that we are mainly interested in a specific context. My original context was casual use: snapping a family video, or uploading to facebook with little to no editing. Ive actually seen both those videos before. The first test is flawed because the canon is a completely different processing pipeline. The codec is one of many variables. Both tests are flawed as they were uploadrd to you tube, which is a significantly lower bitrate than either of the two original clips. So unless you are specifically trying to decide which looks better on youtube, it is pretty useless to compare codecs in a youtube video.
  2. @Emanuel maybe i should do another blind codec test for everyone comparing h264 with prores proxy?
  3. @androidlad jeremy says in the video that nikon hasnt decided whether to use 10 or 12 bit, or whether to use linear or log yet. According to thr prores raw pdf, it "decodes to linear" which means that it could be stored in log. In any case, the format doesnt know which the bits mean. Saying prores raw is linear is like saying hdmi carries a yuv signal... atomos and nikon can put whatever they want into that signal. @Eric Calabros seems right. At the moment they are recording 12 bit linear, but the specs of what they will release to the public are not finalized. Not that i take the 10 bit as a fact either, its still up in the air.
  4. "Z CAM E2 is formally ProRes licensed!" -Kinson Next firmware update will unlock it automatically; in the meantime they posted a method to unlock ProRes manually. Unlike the internal H.265, which is 10 bit 4:2:0, internal ProRes on the E2 has 4:2:2 subsampling.
  5. Yup. You'd have to use a very poor H.264 encoder if it looks as bad as ProRes Proxy at the same bitrate. Again, not talking about serious production here. I'm saying for people who want to use their camera for casual purposes, the presence of lower bitrate video is a benefit, while obviously the Z6 with an Atomos has ProRes as well.
  6. ProRes Proxy looks pretty terrible compared to H.264 at a similar bitrate. LT is 3x the size. ProRes really just isn't a great family of codecs for casual videos.
  7. Let's not forget that the Z6 can shoot in standard H.264 as well! That's a huge benefit for hobbyists or enthusiasts who want high fidelity for projects, but also use their camera for casual purposes like family vacations.
  8. I never said uncompressed, but Raw should be mathematically lossless to be a legitimate Raw format. It should also have a linear gamma, little to no digital processing, and no debayering if originating from a bayer sensor. Any video format can be treated as a "digital negative" and store information that can be recovered later. Most log curves are designed to do just that. If I make a video format that simply takes the bayer data from a sensor, pretend that every group of 3 pixels is a single RGB pixel, then apply ordinary ProRes compression to the entire thing, that's certainly not RAW at all. I suspect that is very nearly what ProRes RAW does. You can essentially change the white balance on any video if you transform it into linear gamma before doing so. The only hindrance is that most cameras don't store the original white balance metadata in non-raw formats. As long as ProRes RAW retains all the information from pixels sufficiently, it should be just as flexible as any of the other Raw-lite formats for white balance adjustments. It does seem silly that it is not builtin, though.
  9. I agree with @GiM_6x. We should put some effort into using words correctly. Its the reason netflix wont allow upscaled alexa footage to be considered 4k. Prores raw, redcode raw, bmraw, are all fine formats. I dont have an issue with them as formats, but it is marketing bs to call them "RAW." Its not a matter of whats "needed" for a given production, its quite simply a matter of using correct terminology. Its like putting an 8 bit stream into a 10 bit container and calling it 10 bit.
  10. @Mokara I will change camera settings sometimes for a specific creative look, but 95% of the time I keep them locked as above because I'm confident with how that will look in relationship to what the real world lighting looks like, and I'm comfortable grading that image. Being familiar with the specific image removes the need for expensive color-calibrated monitors on set, since you know in your head what it will look like. If I want the look to change, I adjust lighting.
  11. Yes, I'm aware of that. That's why surprising that at 32 GB with a max read speed of 80 MB/s has a slower write speed than a 16 GB card with a max read of 45 MB/s, both from SanDisk. It's a glitch of some kind, yes. It's strange that it is only present in that one specific white balance mode, though.
  12. Amazong. So is this the first hybrid camera to natively offer RAW video? I would never have guessed that Nikon would be first.
  13. @DaveBerg Haha, I totally understand! I once tried gaming with a 4k monitor, but it was limited to 30 fps and I couldn't stand it! It really is just that we are used to "cinema" being really slow. I sometimes shoot whitewater kayaking at 120 as well. The motion is beautiful, really a night and day difference compared to 30 or 24. Although, I'm not sure I'm ready for that smoothness in a narrative film yet. I have issues like this too. I have an "80 MB/s" 32GB SanDisk that is too slow for 120 fps (though it works perfectly for 4k/24). Then I've got a pair of "45 MB/s" 16GB SanDisks that are somehow fast enough for 120. Doesn't really make sense to me. Actually, my NX1 is on its way out to be honest. It freezes from time to time, it even did so while shooting tests yesterday. Sometimes the screen is half black, or the menu won't come up. So far, battery pulls have fixed every problem... but I don't think I'll use it on real projects anymore. No, but I've seen other mentions of this glitch. My NX1 sometimes has weird color in single frames when using a Kelvin white balance. It's like I'm shooting in Gamma DR and then one frame will be in Gamma Normal. It only happens with a sudden change in exposure, like if I quickly close the aperture while shooting video. And only in Kelvin white balance mode. I never had any color issues in any white balance presets, or using a grey card, or even in AWB.
  14. I think a motorized gimbal will give the best results with the least amount of attention paid to the rig, which seems to be the critical element here. Benefits to a motorized gimbal: Significantly easier to keep level, which will be HUGE if you are walking on rough terrain or climbing over things. It will be significantly easier and faster to balance, which is crucial if you are in a hurry. The glidecam is not a bad option at all, and would be my pick if you can be sure of your footing. A few benefits to the glidecam: no batteries, no firmware, very weather resistant. It's significantly easier to pan and tilt naturally with a glidecam, you just touch it and it moves. Much more intuitive and tactile than twiddling with joysticks or touchscreen apps. You can also carry it easily when not shooting. A motorized gimbal will flop around if it is off, so you need a way to secure it when not in use. If neither of those will work, I would go for a fig rig. It's essentially just two handles shoulder-width apart, and a top handle. I know people who have made their own out of old bike wheels. Using this will keep your horizon more level than just holding the camera. You can operate it one handed if you need to clamber up stuff. Again no batteries or firmware to worry about. A fig rig won't do MUCH stabilization, but it's better than nothing. My last pick would be the shoulder rig. I can't imagine doing a strenuous hike with one of those on me. Maybe other people have better technique, but for me to get decent results I have to keep my back and neck pretty straight, which is impossible hiking up or down any sort of incline or when scrambling up and over things. And if you need to swing your arm out suddenly to maintain balance... goodbye camera. Whatever you go with, I definitely recommend having a quick and secure way to put the rig away, and leave both hands free, just in case.
  15. @DaveBerg No problem, I'm really curious to find out what exactly is going on. Earlier today I did a number of tests. I tried PAL 25, 50, NTSC 29.97, and 24. First I compared with an old Olympus camera I had lying around. I almost came back and said I saw what you meant about jitter, because the Olympus has a "smoother" look on pans. It may be due more to a softer image/more compression, though, because I also tested against my friend's XT3 and there I don't see a difference. I put this together as a quick example: https://drive.google.com/open?id=1zdEhB_CPPQyC612tR5aElDlxnMd9WXNd How does that video compare with what you observe on your footage? I used the settings you said, except that both the XT3 and NX1 are shooting 24.00 fps with a 1/50 shutter. The Fuji has a Flog/Fgamut to 709/Eterna LUT thrown on. I did notice that the Fuji looked smoother until I put the LUT on. The washed out appearance falsely smoother it over in my eyes. If I lower the contrast on the footage, it does appear to have smoother motion. In other words, lower dynamic range makes the sharp boundary between indoor and outdoor extra apparent at a low frame rate, almost "jittery". I don't know if this is the same "jitter" you are describing. 50 fps with a 1/60 shutter (which is slowed down 2x) will have much more motion blur than 25 fps with a 1/50 shutter. I know what you mean! I use Gamma DR, but I spent a lot of time figuring out how to color grade it to where I want. It goes from yuck to fantastic with some tweaking.
  16. @DaveBerg In your footage I see virtually no motion blur, so I think that particular clip is a fast shutter. To be honest, I'm don't see the jitter you are talking about. It's possible that you have a more critical eye than me, or even that I am so used to an NX1 that I am blind to it. If you get a chance, can you do a side by side with another camera that doesn't have the jitter? Preferably in 4k so it's super obvious. If you don't have the camera with you then no hurry, we can always keep investigating this later. Another possibility is that the player you are using has issues with its HEVC decoder, which would explain why you see it and I don't. Here on EOSHD we've already discovered problems using the builtin Windows 10 HEVC decoder inside Da Vinci Resolve Free. (This seems unlikely to be the cause, but worth mentioning as we haven't ruled it out yet). If this is the case, then if I transcode your jittery files on my end to, say, ProRes, then the jitter should disappear. If this is true, then it's either a problem with your settings or a genuine glitch/broken part in your camera. (Hopefully the former!). However, if you are viewing transcoded footage then it could still be a decoder issue. What firmware do you have? Any hacks currently or previously installed? Does it occur with all lenses? To rule out lens issues, have you tested it with an adapted manual lens? If you tell me a specific set of settings (exposure, picture profile, white balance), I'm happy to shoot some tests on my end and send the files for you to look at.
  17. @Rick_B I appreciate you taking the time to ensure that I see that. Always happy to give my opinion on gear!
  18. Yes. I suppose it depends on the format you choose for your proxies, but for any reasonable proxy format, the answer is yes.
  19. What is your shutter speed? It looks like a lack of motion blur. If your shutter is significantly faster than 180 degrees, low framerate videos will appear "jumpy" as your eye sees individual frames rather than being fooled into seeing motion. The increased sharpness if 4K would exacerbate the problem. Check out: https://frames-per-second.appspot.com/ and check the presets to compare images with and without motion blur.
  20. What software are you using? Tbh, I would be surprised if a slow HDD was the root of your problems if it tells you the GPU is incapable of an effect. Can you tell which frame is causing issues? ie is it always crashing on a specific part of the render? If so, can you preview that particular frame, or even export it as a single image? If so, then my guess (and this is only a guess!) is that it's a bug--maybe sapphire has a memory leak such that when rendering, it eventually runs out of RAM and bogs down. If that's the case, then no amount of hardware changes would solve anything. However, you may be able to get around it. In my experience, effects-heavy sequences fail to export more often than not. I usually export as an image sequence. That way, if it crashes on frame 30,000, you only need to debug one frame, and you won't have to render the first 29,999 frames again. And if it's a memory leak bogging a long render, then you won't have any issues at all. At the end, you can stitch the image sequence together with minimal system strain. Of course, for a long project this would require a LOT of disk space, but if you're considering a new SSD anyway it may be worth it.
  21. You have OIS and DIS off, right? Sometimes the DIS flips out on me, especially with ultrawide lenses. I've never had any issues when DIS is off. I'd be interested in seeing a clip that shows the problem if you have any.
  22. It might not be very easy to do this if you've cut the files before generating proxies. When I do this for my proxy workflow in Resolve, I'm relying on the fact that each proxy file is exactly the same filename and length as its corresponding online media. How would Premiere know that Proxy A corresponds to time 0:05 - 0:15 of file A? If you have unique timecodes for each file, and have properly carried those over into the proxies, you may be able to do it painlessly, since timecode would indicate which proxies correspond to which segments of the original. I think Resolve's conform abilities might be able to handle this, no clue about Premiere. In my experience, there won't be much degradation. It might be visible if you pixel peep, but certainly will not be immediately noticeable to the average viewer. It may be worth the cost, especially since at this point the H.264 files are really holding up your ability to finish editing. So I wouldn't worry too much about it--it's a non issue at best, and a necessary evil at worst. But I'd keep those original files around anyway, just in case you find that you need them.
  23. Yes, it is big enough if you are close enough to be operating (ie not 10 feet away). I downscaled from 7 to 5 because the 7 was so unwieldy on a dslr rig. Much happier now.
  24. I have the lilliput a5. For the price, it is excellent. I can use it in bright sunlight, it has peaking and false color, and was cheap even brand new. It has mounting points on three sides, and runs off 9v power from the same brick that powers my camera. The color is NOT accurate. If you want to judge color on your external monitor, you will need to spend more money. But if you mainly want a monitor it for flexibility and focusing, I highly recommend the a5. The main problem with it is it only had one customizable button. I previously owned a 7" feelworld and a 3.5" feelworld evf. Both were inferior to the lilliput. The 7" was just too big and heavy, it really dwarfed my rig. The evf was just bad in every way. My impression is the new feelworld monitors are better than the ones i had, probably on par with the a5 for the similarly priced ones. I have used some smallhd monitors briefly. They are certainly better build quality. I cant say they would be worth 5x the cost to me, since i have never broken any of my cheap gear and i have no need of great color accuracy on set. Your needs may differ of course.
  25. @fuzzynormal it was not working perfectly as of a month ago. It worked pretty well throughout editing and color, but seemed to lose optimized media when i started adding ofx plugins. Regenerating the media would work, but it would lose them again next time i loaded the project.
×
×
  • Create New...