Jump to content

KnightsFan

Members
  • Posts

    1,209
  • Joined

  • Last visited

Everything posted by KnightsFan

  1. Nah, I'm fairly sure the thread is about an Advertising Case Study. /spam
  2. I think it looks fantastic! @mercer I see what you mean but I think that's mainly due to the lighting and compression--though YT's compression was certainly much friendlier to this clip than the one with lots of moving foliage!
  3. I got the color shift in 15, but I did NOT get the color shift in 14 (though, as detailed before, this was on two separate computers, etc.) Perhaps the color shift only occurs on Windows 10? Can anyone offer any insight?
  4. Wait, hang on--are we seriously comparing footage uploaded to YouTube? Doesn't comparing compressed sRGB files sort of defeat the purpose of having higher data rates and dynamic range? I think we should wait for files straight out of the camera before making any real judgments.
  5. @noone That's fair, I agree once images are online there's no telling what is what. It's been processed by the user, and then compressed for web. Sometimes, comparing cameras is more a matter of "I can get the colors I want, but it'll take me twice as long in post," instead of "This camera simply CAN'T produce the image I want." And then once it's compressed, any subtle differences will be gone. Fine noise/grain preserved by shooting Raw? Flattened. 500 mbps ProRes 4k? Squished into 40 mbps. Deep rich colors? Oversaturated by an overzealous photoshopper, then smothered by an overzealous compression engine. I feel there is an important life lesson here but I'm too busy reading spec sheets to figure it out.
  6. It's only in playback. When I hit play, it's fine for about half a second and then suddenly it sort of looks like a very light fog machine is blowing haze across the frame--really bizarre. And then if you pause it snaps back to a normal clean image. I tried making a picture to show the effect, but it's barely noticeable without movement. I suppose it is probably just that the computer is hitting some sort of processing limit, and automatically reduces playback quality despite my settings. But it's an oddly specific effect that I've never seen before.
  7. Yes and no. "Color science" refers to the hardware and software that the manufacturer uses to define the output of their camera. For example, Arri's color science may include a dual gain architecture in order to make the highlights behave a certain way, whereas Sigma's Foveon color science involves a very specific type of sensor. The method of creating an image is certainly a science (specifically computer science) built out of objectively measurable variables. The subjective part is that one person may prefer the end result of a specific color science over another, and so a preference of one color science vs. another is not a science itself. And my argument is that having a preference about the subjective part is a perfectly valid way to decide which camera to get, especially now that almost every modern camera has high technical quality.
  8. That's not entirely accurate. People have preferences on color science, and that is a valid reason to prefer one camera over another, even if there is no objective measure of which color/look is "better."
  9. Huh, that's odd. What operating system? On a related note, my NX1 files are ALSO glitching out in Premiere on this laptop; there's this weird wavy noise pattern on top of the image. Again, these same files work perfectly in both Premiere and Resolve on my desktop. So maybe Windows 10 is to blame? Or maybe it's a hardware issue related to GPU acceleration?
  10. It appears to be supported in Resolve 15 free, running on Windows 10. I know H.265 did not work in Resolve 14 free--in fact, H.265 support was the sole reason I bought studio in the first place! Edit: For clarification... when I first read this topic, I assumed everyone was running 14 Studio. Yesterday when I tried in 15 free, I noticed the issue, so I wondered if perhaps everyone here was using the 15 beta.
  11. @Allen Smith @SGPhotography Do you all have Resolve free or studio? What version of Windows? I normally run Resolve Studio on my desktop, where I have no issues with native NX1 files. I've been using it with v14 and up to beta 7 of v15. My desktop runs Windows 8.1. However, today I used Resolve free (v15) on a friend's laptop running Windows 10 and I saw the color bleeding. I wonder if the problem is exclusive to the free version? Or whether it's only with Windows 10?
  12. I have to say I'm skeptical like @webrunner5, since none of the test shots had much dynamic range to begin with. You've got to shoot scenes that have areas with no light, and blown highlights in the same shot frame, and then see how much you can push or pull before losing detail. Though I would also like to point out that since everyone who measures dynamic range has different criteria, the actual numbers are meaningless (ie Blackmagic claiming 13 and Sony claiming 14 doesn't give a valid point to compare).
  13. Is it possible that Nikon doesn't have the processor tech to do 4k/60p with the necessary cooling in a mirrorless body? I wouldn't be surprised, since this is their first one. That's my feeling. It seems like an upgrade from the NX1, but not worth the upgrade cost for me. Maybe a good deal on a used Z6 in a year or two would make me bite. However, I have high hopes for the next generation of Nikon mirrorless. After all, the GH4 did 10 bit HDMI, and then the GH5 had it internally, AND had H.265 for some modes. If Nikon does a similar upgrade with 10 bit internal H.265, that would almost certainly be a top choice.
  14. So you are removing people from the background of a static shot? Can you take a single frame, paint out the people in Photoshop (or photo editor of your choice) and then simply composite the edited part of the photoshopped image over the entire video?
  15. I appreciate your ability to take criticism! The post is much stronger without straying from the primary topic of comparing internal codecs to each other. I think you make a solid argument, given your assumption of no motion error and Panasonic implementing their encoders perfectly. Both of those are big assumptions, but I can roll with them and follow your logic as it stands--and take the real world implications with a grain of salt because of those assumptions. Although, if I'm being honest, some of your language is biased, which gives you less credibility. Saying "Generally there appears to be no benefit using the internal 422 10 Bit codec nor the 420 8 bit double frame rate due to the limitations of the GOP structure" is reasonable (again, given your assumptions), but the next part "here Panasonic has created a few options that to be honest appear more a marketing effort than anything else." is unjustified, I believe. I know I'm editing without being asked to, which is probably annoying, but anyway that's my feedback. Thanks for taking the time to look into this topic and write about it. I'm pretty sure this is all strictly technical haha. Fortunately, nothing anyone says actually changes what your footage looks like, so if you're happy with it then that's all that matters to you as a creative. As for the second part, I'd like to think I'm a creative person, but I do enjoy these technical discussions as well. You'll never hear me say I can't work on a project because technically Prores is better than AVC Intra, but I find it fascinating to learn about the differences.
  16. I think you misunderstand me. We all know from experience that the quality of interframe compression varies based on the amount of motion in the frame. You even showed that as an example with the water fountain. I think it is faulty logic to assume that you can come up with a generic file size for an I frame, out of a long GOP file. I'll try to explain myself more clearly: "With an average video bit-rate of 94 Mbps each GOP has 45.3 Mbps" So far so good. We have 94 Mbits to describe a second of footage. "which means an I Frame has around 13.1 Mbits or 1.57 MB per frame" Ok, that's true. I can see from your chart that the I frame is 1648326 bytes, so that I-frame is indeed ~1.6 MB. "and an equivalent All-Intra bit-rate of approximately 328 Mbps." This is where I disagree. You are apparently multiplying the 13.1 Mb/frame by 25 frames/sec to get the "equivalent" data rate of 327.5 Mbps. You are assuming that the amount of data in a B frame (247334 bytes for example) is the SAME amount of data needed to retain the fidelity of that frame that is required by the I frame in total. That is, you assume that: 1648326 absolute bytes = 247334 delta bytes Where "equivalence" means that they will retain the same amount of information relative to the original, real world scene that is being encoded. And we know, based on your example of the water fountain, that interframe compression quality depends heavily on what is being shot. Therefore, the idea of "equivalent" data rates being used as a measure of actual information retention is faulty, unless you specify what type of data is encoded. I actually think you are right about Prores retaining more information, but like I said you don't have evidence for it. And beyond that, the GH5 could have such poor image processing that uncompressed HDMI actually has no benefit over the internal recording either so even if Prores is theoretically better, concluding that people need an external recorder to get 10 bit quality does not seem justified.
  17. Thank you for the clarification. I am fairly certain now that I've been following your argument correctly. I do understand interframe compression, by the way. However, you can't simply compare bitrates, even adjusted for motion artifacts. There are different algorithms being used. You need to show, with image analysis or an analysis of the algorithm itself, that your method of comparison is valid. Again, for your conclusion, you have not shown that Prores is better. You are still just comparing the size of I frames from codecs that use different algorithms. That's why I suggest doing an actual analysis on real images. Chroma subsampling is for static images, and jpegs do use chroma subsampling. "[Chroma subsampling] is used in many video encoding schemes – both analog and digital – and also in JPEG encoding." (https://en.wikipedia.org/wiki/Chroma_subsampling)
  18. You conclude with "If you want to produce genuine 10 bit colour high dynamic range footage you need to buy an external recorded capable of supporting ProRes 422 HQ or there is not game." That's why I'm suggesting an actual comparison with Prores that uses real world examples. If you just want to compare between internal codecs, that's a misleading conclusion. Not all Jpegs are the same, you can actually specify the quality amount and chroma subsampling of a Jpeg when you encode it. Could you specify which type of jpeg you are referring to? Also--what does jpeg have to do with any of this? I'll admit I'm having a hard time understanding why you use it as an example. Is that a valid assumption to make, though? My understanding is that motion artifacts is one of the main reasons people use All Intra codecs. To summarize my skepticism: It seems to me that you are judging codec quality purely in terms of adjusted bitrate. If I am misunderstanding, my apologies.
  19. You could be right. Do you have a source on that, though? Or a test showing that it is the case that hasn't gone through YouTube compression? In your post, you mainly seem to just compare the bitrates. I don't expect them to be exciting! However, it would give you more credibility if you could show that the numbers do translate to real world differences.
  20. Nice analysis. I'm glad you actually delve into the numbers and explain the different types of encoding! However, it would be better if you included files straight out of camera and ran analysis on them, both qualitative and quantitative. Since Prores and AVC intra use different algorithms, simply comparing data rates is not an accurate representation of their fidelity. I suggest encoding an uncompressed video into Prores, and also into AVC intra, and then running a script to compare the compressed images to the original and get an exact number on how much scene data is lost. Also, your title is meaningless unless you specify what the 10 bit internal is not good enough for. Is the 10 bit theoretically as good as Prores? Maybe--let's run some tests and see. Is the GH5's specific implementation inferior to external recording to Prores HQ? We need real world tests again. Are the GH5's 10 bit codecs better than any other photo/video hybrid's codecs? Almost certainly. Is it good enough for anything that we used to use GH3 and GH4's for? Of course!
  21. I believe E shutter is only available with native Samsung lenses. I read through a DPR thread about it, and some people claim E shutter can introduce rolling shutter artifacts. https://***URL removed***/forums/thread/3784289#forum-post-55098816 However, I'm not sure that's the case since only the first curtain is electronic in the NX1's implementation. Usually rolling shutter is due to readout speed, and since the second shutter is mechanical you wouldn't necessarily have any issues there. I may try to dig out my only native lens and do some tests now that you mention it.
  22. This is amazing! Really fantastic. I'm considering building an electronic follow focus for just spherical lenses. Do you mind sharing more about how you made it, like what servos you ended up using?
  23. You'll have trouble saving space AND reducing export times, unless you are willing to sacrifice a lot of quality. The reason MJPEG and ProRes files are huge is because they are lightly compressed, i.e. take less processing power to decode. So here are a few options: 1. Transcoding to ProRes, DNxHD, or Cineform seems like a bad idea, since you won't save any space with comparable image quality. 2. You could save a ton of space with HEVC--100Mbps would be plenty--but pay for it in increased exporting time. 3. H.264 might be your best option for balancing size, quality, and encoding time. Maybe do some tests at 200 Mbps and see how it goes.
  24. @Allen Smith do you get the same jitter with other cameras shooting at 24p? If not, then what you're seeing is not frame conversion judder.
×
×
  • Create New...