Jump to content

KnightsFan

Members
  • Posts

    1,214
  • Joined

  • Last visited

Everything posted by KnightsFan

  1. True! But at some point, even the low end specs are good enough, a point which we hit a year or two ago for me (and probably many others). Some people really need/want 4k60p, but I'm happy with good quality HD, so ergonomics and usability are a real deciding factor these days. I'm really looking forward to seeing what Canon does, and let's be honest, I'm just as excited to see what the A7S3 is like.
  2. I guess the popular dispute about whether Canon "couldn't" or "wouldn't" make something competitive might be finally put to rest. Yeah, I'm really curious about that part myself. It's striking that the 35mm prime lens has two control rings, almost like an EF zoom lens... hmm would it be too wild for them to have a digital zoom mapped to that control ring? With 30 Megapixels you could turn a prime into a decent zoom especially for video. I'm probably just dreaming haha.
  3. Ah well, it was a cool idea anyway. Though, this caught my eye: It would be ridiculously cool if they added an electronic declicked aperture ring for EF lenses, though I suspect they simply mean a control ring for some sort of variable ND filter.
  4. That's a really cool idea! I wonder how hard it would be to sell to the general public, though? I think people would balk at the idea of carefully inserting a ~20mm long lens rear element into their camera every time you change lenses. But it would certainly be innovative and make Canon's mirrorless much more appealing for the vast swaths of people who own EF or EF-S glass. EF mount is so common I adapted all of my Nikon lenses to Canon just to put them on a NX-EF adapter! Makes it so much easier to quickly use them on borrowed or rented cameras. On a more general note, while I'm excited to see what Canon has in store, I bet it will be another disappointment for me as my main hope is for H.265. The NX1's codec quality is all I need, and I don't want to double or triple my hard drive consumption for marginal returns on that front. Once the GH5 implemented it, I hoped the floodgates would open to H.265 in mainstream cameras, but we may have to wait a little while longer. Canon would have to pull off something truly spectacular for me to buy into it without H.265.
  5. It was Studio and I was running Windows 8.1. (Afaik there was no native support for H.265 in Resolve 14 Free.)
  6. Nah, I'm fairly sure the thread is about an Advertising Case Study. /spam
  7. I think it looks fantastic! @mercer I see what you mean but I think that's mainly due to the lighting and compression--though YT's compression was certainly much friendlier to this clip than the one with lots of moving foliage!
  8. I got the color shift in 15, but I did NOT get the color shift in 14 (though, as detailed before, this was on two separate computers, etc.) Perhaps the color shift only occurs on Windows 10? Can anyone offer any insight?
  9. Wait, hang on--are we seriously comparing footage uploaded to YouTube? Doesn't comparing compressed sRGB files sort of defeat the purpose of having higher data rates and dynamic range? I think we should wait for files straight out of the camera before making any real judgments.
  10. @noone That's fair, I agree once images are online there's no telling what is what. It's been processed by the user, and then compressed for web. Sometimes, comparing cameras is more a matter of "I can get the colors I want, but it'll take me twice as long in post," instead of "This camera simply CAN'T produce the image I want." And then once it's compressed, any subtle differences will be gone. Fine noise/grain preserved by shooting Raw? Flattened. 500 mbps ProRes 4k? Squished into 40 mbps. Deep rich colors? Oversaturated by an overzealous photoshopper, then smothered by an overzealous compression engine. I feel there is an important life lesson here but I'm too busy reading spec sheets to figure it out.
  11. It's only in playback. When I hit play, it's fine for about half a second and then suddenly it sort of looks like a very light fog machine is blowing haze across the frame--really bizarre. And then if you pause it snaps back to a normal clean image. I tried making a picture to show the effect, but it's barely noticeable without movement. I suppose it is probably just that the computer is hitting some sort of processing limit, and automatically reduces playback quality despite my settings. But it's an oddly specific effect that I've never seen before.
  12. Yes and no. "Color science" refers to the hardware and software that the manufacturer uses to define the output of their camera. For example, Arri's color science may include a dual gain architecture in order to make the highlights behave a certain way, whereas Sigma's Foveon color science involves a very specific type of sensor. The method of creating an image is certainly a science (specifically computer science) built out of objectively measurable variables. The subjective part is that one person may prefer the end result of a specific color science over another, and so a preference of one color science vs. another is not a science itself. And my argument is that having a preference about the subjective part is a perfectly valid way to decide which camera to get, especially now that almost every modern camera has high technical quality.
  13. That's not entirely accurate. People have preferences on color science, and that is a valid reason to prefer one camera over another, even if there is no objective measure of which color/look is "better."
  14. Huh, that's odd. What operating system? On a related note, my NX1 files are ALSO glitching out in Premiere on this laptop; there's this weird wavy noise pattern on top of the image. Again, these same files work perfectly in both Premiere and Resolve on my desktop. So maybe Windows 10 is to blame? Or maybe it's a hardware issue related to GPU acceleration?
  15. It appears to be supported in Resolve 15 free, running on Windows 10. I know H.265 did not work in Resolve 14 free--in fact, H.265 support was the sole reason I bought studio in the first place! Edit: For clarification... when I first read this topic, I assumed everyone was running 14 Studio. Yesterday when I tried in 15 free, I noticed the issue, so I wondered if perhaps everyone here was using the 15 beta.
  16. @Allen Smith @SGPhotography Do you all have Resolve free or studio? What version of Windows? I normally run Resolve Studio on my desktop, where I have no issues with native NX1 files. I've been using it with v14 and up to beta 7 of v15. My desktop runs Windows 8.1. However, today I used Resolve free (v15) on a friend's laptop running Windows 10 and I saw the color bleeding. I wonder if the problem is exclusive to the free version? Or whether it's only with Windows 10?
  17. I have to say I'm skeptical like @webrunner5, since none of the test shots had much dynamic range to begin with. You've got to shoot scenes that have areas with no light, and blown highlights in the same shot frame, and then see how much you can push or pull before losing detail. Though I would also like to point out that since everyone who measures dynamic range has different criteria, the actual numbers are meaningless (ie Blackmagic claiming 13 and Sony claiming 14 doesn't give a valid point to compare).
  18. Is it possible that Nikon doesn't have the processor tech to do 4k/60p with the necessary cooling in a mirrorless body? I wouldn't be surprised, since this is their first one. That's my feeling. It seems like an upgrade from the NX1, but not worth the upgrade cost for me. Maybe a good deal on a used Z6 in a year or two would make me bite. However, I have high hopes for the next generation of Nikon mirrorless. After all, the GH4 did 10 bit HDMI, and then the GH5 had it internally, AND had H.265 for some modes. If Nikon does a similar upgrade with 10 bit internal H.265, that would almost certainly be a top choice.
  19. So you are removing people from the background of a static shot? Can you take a single frame, paint out the people in Photoshop (or photo editor of your choice) and then simply composite the edited part of the photoshopped image over the entire video?
  20. I appreciate your ability to take criticism! The post is much stronger without straying from the primary topic of comparing internal codecs to each other. I think you make a solid argument, given your assumption of no motion error and Panasonic implementing their encoders perfectly. Both of those are big assumptions, but I can roll with them and follow your logic as it stands--and take the real world implications with a grain of salt because of those assumptions. Although, if I'm being honest, some of your language is biased, which gives you less credibility. Saying "Generally there appears to be no benefit using the internal 422 10 Bit codec nor the 420 8 bit double frame rate due to the limitations of the GOP structure" is reasonable (again, given your assumptions), but the next part "here Panasonic has created a few options that to be honest appear more a marketing effort than anything else." is unjustified, I believe. I know I'm editing without being asked to, which is probably annoying, but anyway that's my feedback. Thanks for taking the time to look into this topic and write about it. I'm pretty sure this is all strictly technical haha. Fortunately, nothing anyone says actually changes what your footage looks like, so if you're happy with it then that's all that matters to you as a creative. As for the second part, I'd like to think I'm a creative person, but I do enjoy these technical discussions as well. You'll never hear me say I can't work on a project because technically Prores is better than AVC Intra, but I find it fascinating to learn about the differences.
  21. I think you misunderstand me. We all know from experience that the quality of interframe compression varies based on the amount of motion in the frame. You even showed that as an example with the water fountain. I think it is faulty logic to assume that you can come up with a generic file size for an I frame, out of a long GOP file. I'll try to explain myself more clearly: "With an average video bit-rate of 94 Mbps each GOP has 45.3 Mbps" So far so good. We have 94 Mbits to describe a second of footage. "which means an I Frame has around 13.1 Mbits or 1.57 MB per frame" Ok, that's true. I can see from your chart that the I frame is 1648326 bytes, so that I-frame is indeed ~1.6 MB. "and an equivalent All-Intra bit-rate of approximately 328 Mbps." This is where I disagree. You are apparently multiplying the 13.1 Mb/frame by 25 frames/sec to get the "equivalent" data rate of 327.5 Mbps. You are assuming that the amount of data in a B frame (247334 bytes for example) is the SAME amount of data needed to retain the fidelity of that frame that is required by the I frame in total. That is, you assume that: 1648326 absolute bytes = 247334 delta bytes Where "equivalence" means that they will retain the same amount of information relative to the original, real world scene that is being encoded. And we know, based on your example of the water fountain, that interframe compression quality depends heavily on what is being shot. Therefore, the idea of "equivalent" data rates being used as a measure of actual information retention is faulty, unless you specify what type of data is encoded. I actually think you are right about Prores retaining more information, but like I said you don't have evidence for it. And beyond that, the GH5 could have such poor image processing that uncompressed HDMI actually has no benefit over the internal recording either so even if Prores is theoretically better, concluding that people need an external recorder to get 10 bit quality does not seem justified.
  22. Thank you for the clarification. I am fairly certain now that I've been following your argument correctly. I do understand interframe compression, by the way. However, you can't simply compare bitrates, even adjusted for motion artifacts. There are different algorithms being used. You need to show, with image analysis or an analysis of the algorithm itself, that your method of comparison is valid. Again, for your conclusion, you have not shown that Prores is better. You are still just comparing the size of I frames from codecs that use different algorithms. That's why I suggest doing an actual analysis on real images. Chroma subsampling is for static images, and jpegs do use chroma subsampling. "[Chroma subsampling] is used in many video encoding schemes – both analog and digital – and also in JPEG encoding." (https://en.wikipedia.org/wiki/Chroma_subsampling)
  23. You conclude with "If you want to produce genuine 10 bit colour high dynamic range footage you need to buy an external recorded capable of supporting ProRes 422 HQ or there is not game." That's why I'm suggesting an actual comparison with Prores that uses real world examples. If you just want to compare between internal codecs, that's a misleading conclusion. Not all Jpegs are the same, you can actually specify the quality amount and chroma subsampling of a Jpeg when you encode it. Could you specify which type of jpeg you are referring to? Also--what does jpeg have to do with any of this? I'll admit I'm having a hard time understanding why you use it as an example. Is that a valid assumption to make, though? My understanding is that motion artifacts is one of the main reasons people use All Intra codecs. To summarize my skepticism: It seems to me that you are judging codec quality purely in terms of adjusted bitrate. If I am misunderstanding, my apologies.
  24. You could be right. Do you have a source on that, though? Or a test showing that it is the case that hasn't gone through YouTube compression? In your post, you mainly seem to just compare the bitrates. I don't expect them to be exciting! However, it would give you more credibility if you could show that the numbers do translate to real world differences.
×
×
  • Create New...