Jump to content

KnightsFan

Members
  • Posts

    1,214
  • Joined

  • Last visited

Everything posted by KnightsFan

  1. If you see it in YouTubers' shots but not Hollywood shots with similar amounts of motion, then it's user error and not a problem of 24 fps per se. Certainly any time you put footage on a timeline with a mismatched frame rate you'll have problems unless you know what you're doing. I've shot a lot of projects at 24.00 fps, but they were shown in theaters on native projectors (nothing big, just screenings for friends, small festivals). On a proper projector it's pretty easy to see the difference between 24 and 30 fps, with the former feeling more like a classic movie to most people. So at this point, under proper conditions and with proper technique, there is certainly an argument to be made for 24 if you are trying to conjure a traditional movie impression. Once we move to 120 and 240 hz monitors though, the technical limitation will be gone and then any problems you see will purely be user error with mismatched settings, or bad technique. Then it will just be a question of the impression you want to give, 24 vs 30 vs 60, or even 120. With 240 hz you could even add 48 to that list of native formats. I do think we should move away from the fractional rates though... 29.97 instead of 30 is just annoying to think about. Probably because it's a lot easier to stare at a still image and painstakingly grade it than to let something play and evaluate the overall effect.
  2. I'm curious @herein2020 if what you are seeing is judder (from playing 24 fps on a 30 or 60 hz screen) or something else. I've never been able to verify that I can see judder personally. Once 120 hz monitors are the norm, of course judder won't be an issue. I watched that drone video you posted, and at the 28s mark there is definitely something wrong. It's jumping all over the place without that much motion, so I would guess at operator error somewhere. Certainly dropping a 60fps clip on a 24 fps timeline will cause noticeable problems without good frame blending.
  3. I did do a quick test. My process was to film a white wall as a 4k Raw clip, which I processed into a 4:2:2 10 bit uncompressed file. I then used ffmpeg to process the uncompressed video into two different clips with the only difference being the bit depth. I used 420 color and crf 16 on both. The two files both ended up roughly the same size. (8 bit is 3,515 KB, 10 bit is 3,127 KB). I applied a fairly extreme amount of gain, and white balance adjustment equally to all clips. I've included a 100% crop of the uncompressed, 10 bit, and 8 bit files. As you can see, the 8 bit has significantly more ugly banding than the 10 bit. As you can see, the 8 bit has some nasty banding that is not present in the 10 bit version. This is of course an extreme example to show a relatively small difference, but also it does get perceptually worse in motion rather than still frames. Also note that the PNG files themselves are 8 bit (which would match a typical delivery). The banding you see is from the color grading, as all 3 versions have been quantized down to 8 bit upon rendering. Moreover, the 10 bit is actually a 10% smaller file. I find 10 bit HEVC is consistently a smaller file size than 8 bit for better quality. The real benefit of more accurate sampling is that it allows more accurate processing throughout, from compression to coloring. On an related note, both the HEVC clips have lost all the grain and detail compared to uncompressed, which is very unfortunate. However, they are 1% of the file size so I can't complain too much! Edit: just look at the file names to see which pic is which
  4. A decent test would be to shoot a scene in as high quality as you can, like uncompressed raw, and then export a 10 bit and 8 bit version with roughly matching codec and size from that Raw master, and compare those results. If you really want to isolate 10 vs 8 bit, export uncompressed videos with those bit depths. You will most likely see the biggest difference in scenes with smooth gradients in the shadow, particularly with big color grades such as incorrect white balancing, or underexposed scenes. Maybe I'll do some tests later today.
  5. My camera owning plan for 2021 is an RTX 3080... It's hard to justify live action movies with Covid, so I've been doing animation instead. Hopefully my ticket comes up in EVGA's waitlist soon.
  6. I'm glad you're bringing attention to this, @Andrew Reid, as everyone should at least be aware of who is tracking them and why. I have two Firefox addons, one is uBlock Origin for blocking ads, and the other is Blur for blocking trackers. I can confirm that the EOSHD main site has those two Google trackers (which Blur blocks), and the forum has 0. Blur blocks 9 trackers on SonyAlphaRumors even after opting out of cookies, and 67 ads are blocked. It's worth pointing out that even if you are okay with being tracked, the richest companies and people in the world make their money off analysis of your data. It's worth considering whether you want to freely donate your data (which YOU pay for with electricity and internet bills!) to the wealthiest people on earth.
  7. @Andrew CalvettiI think it's a special order camera that you can buy directly from Z Cam or maybe some smaller dealers. I don't know if the original E2g ever got listed on B&H or any of the main retailers either. I'm not 100% sure, so the best place the ask is the Z Cam facebook group. Soltys had a "buy" link before but it's gone now.
  8. Oh yeah, I was mainly replying to the post above me
  9. It's continuous. Hold the shutter and it goes until you run out of card space--according to specs. I only ever used it for short bursts.
  10. @fuzzynormalThat's a much better idea, shooting photos. Lots of camera can get nice, high fps. The NX1 can get to 15, I never thought to use it that way.
  11. @kye isn't talking about undercranking though, right? He's talking about slowing down 24 fps. Speaking of Wong Kar Wai, didn't he use used the effect kye is talking about in Chungking Express? It's a nice effect for some scenes. I feel like it's used more for "impact" in action scenes as opposed to the "cool factor" or "rhythm" of normal slow motion, if that makes any sense. Long ago on my GH3 I did that trick with the shutter speed, but a lot of digital cameras don't let you lower the shutter below the frame rate. The Z Cam E2 can shoot any integer frame rate from 1 up to the max, and as a bonus the shutter angle setting always behaves correctly. I did some "retro" tests with 16mm crop mode and a few C mount 16mm lenses, in 15 fps.
  12. The dongle must remain plugged in whenever the software is in use.
  13. If you have the license key, you can have at most 2 activations at a time. If you activate a 3rd computer, it will automatically deactivate on the other two computers. On an activated computer, if you go too long without it checking in with the server, it will not open. So for the license key, you need to have an internet connection at least periodically. With the license, if you have the key available then you can activate on any internet-connected computer at any time, no need to remember to bring a dongle if you are editing remotely. The dongle is perhaps slightly easier if you always have it on you, and it's necessary if you go off grid for a project and don't have internet, but it's also easier to lose, so there are tradeoffs in the edge cases either way. It is true that once you buy a dongle or license, you get access to all future upgrades for free. As far as crashing, yeah Resolve crashes from time to time. I've never had it crash every 5 minutes, more like once a day, and usually only when I'm working with the Fusion tab. Pretty much all complex software crashes sometimes. In my experience, Resolve is much more stable than Premiere. I do think that v16 was worse than v15 for crashing. I'm now using the v17 beta now and it has been very stable so far, so I'm hopeful that they are improving stability.
  14. Right, jam syncing. $200 to "plug in and forget" would be sort of reasonable to me, but $200 to also need to jam sync and remember to do that every time you shut down, and/or make sure devices are in free run, etc. leaves a lot of room for human error. Especially since it's such a simple problem to solve. So yeah I consider them to be way overpriced for the function they perform.
  15. Maybe I look at the wrong time, but I've looked every now and then and I have yet to see any on eBay below $200 in the US. You can sync without TC, it just takes more effort, as the OP and many of us did for years, especially since the major NLE's can sync from waveform. It's a lot of money for something that makes a small improvement to smaller projects. You are talking about jam syncing, in that case, right? You can't use a single tentacle to send constant TC between two devices--can you?
  16. @josdr I had a DR-60D II before, so I believe that. I can live without 32 bit, but it's the top of my "nice to haves," particularly when effects and I'm already holding a mic and performing sounds all at once. I found my AKG CK93 for less than $200, just shy of a new Tentacle Sync E, and that's my primary mic. My first mic, the Rode NTG1, was about $130 if I recall. Both of which were in my opinion better investments that a timecode system, especially since you need minimum two tentacles.
  17. Yes, exactly. It's very annoying, since it is such a small feature that is almost always missing from photo cameras, even ones marketed as hybrids. The GH5S is the only affordable hybrid I can think of with any real TC support. And like you say, dropping a few hundred on Tentacles is such a money drain. A nice, used mic or lens is less than a timecode system, which still only gets you audio LTC that you have to wrestle with in post! I'll never primarily use a camera that doesn't have an actual timecode input ever again, it's such a pain. Depending on how DIY you are, you could build something yourself. I ended up making my own system with Arduinos synced over bluetooth to a cell phone app I made. It was cheap and works as well as a tentacle/other gizmo, but I probably put a hundred hours into making it so it was costly in that regard I've been considering switching to a MixPre II myself. 32 bit would be useful when I'm recording by myself.
  18. You're welcome! You can send a mix from the MixPre to one channel on the XT3, and LTC to the other. Or potentially send a mix from the Mix Pre to the XT3 and HDMI TC the other direction. I wouldn't say the wrong way. It's better than nothing, but it is a workaround compared to pro cameras that have proper timecode functionality. I would explore the HDMI side and see if that gets you where you need to be, and otherwise take some time to work out the workflow sending TC via audio. Once you have to down, it's not difficult, but there's so little information about it that it is indeed difficult to figure out how to make it work. Most people seem to be either pros and cameras with proper TC, or don't use it at all.
  19. Even if the MixPre can do this, it's not advisable. The purpose of timecode is tgetting that time stamp into metadata, and LTC on an audio track is just a workaround to do it. The only reason you'd want LTC in an audio track is if you A) can't put it in metadata, like on DSLR's, or B) need stretch and warp the audio to compensate for drift during the takes which I haven't ever personally seen anyone do, though it is technically possible. Wireless LTC is the simplest way in my opinion, but Imake sure you have scratch audio. My last big project with an XT3, I used a 3.5mm splitter, and ran LTC into one audio channel and used a $6 mic for scratch audio in the other. I used wireless lav transmitters to carry LTC from my Zoom F4 to the XT3. I am not sure if they are compatible, but it seems worth a test to me. See the answer below, jamming the timecode via HDMI might be a good combination of cheap/easy if the drift is acceptable for your work. If you are writing LTC into an audio track, then yes, it needs to be constantly connected. If you are jamming the internal clock to an external source, then typically, no, once you disconnect the timecode source then the internal clock takes over from that same place. That means it is subject to drift once they are disconnected, and you would have to be sure that both are in free run. Definitely do some tests with the devices you own to see how much drift you're dealing with, whether the clock stops when the device is turned off, or other model-specific "gotchas." I'm not sure, I use Resolve exclusively. However it is possible to use 3rd party software to update the file's timecode metadata, which is what I do. I don't know of any ready-made solutions as I wrote the one that I use, but it's a simple ffmpeg command batched across all my files. In my opinion this is a better solution than doing it in your NLE, since the metadata will be ready for use in any software.
  20. Netflix's requirements is not only about having a minimum fidelity to the image, or even making it strictly easier in post, but having a standard workflow. They need to be able to send any one of their productions to any one of their post houses and have it be immediately workable, and that if that team gets shifted to a different project midway through, they can send that half done project to another post house and have them pick it up immediately, no questions. They've decided that TC is a mandatory element to their workflow, from capture through post. To be honest, if I were Netflix, I would specify a lot more of the technical and metadata requirements for my productions just to ensure that every piece is compatible across their entire billion dollar post workflow. On a much smaller scale we do the same thing at my work (outside the film industry). If the designer for one project is out sick and the client needs a change, other people on the team have to be able to open it up and immediately know how to make the change. We adopt standards, some of them because it's the easiest way, some picked randomly just to make sure we're all on the same page.
  21. The Alexa SXT Studio had a rotating mirror shutter, just like a film camera. Technologically, this could be a feature on any digital camera, if there was demand for it. Rolling shutter isn't a sign that digital isn't caught up, per se, it's a sign that in most cases the benefit of film-like shutter artifacts are not worth the added cost.
  22. @Xavier Plagaro Mussard TC is for syncing with audio
  23. A couple highest grossing films of their year: Lawrence of Arabia, 2001 A Space Odyssey, The Godfather, Star Wars, Terminator 2, Lord of the Rings 2 and 3, The Dark Knight. Obviously some lousy films make a lot of money, but by and large prior to ~2005 I'd say that many of the highest grossing films are pretty good or better. I can't say the same for the movies and particularly the shows that rise to the top on Netflix. Every single time I have picked a movie from the top of the Netflix list, I wish I had done something else instead.
  24. I don't know if our thoughts diverge that much. I agree with what you said. I'm saying that streaming is supporting a cultural shift away from more thoughtful movies, towards lower quality (imo) content. Like I said, it's not that streaming is inherently making movies worse, but that the audience is less invested in their content since it's on in the background, or literally stays on after they fall asleep. Quantity over quality. Of course some people still critically watch their streamed content (myself included), but we're in the vast minority. (The other issue I brought up, of concentrating wealth in fewer people is a separate problem.)
×
×
  • Create New...