Jump to content

KnightsFan

Members
  • Content Count

    696
  • Joined

  • Last visited

  • Days Won

    2

KnightsFan last won the day on December 8 2018

KnightsFan had the most liked content!

1 Follower

About KnightsFan

  • Rank
    Frequent member

Contact Methods

  • Website URL
    https://gobuildstuff.com/

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Thanks, but I sort of went on a DIY adventure and never got to the point of having a complete camera. But maybe I'll post a late "entry" in the future once it's all duct taped together.
  2. I got too busy to enter (but just managed to view all the entries in time!), so I hope there is another.
  3. A few months ago they mentioned working on prores raw via hdmi. They are also working on import plugins for zraw in some nle's. They have repeatedly said braw and prores raw are impossible, idk if its hardware or licensing. Z raw is a "partial debayer." I suspect that they can't do a compressed full bayer because of patents. If braw and prores raw are impossible, then the only options are: A) uncompressed, which would limit frame rates because of write speeds, or B) a new format. Z cam always seems to make incremental updates and is cautious about overpromising, i bet that within a few months the situation will be better for raw output from their cams, but until then, yeah it isn't a great workflow.
  4. That is so true. Even on paper, you have to look at combinations and not individual pieces. Leica uses a very thin filter stack on their sensor, whereas the standard MFT filter stack is very thick. So a lens designed for maximum sharpness on a Leica digital camera will not be as sharp on a MFT sensor. Canon is somewhere in the middle. (LensRentals has a lot of nice articles about it https://wordpress.lensrentals.com/blog/2014/06/sensor-stack-thickness-when-does-it-matter/)
  5. Don't get me wrong, i love 4k. I always shoot 4k, religiously maintain fidelity throughout my post pipeline. I'm the guy who mixes in 5.1 despite the fact that virtually no one watches my content with 5 speakers. I am just saying that, given finite money, I don't think it is worth doubling the cost to get 6k over 4k, and I don't see any other tangible advantages of the P4k at the moment.
  6. KnightsFan

    Davinci Resolve 16

    Version 16.0 is out of beta, final release. Version 16.1 is now in public beta, with new features above and beyond what 16.0 brought (and probably bugs, since it is a beta).
  7. I understand why higher resolutions are better, and where to see them, but here's my experience: I know 0 people who own a 4k computer screen. I own a 4k TV but never use it in 4k because I have 0 4k movies, my blu ray player only supports 1080p, my internet won't handle a 4k stream, and for gaming it maxes out at 4k30 which is too low a frame rate. 100% of the Apple desktop computers that I have seen, were at universities or studios--Mac as an operating system has <4% market share, and a lot of those are laptops. The actual number of 4K+ iMacs out there in use is miniscule. Very few of my friends actually own desktop computers, for that matter, they just use laptops and phones with HD screens. They would benefit from the bitrate increase of streaming 4K on their devices, but for vast swathes of rural America, the internet bandwidth simply doesn't support 4k. I get that a lot of pros need 4k or 6k to stay ahead of the competition. When making content, you make it high enough resolution that the top tier viewers are satisfied, even if only 10% see it in 4k. But the reality for me is, 4k is not a display format yet. My work has only ever been shown in local theaters, festivals, and bars, and none of those venues had 4K display capabilities.
  8. KnightsFan

    Davinci Resolve 16

    @Shell64 also they sell hardware. Control consoles, keyboard, and playback cards. The consoles and keyboard are optional, but the card is required to get a 10 bit or hdr signal for monitoring, so pretty essential for a serious coloring setup.
  9. I have heard good things, yeah. The og pocket and 2.5k were also a joy to use from an interface perspective. Blackmagic's software is really their strong point, both in camera and for post. The menus on modern cameras are absolute nightmares, with a billion options that make little effect on the image, and no effect on the story.
  10. Wow, this thread spiraled quickly. Usually it takes at least 7 pages for the politics to hit full stride. I can't. It's still a while before 4k is truly mainstream. I know very few people with 4k screens, and I know no one who regularly watches 4k content. I know in select communities 4k is old news, but from what I see it's still a minority. You can make the argument that 6k is "future proof," but by the time I want to shoot in 6k, we'll have 3 more generations of cameras to choose from. From my perspective the P4K has no downsides, and half the cost. Unless, of course, the real world reviews and footage is vastly different from my expectations--if it turns out the P6K inherits the minimal rolling shutter from the 4.6k G2, then that's a good place to start making a case--but it seems unlikely. What I dislike about the new Pocket series in general is it does not seem very easy to rig up. The battery life seems poor, and an external battery would block the screen. And I don't like DSLR-style grips, which either get in the way of a follow focus, or make it hard to balance on a gimbal. I'm still hoping for an update to the Micro, but it is more likely that I'll have to turn to Z Cam when/if I upgrade from the trusty NX1. What I love about both P4K and P6K is that they really set the bar low for price by stripping away complex systems that I don't need for video, such as AF, IBIS, and EVF, while adding a lot of simple things that are crucial, like full size HDMI and timecode.
  11. I create proxies with ffmpeg. I wrote a simple script to batch files. If I use hardware encoding and decoding, converting 10 bit HEVC to 1 Mbps H.264 proxies takes just a few minutes per hour of footage. When editing, I manually swap proxies and online media in Resolve using the "Re conform from bins" option. Slow motion footage usually doesn't reconform properly, so I leave conform lock enabled for slow motion clips.
  12. KnightsFan

    New use of 8K

    Really cool technology! Actual projections always have much better integration with lights and reflections than green screens. One of my favorite lighting scenes is the tower in Oblivion, where all of the lighting was "ambient" projected off of a 360 degree projector screen. Though of course that wasn't motion tracked or 3D. As awesome as this technology is, though, from an enjoyment standpoint, I prefer to shoot on location. It's just so much more fun than being stuck in a studio.
  13. @mnewxcv Sensors usually have a number of different readout modes with different resolutions, noise levels, and RS values. Samsung might be using a different readout method for HD on the NX1 and the NX500. 120fps requires a readout under 8.333ms, and Samsung might have just made all HD framerates on the NX1 use that same 7.9ms mode, for whatever reason. The NX500 can't do 120, so there's no reason to use that readout mode at all. BUT... Tbh I am not sure whether the DVXuser test measured the NX1 HD in all frame rates. We'd have to dig through the thread to find out. It's possible that the NX1 has 7.9ms in 1080p120, and 20ms in 1080p30.
  14. I haven't seen anything specifically about rolling shutter on the NX500. So take this with a grain of salt... The NX500 4K is a 1:1 crop, unlike the NX1's full pixel readout. Rolling shutter is likely proportional to the crop size, since it's the same sensor. That would estimate to be ~19.3ms for UHD. (NX1 reads 3650 vertical lines in 32.6ms, NX500 reads 2160 vertical lines. 2160/3650 * 32.6 = 19.3).
  15. The HDMI spec has no control over whether the data it carries will be perceived as a color image by human viewers. You put HDMI formatted data in one end, and get HDMI formatted data out the other. So you're right, you can take 3 UHD 12 bit raw frames, and package them as the Y, U, V channels of a single 4:4:4 color frame. It wouldn't look right if it was plugged into a TV, but as long as the Atomos end knows what to expect, they simply need to interpret each HDMI frame as 3 Raw frames, and processes accordingly. (Apertus has been doing just that for years to get higher frame rates out of their cameras.)
×
×
  • Create New...