Jump to content

KnightsFan

Members
  • Posts

    1,209
  • Joined

  • Last visited

Everything posted by KnightsFan

  1. I wonder if Panasonic will use a <1mm sensor stack like Leica does? If so, some lenses designed for film might have slightly better performance here than on the new Sony/Canon/Nikon.
  2. Pixel shift would be amazing! I occasionally need 5000x5000 square photos, which I use to make seamlessly tiling 4k textures. If the lower resolution model can get that 5000 line height via pixel shift, it would be the only system so far that would solve my occasional need for high resolution stills, while providing good video.
  3. Glad to see that there is, in fact, a low resolution version. Hopefully it's priced competitively with the Z6/A73.
  4. Stabilizing in-camera allows GoPro to use hardware sensors (accelerometer, gyroscope) in combination with image analysis to determine what movements and rotations to compensate. Doing it in post can only analyze the images, which as @Yurolov pointed out, are already compressed.
  5. That was my first thought, except I don't even need FF. I just want to be able to adapt my existing lenses to it.
  6. Because I usually clean tracks in audition or sound forge, and render those out to new files. For example, I might take a boom file, and then render out one version with noise reduction for dialog, and then another that gets some of the effects sounding right, etc. I cant do that cleaning stage before picture lock, and I don't want to mess around with replacing audio files in resolve just to send updated xmls over to reaper every time I need to add a new file in. It would be way easier to just drag that new file into reaper and have it auto align.
  7. @buggz I tried Fairlight when it was first included, and found it very unstable. I didn't have any actual crashes, but audio would regularly cut off or have like a 24 dB attenuations in 1 or more channels, with absolutely no way to get it back to normal other than moving the contents of that track to a new track and then deleting the old one. This would happen in projects where I didn't touch Fairlight or any audio controls at all. I've been using Reaper for a while and really like it as well, so I have no plans to switch--yet! I love the level of customization you can do to everything, from layout to hotkeys, etc. I also find it to have a very intuitive layout and menu structure. It's usually pretty easy for me to do things I've never done before because it just works the way I expect. However, I haven't been able to find much information about timecode in Reaper, other than syncing for live performances.
  8. Thanks for the explanation! I had a suspicion syncing was usually done before editing. However, on my projects, I AM sound post ? Which gives me a lot of flexibility in making my own workflow. My ideal workflow would be to get picture lock without ever importing audio files into Resolve, and then send my project to Reaper. It would be nice to simply be able to import an audio file to Reaper, right click on an audio clip and have some sort of "align to timecode" button. Afaik this feature doesn't exist, BUT Reaper has some extensive scripting capabilities, so I will look into creating this feature myself. I imagine I'd have to loop through all media items from my picture edit, read the timecode and duration, and then use that info to tell whether or not the audio in question should be aligned to that media item. The benefit of this would be: less synchronization required between Resolve and Reaper, no importing audio to Resolve at all, and audio sync issues would be solved in Reaper instead of Resolve.
  9. So do you have to sync by appending audio to your video clips before editing? Is there a way to edit first, and then sync afterwards?
  10. I've never had the chance to use timecode on any of my projects. Does anyone have a good workflow to sync audio and video with timecode, either in Resolve or Reaper?
  11. Looks like the website and specs for the E2 have been updated. http://www.z-cam.com/e2/ Max bitrates: Those are good bitrates for H.265! A tiny bit higher than the XT3's 200 Mbps. The test footage that I downloaded a long time ago held up very well to color grading, despite being just 175 Mbps for 4k/120. It looks like they've got a high resolution 4:3 mode (10.2MP, higher resolution than DCI 4K at 8.8MP) at up to 60fps. I'm curious to know the exact dimensions of all these modes. I'm not sure if it's a 1:1 readout, or whether there is any downsampling/cropping going on for either mode.
  12. That's fair. I personally have no need for >24mp stills or >4K video... Or even 2k if it's a nice clean down sample.
  13. @jonpais agreed, though I wonder if they plan for a cheaper, lower megapixel version later on. 42 mp stills and 8k video are overkill. it would be amazing if they dd a full sensor readout with an in-camera 4K\2k downsample, with good low light and rolling shutter performance.
  14. If I got the xt3, I could sell my nx1, whereas with the p4k I'd need a stills camera as well. All my lenses are Nikon and canon, so I just need some adapters for the Fuji, whereas I'd need a speed booster for the p4k. So in my case, the xt3 is actually considerably cheaper!
  15. Totally agree! If the p4k cane out a few months ago it would have been a no brainer. Now it's hard to justify the cost of a dedicated video camera when the hybrids have improved so much.
  16. I would like to point out that my fears were unfounded, and Blackmagic is indeed open to other camera manufacturers recording in BM Raw. https://www.youtube.com/watch?v=AYswt0WM7gU They talk about it around 5:30.
  17. Good to know! In the video Grant mentioned that all the features would be available to developers "should they choose to implement it" or something to that effect, so I assumed that there might be discrepancies between third party players depending on implementation.
  18. I'm still rooting for Panasonic to use the Canon EF mount, while Canon itself switches to RF. Ten years from now, new photographers will be like, "But why do Canon lenses only work with Panasonic cameras? What were they thinking?"
  19. you wouldn't "bake" it in, but you can put color profiles inside the metadata, which can be used by any software that fully implements the sdk. The metadata doesn't change the original data, it just gives information about how to display it. its worth noting, however, that it's possible for a third party to make a player that doesn't implement all of the features from the sdk. Someone can make a player that ignores some or all of the metadata. there is less of a guarantee that your image will look the same everywhere, compared to a file that is actually baked into a universally standard color space (so it's not a safe distribution codec).
  20. I used standalone fusion for a few projects over the last few years. This week I used the resolve-integrated version for a very simple effect. I love how well it works with resolve, it's a truly remarkable piece of software. Many years ago, when I used blender for everything, I remember talking about how I wished you could put compositions from the node graph into the sequence editor. With resolve, that is finally possible!
  21. What we really need is for a universal system of plug and play parts, like we have with PCs. Imagine if every camera were as modular as a PC! Naturally, building a camera would be a lot of research and some compatibility headaches. But just like with PCs, you could buy an off the shelf, pre-built one with software already installed. What we need is for the Axiom project to work, not just within its own world of open hardware and software, but for their standards to become universal enough that big companies manufacture parts that work with that standard.
  22. That's probably right and in keeping with his philosophy, but I haven't seen anything about it coming to other cameras yet. Free SDK for decoding is very different from it being open source or freely implemented in any camera. Right now, it seems a lot like Redcode, which also has an SDK. Yeah, I agree. I remember Grant talking about how ProRes Raw didn't have enough metadata support back when they announced the Pocket 4K. Now I guess we all know what he was really thinking about! I also think that the whole camera-specific metadata (and possibly encoding as well?) is what would keep BM Raw from being properly implemented by Magic Lantern for the 5D4, even if they could make some post-transcoder.
  23. @webrunner5 Yeah, the decoding part is all open, so it could easily end up in all different post production applications, but I didn't hear him say anything about it ending up in non-blackmagic cameras. Did I just miss that part? It seems like a fantastic format, I'd love for it to become ubiquitous among cameras.
  24. @webrunner5 It would be amazing if that happened. I wonder if Blackmagic has any intention of allowing other camera manufacturers to use BM Raw? Decoding seems open with the SDK and all, but I'm unclear on whether encoding is open and/or free.
  25. @webrunner5 Canon isn't going to help ML in any way, shape, or form.
×
×
  • Create New...