Jump to content

KnightsFan

Members
  • Posts

    1,351
  • Joined

  • Last visited

Everything posted by KnightsFan

  1. Your lens choices seem reasonable to me. You could grab a 50 to start with, and see whether you find yourself wishing for a longer or shorter lens. I personally would go for manual focus primes. I wouldn't say that it matters all that much in terms of how easy it is to start with. I still don't have an ND filter. It's only crucial if your camera has a high base ISO, like 800 or 1600 on some Sony cameras. You should get a polar filter though, which can more or less be used as an ND filter as well. I have a Nikon->Canon adapter for each lens. They're cheap, and a pain to get on and off. Yeah, a new speed booster costs almost as much as a used 5d2, and there's no mirrorless APS-C cameras that shoot raw. So it doesn't seem like it's a good fit for you. You'll be able to get a repeatable process with raw, though that sort of defeats the purpose of having all that flexibility. But like you say, it's always nice to have if you need it for a specific project.
  2. @mercer Not sure if you're referring to me, but I'm not opposed to Magic Lantern. I absolutely love it! But I do want to be honest about workflow challenges so deatrier can make an informed decision. Raw, and especially Magic Lanter, is certainly most appealing to those of us who love tinkering, and have plenty of hard drives. Some people might not be into that.
  3. Based on your interest in full frame Raw, then really it's either the 5d2 or stretch and get the 5d3 if you can find an amazing deal. Obviously, many of us have different priorities, but if that's what's important to you, then it's basically down to those two and seems to be mainly an economic choice. Electronic viewfinders and screens are actually easier to focus with, because you can usually do a pixel-to-pixel magnification and/or focus peaking. Optical viewfinders can be more fun to use, but focusing with an evf is easy, too. Shooting Raw on a Canon camera is not simple. The post workflow isn't bad, though it will take some experimentation. The main problem is that when shooting HD raw, you get like 5 minutes from a 32GB card. Compressed raw and lower bit depths were announced since the last time I used ML, so the situation is a little better now, but it's still very different from shooting all day on a 32GB SD card like you can with H.264. If you go raw for an all-day shoot, you will either need a LOT of cards, or you will need to dump cards to a hard drive as they fill up. Not to dissuade you. Like I said, the image is worth it. But you will need to budget for cards and hard drives. I have a set of Nikkor K lenses that I love. Great character (not terribly sharp, lots of distortion and CA, but wonderful vibrant colors), great handling, they can adapt to anything, easy to declick (if you're into that). Would definitely recommend them for the price.
  4. What kinds of projects will you be doing with it: Narrative or documentary? Studio or run'n'gun? Will you be filming in low light? Do you plan to do a lot of color grading? Green screen or VFX? Do you also want to take photos? If you're set on ff, I don't really know of any options that would be better in the price range you've set. You can make great stuff on a 5d2--or any other camera from the past ~6 years. If it were me, though, I'd get a used APS-C camera, especially if you already plan to upgrade in a year or two. You could get a cheaper Canon for a few hundred dollars, or, if you can find a good deal, get an NX1. It's a fantastic all-rounder. I got mine used a year ago for $850 (700€) with lens and accessories included. If you plan on using Magic Lantern for raw video, be aware that it will be a mild PITA. Totally worth it, but not easy. Do you have an external microphone or any other audio gear? I'm not a fan of the H4 myself.
  5. I like that they're going towards the more open OTG protocol, but it worries me that it's unclear whether the LukiLink and the phone can be powered from a single USB cable--or even if they can be simultaneously powered off an external battery at all. But yeah, super excited to see the final product!
  6. The mewlips controller is unusably laggy for me also. After finding that the remote viewfinder app only worked with native lenses and the mewlips was unusable, I too searched for a way to monitor with a smartphone. I found nothing that was inexpensive. Apparently, you can use some sort of USB capture card and OTG cable for Android, but it would have been expensive and inelegant. Hopefully, the LukiLink project will finish soon because that seems like the solution you and I are hoping for. If you're on a tight budget and just need an external screen, you can get Raspberry Pi monitors for cheap.
  7. @Mokara I read your entire post. It sounded like you were saying HDMI video by nature is debayered and processed. Even the second part of your post seemed to imply that Atomos was just sort of hoping that someone could send a Raw signal for them to record. Sorry if I misunderstood. Raw photos aren't processed, so there is a way for the image to bypass the hardware processor. And besides, Atomos seems to think it can be done. Edit: If Magic Lantern can write Raw to disk using software only, I have no doubt a firmware update to output Raw over HDMI is technically possible for many other camera models.
  8. @cantsin Lack of white balance controls is really suspicious. Do we know if it's a limitation of ProRes Raw, or whether they just didn't implement that in FCPX? Considering the filesize is similar to other compressed Raw formats, it seems unlikely that the file contains the vastly smaller amounts of data that would preclude having a white balance adjustment. I mean, without white balance adjustment how can it even be called Raw? I'm would actually be glad there is no "ISO" control. You can still adjust exposure with curves and other tools, and calling it ISO confuses it with analog gain. Perhaps "ISO" is the more correct term, but we've been so relaxed about distinguishing between gain and ISO in digital cameras that it's confusing. I've had too many conversations about why the BMCC ISO's aren't "real", and why you can't set the ISO of a DSLR in post. Built-in proxies is actually a really good idea! If proxies were done at the file level instead of by the NLE, it could potentially standardize proxy workflows across different NLE's. Though, it would be disappointing to not be able to control how they are generated and used.
  9. @cisco150 As much as I love the NX1, I would not buy any NX lenses.
  10. @sanveer That was my impression, too, which makes me really curious to know what's going on under the hood though, and how it's different from cDNG.
  11. That's not entirely true. HDMI doesn't "know" what information it is carrying. You can send whatever data you want over HDMI, as long as you can encode it in a format it likes, and then decode it on the other end. Back a few years ago the Axiom project was putting a raw signal over HDMI. If I remember correctly, they were taking three Raw frames and sending them as the "red" "green" and "blue" channels. All you've got to do at the end is split the "channels" back into Raw frames.
  12. Does anyone have any info on ProRes Raw vs. cDNG yet? Grant Petty mentioned something about it moving all the color science to FCPX, instead of being a mix of in-camera and in-post. Does anyone have any insight on that?
  13. KnightsFan

    NAB 2018

    Exactly! Since you can now get a great camera for around $1k, it's great to see more lenses in that price range for people who will have to bring their lenses between systems (such as me).
  14. KnightsFan

    NAB 2018

    I will very interested if they do get around to making EF mount as he mentioned. Rokinon doesn't really have any competition right now for non-mirrorless cine lenses.
  15. All that is true. I was responding to your first post, where you said: "The human eye is estimated to see about 10 million colors and most people can't flawlessly pass color tests online, even though most decent 8 bit or 6 bit FRC monitors can display well over ten million colors: 8 bit color is 16.7 million colors, more than 10 million." My point was that that this reasoning does not apply to raw, and that Raw samples NEED a higher bit depth than each individual color sample in a video file. I was illustrating the point by showing that if you count the bits in a lossless 1920 x 1080 Raw image at 14 bit depth, it will be considerably smaller than a 1920 x 1080 color image at 8 bit. True! I was simplifying. But aggregate over the entire image, you still end up with less overall data in the Raw file than the color file. Two ways of saying the same thing, in terms of data rate. I think we're in agreement for the most part
  16. I said "three separate 8 bit values, to make a 24 bit color pixel"
  17. @HockeyFan12 I re-read your original post. I agree wholeheartedly that there is no practical difference between 12 and 14 bit color video. However, Raw is different. Essentially, each 14 bit raw sample becomes three separate 8 bit values, to make a 24 bit color pixel. That means that each pixel in the final video has 1024 times as many possible colors as each corresponding raw pixel--and that's when you're outputting to just 8 bit! (I said "essentially" because color pixels are reconstructed based on surrounding pixels as well) I don't know if it makes a practical difference on a 5D with magic lantern, so you could be completely right about 12 bit being practically equivalent to 14. I haven't had access to a 5D since before the lower bit hacks became available, unfortunately.
  18. Oh okay, yeah, I completely misread it. I totally agree with you!
  19. But can you plug it into the USB-C port on the BMPCC4k and use it as an on-camera usb audio interface? Wishful thinking...
  20. I don't know. Even if its unlikely that any cameras will in fact output ProRes Raw over HDMI, it may not have been a huge cost for Atomos to implement. I wonder if the hardware is actually any different? I mean it's the same data rate, and the recorder is still just taking a signal and writing it to disk. Sure, there's debayering etc. for the screen, but literally every camera in existence does that same processing in real time, so the hardware can't be too expensive. I wouldn't be surprised if world-of-mouth advertising (like that of this topic!) boosted sales enough for it to pay off.
  21. @Deadcode That's correct. 10 and 12 bit simply truncate the lowest 4 or 2 bits in Magic Lantern. Removing each bit divides the number of possible values by two. (So, to answer the original question, lower bit depths will manifest themselves by crushing the blacks. Though, as has been mentioned, the "blacks" have a low signal to noise ratio anyway).
  22. @Deadcode each bit you add doubles the number of possible values. Adding two bits means there are 4 TIMES as many shades. 10 bit has 1/16 the shades vs 14 bit.
  23. You know what would be cool, is if there was a way to use the USB-C port as a multitrack digital audio input from a mixer. That's always been That One feature I've never seen.
  24. I understand what you're saying, but it's not matching with my observations. What I am doing is shooting high dynamic range scenes that include both pure black and pure white. I've shot it every way I can think of, including using a clickless aperture to slowly adjust exposure, and then picking the frames that have either the same black point or the same white point. Using the RGB boost consistently gives me slightly more dynamic range, and a noticeable increase in sharpness. Perhaps the difference we're experiencing is that I am changing the ISO: comparing RGB 1 at ISO 800 vs RGB 1.99 at ISO 400, adjusting the aperture by minuscule margins both ways to ensure equal white or black points. So my theory is that reducing the ISO is what's giving it the slight edge with whatever internal logic the camera has. Compensating with ISO makes the most to me, since I want to test camera settings independent of the lens. I might try compensating with shutter speed at some point. Also, just another note about my observations: it's not a full stop of dynamic range added. Nowhere close. Perhaps it's not even more dynamic range per se, but just better color response and sharpness (from a lower ISO?). Looking at individual color waveforms, it might not even be a uniform change between all three channels.
  25. I mean the RGB boost improved dynamic range in Gamma DR in my test. Yeah, I totally get why you closed the aperture. I was surprised that closing the aperture changed the dynamic range. I'd just never really considered it before, but it seems plausible that using different apertures could in fact affect the dynamic range of the scene that is captured.
×
×
  • Create New...