Jump to content

Caleb Genheimer

Members
  • Posts

    680
  • Joined

  • Last visited

Posts posted by Caleb Genheimer

  1. 8 hours ago, Llaasseerr said:

    We'll have to see if it materializes. But based on the number of white papers for this method, there is no temporal aliasing which harks back to Red's shitty kludge from a few years ago.

    As an aside, I checked out the DGO implementation in the Canon cameras and that is also a bit below expectations, so this Sony sensor is to me preferable even in its current incarnation, particularly because of the fast rolling shutter. It's the best DR and rolling shutter in an "affordable" camera right now. But in the end it's only an incremental improvement. It's a welcome, but modest half a stop more than the previous a7s models.

    The Alexa is capturing two stops more above middle grey than any of the Sony CineAlta cameras, and the difference is huge. Sony is actually gimping their CineAlta camera sensors based on lowering the voltage - I think in order to manage power draw - so that they are only capturing a subset of what the Slog3 curve spec is capable of. This WDR upgrade could address that, but of course it probably requires a higher power draw.

    In linear light values, two more stops means it captures 4x the light information in the scene. You can't quite appreciate that when looking at a log-encoded curve. After working with Alexa footage for years, anything else is a bit of a disappointment because it needs to be babysat or it needs to be shot underexposed with an ND to get something approaching Alexa/film levels for bright sunlight, fire, explosions or hot specular highlights on cars etc. And then of course you need to denoise the shadows, possibly. It's not deal, but it's workable.

    Look at how Deakins shoots, he just exposes for a grey card like film and doesn't worry about clipping highlights because there's tons of range. I love the eND in the FX6 but it's just a convenience compared to actually capturing way more brightness information. Of course everyone has their own priorities, though. ✌️👍

    There is the option to just use the FX6 and get both features, if the WDR gets rolled out.

     

    I’m curious if the next implementation of the Ursa 12K sensor will implement DGO... there’s already some really different sauce in that thing with the RGBW bayer, and I’d think the bayer groupings have plenty enough individual photosites (4 total, 2 White and 2 colored) in them that half could operate at an entirely different gain for some serious DR increase. Even if it pans out to a little more noise, there’s just so much resolution to work with there, even a moderate amount of noise reduction wouldn’t cripple it below DCI.

  2. Electronic ND, Dual Gain Output (simultaneous ARRI-style,) and Full-Frame open gate video would have me buying this camera almost for certain. I agree with many it needs E-ND to set it apart from their other offerings, and IMO DGO is very quickly going to become the new flagship “flavor-of-the-month” (FINALLY we might get some dynamic range competition instead of this pointless hyper resolution!) It’s high time for CMOS to break through that ~12-13 stop ceiling. 
     

    And full frame open gate video? It’s just really aggravating to have a nice big sensor but be unable to utilize it fully. Some of us shoot scope! Some want some re-frame options! Some shoot for squarer social media aspect ratios... Just put the dang mode in there for crying out loud, it’s not like that forces anyone into using it.

  3. Just here to say, I wanted the new Fuji to have an open gate video mode enabled SO BADLY. That alone would’ve put me in “sell all my other bodies for a switch” high gear. 
     

    I’m still holding steady without a sweat... GH5S for anamorphic, Pocket 4K for easy Raw, C200 from work if good AF is needed. 
     

    I want LF video just because I find it interesting and really want to explore it. The S1H is tempting, but that darn AF keeps it from being the monstrous all-rounder I’m after. Hopefully the S2H fixes AF. If not, hey... S1H prices will probably drop at the same time and make that purchase more justifiable.

     

    I think the shaky market is still at a point where it may be a good thing, pushing the manufacturers to condense their lineups and get solid features into all cameras... but I guess we’ll see.

  4. I’ve been working out my own methods on this stuff for quite some time now, and as someone who also shoots some actual film here and there, some of the extreme minutiae are maddening to mail down, and prompted me to move into Resolve’s Fusion tab for many operations. It has been like learning to ride a bike all over again, and I’m not there yet, but Fusion is absolutely  powerful enough to get me where I want to go eventually. There is complex spatial and temporal interplay (especially with overscanned footage) that simply can’t be done easily with power grades and edit page compositing. I guess do 99% of people even notice? Probably not, but we do, right?

    Im interested to see how filmbox is implemented, as well as Tom Bolles power grade (which for that approach looks to provide a lot of nuanced manipulations resulting in the most convincing Color tab results I’ve seen so far). I will probably grab Cineprint 16 just to see if I can replicate the various nodes in Fusion alongside my own spatial machinations.

    Digging deep into this stuff at all kills render times like crazy, but that’s maybe half the fun, right?

    I still think halation could be brought 80-90% of the way there in the physical world with a bespoke filter from some expert like Tiffen or Schneider. Film has a set tonal dynamic range, which honestly a number of digital cameras get close enough to these days as to not make much difference, but spatially especially with regards to halation, the dynamic range in film’s reaction is much higher. Basically, something can be blown out, but two blown out sources of different luminance will still show a difference in the amount and reach of halation, and that reaction is still way outside the dynamic ability of digital capture... but a physical filter I believe could be constructed to react with that nuance and that could be subsequently baked into digital capture. Tiffen’s glimmerglass (of which there is a “bronze” variant already) looks to my eyes to be a prime candidate. If that could be tweaked to be red (or add red), it would to my eyes go a long way to achieving that nuance, and a bit of digital halation would be the added tweak to polish it off to taste.

    It’s super fun to build film effects, but at the end of the day, it’s also not all that difficult to just shoot film on certain projects, and in that case, the look is just... there. I’d encourage anyone who is spending significant time on emulation to also put some solid time into researching 16mm cameras, and doing a bit of hunting until you find a kit that fits your use-case. SR3s and Aatons are nice cameras for sure, but there are plenty of other very capable cameras out there, and you can still find good deals if you are patient. 

  5. Yeah, if folks haven’t figured it out by now... Canon is gonna Canon. It’s an overtly abusive approach to product lineup/development from a customer perspective, but it’s just their chosen philosophy.

    I follow all camera release news aggressively, because I find the technology advancements very engaging, but there’s no way I’d jump in on a multi-thousand dollar piece of kit before it has been well proven out in the wild, because all camera companies have their weak spots. They’re just tools, good tools will prove themselves through use. Canon makes some good tools, even if the peripherals are less than honest. I’m not saying that they shouldn’t be called out or pushed to change their approach, but it’s also important for consumers to step back and just take an objective look at the available tools, ignoring somewhat all the noise and baggage around brands, especially new release hype.

  6. 20 hours ago, majoraxis said:

    “Add a gimbal with subject tracking and you can have AI track your subject around the scene, then distribute the autofocus distance to the second camera, which can frame your subject/scene however you like, while your subject stays perfectly in focus.”

    Ok yeah, I’d take a V2 where the sensor is built into a little DJI Osmo kind of micro-gimbal camera that has object tracking to keep the sensor pointed at the subject... that would be straight up insane.

  7. Great video, glad you opened it up. People are so precious with these things, but their construction is very simple unlike most other lenses. 

    my bet is on one of your optical groups having worked itself loose and rotated. There’s a front block and a rear block, that’s it. For what it’s worth, many folks don’t use screws anyway, as they’re not always in exactly the right place. You could just glue that tab once you find the right position, or if you’re handy, you could still and tap a couple new holes... just clear out the chips before reassembly. But fist thing first, make sure the two optical groups are not rotating within their assembly. They have to maintain axial orientation.

    Look up for various videos on “tuning” an anamorphic lens, and you’ll get an idea of the general concept.

  8. The front knurled ring “locks” the focus, so loosen that first. The wider knurling will then be able to rotate. The infinity mark may not be exact, but shouldn’t be far off. Put both the scope and your 50m at infinity markings and then point at something detailed in the far distance. You should be able to get it sharp with micro adjustments.

    are you supporting the weight of the scope adequately? An over heavy setup can sag enough to throw out focus.

  9. 7 hours ago, BTM_Pix said:

    N is the smaller (and cheaper) motor and can be powered by 5v but it has less torque, although that can be boosted using a larger power source.

    The three different cables for the M motor are

    1) Data only for connection to the AFX.

    2) Data connection to the AFX with USB outlet supplied from the M to power the AFX.

    3) Data AND power connection to the MMX with USB outlet to power the AFX.

    So for the AFX, the choice is with or without USB dependent on what the user wants.

    Cable 3 is ONLY required for the FIRST port connected to the MMX (as it will be powering it) and any additonal M motors that are connected to the other ports can use either of the two other ones.

    No additonal cables are required for the Nucleus N motors as you are provided this with the AFX and an additional one if you buy the MMX.

    Yes, N is Nucleus Nano.

     

    Let me know if I understand this correctly: one cable is included with the AFX which can power it and connect data on an M Motor.

    therefore, to run a second M Motor via the MMX and to power it as well, only one cable 3 is needed?

  10. 5 hours ago, BTM_Pix said:

    Funnily enough, we have some new firmware for the PBC that adds a couple of things, one of which is to be a wireless relay display for the AFX.

    So anyone who wants it will just be able to buy a PBC and then attach it to the wheel like it is in this example.

    It can be switched to display camera settings or distance (metric or imperial)

    The AFX is a multi client server so it will support more than one PBC relay in this mode.

    idkyeunmxjbwsuet7i5j.jpg.5f0b557138c07dfaf4a797f5fc813fd8.jpg

     

    Beautiful. One other obscure use-case to consider: variable diopters. They’re marked with focus scales, but the scales are calibrated to “distance from the variable diopter” instead of the standard “distance from the focal plane.” (Rather obviously because who knows how long any particular lens might be, it’s an unknown factor.)
     

    It would be nice to have the ability to semi-permanently mount the AFX and a focus motor on a variable diopter, calibrate a single profile to the “distance from the variable diopter” focus scale (which never changes from setup to setup), and then just set an offset in the reported/displayed focus distance based on the length of the current lens setup. The actual AFX-to-motor relationship doesn’t have to change in this instance, just the displayed distance. 

     

    It’s minutiae, I know, but it solves a common variable diopter problem wherein most ACs are accustomed to focus distance coming from the focal plane. 

  11. I have a module request for down the road consideration: a wireless distance readout/display. Countless ACs/DOPs use the Tilta gear on mid-grade shoots, but despise their generic “0-100” readout at the controller.


    While it would require some pre-production profile calibration work, many would pick up the AFX for this feature alone: real foot/meter readings at the hand wheel.

    I know this can be read at the AFX, but often focus is wireless for the simple reason that it is being pulled from a distance.

  12. 46 minutes ago, BTM_Pix said:

    Yes, they will connect wirelessly.

    The AFX acts as the hub device controlling the camera (and lens motors for manual focus lenses) and receives/sends data to/from the additional components.

    An example of this that is built in already is the Tilta control wheel when using BMPCC4K/6K.

    It connects over BLE to the AFX and the AFX then translates its movements into controlling the native electronic lenses on the camera.

    This means that P4K/6K cameras can have the advantage of the more tactile control of a focus wheel without having to have focus motors and powering them etc.

    So in the same way, the additional module to offer AI based focus detection will process the data onboard and then pass that wirelessly to the AFX to action on the camera/lens motors.

     

     

    Just bought the AFX, this is awesome. Any word on the actual sensor tech for the AI-based add-on? Is it an additional chip-based system running contrast detect or something? Or phase detect? 

  13. 4/20! I currently rock a GH5S and a Pocket4K, and that’s only because my NX1 is out of commission. I also had a GH5 in the brief window before the S was a thing.

    The only thing keeping me from also having an S1H is the autofocus. I’m not upgrading digital bodies until a manufacturer provides full frame open gate with good AF and 13+ stops of DR. Until then, the GH5S and Pocket will do just fine.

    If I need top notch DR, I’ll just buy a couple loads of color negative stock and shoot 16mm

  14. I think the Blackmagic stuff is enough out of left field to also qualify. My P4K is not miles ahead of my GH5S in IQ on a good day, but the simplicity in handling of the cine cam menus coupled with the save-your-butt tweakability of the BRAW files really sets it apart, especially for the price.

    I LOVE the idea of the Sigma FP, it’s just my favorite, but I wish it had open gate RAW. Really the S1H is the only truly full frame video camera right now, the rest restrict to 16:9 which is sad. If Fuji’s next GFX iteration has open gate video, I’ll be all over it, price be damned. That’s IMAX territory with the Metabones added

×
×
  • Create New...