Jump to content


  • Posts

  • Joined

  • Last visited

About Scott_Warren

Profile Information

  • Gender
    Not Telling
  • My cameras and kit

Recent Profile Visitors

614 profile views

Scott_Warren's Achievements


Member (2/5)



  1. You learn something new every day! That makes a lot of sense, actually. I assumed that the Blackmagic setting was expecting specific colors from your own color filters and so would produce "wrong" results if you started that way. Thank you for the information :)
  2. @imagesfromobjects You'd want to browse to the root of where the DNG sequence lives on your drive in the media browser, then either drag it into the media pool or right-click it and Add to the media pool. From the media pool, you would create a new timeline with it and off you go! It's easier to add DNG sequences through the media pool browser instead of manually adding a folder filled with frames where you'd need to select all the frames, then add... it gets tedious after several clips.
  3. @imagesfromobjects One exercise that helped me get started with Resolve was opening an image in Lightroom and tweaking it to taste, and then opening that same image in Resolve & using the same controls (exposure, contrast, saturation, Hue vs Hue, Hue vs Lum curves, etc.) to try to recreate the same result that I saw in Lightroom. I know that video adjustments tend to exist in their own universe when you dig in to try to make things more technically sound for video compared to a still, but it might be worthwhile to do something similar. With that approach, I could still think in terms of editing a still but developed more of a video mindset to how the controls translate differently between stills and video. Curves still largely function the same, though they tend to be more sensitive in Resolve than Lightroom, and instead of using one master stack for operations, you apply them to nodes which can then be re-wired to create different results depending on the flow of the math. I also had experience with the Unreal Engine material editor for years, so node-based editing and order of operations wasn't a huge leap. I think @paulinventome 's approach with injecting a DNG into the RED IPP2 workflow is one of the faster ways to get to a solid image that I've seen recently.
  4. More stills from the progression of the pan. I'd upload a video, but the motion was nothing to write home about. Really just sweeping around finding interesting compositions at random. There's something appealing about the behavior of blues and teals using IPP2. It has a certain iciness to it that's hard to describe.
  5. Definitely play around with it, Chris! The highlight roll off control alone allows for range compression depending on what's going on. Kind of seems like a local highlight tonemap operator. For this shot, I used a Soft rolloff to help the roof lights transition more smoothly into the fog. It's acting like an extra diffusion layer on top of the natural fog scatter.
  6. @paulinventome I tried a test using your IPP2 workflow, and I have to say I'm now a huge fan The only thing different is I'm using Hard highlight rolloff since I like the extra snap it gives to light sources, but the color behavior out of the gate + the OneShot chart is just fantastic. Skin is maybe a hair red compared to reality, and a Medium rolloff makes it look more correct, but at the expense of the hotness of the lamp. To my eye, this is what the corner of my TV console looks like in reality. It's actually a bit scary how quickly this approach gets to a solid image, haha. I feel like I'm cheating! IPP2 kind of feels like RED's own baby ACES in that you can control global aspects as you wish. It's super powerful. Huge fan of global operators in place of knob parties!
  7. @paulinventome It turns out the stock X-Rite software doesn't do a super accurate job of calibrating the toe of the display curve for sRGB. It makes it too crunchy. I calibrated with DisplayCAL and can now see into the toe more accurately, so I don't need to compensate in Resolve with using the flatter P3 ODT to offset what was looking too crunchy on both my P3 and sRGB monitors. I can only imagine what a dedicated LUT box would offer! After leaving the fp DNG as-is and then white balancing and pushing saturation right up to the vectorscope target boxes (Saturation set to 80-90 on a node, not the RAW saturation controls which behave weirdly), the AP1 primaries look pretty true to life. I took a low tech approach to adjusting the saturation by eyeballing the corner of the living room and comparing it to the monitor and it looks pretty dead-on at this point. Is the need for pushing saturation dramatically due to the nature of how big the AP1 gamut is compared to sRGB/ 709? No normal monitor can display AP1 so I imagine there has to be some base pushing of saturation in order to get a good starting point within a smaller gamut, unless I have that totally backward.
  8. Thanks for being a fountain of information, Paul! The obsession for seeing the ground-truth image is very real, haha. I've dabbled in baby steps of color management with my work in games, so it's great to be made aware of how deep the rabbit hole goes.
  9. @paulinventome Here's the link to the folder of DNGs of my colorchecker. Their white balance would be equivalent to Shade, so 7500K, +10 M . Incident exposed at speed with my Sekonic. https://drive.google.com/drive/u/0/folders/1YLY5ls433I-DgaLSbbzE3Lf6MtTKwjKT I do have an X-Rite iDisplay calibrator to help wrangle any of the weirdness Windows might introduce without any help, however. I know it's not scientifically perfect like what could be achieved with higher-end solutions used for film post, but it was close enough for our purposes when I was lighting Call of Duty: WWII that I adopted it for home use as well I've been meaning to check out DisplayCAL to see if that offers anything more precise than X-Rite's own software.
  10. @paulinventome I'm viewing and grading on a Dell UP2718Q which does 97% of P3 I suppose what I'm reacting to is the AP1 color primaries versus P3's. P3 definitely reads as warmer with more pleasing hues out of the gate, but it's something I could dial in with AP1 with some knob tweaking if that would be more proper. Of course if I use the OneShot chart to start with, then even AP1 looks great since the colors have all been hue adjusted and saturated accordingly, so I may just need to embrace being a bit more hands on with the process. The joys of learning!
  11. @redepicguy Attached are my current settings. I try to make the initial capture as solid as possible as far as exposure goes to avoid a lot of pushing or pulling in post. In the node chain for the start of a grade, I'll also add a Color Space Transform node to convert the ACES AP1 space down to P3-D65. Allows for more color depth than Rec709, but gives a more natural response with saturation and is supported on more and more devices these days. Also the gamma is a little softer rolling down into shadows which lets me see into darker areas without needing to bend the gamma or use lifts to see it. I try to use a global approach to getting a look in place of tweaking knobs if at all possible. Probably leftover habits of working with film for a while You don't have to convert the colorspace from ACES AP1 to 709 or P3, but I find that AP1 needs to be saturated a LOT in order to feel normal, and pushing saturation can behave weirdly depending on the subject or scene. Using the conversion method adds what I think is a natural boost of color without needing to crank things crazily. Is any of this approach correct? I have no idea. But it seems to work well for the kind of images I like to capture.
  12. Now this is interesting. I wonder why a colorspace change would have such a dramatic effect on shadow colors? I can't imagine the colorimetry of the BM sensors is THAT different from the fp's when it comes down to numbers. Of course I'm also not an engineer!
  13. I'd be careful to assume any of us are being judgemental or lecturing, Chris. Online forums, particularly technical ones, tend to have a lot of cut-and-dry information for information's sake. Suggestions from people with different experiences aren't necessarily judgemental. We're all just sharing what we know! I don't have a dog in this fight. I just like making images and sharing what I've learned
  14. Quick aside about dialing in a better starting point about color, short of having an ACES IDT any time soon: The OneShot chart is one of the cheaper but fantastic options to get to a good starting point to balance shots together, or to make sure everything in a cut is consistent so that a global look can be authored and consistent without shot-to-shot tweaking. The ink colors are calibrated to match 50% of the saturation of the standard video colors in the vectorscope for alignment. It's also glossy, when helps you get true white and black instead of a matte finish which would only give you lifted black values. http://dsclabs.com/product/oneshot-plus-2/ For the clip above, I used the auto-match tool in Resolve followed by a quick refinement of the vector directions after that. Nothing else. ACEScc pipeline with a Rec709 output.
  15. Fair enough! The joy of this stuff is there aren't any rules If it looks right, it is right.
  • Create New...