Jump to content

Attila Bakos

Members
  • Posts

    513
  • Joined

  • Last visited

Posts posted by Attila Bakos

  1. 56 minutes ago, androidlad said:

    Would this test and the accompanying RAW files help with your project?

    https://cinematography.net/CineRant/2018/07/30/personal-comments-on-the-2018-cml-camera-evaluations/

    https://www.myairbridge.com/en/#!/folder/XDn2Yx7xLBAaLh0Zbq415jcXW5hOa9SM

    It seems that only Alexa can render pure red colour accurately, while others all skew towards orange, is this something that's specifically addressed in your colour engine?

    I think he posted some code screenshot in the past where he fixed hue twists. I'm 99% sure he fixes this issue but I'll let the man talk :)

  2. On 12/20/2019 at 6:54 PM, Sage said:

    Its almost certain to be the case. Efficiency is vital. I have been deep in the code bunker this week, writing a special segment for the EC engine. This will greatly accelerate supporting new cameras, as it minimizes the amount of intervention necessary to conform data (with superior results). As of today, I have a new luma conform tool that is completely free of either hue or saturation distortion, and I am thrilled with it right now (not even Resolve's luma tools work this way). Now, I have to finish the second piece of the puzzle, and (accelerated) camera support will commence.

    That sounds like P in the HSP color model. To me it reprents luma better than L in HSL, or V in HSV.

  3. 5 hours ago, BTM_Pix said:

    There are plenty of cheap HDMI>USB3 capture devices around that people use for games console capture.

    This one is currently on offer from Aliexpress for £30

    https://a.aliexpress.com/o0rZGAIYT

    You need to do some research with these type of units though to make sure they will accept the format you are using as some of them will have issues with 4K or even 1080 if its 24p.

    Panasonic cameras are usually no problem at providing an output that capture devices accept but its worth checking.

    In this case, checking might actually involve buying one blind as you'll probably find lots of comments about using it with an Xbox but few about using it with a GH5 in the specific format that you use.

    As for quality and latency, these things are generally good enough for what it is you will be using it for as a focus monitor for a locked off camera as that task is likely to only create very small differences between frames so won't be taxing the compression too much.

     

    Be careful with these cheap recorders. I wanted to do the same, monitoring on a laptop, with some scopes. I wanted to do it as cheap as possible, so I ordered this device: https://www.fasttech.com/p/9677213
    I also prepared some pretty complex ffplay scripts that capture the stream and add a scopes realtime, like rgb parade, vectorscope, because I couldn't afford scopebox.
    The device worked but the quality was pretty bad. It has an mjpeg compressed output and an uncompressed output. Even the uncompressed output suffered from resolution loss, it modified the gamma of the source (not a video range/full range issue), and there was a very noticable color shift as well. And it took me 3 months of discussion (in which I had to make videos too) with the support to get a refund! I even got a response from the manufacturer and they told me my results are fine, the captured image is good. It was unbelievable really. My point is that these cheap capture devices might be fine for gamers, but they are absolutely useless when it comes to monitoring. Of course my opinion is based on this device alone.
     

  4. 35 minutes ago, Mako Sports said:

    You totally could do Arri log C, I just did S log 3 because I wanted to try AC's rec 709 conversion lut. 

    Keep a bit more highlight range but reduce a bit more noise vs 75ire. I still think 106 is way to high ?

    Why is it too high if it is not clipping? The values for the top 6 stops seem to be distributed equally in S-Log2.

  5. On 12/6/2019 at 2:16 PM, kye said:

    I'll have to see if I can dig one out..  I have a huge archive of projects I've shot but not edited yet!

    I found a clip where I tested the stabilisation of the GoPro, it has my face in it, so you can see how the DCTL corrects the Protune clip.

    Protune Flat/Native:

    test_1.1.3.thumb.jpg.3f2088d088543681d49910a932b2fe7e.jpg

    DCTL:

    test_1.1.1.thumb.jpg.8dd92dac63d80c28e05f41751f12dd4d.jpg

    Normal contrast/saturation adjustment in Davinci YRGB mode:

    test_1.1.2.thumb.jpg.fa9812690c87477595717207840e50fc.jpg

     

    As you can see the DCTL corrects for the yellow skintones, also the jacket of the lady in the background is supposed to be red, which is not what you get with normal contrast/saturation adjustments.
    I believe my DCTL/LUTs can really make your life easier when shooting GoPro Protune Flat/Native.

  6. 16 minutes ago, Mark Romero 2 said:

    Speaking of the offset wheel in resolve...

    I've heard from somewhere (maybe on liftgammagain) that if you are just going to make a luminance adjustment, better to go into the primary bars and use the Y bar to adjust (and not the wheel / dial at the bottom). I have no idea why it would be different than sliding the horizontal dial under the primary wheels / primary bars. Any thoughts on that?

    The value of Y (luminance) in the lift/gamma/gain bars is calculated from R, G, B (each channel has a different weight in the calculation, green has the most), it's an additional controller that helps you to change luminance without changing chroma, so yes, if all you need is a luminance adjustment, try and see if it works better than using the wheel. The wheel adjusts all three color channels equally.

  7. 4 minutes ago, Mark Romero 2 said:

    You should probably send a Thank You card to @Deadcode because he helped me decide to try your IDT.

    Do we still need to expose slog 2 at approximately 2 stops over? 

    I usually just set my zebras to 106 and open up until I just barely see zebras, and then close down the aperture 1/3 of a stop to play it safe since an individual color channel might clip but not trigger the zebras.

    I actually know @Deadcode, we help each other where we can :)
    ACES won't force you to change the way you expose, it will actually be easier to correct an overexposed image. The offset wheel's effect is very close to a true exposure controller in ACES.

  8. 23 minutes ago, Mark Romero 2 said:

    Does your custom IDT work for Acesscct as well as Acesscc??? 

    The IDT is the entry point for ACES, it linearizes the clips and converts to the color space to ACES AP0. Then ACES will convert this to a logarithmic gamma (either ACEScc or ACEScct in Resolve, whichever you choose), and to AP1 color space. So yes, you can use the IDT with both.

  9. 10 minutes ago, kye said:

    Do you have any before / after Protune shots that include people?  It's easy to make Protune look nice on landscapes, just add contrast, saturation, and maybe adjust the WB a bit, but making it look 'right' is a whole other thing!

    Unfortunately I don't have a GoPro (got the clips from a friend), but if you send me a clip with people, I can show you how it would look. Remember that the color mode must be flat, and white balance must be native.

  10. 1 minute ago, Dave Del Real said:

    Been using FCPX to edit then to Resolve for color. Yeah, I did notice the highlight saturation too when using Pro. Might have to try SLog2 with Sgamut3.Cine again with the workflows described here. Totally forgot about ACES, will try that too.

    Colorizer.net is my stuff, thanks @Deadcode for mentioning it. I also have Fuji film simulation LUTs specially designed for S-Log2/S-Gamut3.cine, if you're interested. (I used an A6300 for the conversions.)

  11. 8 hours ago, BTM_Pix said:

    It comes in over wifi and 3C and the Nucleus Nano talk to the camera over BLE.

    There's nothing particularly elaborate going on currently as I'm just splitting their app and scaling the UI of mine to make it work.

    Its a bit clunkier than I would like to initially setup but works perfectly once it is.

    The question is whether its worth doing the extra work to do a single app that properly integrates their feed. 

    My current view is that it might not be as it might well work well enough for most people as is (plus I've got a fair bit on at the moment) but I'm going to study it a bit closer.

    If nothing else I'll just go with a startup option on 3C to put it in reduced UI mode and a tutorial on how to set everything up so people who've got a Cineye can benefit from it if they want.

    As I say, in the way I have it set up in the initial post with the Nucleus Nano wheel its a very nice little setup.

    Oh I thought there's some kind of an SDK and you can build your own app, maybe your own monitoring tools for the stream that's coming via wifi.

×
×
  • Create New...