Jump to content

Attila Bakos

Members
  • Posts

    519
  • Joined

  • Last visited

Reputation Activity

  1. Thanks
    Attila Bakos got a reaction from Stathman in Fuji X-T3 and X-T4 discussion   
    A simple LUT smoothing fixes it, it's easy to do in Matlab, but I couldn't create an online version yet. It's on the todo list
  2. Like
    Attila Bakos got a reaction from heart0less in Fuji X-T3 and X-T4 discussion   
    It seems this LUT is not up to Fuji's standards, it has a few weird transitions:

  3. Like
    Attila Bakos got a reaction from Zeng in Pocket 4K to Alexa Conversion   
    You are right, and you are a gentleman, because you didn't mention, that for ultimate precision your LUT is still the way to go. A matrix can get you close when you're lucky, but to transform all the color tweaks of a manufacturer, a 3D LUT is still the way to go.
  4. Like
    Attila Bakos reacted to Juan Melara in P6K to ARRI Alexa - Resolve PowerGrade   
    Yeah you're correct, but I already had a library of those charts from various sensors and stocks, so it made sense to keep adding to it. Plus it's also actually quite helpful to have so many samples within the colour space, rather than just a few samples right at the edges.
    Thanks mate! Much appreciated.
     
    Another interesting thing I found in testing is how much dynamic range the P6K actually has. According to the charts published by BMD and ARRI, when both cameras are exposed at ISO800, the Alexa has 1 stop extra range in the highlights, but approximately the same amount of stops in the shadows. In testing I found this to be pretty much spot on. I found I could expose one of the charts exactly one stop higher on the Alexa before it touched the clip point.
    The interesting thing is how that actually translates in real world scenarios, and what that difference looks like.
    Here are some crops of a scene shot on both cameras. In LogC, so the differences are visible. I matched exposure on the day using false colour, they're as close as I could get them. Probably less than 1/20th of a stop of difference in exposure.

    The left image is the P6K. When comparing it to the Alexa image in the middle, thats exactly what 1 stop difference in the highlights looks like in a real world situation.
    It's interesting to see what happens to that 1 stop difference when highlight recovery is enabled – the P6K actually retains more highlight detail.
    Obviously you would never actually place important image information in that range (not that you would ever do that with the Alexa either). But if you need it create a smoother roll-off in anything neutral in colour like specular highlights, clouds etc, it is possible to outperform the Alexa in highlight dynamic range... sometimes.
  5. Like
    Attila Bakos got a reaction from Anaconda_ in Fuji X-T3 and X-T4 discussion   
    Try exiftool. I believe that there's a FilmMode tag that has the info you need.
  6. Haha
    Attila Bakos got a reaction from wolf33d in Fuji X-T4   
    I can see why he needs raw, wb is totally off 😄
  7. Haha
    Attila Bakos got a reaction from Katrikura in Fuji X-T4   
    I can see why he needs raw, wb is totally off 😄
  8. Haha
    Attila Bakos got a reaction from Geoff CB in Fuji X-T4   
    I can see why he needs raw, wb is totally off 😄
  9. Haha
    Attila Bakos got a reaction from heart0less in Fuji X-T4   
    I can see why he needs raw, wb is totally off 😄
  10. Haha
    Attila Bakos got a reaction from Emanuel in Fuji X-T4   
    I can see why he needs raw, wb is totally off 😄
  11. Haha
    Attila Bakos got a reaction from keessie65 in Fuji X-T4   
    I can see why he needs raw, wb is totally off 😄
  12. Haha
    Attila Bakos got a reaction from Sharathc47 in Fuji X-T4   
    I can see why he needs raw, wb is totally off 😄
  13. Thanks
    Attila Bakos got a reaction from josdr in Fuji X-T3 and X-T4 discussion   
    Since 16.1 it's possible in the studio version. It's under Workspace -> Video Clean Feed.
  14. Thanks
    Attila Bakos got a reaction from IronFilm in Fuji X-T3 and X-T4 discussion   
    Since 16.1 it's possible in the studio version. It's under Workspace -> Video Clean Feed.
  15. Like
    Attila Bakos reacted to MrSMW in Fuji X-T3 and X-T4 discussion   
    I swapped XH1 for XT3 for video in order to get:
    60p 4K 10bit eternal.
    The sacrifice was the superior build and handling of the XH1 and of course IBIS.
    Just give me the spec of the XT3 with IBIS in the XT4 and my spec quest ends for at least the next 3 years.
  16. Like
    Attila Bakos got a reaction from kye in The video that shows Blackmagic Pocket 4K RAW image quality is same as GH5S 400Mbit   
    Changing gamma and color space with the right-click menu on the node only works if your timeline gamma and colorspace is set according to the clip you're viewing.
    So if you're viewing an F-Log clip, and you're timeline is set to Fujifilm F-Log, you can right click the node, set gamma to linear and adjust exposure with the curves the way I described before. It will be like a node between two colorspace transform nodes, but compressed to one node. But what do you do when you have clips from multiple cameras? I like the color space transform plugin better because I can see better what's happening. And sometimes you want to see what's happening between the transformations, in our case the linearized image. When you use the one-node method you don't get to see that, you only see the output of that node.
  17. Like
    Attila Bakos got a reaction from seku in The video that shows Blackmagic Pocket 4K RAW image quality is same as GH5S 400Mbit   
    Changing gamma and color space with the right-click menu on the node only works if your timeline gamma and colorspace is set according to the clip you're viewing.
    So if you're viewing an F-Log clip, and you're timeline is set to Fujifilm F-Log, you can right click the node, set gamma to linear and adjust exposure with the curves the way I described before. It will be like a node between two colorspace transform nodes, but compressed to one node. But what do you do when you have clips from multiple cameras? I like the color space transform plugin better because I can see better what's happening. And sometimes you want to see what's happening between the transformations, in our case the linearized image. When you use the one-node method you don't get to see that, you only see the output of that node.
  18. Like
    Attila Bakos got a reaction from kye in The video that shows Blackmagic Pocket 4K RAW image quality is same as GH5S 400Mbit   
    To anyone wondering about exposure controllers in Resolve:
    You can use the offset wheel in ACES, it's very close to a true exposure controller, but only in ACES.
    In normal Davinci YRGB mode you can linearize the clip using the Color Space Transform plugin, and once in linear mode you just grab the top right part of the curves and grab it along the top or right border. Because you're in linear mode this will be simple multiplication, so it's an easy way to control exposure. Then you use the same plugin to convert back to log.
  19. Like
    Attila Bakos got a reaction from seku in The video that shows Blackmagic Pocket 4K RAW image quality is same as GH5S 400Mbit   
    To anyone wondering about exposure controllers in Resolve:
    You can use the offset wheel in ACES, it's very close to a true exposure controller, but only in ACES.
    In normal Davinci YRGB mode you can linearize the clip using the Color Space Transform plugin, and once in linear mode you just grab the top right part of the curves and grab it along the top or right border. Because you're in linear mode this will be simple multiplication, so it's an easy way to control exposure. Then you use the same plugin to convert back to log.
  20. Like
    Attila Bakos got a reaction from KnightsFan in The video that shows Blackmagic Pocket 4K RAW image quality is same as GH5S 400Mbit   
    To anyone wondering about exposure controllers in Resolve:
    You can use the offset wheel in ACES, it's very close to a true exposure controller, but only in ACES.
    In normal Davinci YRGB mode you can linearize the clip using the Color Space Transform plugin, and once in linear mode you just grab the top right part of the curves and grab it along the top or right border. Because you're in linear mode this will be simple multiplication, so it's an easy way to control exposure. Then you use the same plugin to convert back to log.
  21. Sad
    Attila Bakos got a reaction from IronFilm in The video that shows Blackmagic Pocket 4K RAW image quality is same as GH5S 400Mbit   
    That's a fact, Panasonic did not release an IDT, they say the GH5 is not compatible with ACES. Bullshit.
  22. Thanks
    Attila Bakos got a reaction from heart0less in Sony A7III SLog2   
    It's the compression, I never release LUTs with such problems
  23. Thanks
    Attila Bakos reacted to Sage in Pocket 4K to Alexa Conversion   
    Attila does great work, he has an excellent grasp of the technical; highly recommended
    That 4K 60 sounds quite good (super-sampled 10 bit)
    Ironically, I've still been working away at the EC engine, and fixing copious bugs (its inevitable that when I think a thing is finished, the work has just begun ) That being said, I think today I've fully tested it, and everything is working really, really well.
    This is a GHa Tungsten LogC render from this morning; the wheels on the far left and right are completely theoretical extreme gamut, beyond what the camera can record - the ones in the center are actual log gamut (and V3 for comparison):
    V4:

    V3:

  24. Haha
    Attila Bakos got a reaction from frontfocus in Fuji X-T3 and X-T4 discussion   
    Or not ? https://www.fujirumors.com/fujifilm-x-t3-successor-not-to-be-announced-in-january-but-in-february-march/
  25. Like
    Attila Bakos reacted to Sage in Pocket 4K to Alexa Conversion   
    Another tricky thing is the hand-off from the measured space under a given light source (sunlight or halogen) to extreme gamut simulation. This is because extreme saturation is only achievable with color light sources (i.e. red leds etc.). This amounts to two different 'perceived' color spaces from the same sensor.
    The challenge is to preserve hue lines of the measured light source as they move into extreme gamut, while simulating the behavior of extreme gamut. Hence, extreme gamut cannot be an exact match - note the difference in cyan and yellow especially (though can be similar enough as to be perceptually equivalent)
×
×
  • Create New...