Jump to content

kye

Members
  • Content Count

    4,275
  • Joined

  • Last visited

Everything posted by kye

  1. Sounds like you know what you want. I'm no expert on what the alternatives to the Irix are, but I know that when you're going as wide as this there aren't that many options to choose from.
  2. That link is banned, and also not useful because it links to the original source, which has since been taken down. Luckily, the internet never forgets: https://web.archive.org/web/20180407123236if_/http://juanmelara.com.au/blog/re-creating-the-600-brim-linny-lut-in-resolve https://static1.squarespace.com/static/580ef820e4fcb54017eba692/t/5a560498ec212d5f4a972d25/1515589920801/The_Brim_Linny_Rebuild.zip https://web.archive.org/web/20190227062207/http://juanmelara.com.au/s/The_Brim_Linny_Rebuild.zip
  3. What focal length do you actually need? ie, what precise focal length, rather than just wider than 18mm. I'm a huge fan of super-wide-angle lenses, and my second most used lens is a 15mm FF equivalent, but you want to hit the sweet spot. If you go wider than your sweet spot then you will either crop and throw away pixels, or you will get closer and get more wide-angle-distortion. If you go narrower than your sweet spot then you will have to go further back to get the framing you want, and if that's not possible then you won't get everything in frame that you want to. When I
  4. What focal lengths and sensor size are you looking for? and what 'look' do you want, ie, the modern look with lots of contrast and sharpness, or a softer rendering?
  5. No, no makeup team for me! The people i'm filming have much more variability - more like in this image: That's also a useful image for testing LUTs and seeing what they do to skintones BTW. Nice looking images. The DB certainly had a cult following. Interesting. There were also some great sample shots from the UMP with skintones, I should look through them for good examples. I agree that thickness can happen with low and high key images, with saturated and not so saturated images too. A few things keep coming up, and I think i'm starting to fit
  6. I'd suggest using multiple layers of stabilisation. As has been said before, gimbals are great but lose the horizon over time and cornering, and the super-duper-stabilistation modes on the GoPro and Osmo Action etc will also lose the horizon if kept out of level for some time (which is inevitable considering that roads are sloped to drain water away and corner nicely). Due to these factors, I'd suggest a very solid mount (to eliminate wind buffering) combined with a very short shutter speed (to eliminate exposure blurs and RS wobbles from bumps) combined with in-camera floaty-smooth
  7. My images were all GH5, with the second one using the SLR Magic 8mm f4, and the others using the Voigtlander 17.5mm f0.95. I don't think that anyone would suggest the GH5 was designed for low light, and the boat image is actually significantly brighter than it was in real life. The floodlights in the background were the only lighting and we were perhaps 75-100m offshore, so the actual light levels were very low. I was gobsmacked at how good the images turned out considering the situation, but they're definitely not the controlled lighting situation that they're being compared to. The s
  8. Indeed. Reminds me of this article: https://www.provideocoalition.com/film-look-two/ Art Adams again. Long story short, film desaturates both the highlights and the shadows because on negative film the shadows are the highlights! (pretty sure that's the right-way around..) I definitely think it's in the processing. I view it as that there are three factors: 1) things that simply aren't captured by a cheaper camera (eg, clipping) 2) things that are captured and can be used in post without degrading the image below a certain threshold (ie, what image standards you
  9. Im still working through it, but I would imagine there are an infinite variety. Certainly, looking at film emulations, some are quite different to others in what they do to the vector scope and waveform. What do you mean by this? OK, one last attempt. Here is a LUT stress test image from truecolour. It shows smooth graduations across the full colour space and is useful for seeing if there are any artefacts likely to be caused by a LUT or grade. This is it taken into Resolve and exported out without any effects applied. This is the LUT image with my plugin
  10. Read the book. The title isn't an accident.
  11. and some time after that maybe they'll be able to take my photos of 2-3+ hours of vigorous whiteboard dissuasions and actually make people work together better! Now that would be the magic of cinema.... Everything has pros and cons. Typically the people who think something is bad are either unaware of what was gained, or don't value them as highly as the things that were lost. Most often things were better in the good old days, but what that doesn't take into account is that the good old days for the 60-year old were when they were in their early 20s and was what the 80-year old was
  12. Ok, now I understand what you were saying. When you said "a digital monitor 'adds' adjacent pixels" I thought you were talking about pixels in the image somehow being blended together, rather than just that monitors are arrays of R, G, and B lights. One of the up-shots of subtractive colour vs additive colour is that with subtractive colour you get a peak in saturation below the luminance level that saturation peaks at in an additive model. To compensate for that, colourists and colour scientists and LUT creators often darken saturated colours. This is one of the things I said that
  13. @Andrew Reid considering how good these things are getting, is it now possible to start ranking phones against cheaper cameras? All the discussion is phones vs phones or cameras vs cameras, but I'm curious when we can start comparing them directly. Or maybe it's obvious and I'm out of the loop because I'm still using my iPhone 8.. It will be a great day when the image quality of a consumer device mimics Zeiss Master Primes. Cameras are tools, and currently the best tools are inaccessible to the vast majority of people who would like to create with them. Democratisation of i
  14. +1 for a normal picture profile, like @TomTheDP says. This will mean controlling your lighting and DR in the scene and white-balancing meticulously. It would also help the colourist immensely if you shot a colour chart, preferably in every setup if you can. 8-bit is fine if it's starting in a 709-like space, and the A73 isn't that bad when you consider that feature films have mixed GoPro footage in with cinema cameras!
  15. Perhaps these might provide some background to subtractive vs additive colour science. https://www.dvinfo.net/article/production/camgear/what-alexa-and-watercolors-have-in-common.html https://www.dvinfo.net/article/post/making-the-sony-f55-look-filmic-with-resolve-9.html Well, I would have, but I was at work. I will post them now, and maybe we can all relax a little. No bit-crunch: 4.5 bits: 4.0 bits: In terms of your analysis vs mine, my screenshots are all taken prior to the image being compressed to 8-bit jpg, whereas yours was taken after it
  16. It does hark back to Deezids point. Lots more aspects to investigate here yet. Interesting about the saturation of shadows - my impression was that film desaturated both shadows and highlights compared to digital, but maybe when people desaturate digital shadows and highlights they're always done overzealously? We absolutely want our images to be better than reality - the image of the guy in the car doesn't look like reality at all! One of the things that I see that makes an image 'cinematic' vs realistic is resolution and specifically the lack of it. If you're shooting with a com
  17. I've compared 14-bit vs 12-bit vs 10-bit RAW using ML, and based on the results of my tests I don't feel compelled to even watch a YT video comparing them, let alone do one for myself, even if I had the cameras just sitting there waiting for it. Have you played with various bit-depths? 12-bit and 14-bit are so similar that it takes some pretty hard pixel-peeping to be able to tell a difference. There is one, of course, but it's so far into diminishing returns that the ROI line is practically horizontal, unless you were doing some spectacularly vicious processing in post. I have not
  18. It might be, that's interesting. I'm still working on the logic of subtractive vs additive colour and I'm not quite there enough to replicate it in post. Agreed. In my bit-depth reductions I added grain to introduce noise to get the effects of dithering: "Dither is an intentionally applied form of noise used to randomize quantization error, preventing large-scale patterns such as color banding in images. Dither is routinely used in processing of both digital audio and video data, and is often one of the last stages of mastering audio to a CD." Thickness of an image might ha
  19. The question we're trying to work out here is what aspects of an image make up this subjective thing referred to by some as 'thickness'. We know that high-end cinema cameras typically have really thick looking images, and that cheap cameras typically do not (although there are exceptions). Therefore this quality of thickness is related to something that differs between these two scenarios. Images from cheap cameras typically have a range of attributes in common, such as 8-bit, 420, highly compressed, cheaper lenses, less attention paid to lighting, and a range of other things. Howe
  20. If I'm testing resolution of a camera mode then I typically shoot something almost stationary, open the aperture right up, focus, then stop down at least 3 stops, normally 4, to get to the sweet spot of whatever lens I'm using, and to also make sure that if I move slightly that the focal plane is deep enough. Doing that with dogs might be a challenge!
  21. Actually, the fact that an image can be reduced to 5bits and not be visibly ruined, means that the bits aren't as important as we all seem to think. A bit-depth of 5bits is equivalent to taking an 8-bit image and only using 1/8th of the DR, then expanding that out. Or, shooting a 10-bit image and only exposing using 1/32 of that DR and expanding that out. Obviously that's not something I'd recommend, and also considering I applied a lot of noise before doing the bit-depth reduction, but the idea that image thickness is related to bit-depth seems to be disproven. I'm now re-thin
  22. Which do you agree with - that film has poor DR or that Canon DSLRs have? I suspect you're talking about film, and this is something I learned about quite recently. In Colour and Mastering for Digital Cinema by Glenn Kennel he shows density graphs for both negative and print films. The negative film graphs show the 2% black, 18% grey and 90% white points all along the linear segment of the graph, with huge amounts of leeway above the 90% white. He says "The latitude of a typical motion picture negative film is 3.0 log exposure, or a scene contrast of 1000 to 1. This corresponds
  23. That's what I thought - that there wasn't much visible difference between them and so no-one commented. What's interesting is that I crunched some of those images absolutely brutally. The source images were (clockwise starting top left) 5K 10-bit 420 HLG, 4K 10-bit 422 Cine-D, and 2x RAW 1080p images, which were then graded, put onto a 1080p timeline and then degraded as below: First image had film grain applied then JPG export Second had film grain then RGB values rounded to 5.5bit depth Third had film grain then RGB values rounded to 5.0bit depth Fourth ha
  24. or how about these... are these super thick?
  25. No thoughts on how these images compare? It's not a trick..
×
×
  • Create New...