Jump to content

kye

Members
  • Posts

    7,967
  • Joined

  • Last visited

Everything posted by kye

  1. Great thread BTW. Modern sterile images are incredibly boring.
  2. Here's a quick test from the Voigtlander 42.5mm F0.95 and Sirui 1.25x showing the overall quality and how it behaves with differing apertures. The Voigt is sharp when stopped down, but not when wide open. The colour shifts are from the Voigt. F2.0 F1.4 F0.95 Here's an image from the other night to get a bit of a flavour. I've sharpened it quite a bit in post. I've shot with this combo on my current trip and really like it, but it's really heavy and so I've been thinking about alternatives for getting a similar look. I'm starting to think of this as a two-part challenge: the first part is things that can only be done optically like the bokeh (size, shape, CA, etc) and the second part is things that can be influenced in post (especially the softness of the focal plane). In this sense, I'm looking for glass that will give me the right bokeh, and can then degrade the image in post using softening, vignetting, distortion (barrel / pincushion), CA (of the whole image) etc. I'm surprised at how much the bokeh swirls: The fact that swirly bokeh is just anamorphic bokeh at the sides of the image, and the wider the aspect ratio you end up using in the final video the more you're cropping off the top and bottom where the swirl goes from vertical to horizontal, makes me think that a very swirly spherical lens with a wide crop might be a passable alternative. I will be investigating my vintage fast ~50mm collection on SB when I get home. These seem the easiest way to get soft images with character without huge weight and complexity and cost.
  3. How dirty are you looking for? My Sirui 1.25x can look pretty dirty if you put it on a softer taking lens.
  4. Not a clue, I just saw that there were some products available. A quick search reveals that the usual suspects offer them (Moment, Neewer, Smallrig, etc). Lots of them (or all of them?) will have their optics manufactured in the same factory, so they might all be similar optically. In which case, you're choosing form-factor (for example the Moment one is square and doesn't seem to have filter threads so doesn't look like it supports a vND) and perhaps the case / system that goes with that, and also price of course.
  5. Interesting idea and seems like it might work well for some situations. I personally always use my phone in a drop-proof case and never take it out (as that just means the fit of the case gets loose and it lessens the protection) but the idea of a clip-on / clamp-on one seems sensible. I'd imagine that we'll see a number of third-party accessories come out for the 17 over the coming months. When I got mine there were only Apple cases available to be delivered immediately and the delivery date for the rest were weeks away. I took the GH7 >> Voigtlander 42.5mm F0.95 >> Sirui 1.25x anamorphic adapter out for a spin in the markets last night and the images were great. I'm not sure how much of that was due to the anamorphism of the lens, and how much it was just due to it not being pristine, but maybe that might be worth the extra fuss.
  6. The DOF adapter still seems like a crazy / borderline broken vintage lens emulation, but if that's the look you're going for then it's a good option to have. What seems much more useful though is using it as a B or C camera, fitted with an ND and shooting manually.
  7. The more I use and grade my 17 Pro, the more I think that I'll work out a compact vND setup for it. The primary job for my phone is shooting ultra-fast with the default app in Prores Log, but the footage is so good that it would be great to be able to put on a vND, swap to the BM Camera app, and then shoot manually with a 180-shutter. In the default app where it's exposing with the shutter the only things I'm noticing that are limitations are the short shutter speeds and the stabilisation in difficult situations (like when you're tired, hot and sweating, blood-sugar is up and down, etc). The stabilisation is actually really good in the longer focal lengths, it's the 13mm and 24mm ones that are the challenges as you get the edges flopping around a bit. I'm also wondering what the 1.5x anamorphic adapters might be like for this. To get a vND setup for it you have to get a dedicated case and a custom vND for that case (which aren't cheap) so if you're going to be spending decent money on these things anyway then maybe getting a case and anamorphic lens and then using a standard vND that fits the lens might not be that much more costly. I have no idea if the lenses are good, are worth it, and how easily they work with the 1x / 2x camera and the 4x / 8x camera, as the mount would have to move if you wanted to use it with the 4x / 8x camera. I've seen a few 1.5x adapters, which would make the 24 / 28 / 35 / 48mm normal camera into a 16 / 19 / 23 / 32mm camera. The other camera would go from being 100 / 200mm to 67 / 133mm. Perhaps the best combinations in that bunch would be the 32mm, and 67mm ones, perhaps with the 23mm (or 19mm) as very wide options for specific situations.
  8. Tianluokeng Tulou clusters, Fujian Province, China. These buildings have a very thick outer wall of earth and a 3-5 storey inner wooden structure that houses dozens of families. The structure is designed to be stable during earthquakes and secure against bandits. The oldest if the ones we visited was built in 1796. These are just with a quick grade, mostly Resolve Film Look Creator. The DR in the scene is extreme, and while all the required info is in the files, I'm going to have to go heavy on the power-windows when I grade these properly. Grabs from GH7 + 14-140mm zoom. Grabs from iPhone 17 Pro shooting Prores Log with default app. The Prores HQ Apple Log files grade really nicely, have heaps of DR, and are great to work with. The DR isn't quite as much as the GH7, but it's more than enough for these scenes. These were graded at a different time to the above GH7 shots so probably don't match. All-in-all, the iPhone well and truly punches above its weight when you take into account it's pocketability, the size of the sensor, and the incredible range in focal lengths. Imagine how much you'd have to pay to get a lens that can do 13-200mm FF equivalent FOV and has exposure levels between F1.78 and F2.8 across the whole range.
  9. Absolutely. Use a tripod or stabiliser when required and use an ND to shoot manually with the iPhone and you'll get top notch results with a very light setup. Great to hear people using modest tools and putting out work.
  10. Another frame grab showing a bit more DR. Same Resolve + FLC workflow, but no grain added, so what you see below is the noise and compression artefacts from the Prores. Default exposure: -2 stops to see where the clipping point is: +2 stops to look at the shadow detail: Very serviceable, especially considering this was shot from the window of a moving train (the OTHER reason why rolling shutter matters).
  11. I started this thread by talking about the GH7, but I think the iPhone 17 Pro has also come of age (for me at least). My goals for using this is to keep it in my pocket, be able to shoot super quickly using the default camera app, and focus on the compositions and capturing the events in front of the camera while it does all the auto-everything required for a good quality capture. First impressions and thoughts from a few weeks of using it. 4K Prores HQ files in Apple Log 2 look great and are a joy to work with in Resolve (see examples below) All the lenses seem to work well and even up to the 8x 200mm are completely usable hand-held, and if leaning your hand against something the 8x is almost locked-off It records 6 channels of audio, and they appear to all be independent and available in the NLE (see image below) which might(?) be useful in difficult situations where there's wind noise in one or one channel clips etc? While recording Prores Log the default camera app shows you the log image and doesn't have an option to apply a LUT, so although it's a great way to be sure you're recording LOG, it's hardly ideal. Hopefully they fix this in an update. Audio channels in Resolve: Some frame grabs from out the hotel window in HK. Bear in mind these were shot with the default camera app, through multiple layers of tinted glass, and have had a film emulation grade put on top of them. 1x 24mm camera: 8x 200mm camera: with a bit of sharpening: with too much sharpening (unless you're a "cinematic Youtuber"): 1x 24mm camera (ignore the reflections in the window): 8x camera: 8x camera with sharpening: and in terms of DR / latitude, here's the 1x image brought up ~2.5 stops: I haven't tested it in low-light yet, but to me, all this essentially means that the camera is sufficiently technically capable that I can shoot with it without feeling like the technical factors are overly restrictive. This is incredibly impressive, given that even cameras like the GH5 needed you to pay attention to their limits in some situations. For the first time I feel like if something happened to my 'real' camera and all I had to capture a trip is my phone I wouldn't feel like I'd stuffed up. This is the first trip I haven't packed a backup camera body, which has given me the ability to pack a couple of extra lens options, thus 'upgrading' the GH7 rig as well.
  12. If no-one is able to assist here, might be worth asking in the relevant sub-forum on liftgammagain.com where the colourists all hang out. Of course, you might get an answer that's more technical than you'd like...
  13. The way I've used it in the past is there's a DCTL that will convert to/from OKLab. The pain is that it assumes you're in LogC3, so I have to do a CST before/after to get from/to DWG/DI which I use as my timeline colour space. My next steps were to look at the licensing for the OKLab DCTL, and work out how to just go directly from DWG/DI to OKLab in my DCTL. I'd assume I'd just need to combine the matrices for DWG->LogC3 and the LogC3->OKLab together into one step, but not sure if that's right. The way I'd do it is to implement a function that can be applied to the LAB variables that is stronger in the target areas than the others, or only apply in the target areas. A simple way is to say something like: if(A<0) { A=0; } if(B<0) { B=0; } X = (A * B); Then use that as a mask where the transformation is multiplied by X, so if X=1 the full effect is applied and if X=0 no effect is applied. What that will do is not effect the three quarters of the colour circle where either A or B are negative, and for the quarter they're both positive it will start at zero on the edges and then ramp up. This will ensure nothing in the image breaks because every adjacent colour before the operation remains adjacent afterwards. That will create sudden transitions in the mask though, so you could (for example) square the mask before using it, so values of X that are very low will be vanishingly low and the surface created will be a smooth curve. A bit more sophistication can make the transitions at the top (near 1) smooth too. This above is a transform that targets the A=B vector, but if you calculate Hue and Sat from the A and B values, then you can start to combine all sorts of values together, like having the value target a hue with a certain range, a saturation with a certain range, and then square that to target a location on the A-B plane. You can localise the intensity further by cubing X, or better yet, apply a variable factor which shifts X closer to 1 before it gets squared or cubed, so you can adjust it to the strength you want. While you're developing these things I'd map the variables to sliders in the plugin and then you can tune them and when you find good values just hard-code them. People have been calling for many years to be able to change the colour space of lots of controls, the way you can with tools like the Colour Warper in the spiderweb view. This would make grading so much easier if you could just set the colour space of the Saturation knob to be OKLab and be done with it, etc.
  14. I'm not really sure what you're saying. I pulled the clip into Resolve and brightened it by the certain number of stops I specified. If it wasn't actually darker then it would have become too bright.
  15. I used to edit projects on the train to/from work using a base-model Intel Macbook Air and external SSD with proxies rendered to 720p Prores Proxy format. This was in Resolve 12.5, so a long time ago. They edited like butter. Rendering them was a pain, swapping between the proxies and full resolution files was complicated and took a bit of work to get sorted out, but it made editing possible on very modest equipment. Now with Apple Silicone, you can edit almost anything without needing proxies.
  16. We've actually had quite a bit of flexibility to colour grade our smartphone images for basically a decade now, but the issue is that no-one shooting with their phone could be bothered to get off their asses and actually learn how Colour Management worked.
  17. Some initial testing using the default Apple Camera app. I did some bitrate stress tests and got bitrates between 550-700Mbps so it looks like the default camera app is just using Prores HQ. The bitrates for the stress-test and for static scenes were all within this range. Latitude tests using the native Apple camera app, the 1x camera and 4K Prores Log mode. How I tested was I started recording, then clicked and held in the middle of the image, then dragged down all the way to the bottom to progressively under-expose. Then in post I found the frames that aligned nicely to -1, -2, and -3 stops of exposure. Reference image - this is how the app decided to expose and WB. These are the fully corrected images (-1 stops, -2 stops, and -3 stops): These are the images without any corrections so you can see the relative exposures: and these are the images that have had exposure corrected but not WB corrected: My impressions were that these graded beautifully in post. I've graded a variety of codecs (everything from 8-bit rec709 footage to "HLG" footage to GH7 V-Log to downloaded files from RED / ARRI / Sony) and these felt completely neutral and responded just how you'd expect without any colour shifts or strange tints or anything else. I've seen enough tests from iPhone 15 and 16 to know that Apple are doing all kinds of crazy HDR shenanigans, but realistically this footage seems like it's really workable. If I can pull images back from -3 stops to be basically identical then that means that whatever colour grading I'll have to do on a semi-reasonably exposed image will be just fine.
  18. Well, that was fun. New phone has new colour space which requires updating Resolve which wouldn't install and required me to update MacOS. Recording and then viewing the first clip took a lot longer than anticipated. Here's some initial observations as I find my bearings. Resolve 20.2 only has Apple Log 2 in the Colour Space, and only has Apple Log in the Gamma. If we assume that BM and Apply both know what they're doing then that means that Apple Log 2 gets a new colour space but keeps the same gamma curve, and iPhone 15 and 16 users get an upgrade to Apple Log 2. In the default camera app you only get Log and HDR (whatever that is) in Prores mode. Not entirely sure what Prores flavour it is, but I'd guess HQ because the fine print in the settings to enable it says 1 minute of 4K 30p is 6GB which is 800Mbps (and would be 640Mbps for 24p) which is sightly lower than the 700 typically stated, but is definitely higher than the 471Mbps typically stated for 4K 24p Prores 422. If you select Prores and Log, the image displayed is the log image.. no display conversion for you! BM camera app gives tonnes more options. I'll have to investigate that further, but last time I looked it seemed to only offer manual shooting, rather than the auto-everything that I need when shooting fast.
  19. Interesting and thanks for sharing. I hadn't heard the RAW was external only, I guess every new camera has its 'gotchas' in terms of combinations of features that do/don't work together. Mine arrived yesterday (Friday), which was a nice surprise as when I preordered it they predicted it wouldn't be delivered until Monday. I got all the notifications etc, so it didn't just rock up with no warning though 🙂 Odd that the sensor stabilisation isn't supported in RAW. Is it IBIS or is it EIS? If it's IBIS then I have no idea why it wouldn't be supported?
  20. 100%.. definitely what's happening. I heard it was a flop. One reason I didn't mind having a smaller phone is that I don't really do a lot of stuff on it, as I prefer using my laptop. If I used my phone as my main device I'd be wanting the largest thing I could practically carry around. I'll be curious to see how it goes. On my last trip I bought a Pop Socket, which I found to work really well actually. This is what it looks like if you're not familiar, and means you can hold it really securely, and it folds almost flat. I put it towards the bottom and on one side so I could open it, pull the phone out of my pocket, open the camera app from the Home Screen, take some video, lock the phone and put it back in my pocket again, all with one hand. The flaw was that because the socket wasn't in the middle of the phone so when walking the weight of the phone would rotate the phone. I got another one for the 17 and I'll put it in the centre of mass of the phone so when I walk with it there's no rotation induced from unbalanced motion. I don't know what that means for how usable it will be one handed. I guess sacrifices have to be made to go from h265 HDR to Prores with Apple Log.
  21. I also really like the Mini size and form factor, and TBH it frustrates me quite significantly that they don't put the good cameras in the smaller models. It's like forcing you to buy a truck just to get power-steering. I also agree that the square selfie-camera is a cool addition. I'll be curious to see how wide it is, my recollection of previous versions was that they were a bit long of a focal length for what I wanted to shoot (me in a cool location) so I'd normally just use the wide-angle camera and turn it around and shoot blind.. hardly ideal. The holy grail for these cameras is to look the same as a nice camera with a stopped down aperture. I'd suggest that those iPhone 15 Apple Log vs Alexa videos showed that it was getting pretty close, although now with RAW it will be interesting to see how close it really is. When the first Android RAW samples came out there was something about the footage that still looked like a phone but at the time I couldn't work out what it was. There will be other things that might give the game away, but these will probably be how the camera is used rather than some innate property of the image itself. I'll be putting mine through its paces once I get back from my trip and upgrade to Resolve 20.2 to get the RAW support. I might have to do some side-by-side comparisons too.
  22. Just be sure that the apps you will use will all still fit on the internal drive. Apps are getting more and more "featured" and are more and more bloated with things, or they do things like cache data and store it locally for enhanced performance. You might also benefit from some wiggle room, depending on how you're going to use it and how long you'll keep it. My iPhone 12 Mini from 2020 is still going strong and if I didn't use the cameras so much I wouldn't feel the need to upgrade, so you might end up keeping it for quite some time. Yeah, I've seen tests of this configuration and they looked pretty good. I suspect I'll end up just using the default camera app for speed of use, and I don't think it allows this combination (Log and h265) so I'll have to see what it allows me to do. Ideally I'd go for 4K and Prores 422, which is around 500Mbps, so large but still manageable. I think there are a few very specific situation where you'd want to get the dock and use a phone rather than just use a proper camera with a (much) larger sensor, but not that many. I am in chats with a lot of professional cinematographers who shoot a variety of client projects and the consensus seems to be that when the client wants something to look like it's been shot on a phone it's far far easier to just shoot it on a phone rather than try and get that specific look in post, but of course while shooting this "look" on a professional set you'd still want all the functionality like remote directors monitors and timecode sync for audio etc etc, so it's a very niche product but if you're doing that then it can save an incredible amount of hassle. Besides, now you can get your phone, add the external box, get a proper battery setup, and add a HUGE monitor.. no more being forced to shoot video on your iPad!!
  23. Nice. I'm getting the Pro one to replace my 12 Mini, so it should be quite the upgrade. When I'm on trips I use the big camera for shooting the surroundings and environment and my phone for quickly shooting the people I know as we're out and about doing things. Ironically that means that it's my most important camera and the other stuff is my secondary setup. I'll be curious to see what the lower-bitrate files are like, as shooting at full res gives very little recording time.
×
×
  • Create New...