All Activity
- Today
-
how to start security agen joined the community
-
billdoubleu reacted to a post in a topic: Optimising resolution & sharpness in post
-
foliovision reacted to a post in a topic: Optimising resolution & sharpness in post
-
https://blog.kasson.com/the-last-word/how-fast-is-the-gfx-100-ii-electronic-shutter/ Readout speed for GFX 100 II is the same as the 100, 100S??
-
8K ProRes Flog2 footage download:
-
kye reacted to a post in a topic: Coming Back to It.
-
Clearly. There is a special handshake and everything and a solemn promise that what happens on YouTube, stays on YouTube. They give it all up to get their $2 ad revenue return… Just 10 years ago, the common folks paid these rawk stars a small fortune to go on one of their secret revealing workshops. They have since mostly become charities for the short remaining time they have in the industry.
-
I've developed a more sophisticated "false sharpness" powergrade, but it was super tricky to get it to be sensitive enough to tell the difference between soft and sharp lenses (when no sharpening has been applied). Here are some test shots of a resolution test pattern through two lenses - the Master Anamorphics which are gloriously sharp, and the Hawk Vintage '74 lenses which are modern versions of a vintage anamorphic. ARRI ungraded with the false colour power-grade: Note that I've added a sine-wave along the very bottom that gets smaller towards the right, and acts as a scale to show what the false sharpness grade does. Here's the Hawk: and the Zeiss one with a couple of blur nodes to try and match the Hawk: Here's the same three again but without the false sharpness powergrade. Zeiss ungraded: Hawk ungraded: Zeiss graded to match the Hawk (I also added some lens distortion too): Interestingly, I had to add two different sized blurs at different opacities - a single one was either wrong with the fine detail or wrong on the larger details. The combination of two blurs was better, but still not great. I was wondering if a single blur would replicate the right shape for how various optical systems attenuate detail, and it seems that it doesn't. This is why I was sort of wanting a more sophisticated analysis tool, but I haven't found one yet, and TBH this is probably a whole world unto itself, and also, it's probably too detailed to matter if I'm just trying to cure the chronic digitalis of the iPhone and other digital cameras. ....and just for fun, here's the same iPhone shot from previously with the power-grade: If I apply the same blurs that I used to match the Zeiss to the Hawk, I get these: It's far too "dreamy" a look for my own work, but the Hawk lenses are pretty soft and diffused:
-
LOL I've watched dozens of hours of "how to edit a wedding video" tutorials. They're very similar to my work in many ways, where footage is likely to be patchy with random technical and practical issues to solve and the target vibe is the same - happy fond memories. BUT... I've never shot a wedding video, so I haven't taken the oaths to keep all your secrets!!
-
kye reacted to a post in a topic: Coming Back to It.
-
kye reacted to a post in a topic: Coming Back to It.
-
Milo NoelClark joined the community
-
Emanuel reacted to a post in a topic: Coming Back to It.
-
Stop giving the world our secrets huh?! But that’s kind of the sum of it actually…
-
Emanuel reacted to a post in a topic: iPhone 15 Camera Update - Released
-
Emanuel reacted to a post in a topic: iPhone 15 Camera Update - Released
-
Emanuel reacted to a post in a topic: iPhone 15 Camera Update - Released
-
Emanuel reacted to a post in a topic: iPhone 15 Camera Update - Released
-
Learning to not give a damn... LOL Nice pick! @kye ;- )
-
I'm not sure what you're seeing, but there seems to be two things required. The first is to get Resolve to not automatically do anything to the footage. IIRC you can do this by going to the clips in the Media tab and there's some option when you right-click on the clips that is something like Bypass Colour Management or something similar. That should tell Resolve not to do anything automatically based on metadata in the clip. The second one is the conversion, which should just be a CST from the right space to the destination one. IIRC the video suggested it was rec2020/rec2100 HLG, so you should be able to do a CST from that to whatever LOG format you want to work in. Keep in mind that you might want to do the CST at the start to a common working colour space for all your media and cameras, so that any grades or presets you create will work the same on all footage from any camera. I use DI/DWG for this purpose. Then if you have a LUT that wants a specific colour space, you just do a CST from DI/DWG to that log space, then put the LUT after that and you should be good. For example, the iPhone shots above had the following pipeline: Convert to DI/DWG I manually adjust the clips to 709 with a few adjustments and then use a CST from 709/2.4 to DI/DWG all my default nodes etc are in DI/DWG CST from DI/DWG to LogC/709 Resolve Film Look LUTs (mostly the Kodak 2383 one)
-
The format of this video is a pretty common one I think. My understanding of this style is this: Go out and do something, film what you can Review the footage and "find the story" Write and record a "piece to camera" (PTC) shot (https://en.wikipedia.org/wiki/Piece_to_camera) Edit the PTC into a coherent story, focusing on the audio Put the shots you recorded from (1) over the top of the PTC to hide your cuts If there are still gaps in the edit or it still doesn't work, record another PTC in-front of the editing desk that explain or clarify, put that into the edit I see these videos often, including the snippets from the person in the edit. Sometimes they have recorded a PTC so many times that the whole video is just a patchwork of clips from different times and locations that you're not even sure how it was shot anymore. Casey Neistat used to film his videos where each sentence or even every few words were recorded in a different location, so during the course of a sentence or two he'd have left his office, gone shopping, and returned home. Here's a video I saw recently that has this find-it-in-the-edit format: The above is an example of where the video was very challenging to make, which is why it required such a chaotic process, but it shows that if you are skilled enough in the edit you can pull almost anything together. Also, go subscribe to her channel - she's usually much more collected than the above video! 🙂 Wedding videos often follow a similar pattern in the edit: Find one or two nice things that got recorded (this is normally a speech from the reception, and perhaps if the bride and groom wrote each other letters and they opened their letters from their partner and read them out loud) Edit these into a coherent audio-edit (you literally just ignore the visuals and edit for audio only) Put a montage of great shots from the day over the top, showing just enough footage from the audio so you know who is speaking Put music in the background and in any gaps Done! I'd also suggest that when you say most other people film vlog style with a phone and you want to take it up a notch, try and do that just by filming with your camera on a tripod, but otherwise try and copy their format at first. Innovation is an interactive process, and the way they shoot and edit their videos likely has a number of hidden reasons why things are done that way. Start by replicating their process (with a real camera on a tripod) and see how that goes and what you can improve after you've made a few of them and gotten a feel for it. The priority is the content and actually uploading, right? So focus on getting the videos out and then improve them once you get going. It's always tempting to think you can look in from the sidelines and improve things, but until you've actually done something you don't understand it. Real innovation comes from having a deep understanding of the process and solving problems, or approaching it in a different way.
-
I am really appreciating these replies! Something I have noticed in most of the videos being made about this same subject is they are done in very much a vlog or handheld talking format and are often shot on their phones. My aim is to do more of a serious "looking" doc-style shoot. I really like how in some BBC Space documentaries they have an expert or scientist doing a talking head about their entire process and then it intercuts with the actual "doing" of the project. I am not a fan of this guys personality, but this is almost the exact presentation im trying to emulate.
- Yesterday
-
Greetings Everyone, New seamless texture images are ready on this page on my site: BRICK - ARTISTIC - (Tile-able) As always, you can access them, as well as my thousands of other images, from here: https://soundimage.org/images-home-page/ Enjoy and keep being creative! 🙂
-
I just watched this video comparing the G9 to G9II autofocus. Reading so much people in the comments saying the new AF is fantastic make me wonder they should just use 1080P cameras at best. I downloaded the video to zoom inside a bit, and while the G9II focus faster than the G9, accuracy is much worse. During the comparison with the 35-100mm, most of the time, the focus is not on his face but on his sweater (like at 2:26 or 3:36), making his face with even less details than a 1080P camera. I saw the same issue with another G9II video when using the 12-35mm, 35-100mm and 10-25mm. What the point to have a 4K camera with faster AF if the accuracy is not good ? I wanted to pre-order the camera but I will wait to see if Panasonic can solve the issue. I had too much issue with my GH6 using Open Gate (much more pulsing than my GH5) or when using frame rates <48fps with most of my lenses (defocusing issue). Panasonic never corrected them.
-
I am kinda trying to do the opposite. Want it to display as LOG, not HDR with tone mapping. It does look washed out in terms of colour space, but not in terms of gamma. Remember the Panasonic S1 and it had Hybrid LOG Gamma but no V-LOG? You could simply grade the Hybrid LOG Gamma as if it were C-LOG or something. Now you can't because all the NLEs pick up the metadata and handle it as HDR at 1000 nits! Must be a way to handle it as LOG instead?
-
I've seen this get recommended online elsewhere. Personally I just shot a colour chart with the phone and made a curve to straighten out the greyscale patches and a bit of hue vs hue and hue vs sat curves to put the patches where they should be in the vector scope. I've tried using a CST and didn't like the results from that as much as my own version. After I did my conversion my other test images all straightened out nicely and the footage actually looked pretty straight-forwards to grade.
-
EF-L lenses have their own look, I like the primes personally. Never liked the 24-105, massive barrel distortion and not a fan of slow lenses on medium wide focal lengths. 24-70mm f2.8 is usually my jam when it comes to zooms, and the RF24-70mm f2.8 does not disappoint aside from being big & heavy. And that is a concern, lenses keep getting bigger and bodies smaller. At least Canon has the R3 which is super light despite its mini sport-DSLR like body. And again that's where the RF70-200mm f4 comes in clutch. And yeah some older EF lenses too when it comes to weight/size ratio. The nicest combo I've experienced was Nikkor AI-S glass on Z8, and I can only imagine on the new Zf. Also awesome to see that old glass resolve in glorious 8K when it didn't impress me so much in mushy FHD during D750 days. Same can be said for EF on R5/R5C..
-
Nice results Does anyone know what the right combination of settings are in Resolve to get Dolby Vision HDR 4K footage from the older iPhones from 12 through 14 to display as HLG in Resolve without the Dolby Vision metadata, or tone mapping applied? Just want to grade the HGL like S-LOG. Everything I've tried so far doesn't work, including setting custom colour spaces in Resolve set to Rec 2100 and Hybrid Log Gamma.
-
Another one to match the above with a slightly better matching scene (still not hard light though - the pollution in India is no joke!). Ref: iPhone grade: iPhone (ungraded): Ok, I'll shut up now.
-
Another one, this time a shot from one of the Jason Bourne films, I think this was the second one, which was shot on Kodak Vision 3 and printed on Kodak 2383. Reference image: iPhone grade: iPhone (ungraded): I'm not so happy with that one, but the subject matter was a lot more different, with the reference shot being in full sun and the iPhone image being overcast and also containing a lot of different hues. The road in the reference image is asphalt and is slightly blue in the image, whereas the "road" in the iPhone shot is actually tram tracks and concrete, not asphalt. Still, there was something in the green/magenta/yellow hues that I couldn't quite nail. Oh well. That's those HDR images from the iPhone - you have no creative control over them. If only Apple had given me a slider for saturation, sharpening, and other controls, those would have matched the look of S35mm film and Cooke lenses perfectly 😉
-
A bit more playing around with what is possible from my HDR iPhone 12 images... Reference image from the move Ava (2020): iPhone grade: iPhone (ungraded): It's not perfect, but without having them next to each other it's not terrible. I couldn't find what camera Ava was shot on, but I did find that it was shot with Panavision anamorphics. No doubt that is a contributing factor to why my iPhone shot doesn't match the exact look of the movie lol.
-
Here are a set of four shots from my iPhone 12 Mini that I have graded very quickly in a few different ways to give a sense of what is possible, and what "film looks" might be able to be created. The first row has no grading applied, the second has my standard default iPhone input transform, the rest are more creative grades, just pushing it around to create various looks. All these shots were shot on full-auto with the default iPhone app. Every shot on the same row has the same grade, including exposure and WB and everything, despite being shot on three different continents. All these use only effects that are available in Resolve - no third-party plugins or LUTs or other YT influencer bullshit. You tell me - do I look like I am currently experiencing a complete lack of creative control over my images?
-
Remember before when I said that colour grading knowledge is lacking? This is a far deeper subject than I think you're aware of, and the comments from the guy in the video are so oversimplified that they're closer to being factually incorrect than they are to just being misleading. I've been studying the "video look" because I hate it and want to eliminate it as much as possible in my own work, and you're right that over processing images will contribute to this, however that's not what the guy was saying. He said that shooting HDR gives you no "creative control over the image", which is patently false because you can get it in post. When grading any footage, regardless of the camera, there are a number of standard operations you would apply to it, and you apply all the same treatments to LOG and HDR images. I have a testing timeline where I am developing my own colour grading tools and techniques, and it contains everything from HDR iPhone footage to 709-style GX85 footage, to HLG GH5 footage to LOG XC10 footage to RAW and Prores footage from the OG BMPCC and BMMCC, to ARRI and RED RAW 8K footage. I apply a custom transform to each of them to convert them into my working colour space (which is Davinci's LOG space) and then I apply a single default node tree to all clips, regardless of which camera they came from, and then I grade all the clips in one sitting. I have done this process dozens of times. The HDR profile from the iPhone is approximated pretty well by a rec2020 conversion. In that sense, it's in a colour space just like any other footage. Do I grade the iPhone footage "differently" to the other footage? Yes, and no. I still apply the same adjustments to each clip, adjusting things like: White balance Exposure and contrast Saturation Black levels and white levels Specific adjustments to things like skin tones, colour of foliage and grass, etc Power windows to provide emphasis to the subject, usually lightening and adding contrast Removing troubled areas in the frame like anything that stands out and is distracting Texture adjustments like sharpening / softening / frequency separation Adding grain That stuff all sits underneath an overall look, which will be based on a CST or a LUT, as well as up to a dozen or so specific adjustments which are too complex to explain in this post. Does the iPhone footage "feel" different when being graded? Sure, but the XC10 and GH5 and OG BMPCC all feel more different to each other than the iPhone does. TBH it feels more like in the middle than the other cameras, and similar to the GH5 and GX85. Is it harder to grade because I'm having to overcome all the baked-in stuff? It's probably not as difficult to grade as the XC10 footage, and that's shot in C-Log, which is a proper professional LOG profile. The RAW stuff is easiest to grade. What you might not be aware of is that all the different forms of RAW also feel different. Different scenes feel different too, even from the same camera. Can you make iPhone footage look as nice as ARRI footage? No. Definitely not. But you won't be able to make the Apple LOG footage look like that either. Can you have "creative control over the image". Absolutely. You have no idea how much control you can have over the image. Apple LOG does give you more "creative control over the image", but compared to the creative control you have by even learning the basics of colour grading, the difference is minimal. The only people who have no "creative control over the image" are people that have no colour grading ability. The ironic thing is that by applying a LUT designed by someone else, you have less effective creative control than you had before you applied it. Going back to the "video" look that over processing the image gives, I have become very sensitive to it and see it online in almost all free content. It is present even on videos shot on high-end cinema cameras. The only places it is almost completely absent is on high-end productions on streaming services and in the cinema.
-
Am liking this new blog from a camera operatives perspective - some good breakdowns of well known films already and each week there's new material https://www.theop.io/the-breakdown
-
DNG files from Android RAM footage can be imported directly in Resolve. They also can be labeled in a way that Resolve see them as DNG files coming from a blackmagic camera. You have access to all the settings, like change ISO in post and even Highlight Recovery ! I'll share my workflow with some sample clips in a few days. It is easy if you have the knowledge. Funny enough but to shoot in 4K ProRes 60p on new iPhone 15 pro Apple forces you to use external storage, while with Android RAW you can get 4K at 60p internally on the latest generation flagships without any problem. External recording to SSD is available to Android RAW too. With Xioami 13 Ultra or Vivo x90 Pro Plus 1Tb internal storage at 1000-1100 E, you may never need external one. Going back to IPhone 15 Pro, here is one good example that shows that aggressive noise reduction in high ISOs is also gone with new iPhone 15. Flares are still present but much less than in 14 Pro where they are simply ruining the image. https://www.youtube.com/watch?v=DHT2TEwJJz4 This guy claims that with Blackmagic application you can have Apple Log with HEVC (265) codec. Check the Apple Log part. Hm interesting https://www.youtube.com/watch?v=BOZUfWcGCxk
-
ENGISOFT joined the community
-
By commands am I technical but apple HDR locks its settings like sharpening, saturation, etc. log gives you the control and if you look side by side it’s night and day. You’re making conspiracy theory out of a lack of knowledge of what he’s saying. Take a moment. Step back and look into this. It’s been shown by countless DPs that apples locked settings is what gives HDR a video look and now with log you can get a closer to film look.
-
FYI - Nexto was acquired by TVLogic and this is their new backup storage product: https://www.newsshooter.com/2023/09/28/clouzen-tainer-all-in-one-portable-backup-storage-review/