Jump to content

kye

Members
  • Posts

    7,494
  • Joined

  • Last visited

Everything posted by kye

  1. Thanks! That comment means a lot coming from you At the risk of providing too much information, the total workflow was: ML RAW 1728 10-bit MLV App (Mac) ---> Cinema DNG Lossless Resolve Clip Node 1 WB Basic levels Resolve Timeline Node 1 OpenFX Colour Space Transform: Input Gamma Canon Log ---> Output Gamma Rec.709 Resolve Timeline Node 2 Noise reduction (Chroma only - the Luma noise in RAW is quite pleasant) Resolve Timeline Node 3 Desaturate yellows (Hue vs Sat curve) Desaturate shadows + highlights (Lum vs Sat curve) Resolve Timeline Node 4 Slightly desaturate the higher saturated areas (descending Sat vs Sat curve) Resolve Timeline Node 5 OpenFX Colour Space Transform: Input Gamma Rec.709 ---> Output Gamma Arri LogC OpenFX Colour Space Transform: Input Colour Space Canon Cinema Gamut ---> Output Colour Space Rec.709 Resolve Timeline Node 6 3D LUT - Rec709 Kodak 2383 D65 Resolve Timeline Node 7 Sharpen OFX plugin Resolve Timeline Node 8 Film Grain OFX plugin: Custom settings, but similar to 35mm grain with saturation at 0 Resolve Timeline Node 9 Glow OFX plugin Resolve Fairlight settings Master channel has compressor applied to even out whole mix Render settings: 3840 x 1632 H.264 restricted to 40000Kb/s I credit the overall architecture of the grade to Juan Melara - I cannot recommend his YouTube channel enough. To those starting out, in case that looks like a stupid amount of work, it's fast as hell once you save the structure in a Powergrade. Once I'd converted to CinemaDNGs the whole edit process only took a couple of hours including music selection.
  2. @jonpais Interesting thread. My (little) experience on film sets is more than enough to understand the benefits of being able to prepare a focus pull in advance. In order to add to the completeness of this thread for those lurking out there, in addition to the benefits of cinema lenses already mentioned, IIRC they are also designed so that all the lenses in a set are the same weight (which means not having to re-balance gimbals or adjust steadicam rig counterweights when changing lenses), will also share the same filter size (to enable one set of filters) and will have the same spacing of gears so that remote controllers for focus pulls or zoom lenses don't have to be adjusted. The total hourly rate of a film set is absolutely huge, and these lenses are designed so that they are quick to change and use.
  3. And fix some of the known weaknesses of the predecessor, like sound quality and battery life. I shortlisted the BMPCC but went with the XC10 because my work is run-and-gun and the BMPCC is more like a baby cinema camera by the time you kit it out with all the extra stuff you need to make it a practical setup.
  4. First publishable ML RAW test. 700D, ML RAW, Sigma 18-35. Not a great example of any aspect of film-making, but might still be of interest as an example of lesser film-makers with lesser equipment One thing I found surprising was that the ISO noise from the 700D (not the quietest camera of all time, especially at 800-3200 in this video) was quite organic and natural looking. As the noise changed a lot between shots I had a go at the technique that @kidzrevil has mastered of NR+Grain to make the video more consistent. Enjoy
  5. kye

    Cheap bokeh setup?

    First (publishable) test video shot with the 700D / ML RAW / Sigma 18-35. It was all hand-held and low light (challengingly low light), and I've added in some grain in post to cover up the noise from the camera (different shots had different amounts of noise so it was distracting). Notes: It's very hard to tell what's in focus and what is slightly off while filming - this was all F1.8 so shallow depth-of-field The RAW video is limited to 1728 x 786, so not quite FullHD, but I uploaded in 4K to get a bit better quality out of YouTube The lens is heavy! But that helps stabilise it a bit which is a plus. Manual focus is very nice, and so is the rest of the design of the lens actually The camera has lots of video noise (which is why I bought the XC10) but in RAW the noise looks less objectionable and in a way is more like film. This had shots up to ISO 3200 or more in it, which are challenging conditions for a camera this old. The lens focuses really close, which is handy for controlling perspective It's not likely to replace my XC10 any time soon, but it is very nice and has a different aesthetic to it I need to shoot with it more to 'understand' the look and work out how I want my videos to look After we ate dinner there was dancing and singing and I ended up trying the crop mode to extend the focal length, and that seems like a great feature. With the 18-35 without crop mode you get the equivalent of 29-56mm and with crop mode you get 87-169mm. That really is excellent coverage considering you don't have to carry anything extra, don't have to change lenses, and the 87-169 is a lot faster than my other long lens, the 55-250 f4-5.6. The more I film with this combination (with the Rode VideoMicro) the more I like it.
  6. This card-reader hack is at the bleeding edge of the bleeding edge. When something hasn't been included in the experimental build yet..... Still, lots of people getting >40MB/s readings sure is promising. When it's stable I'll be using it for (1728 / 16:9 / 10bit / 48MB/s) and (1792 / 16:9 / 10bit / 51.6MB/s)... but I'd be interested to see if they manage to figure out modes above 1728 without a sensor crop.
  7. Not sure if you're on the CML? https://cinematography.net I've only dipped my toes in the water a little there, but it seems to have lots of people who casually talk about RAW workflows for Red and Arri cameras. A very different world than the one I live in!
  8. Be aware though, that although Casey and a number of other you tubers seem quite proficient and can make some nice footage, if you're serious about learning to colour correct and grade then you should pay for some online courses from the real experts. I spent ages watching YT tutorials and noticed that some people had a 'feel' for grading but after a while I realised they were just playing with the controls and didn't actually know what was going on, or why you might use one set of controls over another. I'd recommend checking out Juan Melaras channel - it's obvious that he's a pro and that he works in a radically different way to those creating free YT tutorials: https://www.youtube.com/channel/UCqi6295cdFJI9VUPzIN4NXQ/videos
  9. kye

    Motion Cadence

    I don't know for sure, but I suspect it's about maintaining quality throughout the workflow. For example, if you were to have a long-GOP capture format then it will 'bake-in' motion issues, and then those issues may be worsened by intermediate processing steps (and of course each time you round-trip you're baking things in) and then the final delivery format gets put over the top of all of that. If the imperfections in the way that the long-GOP behaves are worsened by similar issues in the input then you might find that the motion issues compound, or even worse, are converted by some other limitation in the format into some other type of secondary issue.
  10. kye

    Motion Cadence

    I will do this and try as hard as I possibly can to see no difference.. I'd take being 'motion cadence blind' and under 25 any day!!
  11. No idea on LUTs but this tutorial might help if you're just starting out and are unfamiliar with matching shots? If you don't use Resolve then it might be worth a watch anyway as the controls he uses are fairly standard ones.
  12. This depends on your situation. If you're editing 4K RAW files and the Producer and Director are sitting behind you in your commercial edit suite, then no. If you want to grade that footage with them watching then HELL NO. I suspect that this is not the situation you're in, so it's all about compromises. My sister studied film at university in the late 90s and I remember sitting in an edit suite all night helping her edit the documentary (and fix the terrible audio) she shot on a PD150 off a removable HDD, and we'd make the changes, hit RENDER, and then go for a walk while the computer re-rendered the changes we'd made, and then 30 minutes later it would be ready to watch and we'd review it and make more changes and then go for another walk... I edit 4K on a laptop but I use a 720p proxy workflow, which may or may not work for you.
  13. kye

    Motion Cadence

    I just took a quick look over that thread and saw a lot of discussion about how the codec can introduce jitter / stuttering into something that didn't have them before. If that's true (and assuming I read it right) then it might be that the signal path might matter just as much as the capture device. Technology is normally very good at things like this, with error rates in the realm of parts-per-million (ie, only varying by something like 0.00000001%). The problems come in when there are secondary effects of such a small error that end up as something we're very good at perceiving. For example, in digital audio jitter can be quite audible despite being a very small variance. The reason it becomes audible isn't by hearing it directly (you don't hear the music speeding up and slowing down!) but because the timing is used to convert digital to analog and small timing variations cause harmonic distortions, which even if they aren't audible by themselves then cause intermodulation distortion, which are both things we are quite attune to hearing, so the problem is that these small timing errors can have audible knock-on impacts. In terms of frame rates though, I can't think of any knock-on impacts we'd be more sensitive to, or where the impact would be exacerbated.
  14. IIRC it was lowepost.com but I just had a quick look and couldn't't find it. If that isn't it then maybe LGG?
  15. Yes - with these crop sensors it's often difficult to get a lens that's wide, fast and cheap. In the end I conceded and just paid the money. It makes it doubly difficult if you want to go wider than the normal 28mm equivalent - I know lots of vloggers use FF with the 16-35 @ 16mm because they take up less space in the frame, don't have to worry about pointing it as accurately, and don't have to stretch their arm out as far. Good luck getting a practical 16mm equivalent on a crop sensor!
  16. kye

    Motion Cadence

    Ah yes, Spielberg and Saving Private Ryan. A memory is a good thing when it works Actually, one of the main challenges I have is that I don't really know it when I see it. I think my perception is a mixture of being unsensitised to these elements, and is also untrained. A big part of learning for me is learning to see. I can often see a difference between two different finished products (eg, I know I prefer the production of Peaky Blinders and The Crown to most other shows) but when I try and isolate the individual variables I'm often just left with two things that look the same to me. In a way that's why I'm hanging out on forums like this so much - I'm trying to learn what other people see and what the technical thing is behind it. Yep - my current kit contains XC10, 700D with ML, and iPhone 8 with ProCam (which provides full manual controls). I also have other much lesser cameras, but I suspect they're not high enough quality to be useful. What settings would you suggest I play with?
  17. If you haven't already come across them, I can highly recommend the videos by Juan Melara (https://www.youtube.com/channel/UCqi6295cdFJI9VUPzIN4NXQ/videos). He is obviously a very knowledgeable operator, but also seems to use Resolve in a way I haven't seen any other YouTubers even approach. I learned a heap from him. I haven't completely worked out my workflow for ML RAW, but he got me pretty close I think.
  18. kye

    Motion Cadence

    I thought that YouTubers used slow-motion because it looked 'better', which may be mostly that real-life needs all the help it can get to look nicer, and the slow-motion smooths out the camera-shake. So, as contenders we currently have: - line-skipping when reading the sensor (this wouldn't be to do with motion, it would be to do with how each image was rendered, but might be something that looks 'nicer') - less rolling-shutter (shouldn't apply much to static tripod-shots) - jitter in frame rate (variation in how far apart the frames are exposed) - shutter angle I have a vague memory of reading that some cameras take a small amount of time to open or close the shutter, kind of like it faded in, and this meant that the edges of moving objects weren't as sharp. I'm not sure exactly how this would be accomplished (I know very little about shutter design) but it sounds simultaneously like something that could be true in expensive cameras, or could be 'alternative internet facts'. On a more philosophical note, my personal view is that every technical aspect of filming will have an artistic implication, and behind every artistic impression there is a something tangible that we can isolate and measure. One of my goals is to understand what technical / specific things are behind the artistic things, so I can explore them and then use them to create a finished product where more of the choices I made are coherent with the overall feel I was going for. For example if we want something to look happy we use brighter more saturated grading because that supports a happy vibe, or if we want people to be on edge then we can have extreme close-ups with wide-angle lenses to distort the image which supports the feeling of un-ease. I read that in a recent war movie the director used a 90 degree or 45 degree shutter on action sequences as it showed explosions as being full of chunks of people instead of just being blurry and also made it feel more real and less 'cinematic'. I'm hoping we can learn something tangible about motion cadence in this thread.
  19. Makes sense. I wasn't recommending the Sigma (it's lovely but really big and heavy!), I was more commenting on not trying to use the 50mm on APSC for anything other than specialist shots. However, if it is to add to a collection of wider lenses then it does make sense and the price is definitely right! IIRC the M50 has different crops in 1080 and 4K - if that's true and the OP is outputting 1080 then the 50mm would have extra versatility as it can effectively become an 80mm with the DPAF, and a longer lens in mode with sensor crop. Totally agree about 35mm or 50mm on FF. I shot a couple of videos at 80mm (50mm on crop) indoors and it was just a little too long for most things - 35mm and 50mm equivalents are just right.
  20. Thanks @Deadcode @Papiskokuji @HockeyFan12 that totally makes sense - I'd forgotten that it was RAW vs compressed codecs. @Deadcode I asked it separately as I thought the answer would go into what scenes benefited from extra bit depth or that it was me being blind. ML RAW just happened to be how I got the files, and I only included it in case I was screwing something up and not really getting the extra bits! This basically answers my question, which ultimately was about what bit-depth I should be shooting. This is especially relevant with ML because if I lower the bit depth I can raise the resolution, which I thought I would have to trade-off against the gains of the extra bit-depth. On another forum I saw a post talking about getting new client monitors while things were on sale - they said they needed about half-a-dozen of them, and I just about choked when I read their budget was $8000 ..... each! I think I only recognised every second word in the rest of the thread - between brand names and specifications etc. It's another world!
  21. I've been playing with Magic Lantern RAW and looking at different bit-depths, and after all the conversation in the BMPCC thread (and the thousands of 8-bit vs 10-bit videos) I got the impression that bit-depth was something quite important. Here's the thing, I can't tell the difference between 10bit, 12bit and 14bit. I recorded three ML RAW files at identical resolutions and settings, just the bit-depth varied. I converted them in MLV App to Cinema DNG (lossless) and pulled them into Resolve. Resolve says one is 10-bit, one is 12-bit and one is 14-bit, so I think the RAW developing is correct. I just can't see a difference. I tried putting in huge amounts of contrast (the curve was basically vertical) and I couldn't see bands in the waveform. I don't know what software to use to check the number of unique colours (I have photoshop, maybe that will help?). Am I doing something wrong? Does a well-lit scene just not show the benefits? Is 10-bit enough? Am I blind to the benefits (it's possible)? I've attached one of the frames for reference.. Thanks! M02-1700_000000.dng
  22. kye

    Motion Cadence

    Lol, looks like the cure for OT posts might be feeding us! In getting back to Motion Cadence, I'm curious about this as well, as this is one of the things people say is 'cinematic' (which I believe) but people also seem to imply that even if you set cameras to the 180 shutter that there will still be some difference to the footage (which I don't believe, but would love to be proven wrong). @jonpais I have watched quite a few of those 'phone vs cinema camera' videos and I found that there was either a huge difference in motion cadence between the two of them (they always shoot them in bright light to use base ISO and so one would be 1/50 and the other 1/2000) or they have used an ND filter of some kind and then there was no difference in motion cadence. In terms of how someone might have a 180 degree shutter without an ND filter in the real world I can't think of any way this is possible. I'd be happy to hear about an alternative to ND filters, but I'd say it's safe to assume that if they're using a 180 shutter then it's got an ND.
  23. I've got a APSC camera (canon 700D) and for years the 50mm f1.8 was the only fast lens I had, so I tried to use it in many situations, but at 80mm equivalent it's too long for many situations. I've just recently purchased the Sigma 18-35, which is 29-56mm equivalent and I was surprised that there's a big difference between 56mm and 80mm. It depends on what you're shooting but I wouldn't recommend the 50mm on an APSC sensor as a general purpose lens at all.
  24. You're right about people not being that critical. My family are already saying that my films look like videos from the tourist bureau, but I can see that they're not and I guess I am that critical. Early on I shot a video of a family holiday on a point-and-shoot and had it on the 50p mode, which I only later discovered didn't record sound (oops!). The result was a wonderful video half-full of slow-motion shots of the kids smiling and running around with a full music soundtrack - great stuff and still perhaps my best work. Now I have gone and spent thousands of dollars on 'real' camera equipment I am expecting to get a lot closer to the way that a feature film would render something like that, and I find that I'm still falling quite a bit short. One of the things that is letting me down is DR. I realise that my skill level is the number one thing letting me down, but that's not something I can throw money at and DR is!
  25. I'm arguing that having a camera with higher DR is worthwhile and the answer isn't always to just work around the limitations of your camera. I've been around enough narrative, documentary, ENG and other types of shooting to realise that what I'm trying to do is at the pointy-end of making the best of situations you may have almost no control over. Most of the time on here I'll talk about something valuable to me and someone will say that I don't need it and all I have to do is change something I don't have control over but they assume everyone does.
×
×
  • Create New...