Jump to content

All Activity

This stream auto-updates

  1. Past hour
  2. Agree! Been watching hours of footage and notice a personal bias toward canon images - and it seems to be the way they render color and skin regardless of user - it’s over hours of footage. That’s us! Yes I get this, after feeling a little dejected from the original question and response I checked the subreddit “colorists” and there’s tons of threads and comments noting that they prefer to receive images from some cameras and brands over others. it seems to come down to difficulty - if you have the same face, scene, lighting etc and 5 different cameras or brands - these colorists had noted that to get the desirable look was much easier than others and frustrating and painstaking on others. Which is why I felt it difficult to accept notions of makeup or learn color grading. The tools themselves can aid in that journey. I’ve learned through this that the way canon renders color, specifically skin, is my favorite look. That doesn’t mean I’ll go run out to grab an r5 or r8 - the feature set is far inferior to an a7s3 or a7iv, I think. I was curious what more knowledgeable folks preferred as their starting point tool. Do you use a gx85 for your travel and fs7 for work? Or is it an r5 for travel and fx6 for work, etc? That’s all I was hoping to discuss here and then follow with - “why did you choose that tool”. sometimes we bring our own gripes and biases to a question and don’t answer the quesirtin. that said, now my next question is can you achieve the same appeal of color on mid-level Sony bodies as you can on the canons that produce color I love. And if so, how easy is it to do? Great video - saw this years back I believe. That said. I loved the 1/8 look for general purpose filming but anything more was very stylized for my taste.
  3. Today
  4. It's definitely an Angenieux lens. Here are more shots and there's a Tiffen lens shade with writing on it. Only Angenieux has zooms machined like that. I found some better shots: I'm fairly sure it's a "version" of the Angenieux 12-120mm, but I cannot guarantee it.
  5. I think it's likely to be a real camera and lens (for the reasons you mention), but no idea which lens. TV news gathering was in a slow transition phase from 16mm film to ENG back then, so there would have been plenty of working 16mm film equipment around, used both for news and other TV production outside the studio environment.
  6. I very much agree with that - the opposite of someone filling an answer with the latest buzzwords, fashion statements and acronyms to gloss over the fact that they don't really understand the subject. I've been interested in science and engineering from quite young (the first book I ever bought was about electricity and magnetism). Favourite subject at secondary school was physics, helped a lot by an enthusiastic teacher who really understood the subject and could explain the fundamentals behind it very well. When I went on to study physics and electronics at university, in marked contrast some of the lecturers were terrible at explaining things in a simple fashion. One lecturer in particular kept pushing his own textbook, which was just as impenetrable as his lectures, so some of us students just gave up and found a book that explained the basics of the subject much better, just to get us through the exam at the end of the year... (and it was a subject that in my subsequent electronic design engineering career I've become much more familiar with - so now I know it's mostly much less complicated than it seemed at the time). "Simplicity is the essence of good design" I've found to be very true. If things start getting too complicated and messy in a project, it's usually a sign that I didn't set off in the right direction at the 'blank sheet of paper' stage.
  7. Excellent point about the compatibility - I'm so used to MFT and almost everything being interchangeable that I'm not used to even thinking about these things! In terms of it being a prop, I would have thought that it would have been easier to grab whatever was the cheapest / most common / not-rented item from their camera rental house. I mean, if you're shooting a feature film then you're renting a bunch of stuff anyway, so renting an extra 16mm setup to use as a prop wouldn't be hard at all. They could have rented it from a production design rental house along with all the other props etc, but then anything in that place would be non-working and likely turned into a prop when it stopped working. In this sense, it's very unlikely to have been a camera / lens combination that wasn't compatible, as someone would have had to have glued the lens on the body or something, which takes extra effort etc which wouldn't be needed considering there would be that many of those cameras and lenses that wore out or got dropped into a river etc that they'd be worthless and ubiquitous.
  8. If it was a real working 16mm film camera, I don't think it would be an ENG (Electronic News Gathering) lens, as they are designed for professional portable video cameras (which in the late 1970s would have been triple vacuum tube image sensor cameras using a dichroic colour splitting prism, thus having a long flange-to-sensor optical path). But of course in the movie it's basically a prop, so doesn't have to be a working camera.
  9. Tamron is also putting out an RF-S (APS-C) zoom lens (11-20mm F2.8). Makes sense as Canon have barely developed any RF-S lenses (not a single prime or fixed aperture lens). So this will surely help sell crop bodies. So yeah while this is great news for APS-C Canon owners, I really don’t see this as a solid indicator Canon will open third party FF RF lenses although one can hope!
  10. I don't think so.. all the photos I found showed the Angenieux has the writing on the outside and not visible from the front Filters don't tend to have writing on them like that - that pattern looks like lens info anyway. None of the ones on here have writing that looks similar either: https://www.oldfastglass.com/cooke-10860mm-t3 It seems to have one of those boxes that controls the lens and provides a rocker switch for zooming etc, maybe that narrows it down? Maybe it's an ENG lens rather than a cinema lens?
  11. I've heard that the 12K files are very usable in terms of performance, but it will likely depend on what mode you're shooting in. Most people aren't using the 12K at 12K - they're using it at 4K or 8K. Regardless, Resolve has an incredible array of functionality to improve performance and enable real-time editing and even colour correction on lesser hardware. This is a good overview:
  12. Well that is pretty decent. In that case, I might look even harder at the system next year… The R3 is for my needs the best body currently on the market. A Nikon Z6iii with additional battery grip is probably going to beat it…maybe…for me and as someone now 50% invested in Nikon, almost certainly where I will be going. But at the end of every season, I review my needs and Canon could be an option, especially now that Sigma have joined the party, but would need to have brought out a pretty extensive FF line by next Spring. I think Canon really do need to lighten up on their lens stance though or it will bite them in the arse. If it is not already nibbling…
  13. When you say "like they are emitting light themselves" you have absolutely nailed the main problem of the video look. I don't know if you are aware of this, so maybe you're already way ahead of the discussion here, but here's a link to something that explains it way better than I ever could (linked to timestamp): This is why implementing subtractive saturation of some kind in post is a very effective way to reduce the "video look". I have recently been doing a lot of experimenting and a recent experiment I did showed that reducing the brightness of the saturated areas, combined with reducing the saturation of the higher brightness areas (desaturating the highlights) really shifted the image towards a more natural look. For those of us that aren't chasing a strong look, you have to be careful with how much of these you apply because it's very easy to go too far and it starts to seem like you're applying a "look" to the footage. I'm yet to complete my experiments, but I think this might be something I would adjust on a per-shot basis. You'd have to see if you can adjust the Sony to be how you wanted, I'd imagine it would just do a gain adjustment on the linear reading off the sensor and then put it through the same colour profile, so maybe you can compensate for it and maybe not. TBH it's pretty much impossible to evaluate colour science online. This is because: If you look at a bunch of videos online and they all look the same, is this because the camera can only create this look? or is this the default look and no-one knows how to change it? or is this the current trend? If you find a single video and you like it, you can't know if it was just that particular location and time and lighting conditions where the colours were like this, or if the person is a very skilled colourist, or if it involved great looking skin-tones then maybe the person had great skin or great skill in applying makeup, or even if they somehow screwed up the lighting and it actually worked out brilliantly just by accident (in an infinite group of monkeys with typewriters one will eventually type Shakespeare) and the internet is very very much like an infinite group of monkeys with typewriters! The camera might be being used on an incredible number of amazing looking projects, but these people aren't posting to YT. Think about it - there could be 10,000 reality TV shows shot with whatever camera you're looking at and you'd never know that they were shot on that camera because these people aren't all over YT talking about their equipment - they're at work creating solid images and then going home to spend whatever spare time they have with family and friends. The only time we hear about what equipment is being used is if the person is a camera YouTuber, if they're an amateur who is taking 5 years to shoot their film, if they're a professional who doesn't have enough work on to keep them busy, or if the project is so high-level that the crew get interviewed and these questions get asked. There are literally millions of moderately successful TV shows, movies, YouTube channels that look great and there is no information available about what equipment they use. Let's imagine that you find a camera that is capable of great results - this doesn't tell you what kind of results YOU will get with it. Some cameras are just incredibly forgiving and it's easy to get great images from, and there are other cameras that are absolute PIGS to work with, and only the worlds best are able to really make the most of them. For the people in the middle (ie. not a noob and not a god) the forgiving ones will create much nicer images than the pigs, but in the hands of the worlds best, the pig camera might even have more potential. It's hard to tell, but it looks like it might even be 1/2. You have to change the amount when you change the focal length, but I suspect Riza isn't doing that because of how she spoke about the gear. It's also possible to add diffusion in post. Also, lifting the shadows with a softer contrast curve can also have a similar effect.
  14. I think that if you can possibly manage it, it's best to provide the simplification yourself rather than through external means. This gives you flexibility in the odd example you need it, and doesn't lock you in over time. The basic principle I recommend is to separate R&D activities from production. Specifically, would recommend doing a test on the various ways you can do something, or tackle some problem, and the options for your workflow, evaluate the experience and results, then pick one and then treat it like that's your limitation. I'm about to do one of those cycles again, where I've had a bunch of new information and now need to consolidate it into a workflow that I can just use and get on with it. Similarly, I also recommend doing that with the shooting modes, as has happened here: I find that simple answers come when you understand a topic fully. If your answers to simple questions aren't simple answers then you don't understand things well enough. I call it "the simplicity on the other side of complexity" because you have to work through the complexity to get to the simplicity. In terms of my shooting modes I shoot 8-bit 4K IPB 709 because that's the best mode the GX85 has, and camera size is more important to me than the codec or colour space. If I could choose any mode I wanted I'd be shooting 10-bit (or 12-bit!) 3K ALL-I HLG 200Mbps h264, this is because: 10-bit or 12-bit gives lots of room in post for stretching things around etc and it just "feels nice" 3K because I only edit on a 1080p timeline but having 3K would downscale some of the compression artefacts in post rather than have all the downscaling happening in-camera (and if I zoom in post it gives a bit more extension - mind you you can zoom to about 150% invisibly if you add appropriate levels of sharpening) ALL-I because I want the editing experience to be like butter HLG because I want a LOG profile that is (mostly) supported be colour management so I can easily change exposure and WB in post photometrically without strange tints appearing, and not just a straight LOG profile because I want the shadows and saturation to be stronger in the SOOC files so there is a stronger signal to compression noise ratio 200Mbps h264 because ALL-I files need about double the bitrate compared to IPB, and also I'd prefer h264 because it's easier on the hardware at the moment but h265 would be fine too (remembering that 3K has about half the total pixels at 4K) The philosophy here is basically that capturing the best content comes first, and the best editing experience comes next, then followed by the easiest colour grading experience, then the best image quality after that. This is because the quality of the final edit is impacted by these factors in that order of importance.
  15. Yesterday
  16. Canon RF 70-200 f2.8 is 1070g and 70-200 f4 is 695g
  17. Without having shot 12K, I'll say that system looks like more than enough (and if it's not, there won't be many systems that are). The RTX 4090 is the most powerful consumer GPU on the market and that's a really decent CPU. It might even be overkill (though overkill also means you probably won't be shopping for a better system in 1-2 years). If you aren't already familiar with PugetBench, Puget Systems have a nice database of results that people have achieved with various systems. https://benchmarks.pugetsystems.com/benchmarks/ What's not clear on that is how expandable the system is - which was a problem with the Alienware that I had for a while. There seems to be only one more slot for NVMe beyond the boot drive and the USB ports are only 3.2 and not 4. The Alienware that I had only had a single free PCIe slot as well. In my case, it was enough to add a 10g network card and that was about it. If you want to put 12K footage locally on the machine, 2TB is going to get cramped fast. Fast USB 3.2 storage will be able to keep up, though as of a year or so ago, flash-based USB 3.x storage arrays were not so common - at least at a reasonable price. If it were me, I'd look for something with 2-3 additional NVMe slots beyond the boot drive to be able to add more local storage and I'd look for something with USB 4 since it will be compatible with most/all Thunderbolt devices which gives a lot of better/more interesting options for external storage/devices.
  18. https://www.dell.com/en-us/shop/desktop-computers/alienware-aurora-r15-gaming-desktop/spd/alienware-aurora-r15-amd-desktop/wdr15amd50h Any idea if it is possible to find a better route out there?
  19. He’s the only man who doesn’t hate the sound of his own voice 😉
  20. Also, I'm fairly sure that's Morgan Freeman doing the voiceover for the trailer.
  21. That's solid advice. I know that @kye has been recording at 4k 100mbps 8bit and that's basically the same as 1080p 10bit, but with a little more detail, which makes a ton of sense. I would also like to put the 1080p 12mbps 10bit Proxy on the S5ii up against my 4k 100mbps 8bit on my GX800. I'm not so sure the 4k would come out 10x ahead, especially since I can barely tell the difference at normal distances with 6k (and don't get me wrong, the 6k looks amazing at 100%). Of course, nothing is "for free", but 1080p HEVC gives a little bit of that impression sometimes.
  22. ...and is that a Nagra IV open-reel tape recorder the sound op is using?
  23. I did a similar 1080p to 4k comparison with 10-bit 50p HEVC files from my OM-1 very recently (as a check after I'd updated the FW to the latest 1.6 version). 1080p is nominally 40Mbps and 4k is 150Mbps. With the 1080p upscaled to 4k (using the FFMPEG zscale 'spline36' filter), at normal viewing distance on a 55" native 4k OLED TV I could tell them apart (as I know what to look for) but it's not easy. A normal viewer wouldn't notice. I've done the same comparisons in the past with files from my G9 with the same result. As a consequence of this, most often I record in 1080p 10-bit and save 75% of the storage space, unless there is a reason to want maximum resolution/quality e.g. it's an 'unrepeatable' major trip or event, to allow for re-framing or extraction of 4k stills. For the last one (which is handy for wildlife), I often record at 4k 24/25/30p 10-bit as that is sharper on the OM-1 than 4k 50/60p, but use 1/100 shutter speed to reduce motion blur while being reasonably usable as video footage as well.
  24. Actually, what am I blathering about?! The only time I need the EIS is for tracking the couple, outdoors, once, maybe twice. Doh. Absolute none issue. I'll simply put it on a function button! I'd normally be tracking at '42mm' and it will be '58mm'. I'll just increase the distance between myself and the subject by a couple of feet et voila. Move on, nothing to see here 🤪
  25. Yes, after seeing more photos, it looks like it. I'm guessing the lens is the Angenieux 9.5-57mm zoom lens.
  1. Load more activity
×
×
  • Create New...