Jump to content

kye

Members
  • Posts

    7,483
  • Joined

  • Last visited

Everything posted by kye

  1. I agree with much of what @Oliver Daniel suggested, including the IBIS and EIS taking over gimbals. I think the technical solution for this might look very different though, but let me set some context from the other thread about how far we’ve come in the last decade.. in the last ten years: the 5Dii accidentally started the DSLR revolution the BMCC gave RAW 2.5K to the humble masses we got 3D consumer cameras we got 360 consumer cameras we got face-recognition AF, we got specific face recognition AF, we got eye AF we got computational photography that used multiple exposures from one camera we got computational photography that used multiple dedicated cameras (and other devices like time-of-flight sensors) and did so completely invisibly we got digital re-lighting we got the above 4 points in the most popular camera on the planet we got 6K50 RAW So, from before the DSLR revolution to 6K50 RAW and AI in the most popular and accessible camera on the planet in the last decade.... I think that we should take some queues from The Expanse and look at the floating camera that appears in Season 4. I think consumer cameras will go towards being smart and simple to use for the non-expert as phones continue to eat the entire photography market from the bottom up. They’ve basically killed the point-and-shoot and will continue to eat the low-end DSLR/MILC range and up. So I think we’ll get more and more 360 3D setups by default in an over capture situation where they have cameras in the front/back/sides/top/bottom/corners which will mean that they can do EIS perfectly in post. That will eliminate all rotational instability, but not any physical movement in any direction (how gimbals bob up and down when people walk with them). This will be addressed via AI, as the device will be taking the image apart, processing it in 3D, then putting it back together again. It won’t take much parallax adjustment in post to stabilise a few inches of travel when objects aren’t that close to the device. We won’t get floating cameras though, that would require a pretty significant advances in physics! This AI will enable computational DoF and other things, but I don’t think these will matter much, as I think shallow DoF is a trend that will decline gradually. If this is surprising to you then I’d suggest you go watch the excellent Fimmaker IQ video about the history of DoF, and take note that there was a period in cinematic history when everyone wanted deep DoF and once it became available through the tech the cinematic leaders of the time used it to tell stories where the plot advanced in the foreground, mid-ground and background simultaneously, and everyone wanted it. The normal folks of today view shallow DoF (and nice colours and lighting design and composition for that matter) as indicating something is fake, and it becomes associated with TV, movies, and large companies trying to PR you into loving them while they deforest the amazon and poison your drinking water. The look at authenticity is the look of a smartphone, because that’s where unscripted and unedited stories come from. The last decade started with barely the smartphone, now the smartphone is ubiquetious, and so developed that they basically all look the same. Camera phones basically didn’t exist a decade ago, now we have had the first feature films shot on them by famous directors (publicity stunt or not). People over-estimate what can happen in a year, but underestimate what can happen in a decade.
  2. kye

    Sirui anamorphic

    I'd suggest that it wouldn't work so well trying to make such a large change in hue. The problem is the same as greenscreening except that this example has extremely soft edges and getting the key right would be tricky and the edges would probably have strange halos. Take a few screenshots of one of the videos above and try it. I'd suggest grabbing a key, taking the key and blurring it vertically in one node, blurring it horizontally very heavily in the next node and then changing the colour based on that. Not sure if PP or FCPX will let you do that work flow, but it's pretty easy in Resolve Studio.
  3. well, ok, just a few.... You can adapt almost anything: It's flexible Portable - this is my dual camera setup for travel - pic taken on my towel on my lap in the tour bus doing 80kmh on the beach... and the files are crazy gradable - for example if we take this image here and try to break it we basically can't do it.... which means you're free to make whatever images you want..
  4. For me it's the GH5. Not perfect, but a real workhorse, and was really ahead of its time. IBIS means I can shoot hand-held but not be limited to a hand-held look, 10-bit internal gives hugely flexible footage without having to have an external recorder, 4K60 and VFR up to 1080p180 are great, MFT mount makes adapting almost anything super easy, flippy screen and EVF are really useful, the list goes on... That's my personal choice, but if I had to pick what was the most significant things in general then it would be the BMPCC and BMCC as I think they moved the goal posts forward by an enormous margin and much of the features we all enjoy today were inspired by that shift, as camera manufacturers have historically been very traditional and reluctant to give anything except the smallest improvement possible on newer models and this shook up the industry a bit and showed there is room for new players to be profitable and innovative.
  5. I was thinking that too. Even a stereo pair placed above the camera mixed in would give some sense of space. The best solution would be to decode a 3D signal to binaural output based on your viewing angle, but do we know if the platforms support this kind of encoded sound, or if they just pump out a stereo audio track? I figure that we're in the infancy of this stuff, and it will get there, but not really sure how far the tech is along right now w.r.t audio. I do know that in terms of adjustability it's normally pretty rubbish. I have MobileVRStation for iOS and the number of options in it is ridiculous, and the number of options in most online streaming platforms is zero.
  6. Great intro video from Seven Dovey about 180 VR film-making. He talks about the gear, framing and compositions, sound, and touches on editing too. He's shot a few films in this format and they're coming soon apparently so that will be interesting.
  7. kye

    Extreme telephoto work

    Finished processing the two timelapses I took. Resolve doesn't read RW2 files from the GH5 so I had to have a couple of goes of processing them, but I discovered that the Adobe DNG Converter creates 16-bit DNG files so keeps the bit-depth and DR of the RAW files, which is useful when you're clipping things this hard. Processing involved stabilisation, setting black and white point and a key of just the clipped areas which I pushed towards orange so it wasn't digital white. I'm thinking I'll do one where I don't clip the sun and see how it looks, although I suspect that everything else will just be black, but I'd be happy to be proven wrong. In both of them the edges of the sun appear to be blocky and horribly compressed, but it's actually due to the atmospheric effects, the RAW files are, well, RAW, and the pixel size is much smaller than the rather square-looking edges. I suspect it would make more sense visually if I had a higher frame rate (the GH5 can't do a time-lapse interval smaller than 1s which is what these are). I'd try video but I've zoomed in a bit on these, so the extra resolution seems to be quite useful. Suggestions welcome..
  8. Thanks all.. Looks like it's either a "real" video capture solution from BM or equivalent, a "real" monitor that I could also use for shooting, or a mains powered TV or monitor of some kind. I wouldn't use a monitor that much in my normal shooting but I would use it for shooting my kids sports games, although how many more seasons he sticks with that is uncertain.
  9. Thanks all. I was really hoping for a link to an $30 ebay USB dongle. I was kind of thinking of a laptop as kind of a half-way measure between having an on-camera monitor and having a huge TV/monitor on set like is common on big productions. I wouldn't carry around a laptop for my run-n-gun shooting, but if you're shooting in a single location then I don't see why a larger screen wouldn't be useful. I've had heaps of things that I shot on the GH5 and they looked fine on the monitor (or the viewfinder which is higher resolution) but were completely fubar when I saw them on my 32" display, and I'm not sure that a 7" display is really large enough to be a perfect proxy for a 32" 4K display, let alone a 65" 4K TV or a projector setup. And before anyone tells me that big productions use 7" monitors just fine let me say that I watch Netflix / Prime on my 32" 4K display and I see out-of-focus shots all the time.... There should be a chip that is HMDI to USB and there should be a factory in China just pumping out the standard implementation circuit from the spec sheet at a very small markup to the parts cost. Surely?
  10. kye

    Sirui anamorphic

    I think of it much less like marketing and more like market research. There’s a principle in market research that you can ask someone if they would purchase something and get an answer, but it’s only if you ask the person to actually buy that you get their real answer. Lots of people will tell you your idea is good, or that they would buy, to make you feel better, or they don’t like saying no to people, or they’re optimistic, or excited by the idea, etc but it’s not reality. Why design a product, fit out a factory with tooling, manufacture a bunch of products, market the shit out of it, and only then discover that the people in the focus group who said they were excited to buy it were just being nice... ? These crowdfunding campaigns are more valuable to manufacturers in market research than they are in anything else.
  11. kye

    Extreme telephoto work

    @leslie lots of ways.. You can either do the whole image, I’d suggest playing with the Gamma or Gain wheels and see if you can get the sun and the smoke the right colours at the same time. The other approach is to do a key of just the sun, and then you could try the same things, or try the Hue vs Hue and shift it towards red/orange and further away from pink/magenta.
  12. @heart0less - that is a fascinating video. Thanks! What I found interesting about it is: They credit the guys at LiftGammaGain (I suspect most similar videos come from people who either don't know about LGG or plainly rip it off without saying) They mention a LUT for all of 2 seconds before talking about everything except colour They made a complete tutorial that didn't include possibly the most significant things that (IMHO) make something look like film They also did something that I think was great - they showed the LGG thread. A quick search and now I can go read it! The thread is only about halation and gate weave (which I had to look up - it's the jumping around that film does - like why the credits move around when being projected). It's also interesting that it's started by Jason Bowdach whose name was familiar - he'd written one of the reference articles on film emulation that I refer back to on occasion.. So, all that said, here are the other links that I've found that are useful if you want to emulate film... To start, this article is very good and talks about non-linearities and saturation behaviour: https://www.provideocoalition.com/film-look-two/ It also links to this very interesting video showing how Alexa handles things: Noam Kroll has a decent write-up: https://noamkroll.com/how-to-make-your-digital-footage-look-like-film-in-post-production/ The article from Jason Bowdach that covers a lot of ground: https://blog.frame.io/2019/10/21/emulating-film-look/ and lastly the article the video above references: https://liftgammagain.com/forum/index.php?threads/halation-and-gate-weave.13056/ of course, after all this, the film look really comes from lighting and composition in the first place, and remember that if you want the film look then it's better to first get someone to look at your film (ha ha) so it should be entertaining in the first instance Film is probably an infinitely deep rabbit hole....
  13. kye

    Extreme telephoto work

    Yeah, the focus would be hard - for football season I shoot highlights at my kids games and so I spend a bit of time manually pulling focus at around 200-400mm equivalent, but you're in another whole league at 1500mm. One challenge I have with my rig is that it's on a TC and also a dumb adapter but the lens doesn't have a hot shoe, so it's being supported by the camera (and I also hold the weight of the lens with the hand that's focusing) but getting some additional lens support would be ideal in future. That is a really cool video and awesome zooms. Thanks for posting the article, some good tech talk in there and it makes sense they would have needed a robot - those focal lengths are really difficult to manage, especially with those fast transitions. Luckily the sun doesn't require me to move the camera!! lol, let's hope! One of our neighbours has a two storey house with a balcony and they're often out there so I'm not the only one. Of course, I can't really see into other peoples backyards from that angle either, so there's that. well, not an optical viewfinder anyway! I have no idea at what point the sun is too much, but I'm shooting without filters, although at F4 the lens is hardly super fast. Thanks - interesting setup. I went with the oats solution as it keeps the camera low to the ground so that air is only going to go over the top of it rather than around it with lots of turbulence, and the oats also provides multiple points of contact rather than there being a single point subject to flex and twists. My tripod would have to be at eye-level (and I'm over 6 foot) to see the horizon, so at it's least stable configuration basically.
  14. Interesting film from Blackmagic... Shot on P6K, post in Resolve 16, and the credits say "Produced in 14 days" so presumably a short timeframe project. What I think is interesting is that (although it doesn't say it) I suspect that the whole production was done in Resolve, so that likely means that all the VFX are from the Fusion page. This would be pretty cool as I think that the Fusion page has huge untapped potential and we haven't seen it in use yet as the people who really know Fusion are off getting paid not being on YT. It would be interesting to see some BTS, especially of what happened in post for this.
  15. The other day I noticed the sunset and decided to have a go at filming the sunset at super-telephoto so that the sun fills a large section of the frame and you see all the atmospheric distortions and stuff. First attempt was a write-off because I stuffed up the focus. Second attempt was better as I was able to focus better, and then closed the aperture a few stops to make sure I got it. I stuffed up the settings as I still had it on my normal auto-exposure and auto-WB, so that's why I'm only posting this still, however I think I'm getting closer.. This was shot in 4K but I think next time I will do a normal time-lapse and get the benefits of RAW and a better ISO as there is quite a bit of grain in the file. The setup seems to work though: GH5, Canon FD 70-210 F4 at 210mm F11, Canon FD 2x TC. Setup on a packet of oats for stability in the breeze we had, and on top of a pillar on the fence. I suspect that at 840mm equivalent that stability is one of the key factors here, and the oats seemed to do the trick. Once I get it all setup then I can probably put a bag of pasta or something on the top to further damp it and deflect some of the wind. Yesterday was cloudy so no sunset, but hopefully tomorrow I'll get another go with fixed settings and in time-lapse mode and I'll be able to post a video instead of just a still Anyone else shooting longer than 600mm, or 800mm equivalents?
  16. Can I use a laptop as a large portable HDMI monitor, presumably via some kind of USB HDMI in type adapter/dongle and some software that allows viewing (and maybe some cool features like LUTs, false colour, focus peaking etc). It seems ridiculous to suggest that I have to buy a HDMI monitor when that means I have to re-buy the screen, buttons, battery etc that are already in a laptop. Surely there's an adapter that's cheaper than a monitor with focus peaking and various features? I'd be looking to use it with my GH5 for critical focus when I'm either too far from the camera to see the floppy screen, or when I can't check focus well enough using the (rather mediocre) focus peaking that it delivers.
  17. Totally agree. Although, considering that your comment about film stock is very relevant here and that's now the sensor and CS which means it's the camera too, I'd say it's Lighting, Lenses, and Camera (in that order!). Of course, the full list is more like: Story, Acting, Sound, Lighting, Cinematography, Lenses, and then Camera (in that order!)
  18. kye

    Sirui anamorphic

    Naturally, I choose the cheapest single-focus lens that matches with my 16:9 camera... oh wait!
  19. Thanks to @noone for posting that link to lensrentals - that article completely changed the way I think about DoF and lenses. I hadn't previously made the link that field curvature is actually a map of DoF, so that's hugely useful. It also has a number of very significant consequences, and may help to join together DoF, field curvature, the FF look, how an image transitions between in-focus areas and out-of-focus areas, and 3D pop. I'm contemplating and will return with my thoughts once I've re-adjusted my head and worked out what this means.
  20. kye

    Sirui anamorphic

    @Julien416 I agree. Let's agree to disagree ??? It's probably a fine point, but your original post said "all the people" and "everyone". I was just trying to bring some balance into the equation. You might be right that the vast majority of people shooting anamorphic want a less clinical look, but from my cine lens thread I also quoted some people talking about the Master Anamorphics: It might just be a pet annoyance of mine, but so often people talk in absolutes now, and I think it just divides people, creates friction, encourages polarised thinking, and discourages critical thinking. I find politics to be very prevalent with this issue - if "everyone" (or even "all" the smart people) share your opinion then there's no reason to talk about issues, exchange views, and heaven forbid(!) actually learn something new. Anyway, rant over. PS, if you want to shoot anamorphic on a budget then maybe buy the lens and dirty it up in post! Or buy a vintage spherical and crop it. Or whatever. This lens doesn't somehow make all the other lenses disappear. in fact, it makes them cheaper
  21. kye

    Google Pixel 3

    and here's one from the Pixel 4. skill > tech
  22. kye

    Sirui anamorphic

    I disagree. It might be that the defects are a common reason that people shoot anamorphic, but it's not everyone. Here's a video that includes the Zeiss / ARRI Master Anamorphic and it renders an image almost as technically perfect wide open at T1.9 as other cine lenses at T4. I suspect that shooting anamorphic because you want the flaws is a low budget thing (and because at low budget you can't get sharp well-behaved optics so there's no choice). It's like saying that people only shoot FF to have shallow DoF. Or MFT because they want a small camera. You might make the point that Hollywood cinematographers often like vintage glass because of the softness and rendering, which is true, but there's also a large segment of Hollywood that have the attitude of capturing things in the highest quality possible (meaning highest resolution and most neutral rendering) so that they can push the image around in post later on. It's not a POV you hear a lot, but lots of big movies are shot with this principle.
  23. @heart0less that's very interesting because out of all the mega-dollar cine lens tests I've seen so far I don't recall there being a single Panavision lens amongst them. Of course, maybe one or two escaped my attention, but not enough to make me remember them. In the above: Panavision = 29 ARRI = 12 Cooke = 7 Angenieux = 4 the rest only got 1... Maybe we should buy Panavision lenses and it will make us automatically famous.... on second thoughts, I can't find a singe cine Panavision lens for sale. I guess it's beyond those things where the price says "if you have to ask it's too much" and all the way to "if you don't know where to ask then it's too much"!!!
  24. Well, that's definitely a credit to Sony as I haven't spotted a single clip with colour issues. My understanding of Sony colours is that they are the most accurate but that people don't find them pleasing, and that their log profiles can be hard to grade. But having said that I haven't worked with Sony footage (apart from my new X3000 action cam) so none of that is from personal experience. Sounds like the OP should just use the default settings......
×
×
  • Create New...