-
Posts
6,031 -
Joined
-
Last visited
Content Type
Profiles
Forums
Articles
Everything posted by BTM_Pix
-
Step 2 - Holy Kit OK, so we've established that there is no way we can contemplate the financial, physical and personnel resources available to The Mandalorian but we do need to look at just how much kit we need and how much it will cost to mimic the concept. From the front of our eyeball to the end of the background, we are going to need the following : Camera Lens Tracking system for both camera and lens Video capture device Computer Software Video output device Screen No problem with 1 and 2 obviously but things start to get more challenging from step 3 onwards. In terms of the tracking system, we need to be able to synchronise the position of our physical camera in the real world to the synthetic camera inside the virtual 3D set. There are numerous ways to do this but at our level we are looking at re-purposing the Vive Tracker units mounted on top of the camera. With the aid of a couple of base station units, this will transmit the physical position of the camera to the computer and whilst there may be some scope to look at different options, this is a proven solution that initially at least will be the one for me to look at getting. Price wise, it is €600 for a tracker and two base stations so I won't be getting the credit card out for these until I've got the initial setup working so in the initial stages it will be a locked off camera only ! For the lens control, there also needs to be synchronisation between the focus and zoom position of the physical lens on the physical camera with that of the virtual lens on the virtual camera. Again, the current norm is to use additional Vive trackers attached to follow focus wheels and translating the movements to focus and lens positions so that will require an additional two of those at €200 each. Of course, these are also definitely in the "to be added at some point" category as the camera tracker. The other option here are lens encoders which attach to the lens itself and transmit the physical position of the lens to the computer which then uses a calibration process to map these to distance or focal length but these are roughly €600 each so, well, not happening initially either. Moving on to video capture, that is something that can range from the €10 HDMI capture dongles up to BM's high end cards but in the first instance where we are just exploring the concepts then I'll use what I have around. The computer is the thing that is giving me a fit of the vapours. My current late 2015 MacBook Pro isn't going to cut it in terms of running this stuff full bore and I'm looking to get one of the new fancy dan M1 Max ones anyway (just not till the mid part of the year) but I'm not sure if the support is there yet for the software. Again, just to get it going, I am going to see if I can run it with what I have for proof of concept and then defer the decision until later regarding whether I'm going to take it further enough to hasten the upgrade. What I'm most worried about is the support to fully utilise the new MacBooks not being there and having to get a Windows machine. Eek ! At least the software part is sorted as obviously I'll be using Unreal Engine to do everything and with it being free (for this application at least) there is no way of going cheaper ! The video output and screen aspect are ones that, again, can be deferred until or if such time as I want to become more serious than just tinkering around. At the serious end, the output will need an output card such as one of BM's and a display device such as a projector so that live in camera compositing can be done. At the conceptual end, the output can just be the computer screen for monitoring and a green screen and save the compositing for later. To be honest, even if I can get my addled old brain to get things advanced to the stage that I want to use it then its unlikely that the budget would stretch to the full on in camera compositing so it will be green screen all the way ! OK, so the next step is to install Unreal Engine and see what I can get going with what I have to hand before putting any money into what might be a dead end/five minute wonder for me.
-
Step 1 - The Pros And Not Pros So, to look at sort of what we want to end up with conceptually, this is the most obviously well discussed example of VP at the moment. Fundamentally, its about seamlessly blending real foreground people/objects synthetic backgrounds and its a new-ish take on the (very) old Hollywood staple of rear projection as used with varying degrees of success in the past. The difference now is that productions like The Mandalorian are using dynamically generated backgrounds created in real time to synchronise with the camera movement of the real camera that is filming the foreground content. Even just for a simple background replacement like in the driving examples from the old films, the new technique has huge advantages in terms of being able to sell the effect through changeable lighting and optical effect simulation on the background etc. Beyond that, though, the scope for the generation and re-configuring of complex background sets in real time is pretty amazing for creating series such as The Mandalorian. OK, so this is all well and good for a series that has a budget of $8.5m an episode, shot in an aircraft hangar sized building with more OLED screens than a Samsung production run and a small army of people working on it but what's in it for the rest of us ? Well, we are now at the point where it scales down and, whilst not exactly free, its becoming far more achievable at a lower budget level as we can see in this example. And it is that sort of level that I initially want to head towards and possibly spot some opportunities to add some more enhancements of my own in terms of camera integration. I'm pretty much going from a standing start with this stuff so there won't be anything exactly revolutionary in this journey as I'll largely be leaning pretty heavily on the findings of the many people who are way further down the track. In tomorrow's thrilling instalment, I'll be getting a rough kit list together and then having a long lie down after I've totted up the cost of it and have to re-calibrate my "at a budget level" definition.
-
Creating emulations of different lens profiles could well become a thing.
-
You and me both. Like most things these days, the amount of stuff and graft required to get systems set up with "real" cameras just to get the same starting point in terms of real time composting etc as some phone applications is quite mad ! The interesting thing is how it may impact the role/expense of the camera and lenses in such productions as, from a practical though simplistic point of view, the onus just becomes on achieving a sharp capture of the live element as all of the background elements where the character/mojo/hype kicks in with real cameras and lenses will be synthetically created. It offers the unwanted potential for some horribly synthetic "portrait mode" stuff like some smartphones of course. But with much more expensive cameras and lenses. And tons more computing power required !
-
They say "A journey of a thousand miles begins with a single step" and that describes exactly where I am with Virtual Production. The reason for setting off on this journey ? Curiosity in the process is one aspect - and is something I'm hoping will sustain when the expected big boulders block the route - but I also think VP is going to be a process that trickles down to lower budget projects faster than people might expect so it is something I'm keen to get involved in. What aspects of VP am I looking to learn about and get involved with ? In the first instance, it will be about simple static background replacement compositing but will move on to more "synthetic set" 3D territory both for narrative but also multi camera for virtual studio applications. There will also be a side aspect of creating 3D assets for the sets but that won't be happening until some comfort level of how to actually use them is achieved ! And the ultimate objective ? I'm not expecting to be producing The Mandalorian in my spare bedroom but I am hoping to end up in a position where a very low-fi version of the techniques used in it can be seen to be viable. I'm also hopeful that I can get a two camera virtual broadcast studio setup working for my upcoming YouTube crossover channel Shockface which is a vlog about a gangster who sells illicit thumbnails to haplessly addicted algorithm junkies. I'm fully expecting this journey to make regular stops at "Bewilderment", "Fuck Up", "Disillusionment" and "Poverty" but if you'd like to be an observer then keep an eye on this thread as it unfolds. Or more likely hurtles off the rails. Oh and from my initial research about the current state of play, there may well be a product or two that I develop along the way to use with it too...
-
I think this is a neat idea. Anything that encourages more viewfinder gazing than navel gazing is a win for me at the moment. I'm in.
-
Kotaro became quite the celebrity after his sax playing exploits.
-
He was absolutely desperate to get a grip of that protein shake wasn't he ?
-
-
This is an interesting product that has dual slide outs for holding other equipment. Might be a bit heavy though but something that could be used as, erm, "inspiration" for something DIY. https://www.amazon.co.uk/AboveTEK-Portable-Retractable-Non-Slip-Notebook/dp/B074473Z6T Tangent, who make hardware control surfaces for Resolve, have something that might be worth looking at. Its a virtual control surface that runs on iPad or Android tablets so it would satisfy the criteria of it being compact as you can scale it by dint of what size of tablet that you run on it. You can even run it on different units at the same time so you could have one tablet running the screen with the control wheels on it and another the transport page etc. It doesn't have any mad requirements on the tablet side so it will run any cheapo 7" no brand model so a couple of those will still give you a sub €150 solution. There is a free version available to try out to see if it suits and the full version is €8.99 (you only need one copy for multiscreen operation). https://www.tangentwave.co.uk/products/ If you don't fancy the idea of virtual surfaces, Tangent's compact Ripple controller might be worth a look if you can find one.
-
Canon rep at the launch whilst simultaneously neglecting to mention the 8 second delay when switching modes between stills and cine whilst also taking some of the AF capability from you.
-
I used to have one of these adapters that Cokin made for their A series filters to enable you to use them with compact cameras that, obviously, don't have a filter thread. The smart thing about it as well is that the grid of openings allow you to account for different mounting positions and distances from the lens so you would be able to position it to be snug against the front of the WA adapter. It holds three filters so is quite flexible and is made of plastic so weighs barely anything and folds flat for transport. They are long since discontinued (and I'm fucked if I can remember where my one is now) but might be worth a scout on eBay if it suits your needs.
-
Contax-Zeiss Sonnar 135mm f2.8 on Sigma FP Shot about 25 mins apart on a very changeable weather wander!
-
In my experience, as the F to EF adapters are by necessity reasonably thin, then they are so tight that the challenge of getting them off again is far bigger than the challenge of keeping them on and secure !
-
Yeah, my Nikon mount speedbooster sits in a drawer and remains as unused as Boris Johnson's sincerity. Speaking of Pockets/Micros, an EF speedbooster used with a used Canon 10-18mm EF-S gives a really good alternative to the venerable Tokina 11-16mm for anyone looking for a wide option for those cameras. Its far cheaper, smaller and ilghter than the Tokina and has not only a bigger range but also has IS which is a big boon on those cameras. OK, its slower than the Tokina and doesn't have its constant aperture either but the speedbooster gets some speed back obviously. I might do some frames with it to post here if the plague actually calms the fuck down again in my area.
-
I've got no interest in the camera to be honest but I'd just like to say that its always good to see the democracy manifest fella making an appearance in any video.
-
Sigma fp and EVF-11 for other cameras... Could be possible
BTM_Pix replied to paulinventome's topic in Cameras
Based on NOT having access to the service manual for the FP or the EVF11....... The USB-C port is more than likely a simple pass through rather than being used for power/control of the EVF11. You can test this by putting a USB-C cable from the FP to the EVF11 and then another one from that into an external SSD. If the EVF11 doesn't power on and the camera can still see and use the SSD then that will confirm it. Based on it not being powered from the USB-C port then the presumption is it is being powered from the two pogo pins, with one of them being power and the other one obviously being ground. From tracing the connection from the pins of the HU11 flash interface to its hotshoe just now, the ground pin (unless there is some internal re-mapping) is the middle pin on the bottom row of the FP which would mean the one adjacent to it (as the pogo pins are horizontal on the EVF11) should be the power. However, I can't tell from the images of the EVF11 (as I don't have one unfortunately) whether this is on the one to the left or right but you should be able to determine that when you offer it up to the FP for connection. Once that is established, you should be able to measure the voltage coming from the FP to determine what external power you will need to provide. If there is no control information between the FP and the EVF11 through the USB port then it will be being provided through the HDMI port's data control channel. If I was going to look at hacking the EVF (heaven forbid, obviously) then I would start with extending out the connections from the FP to it in the same way as with the USB-C port before looking at the power as that offers the least chance of blowing up the FP and/or the EVF11. So, put some clips between the pogo pins of the FP and the EVF11 and an HDMI cable in between too. You can then remove the HDMI cable and see what happens both with it going to fresh air but then also what happens if you plug it into another HDMI device, what happens when you plug it back into the FP etc. This should tell you if there is any specific data communication over HDMI from the FP to the EVF11 that prevents it working with any source as if it does that then there isn't much point going on to the power aspect. If it does work with another HDMI source whilst being powered from the FP then you might still not be out of the woods as there could be an initial handshake that is happening prior to you changing source so the next test would be to power it on with the other source already connected and see if it still works. If it does then you are only looking at the power situation but obviously proceed absolutely at your own risk with that both from the initial measuring side on the FP to then building something custom based on that to power the EVF11. -
It might sound counter intuitive but, even though you've got F mount lenses, I'd look at getting the EF version instead and using an F to EF adapter on it. The reason being is that you get no electronic control of Nikon lenses with the Nikon version anyway so you aren't losing anything there but what you will gain is more flexibility by being able to put EF lenses on it as well, where you will get electronic control from the P4K. So, if you do get any EF or EF-S lenses, you will be able to use aperture control and one shot AF but also take advantage of the IS on the lens if it has it, which can be a boon with the P4K. With the EF mount being shallower than the F mount, there is no lens you can adapt to F that you can't also adapt to EF so you don't lose anything there either. On the contrary, you actually gain several mount options, including C/Y and M42, that can be adapted to EF with simple adapters whereas you would need adapters with optical elements to use them with Nikon F and maintain infinity focus.
-
Some Voigtlander 40mm f1.4 M mount stuff on the Sigma fp from a recent trip to the frozen-ish north to see the in laws. The vignette and slight softness when shooting wide open makes it quite suitable for festive stuff ! Didn't shoot RAW so these are grabs from 4K all-i mov files.
-
I'm not sure what the issue is here ? There are support materials covering all aspects of calibration and operation which have been updated over the past nine months. All of the following support materials have been publicly published on the project page. All AFX owners have the updates of these support materials emailed to them automatically at the address they used when they ordered the AFX. User Guide - The full User Guide for the AFX is available at www.cdatek.com Please note that the User Guide is subject to addition as new features are added so please check that you have the most current version. Focus Calibration Chart - The focus calibration chart is available for download at www.cdatek.com and should be printed out for use with calibrating your lenses. Firmware Upgrades - All firmware upgrades are available at www.cdatek.com/afx-upgrade . On this page you will also find the firmware installation guide. The availability of new firmware is announced through the Indiegogo Project Update section and you should receive automatic notifications as they become available. Video Walkthrough Of Calibration Procedure For Electronic Lenses - A live walkthrough of the calibration procedure for electronic lenses for Pocket4K/Pocket6K is available here Video Walkthrough Of Lens Profiling Step For Pocket 6K - The Pocket6K lenses require an additional profiling step prior to the main calibration process and this is covered in a live video walkthrough here Video Walkthrough Of Calibration Procedure For Manual Focus Lenses - A live walkthrough of the calibration procedure for motor controlled manual focus lenses is available here Video Walkthrough Of Motor Calibration Procedure For Non Hard Stop Lenses - A live walkthrough of the motor calibration procedure for manual focus lenses that do not have hard stops is available here Video Walkthrough Of Zoom Lens Operations - A live walkthrough of the calibration procedure for zoom lenses and their operational considerations is available here Video Walkthrough Of Quad Lock - A live walkthrough of the functionality and operation of the QuadLock system is available here Example Of Using Adapted EF Lens On Pocket4K - Example of AFX AF-C performance on the Pocket4K using a Metabones Speedbooster and Sigma 18-35mm EF lens is available here Comparison Of AF-X AF Performance With Pocket 6K Internal AF - A comparison of the performance of the AFX in AF-C mode on a Pocket 6K with a Sigma 18-35mm EF lens versus the camera's own internal AF system is available here Video walkthrough of new Focus Recorder function in AFX v1.7 is available here Video walkthrough of the Firmware Update Process is available here
-
@Sammysammy I have respnded to your PM . As an FYI mate, I prefer all tech support requests to be directed to cdatek.com directly to get a formal response rather than PM on here as its not the correct way for us to track it etc Thanks
-
I did a quick thing about it when I first picked one up. Nothing special in terms of testing, just putting it on top of the camera while recording some fixed position wallpaper stuff. The audio in the video in the thread is from its down mixed live binaural line output straight into the camera (it can produce this output simultaneously whilst its recording the individual tracks of each capsule to the recorder). The original files are downloadable so you can try it with their software to experiment with the re-positioning aspect. It is an interesting device to supplement other microphones rather than replace them but I'm hoping at some point that Zoom produce a higher end one with better capsules.
-
For a reactionary pessimist such as myself though, the other way of looking at that is as a 100% failure rate 🙂