Jump to content

KnightsFan

Members
  • Posts

    1,209
  • Joined

  • Last visited

Reputation Activity

  1. Like
    KnightsFan got a reaction from Andrew Reid in CanonRumors owner decides to quit   
    This. I love VR for entertainment, I have high hopes for it as a productivity tool, and it's a huge benefit in many training applications. But the Metaverse concept--particularly with Facebook behind it--is a terrible idea.
  2. Like
    KnightsFan reacted to Andrew Reid in CanonRumors owner decides to quit   
    Facebook's vision with the tech that started with Oculus is to virtualise our everyday lives so we do nothing but sit at home with a bucket on our head watching ads.
  3. Like
    KnightsFan got a reaction from majoraxis in My Journey To Virtual Production   
    The main conceptual thing was I (stupidly) left the rotation of the XR controller on in the first one, whereas with camera projection the rotation should be ignored. I actually don't think it's that noticeable a change since I didn't rotate the camera much in the first test. It's only noticeable with canted angles. The other thing I didn't do correctly in the first test, was I parented the camera to a tracked point, but forgot to account for the offset. So the camera pivots around the wrong point. That is really the main thing that is noticeably wrong from the first one.
    Additionally, I measured out the size of the objects so the mountains are scaled correctly, and I measured the starting point of the camera's sensor as accurately as I could instead of just "well this looks about right." Similarly I actually aligned the virtual light source and my light. Nothing fancy, I just eyeballed it, but I didn't even do that for the first version. That's all just a matter of putting the effort in to make sure everything was aligned.
    The better tracking is because I used a Quest instead of a Rift headset. The Quest has cameras on the bottom, so it has an easier time tracking an object held at chest height. I have the headset tilted up on my head to record these, so the extra downward FOV helps considerably. Nothing really special there, just better hardware. There were a few blips where it lost tracking--if I was doing this on a shoot I'd use HTC Vive trackers, but they are honestly rather annoying to set up, despite being significantly better for this sort of thing.
    Also this time I put some handles on my camera, and the added stability means that the delay between tracking and rendering is less noticeable. The delay is a result of the XR controller tracking position at only 30 hz, plus HDMI delay from PC to TV which is fairly large on my model. I can reduce this delay by using a Vive, which has 250 hz tracking, and a low latency projector (or screen, but for live action I need more image area). I think in best case I might actually get sub-frame delay at 24 fps. Some gaming projectors claim <5ms latency.
  4. Like
    KnightsFan got a reaction from majoraxis in My Journey To Virtual Production   
    ...and today I cleaned up the effect quite a bit. This is way too much fun! The background is a simple terrain with a solid color shader, lit by a single directional light and a solid color sky. With it this far out of focus, it looks shockingly good considering it's just a low poly mesh colored brown!
    https://s3.amazonaws.com/gobuildstuff.com/videos/virtualBGTest2.mp4
    It's a 65" tv behind this one. You can vaguely see my reflection in the screen haha.
  5. Like
    KnightsFan got a reaction from majoraxis in My Journey To Virtual Production   
    Thanks to this topic, I dug a little further into virtual backgrounds myself. I used this library from github, and made this today. It's tracked using an Oculus controller velcro'd to the top of my camera. "Calibration" was a tape measure and some guesstimation, and the model isn't quite the right scale for the figure. No attempt to match focus, color, or lighting. So not terribly accurate, but imo it went shockingly well for a Sunday evening proof on concept.
    https://s3.amazonaws.com/gobuildstuff.com/videos/virtualBGTest.mp4
  6. Like
    KnightsFan got a reaction from BTM_Pix in My Journey To Virtual Production   
    Thanks to this topic, I dug a little further into virtual backgrounds myself. I used this library from github, and made this today. It's tracked using an Oculus controller velcro'd to the top of my camera. "Calibration" was a tape measure and some guesstimation, and the model isn't quite the right scale for the figure. No attempt to match focus, color, or lighting. So not terribly accurate, but imo it went shockingly well for a Sunday evening proof on concept.
    https://s3.amazonaws.com/gobuildstuff.com/videos/virtualBGTest.mp4
  7. Like
    KnightsFan got a reaction from webrunner5 in My Journey To Virtual Production   
    Thanks to this topic, I dug a little further into virtual backgrounds myself. I used this library from github, and made this today. It's tracked using an Oculus controller velcro'd to the top of my camera. "Calibration" was a tape measure and some guesstimation, and the model isn't quite the right scale for the figure. No attempt to match focus, color, or lighting. So not terribly accurate, but imo it went shockingly well for a Sunday evening proof on concept.
    https://s3.amazonaws.com/gobuildstuff.com/videos/virtualBGTest.mp4
  8. Like
    KnightsFan got a reaction from matthere in My Journey To Virtual Production   
    Last year my friends and I put together a show which we filmed in virtual reality. We tried to make it as much like actual filmmaking as possible, except we were 1000 miles away. That didn't turn out to be as possible as we'd hoped, due to bugs and limited features. The engine is entirely built from scratch using Unity, so I was doing making patches every 10 minutes just to get it to work. Most of the 3D assets were made for the show, maybe 10% or so were found free on the web. Definitely one of the more difficult aspects was building a pipeline entirely remotely for scripts, storyboards, assets, recordings, and then at the very end the relatively simple task of collaborative remote editing with video and audio files.
    If you're interested you can watch it here (mature audiences) https://www.youtube.com/playlist?list=PLocS26VJsm_2dYcuwrN36ZgrOVtx49urd
    I've been toying with the idea of going solo on my own virtual production as well.
  9. Like
    KnightsFan got a reaction from kye in My Journey To Virtual Production   
    It's still easy to spot an object sliding around in the frame, especially if it's supposed to be fixed to the ground. And motion capture usually means capturing the animation of a person, ie how their arms, legs, and face move, and we're especially adept at noticing when a fellow human moves incorrectly. So for foreground character animation, the accuracy has to also be really high. I believe that even with high end motion capture systems, an animator will clean up the data afterwards for high fidelity workflows like you would see in blockbusters or AAA video games.
    The repo I posted is a machine learning algorithm to get full body human animation from a video of a person, as opposed to the traditional method of a person wearing a suit with markers tracked by an array of cameras fed through an algorithm someone wrote manually. It has shockingly good results, and that method will only get better with time--as with every other machine learning application! Machine learning is the future of animation imo.
    For motion tracking, Vive is exceptional. I've used a lot of commercial VR headsets, and Vive is top of the pack for tracking. Much better than the inside out version (using builtin cameras instead of external sensors) of even Oculus/Meta's headsets. I don't know what the distance limit is, I've got my base stations about 10 ft apart. For me and my room, the limiting factor for a homemade virtual set is the size of the backdrop, not the tracking area.
  10. Like
    KnightsFan got a reaction from greenscreen in Ursa Mini 4.6k or Pocket 6k?   
    Seems like most people don't visit the sub forums, you might get more responses in the main forum. Without having used either camera, and assuming you're sure it doesn't have anything wrong with it at that price, here's my two cents.
    It would all come down to how controlled a setting you typically shoot in, and how big your crew is. That's a large camera, meaning support equipment also needs to be sturdier--tripod, gimbal, crane, car mounts, etc. It'll go through more batteries and storage. If in a studio, it may not be a problem, but on a 10 hour location shoot that's a lot of extra stuff to bring, compared to a BMMCC. Those 4.6k sensors aren't known for being great in low light, so you'll also perhaps need more/bigger lights.
    So if it already matches your production style, than that's a great deal to jump on. However, if you're one-man-banding it on location, consider how it might affect your workflow.
  11. Like
    KnightsFan got a reaction from majoraxis in My Journey To Virtual Production   
    Last year my friends and I put together a show which we filmed in virtual reality. We tried to make it as much like actual filmmaking as possible, except we were 1000 miles away. That didn't turn out to be as possible as we'd hoped, due to bugs and limited features. The engine is entirely built from scratch using Unity, so I was doing making patches every 10 minutes just to get it to work. Most of the 3D assets were made for the show, maybe 10% or so were found free on the web. Definitely one of the more difficult aspects was building a pipeline entirely remotely for scripts, storyboards, assets, recordings, and then at the very end the relatively simple task of collaborative remote editing with video and audio files.
    If you're interested you can watch it here (mature audiences) https://www.youtube.com/playlist?list=PLocS26VJsm_2dYcuwrN36ZgrOVtx49urd
    I've been toying with the idea of going solo on my own virtual production as well.
  12. Like
    KnightsFan reacted to BTM_Pix in My Journey To Virtual Production   
    Step 2 - Holy Kit
    OK, so we've established that there is no way we can contemplate the financial, physical and personnel resources available to The Mandalorian but we do need to look at just how much kit we need and how much it will cost to mimic the concept.
    From the front of our eyeball to the end of the background, we are going to need the following :
    Camera Lens Tracking system for both camera and lens Video capture device Computer Software Video output device Screen No problem with 1 and 2 obviously but things start to get more challenging from step 3 onwards.
    In terms of the tracking system, we need to be able to synchronise the position of our physical camera in the real world to the synthetic camera inside the virtual 3D set.
    There are numerous ways to do this but at our level we are looking at re-purposing the Vive Tracker units mounted on top of the camera.

    With the aid of a couple of base station units, this will transmit the physical position of the camera to the computer and whilst there may be some scope to look at different options, this is a proven solution that initially at least will be the one for me to look at getting.
    Price wise, it is €600 for a tracker and two base stations so I won't be getting the credit card out for these until I've got the initial setup working so in the initial stages it will be a locked off camera only !
    For the lens control, there also needs to be synchronisation between the focus and zoom position of the physical lens on the physical camera with that of the virtual lens on the virtual camera.
    Again, the current norm is to use additional Vive trackers attached to follow focus wheels and translating the movements to focus and lens positions so that will require an additional two of those at €200 each.
    Of course, these are also definitely in the "to be added at some point" category as the camera tracker.
    The other option here are lens encoders which attach to the lens itself and transmit the physical position of the lens to the computer which then uses a calibration process to map these to distance or focal length but these are roughly €600 each so, well, not happening initially either.
    Moving on to video capture, that is something that can range from the €10 HDMI capture dongles up to BM's high end cards but in the first instance where we are just exploring the concepts then I'll use what I have around.
    The computer is the thing that is giving me a fit of the vapours.
    My current late 2015 MacBook Pro isn't going to cut it in terms of running this stuff full bore and I'm looking to get one of the new fancy dan M1 Max ones anyway (just not till the mid part of the year) but I'm not sure if the support is there yet for the software.  Again, just to get it going, I am going to see if I can run it with what I have for proof of concept and then defer the decision until later regarding whether I'm going to take it further enough to hasten the upgrade.
    What I'm most worried about is the support to fully utilise the new MacBooks not being there and having to get a Windows machine.
    Eek !
    At least the software part is sorted as obviously I'll be using Unreal Engine to do everything and with it being free (for this application at least) there is no way of going cheaper !
    The video output and screen aspect are ones that, again, can be deferred until or if such time as I want to become more serious than just tinkering around.
    At the serious end, the output will need an output card such as one of BM's and a display device such as a projector so that live in camera compositing can be done.
    At the conceptual end, the output can just be the computer screen for monitoring and a green screen and save the compositing for later.
    To be honest, even if I can get my addled old brain to get things advanced to the stage that I want to use it then its unlikely that the budget would stretch to the full on in camera compositing so it will be green screen all the way !
    OK, so the next step is to install Unreal Engine and see what I can get going with what I have to hand before putting any money into what might be a dead end/five minute wonder for me.
  13. Like
    KnightsFan reacted to BTM_Pix in My Journey To Virtual Production   
    Step 1 - The Pros And Not Pros
    So, to look at sort of what we want to end up with conceptually, this is the most obviously well discussed example of VP at the moment.
    Fundamentally, its about seamlessly blending real foreground people/objects synthetic backgrounds and its a new-ish take on the (very) old Hollywood staple of rear projection as used with varying degrees of success in the past.
    The difference now is that productions like The Mandalorian are using dynamically generated backgrounds created in real time to synchronise with the camera movement of the real camera that is filming the foreground content.
    Even just for a simple background replacement like in the driving examples from the old films, the new technique has huge advantages in terms of being able to sell the effect through changeable lighting and optical effect simulation on the background etc.
    Beyond that, though, the scope for the generation and re-configuring of complex background sets in real time is pretty amazing for creating series such as The Mandalorian.
    OK, so this is all well and good for a series that has a budget of $8.5m an episode, shot in an aircraft hangar sized building with more OLED screens than a Samsung production run and a small army of people working on it but what's in it for the rest of us ?
    Well, we are now at the point where it scales down and, whilst not exactly free, its becoming far more achievable at a lower budget level as we can see in this example.
    And it is that sort of level that I initially want to head towards and possibly spot some opportunities to add some more enhancements of my own in terms of camera integration.
    I'm pretty much going from a standing start with this stuff so there won't be anything exactly revolutionary in this journey as I'll largely be leaning pretty heavily on the findings of the many people who are way further down the track.
    In tomorrow's thrilling instalment, I'll be getting a rough kit list together and then having a long lie down after I've totted up the cost of it and have to re-calibrate my "at a budget level" definition.
  14. Like
    KnightsFan got a reaction from kaylee in My Journey To Virtual Production   
    Last year my friends and I put together a show which we filmed in virtual reality. We tried to make it as much like actual filmmaking as possible, except we were 1000 miles away. That didn't turn out to be as possible as we'd hoped, due to bugs and limited features. The engine is entirely built from scratch using Unity, so I was doing making patches every 10 minutes just to get it to work. Most of the 3D assets were made for the show, maybe 10% or so were found free on the web. Definitely one of the more difficult aspects was building a pipeline entirely remotely for scripts, storyboards, assets, recordings, and then at the very end the relatively simple task of collaborative remote editing with video and audio files.
    If you're interested you can watch it here (mature audiences) https://www.youtube.com/playlist?list=PLocS26VJsm_2dYcuwrN36ZgrOVtx49urd
    I've been toying with the idea of going solo on my own virtual production as well.
  15. Like
    KnightsFan got a reaction from BTM_Pix in My Journey To Virtual Production   
    Last year my friends and I put together a show which we filmed in virtual reality. We tried to make it as much like actual filmmaking as possible, except we were 1000 miles away. That didn't turn out to be as possible as we'd hoped, due to bugs and limited features. The engine is entirely built from scratch using Unity, so I was doing making patches every 10 minutes just to get it to work. Most of the 3D assets were made for the show, maybe 10% or so were found free on the web. Definitely one of the more difficult aspects was building a pipeline entirely remotely for scripts, storyboards, assets, recordings, and then at the very end the relatively simple task of collaborative remote editing with video and audio files.
    If you're interested you can watch it here (mature audiences) https://www.youtube.com/playlist?list=PLocS26VJsm_2dYcuwrN36ZgrOVtx49urd
    I've been toying with the idea of going solo on my own virtual production as well.
  16. Thanks
    KnightsFan reacted to kye in Best Control Surface?   
    Yes, the layout absolutely matters.
    The Beatstep Resolve Edition comes in different resolutions and you have to use the right one.  When you start it up it does an initialisation routine that resets the layout and puts Resolve into Full Screen mode.  I don't find it that much of an imposition but if you'd moved the windows around (in the Colour page) a lot then that might be an issue for you perhaps.  Depending on your screen resolution you might not be able to adjust the Colour page that much though.
    I run Resolve in three ways:
    on my laptop using only the laptop display, so viewing the clips/timeline in the GUI on my laptop using only my UHD panel, so viewing the clips/timeline in the GUI on my laptop using an external monitor as a "Clean Feed" and only using the GUI for controlling things I heartily recommend the latter "Clean Feed" option if you have an external monitor you can use.  It's how you set things up if you have a BM hardware device (which I have just ordered and would recommend) but you can also do it with just a normal dual monitor setup through the menus - I think it's called something like "Clean Feed" - but you just choose which monitor it goes to.  It's a great way of working.
    It's a tricky one as these things are absolutely worthwhile if you're billing your time, but hard to justify if it's only a hobby.  As there isn't one controller that does Editing and Colour it seems like the more controllers you get the more you're forced into the studio system way of working where each "page" in Resolve is done separately, rather than jumping around as I tend to do while I'm exploring and shaping the material.
  17. Like
    KnightsFan got a reaction from newfoundmass in Pro/Con of Buying 1. New; 2. Used; 3. Refurbished   
    I buy used whenever I can: cameras, lenses, microphones, cars, clothes. Mainly to do my part to reduce waste. I sell or give away anything I've upgraded away from, unless it's absolutely unsalvageable. So a nice perk is that often I don't lose much value. In fact, with 2 out of the 3 cameras I've sold, I actually profited from it! In both cases I bought and sold on ebay, which is what I almost always use.
  18. Like
    KnightsFan got a reaction from mkabi in Tascam is developing a hotshoe XLR adapter to Canon, Fujifilm and Nikon   
    The problem with USB audio for widespread use is it's limited to short cables. There are workarounds, but it gets complicated once you get to 5+ meters. XLR cables can easily be 10x that length. So XLR is better unless you know that the DAC is always going to be right next to the recorder.
    As far as non-USB digital audio goes, it would be interesting if we saw some uptake of RJ45 ethernet ports on cameras along with the necessary standards to carry video and/or audio. Those have (flimsy) locking connectors and longer run distance, plus there's a POE standard in place. But as of right now, the only universal standard for digital audio in the consumer world is USB. AES would help pro audio, but would be useless for most consumers still.
    @mkabiI think we're heading in the direction of putting the DAC (digitizer) in the mic housing like you are suggesting. Certainly the consumer world is. The Sennheiser XS Lav Mobile and the Rode VideoMic NTG are examples, and then there are a couple bodypacks like the Zoom F2 or Deity BP-TRX that are moving there for the wireless world. I think putting the DAC in the mic is especially beneficial for consumer products, because it means that the audio company is solely responsible for the audio signal chain.
    However, for the foreseeable future, if there is a wire between the mic and recorder, then analog via XLR to a DAC inside the recorder has no downsides compared to digitizing on the other end. If we're talking wireless it's a different story--it would be better to both digitize and record on the TX side, but that's a legal minefield.
    Also, the reason lavs get away with thin cables and 3.5mm jacks is because they have short runs. You can run unbalanced audio from your shirt to your belt, not 50m across a stage.
  19. Haha
    KnightsFan got a reaction from 92F in Apple is Coming For Y'all: Disruptive Video Production Technologies   
    I think this is great news. The more we can do with generic devices like smartphones that can run 3rd party software, the more creative options open up to us. Some purists are married to using large sensors to get "real" depth of field control, but I say that once it can be simulated to the point of not being able to tell the difference, we'll all be free to use smaller, lighter gear with fewer expensive accessories. Will this camera be indistinguishable from a full frame camera with a 50mm f1.4? Probably not--but it's getting excitingly close!
    I'm not an Apple user as I don't like their closed ecosystem, but good for them for pushing a little farther into pro imaging quality with iphones.
     
    Yeah since apparently it's physically impossible to make a ProRes encoder that is smaller than a Ninja V.
  20. Like
    KnightsFan got a reaction from billdoubleu in Android phone as 1080 HDMI monitor.   
    In addition to the new Z Cam model, Accsoon announced their own product which seems better designed physicall. Rather than needing to mount it somewhere, it is the phone holder. It has a NPF battery sled, and I'd assume that means it'll charge your phone simultaneously, plus a cold shoe on top. I also see some nice locating pin holes on the bottom.
    Z Cam though has the benefit of controlling compatible cameras with a USB cable. Would be nice to combine them and get the physical design of Accsoon with camera control.

    It would be cool (and not that surprising!) if Accsoon made a version that could receive a wireless signal from their CineEye transmitter in addition to HDMI.

  21. Like
    KnightsFan reacted to billdoubleu in Android phone as 1080 HDMI monitor.   
    Glad to see Z Cam is still chugging away on some useful things.
    https://www.instagram.com/p/CT9I65OrGTY/?utm_medium=copy_link
    https://www.instagram.com/p/CT9JVpSrjBW/?utm_medium=copy_link
     
     
     
  22. Haha
    KnightsFan got a reaction from UncleBobsPhotography in Apple is Coming For Y'all: Disruptive Video Production Technologies   
    I think this is great news. The more we can do with generic devices like smartphones that can run 3rd party software, the more creative options open up to us. Some purists are married to using large sensors to get "real" depth of field control, but I say that once it can be simulated to the point of not being able to tell the difference, we'll all be free to use smaller, lighter gear with fewer expensive accessories. Will this camera be indistinguishable from a full frame camera with a 50mm f1.4? Probably not--but it's getting excitingly close!
    I'm not an Apple user as I don't like their closed ecosystem, but good for them for pushing a little farther into pro imaging quality with iphones.
     
    Yeah since apparently it's physically impossible to make a ProRes encoder that is smaller than a Ninja V.
  23. Like
    KnightsFan got a reaction from Emanuel in Apple is Coming For Y'all: Disruptive Video Production Technologies   
    I think this is great news. The more we can do with generic devices like smartphones that can run 3rd party software, the more creative options open up to us. Some purists are married to using large sensors to get "real" depth of field control, but I say that once it can be simulated to the point of not being able to tell the difference, we'll all be free to use smaller, lighter gear with fewer expensive accessories. Will this camera be indistinguishable from a full frame camera with a 50mm f1.4? Probably not--but it's getting excitingly close!
    I'm not an Apple user as I don't like their closed ecosystem, but good for them for pushing a little farther into pro imaging quality with iphones.
     
    Yeah since apparently it's physically impossible to make a ProRes encoder that is smaller than a Ninja V.
  24. Like
    KnightsFan got a reaction from maxmizer in Is anyone else interested in a screen-less (small) external recorder?   
    But what you're saying in this topic is that you'd like a 3rd party to make an HDMI recorder, putting you in their hands for recording format instead of the camera's. Unless you're envisioning the external recorder space getting large enough that there is legitimate competition with a variety of different options. Why not wish for camera companies to allow for more flexibility in choosing codec, and higher quality instead of adding a whole extra layer of compatibility, accessories, and specs.
    Remember when Panasonic added 10 bit to the G9 years later via firmware? Remember when hackers added compressed raw recording to the 5D3? Or when Blackmagic made a pocket camera that shot ProRes and Raw at a fraction of the price of contemporary DSLR's, without fans and heatsinks? My sarcasm in this topic is because, in my opinion, wishing for small external recorders is a roundabout solution that would also encourage the implementation of bad internal video. I'd rather pay $100 for firmware that unlocks pro video formats, than $600 + batteries for a recorder even if it's the size of a battery grip.
    This is the exact reason I hope for the Octopus cinema camera to turn into a real product, or why I'd like to see cameras running vanilla android or Linux OS's. Getting on my soap box here, but the migration away from standardized computers (both hardware and software) to dozens of different devices running proprietary firmwares is a harmful trend.
  25. Like
    KnightsFan got a reaction from Video Hummus in Is anyone else interested in a screen-less (small) external recorder?   
    But what you're saying in this topic is that you'd like a 3rd party to make an HDMI recorder, putting you in their hands for recording format instead of the camera's. Unless you're envisioning the external recorder space getting large enough that there is legitimate competition with a variety of different options. Why not wish for camera companies to allow for more flexibility in choosing codec, and higher quality instead of adding a whole extra layer of compatibility, accessories, and specs.
    Remember when Panasonic added 10 bit to the G9 years later via firmware? Remember when hackers added compressed raw recording to the 5D3? Or when Blackmagic made a pocket camera that shot ProRes and Raw at a fraction of the price of contemporary DSLR's, without fans and heatsinks? My sarcasm in this topic is because, in my opinion, wishing for small external recorders is a roundabout solution that would also encourage the implementation of bad internal video. I'd rather pay $100 for firmware that unlocks pro video formats, than $600 + batteries for a recorder even if it's the size of a battery grip.
    This is the exact reason I hope for the Octopus cinema camera to turn into a real product, or why I'd like to see cameras running vanilla android or Linux OS's. Getting on my soap box here, but the migration away from standardized computers (both hardware and software) to dozens of different devices running proprietary firmwares is a harmful trend.
×
×
  • Create New...