Jump to content

majoraxis

Members
  • Posts

    489
  • Joined

  • Last visited

Reputation Activity

  1. Like
    majoraxis reacted to Gianluca in Unreal and python background remover   
    In this case it was more complicated to combine the two videos ... You have to perfectly orient the direction of travel otherwise it seems that you walk too fast or slowly ... Even with the aspect ratio it was not easy ... What do you think?
  2. Like
    majoraxis reacted to kye in Interesting affordable stabilization device   
    The news shooter article says that normal retail price will be $3,500 or so, which seems reasonable.
    The price of mechanical and electronic components has plummeted over the lsat few decades thanks to centralisation in China.  The factor that makes or breaks products these days is the design of the physical unit and the software that controls it.
    The proof is in the pudding, as they say, so I would reserve judgement until there are units in the wild, and of course, all PR materials that are real (it's common for even things like car commercials to be 3d renders now) will be under the absolute best possible circumstances.  Anyone who has ever used a GoPro knows that you'd have to be luckier than a lotto winner to get shots like they put in their promos.
  3. Like
    majoraxis reacted to bjohn in Interesting affordable stabilization device   
    There was a pre-review a while back on Newshooter; see the comments as well: https://www.newsshooter.com/2022/01/16/snoppa-rover-camera-dolly/
  4. Like
  5. Like
    majoraxis reacted to Gianluca in Unreal and python background remover   
    Yes, I was very surprised for the last part... I've done some other test with this sequence with other environment and the best part for me is that i can change the location importig other world and my subject will be aligned perfectly with the floor... This will greatly speed up my work when I will combine multiple scenes.. 
    Unfortunately I have not found a way to make the file of the mobile phone perfectly match that of the mirrorless, probably the problem is due to the fact that the mobile phone films at variable framerate while the a6300 at fixed framerate ...
    Even comparing the two files in resolve, without doing anything, you can see that they have different times .. Maybe I'm doing something wrong but at the moment I have to settle for the mobile phone file, which is unfortunately really bad, also because the application records at most in 1080p ...
    Now I'm trying to upscale the video with python to see if it improves a bit, but even if it doesn't matter, I have a mind to make a little movie where both the locations and my son will have a cartoon effect, so that the definition is a secondary problem ...
     
  6. Like
    majoraxis reacted to Gianluca in Unreal and python background remover   
    Yesterday finally after an exhausting day arguing with unreal engine, I managed to get a video with a credible tracking, and the merit is all of a free application for smartphones called blendartrack.
    I have never been able to use the blender camera tracking, moreover if there are moving subjects you have to stay there to delete the points by hand where there is the overlap of the subject, then you have to align the planes etc etc .... With blendartrack all this is really a walk, from when I open blender to when I export for unreal it takes no more than 15 seconds, truly a liberation ... In this example the tracking in my opinion is not even at its maximum potential because in the rush with my son who wanted to do something else, I forgot to add markers, but it is already very good ...
    What surprised me then is that the focal length used by the mobile phone is also exported, and that the world of unreal (or at least the one used in this example) responds perfectly to the movements of the real world, without having to scale anything. The only big drawback at the moment is that the video taken with the mobile phone has a very low quality compared to what I can get with the a6300 and its dedicated optics .. I have to figure out how to track the file with my mobile phone and then use a file taken with the mirrorless.
     
  7. Like
    majoraxis reacted to Gianluca in Unreal and python background remover   
    Yes, it's robust video matting.. 
    In this video tracking it's bad but it's my fault, fps in the 2 videos doesn't match... 
    I'm using virtual plugin, an app for unreal engine to install on a smartphone... 
     
    Projected background it's better, but you NEED a studio, a very large studio, with light etc etc.. 
     
    Here I'm in my garden.... Now i want to use blender camera tracking for videos where exact match is needed.. 
     
  8. Like
    majoraxis reacted to KnightsFan in Unreal and python background remover   
    The matte is pretty good! Is it this repo you are using? You mentioned RVM in the other topic. https://github.com/PeterL1n/RobustVideoMatting
    Tracking of course needs some work. How are you currently tracking your camera? Is this all done in real time, or are you compositing after the fact? I assume that you are compositing later since you mention syncing tracks by audio. If I were you, I would ditch the crane if you're over the weight limit, just get some wide camera handles and make slow deliberate movements, and mount some proper tracking devices on top instead of a phone if that's what you're using now.
    Of course the downside to this approach compared to the projected background we're talking about in the other topic is, you can merge lighting easier with a projected background, and also with this approach you need to synchronize a LOT more settings between your virtual and real camera. With projected background you only need to worry about focus, with this approach you need to match exposure, focus, zoom, noise pattern, color response, and on and on. It's all work that can be done, but makes the whole process very tedious to me.
  9. Like
    majoraxis reacted to Gianluca in Unreal and python background remover   
    Hello everyone ... I preferred to open a new topic rather than continue to abuse "my Journey to virtual production"
    With the carnival I was able to do some more tests with unreal and this time for I recorded my subject up against a yellow wall ... As you can see, in controlled working conditions, the results can be really good ... Obviously there is still a lot of room for improvement, for example I have to synchronize the two video tracks by recording a common audio track, I have to balance the gimbal better (I have a crane-m and with the mobile phone mounted I exceed the grams it can support, so it vibrates a lot) , but apart from that, if I had an actress who does something sensible apart from shooting all the time 🙂 I might at this point think I can shoot something decent ... What do you think? Opinions, advice?
     
  10. Thanks
    majoraxis reacted to BTM_Pix in My Journey To Virtual Production   
    Yes, consider it an MMX of sorts that takes input from the AFX and then talks to the lens of the virtual camera in the same way the MMX talks to FIZ motors.
    And it speaks to both simultaneously of course.
    The additional development is in the integration of a positioning element.
    More (relatively) soon.
  11. Like
    majoraxis reacted to BTM_Pix in My Journey To Virtual Production   
    The role of it in this scenario is just as a cheap and dirty way for me to generate basic input so an LPD-3806 would be overkill 😉 
    The real time dynamic information is coming from the camera system.
    Essentially, this is sending the raw data and the Blueprint itself is doing the interpretation so its conceptually the same. 
    Because I am generating the MIDI data directly rather than taking it from a control surface and re-interpreting it, I'm able to use the standard MIDI data structures in different ways to work around the limitations of unsigned 8 bit values because the Blueprint will know how to interpret them.
    So if I want to change the aperture to f5.6, for example, then I can split that inside a basic Note On message on a specific channel so that I send that on channel 7 with a note value of 5 and a velocity value of 6.
    The Blueprint knows that anything it receives on MIDI channel 7 will be an aperture value and to take the note value as the unit and the velocity value as the fractional and re-constitutes it to 5.6 for the virtual camera.
    Using the 14 bit MIDI messages such as Pitch Bend or Breath Control also offers more scope to create workarounds for negative values by using, for example, the midway value of the range (8191) as a "0" point and then interpreting positive and negative values relative to that and using division if necessary to create fractional values (e.g. -819.1 to +819.1 or -81.91 to +81.91 etc).
    The reason to take the MIDI path was to use BLE MIDI to keep everything wireless and enable it to sit within the existing BLE architecture that I have created for camera and lens/lens motor control so it made for much faster integration.
    Having said all that......
    As the saying goes, there is no point having a mind if you can't change it and things do change with this over the subsequent steps.
  12. Like
    majoraxis reacted to BTM_Pix in My Journey To Virtual Production   
    The video is purely a prototyping illustration of the transmission of the parameter changes to the virtual from an external source rather than a demonstration of the performance, hence the coarse parameter changes from the encoder.
    So is purely a communication hub that sits within a far more comprehensive system that is taking dynamic input from a positional system but also direct input from the physical camera itself in terms of focus position, focal length, aperture, white balance etc.
    As a tracking only option, the Virtual Plugin is very good though but, yes, my goal is not currently in the actual end use of the virtual camera.
    A spoiler alert though, is that what you see here is a description of where I was quite a few weeks ago 😉 
  13. Like
    majoraxis reacted to KnightsFan in My Journey To Virtual Production   
    I have a control surface I made for various software. I have a couple of rotary encoders just like the one you have, which I use for adjusting selections, but I got a higher resolution one (LPD-3806) for finer controls, like rotating objects or controlling automation curves. Just like you said, having infinite scrolling is imperative for flexible control.
    I recommend still passing raw data from the dev board to PC, and using desktop software to interpret the raw data. It's much faster to iterate, and you have much more CPU power and memory available. I wrote an app that receives the raw data from my control surface over USB, then transmits messages out to the controlled software using OSC. I like OSC better than MIDI because you aren't limited to low resolution 8 bit messages, you can send float or even string values. Plus OSC is much more explicit about port numbers, at least in the implementations I've used. But having a desktop software interpreting everything was a game changer for me compared to sending Midi directly from the arduino.
  14. Like
    majoraxis reacted to Gianluca in My Journey To Virtual Production   
    I know very well that you are a very good programmer and surely you are trying to make your own system to control the unreal camera, but after seeing the example in video, I wonder if it is not much more convenient to use an already made app, such as a virtual plugin. that I used in my tests ... You can control different parameters such as movement and focus , you can move around the world with virtual joysticks as well as being able to walk, you can lock the axes at will etc etc .. Try to give it a look, there is also a demo that can be requested via email ...
    https://www.unrealengine.com/marketplace/en-US/product/virtual-plugin
     
  15. Like
    majoraxis reacted to BTM_Pix in My Journey To Virtual Production   
    So....getting into the time machine and going back to where the journey was a few weeks ago.
    To recap, my interest in these initial steps is in exploring methods of externally controlling the virtual camera inside Unreal Engine rather than getting live video into it to place in the virtual world.
    To me, that is the bigger challenge as, although far from trivial, the live video and keying aspect is a known entity to me and the virtual camera control is the glue that will tie it together.
    For that, we need to get the data from the outside real world in terms of the camera position, orientation and image settings in to Unreal Engine and then do the processing to act upon that and translate it to the virtual camera.
    To achieve this, Unreal Engine offers Plug Ins to get the data in and Blueprints to process it.
    The Plug Ins on offer for different types of data for Unreal Engine are numerous and range from those covering positional data from VR trackers to various protocols to handle lens position encoders, translators for PTZ camera systems and more generic protocols such as DMX and MIDI.
    Although it is obviously most closely associated with music, it is this final one that I decided to use.
    The first commercially available MIDI synth was the Sequential Prophet 600 which I bought (and still have!) a couple of years after it was released in 1982 and in the intervening four decades (yikes!) I have used MIDI for numerous projects outside of just music so its not only a protocol that I'm very familiar with but it also offers a couple of advantages for this experimentation.
    The first that, due to its age, it is a simple well documented protocol to work with and the second is that due to its ubiquity there are tons of cheaply available control surfaces that greatly help when you are prodding about.
    And I also happen to have quite a few lying around including these two, the Novation LaunchControl and the Behringer X-Touch Mini.


    The process here, then, is to connect either of these control surfaces to the computer running Unreal Engine and use the MIDI plug in to link their various rotary controls and/or switches to the operation of the virtual camera.
    Which brings us now to using Blueprints.
    In very simple terms, Blueprints allow the linking of elements within the virtual world to events both inside it and external.
    So, for example, I could create a Blueprint that when I pressed the letter 'L' on the keyboard could toggle the light on and off in a scene or if I pressed a key from 1 to 9 it could vary its intensity etc.
    For my purpose, I will be creating a Blueprint that effectively says "when you receive a value from pot number 6 of the LaunchControl then change the focal length on the virtual camera to match that value" and "when you receive a value from pot number 7 of the LaunchControl then change the focus distance on the virtual camera to match that value" and so on and so on.
    By defining those actions for each element of the virtual camera, we will then be able to operate all of its functions externally from the control surface in an intuitive manner.
    Blueprints are created using a visual scripting language comprising of nodes that are connected together that is accessible but contains enough options and full access to the internals to create deep functionality if required.
    This is the overview of the Blueprint that I created for our camera control.

    Essentially, what is happening here is that I have created a number of functions within the main Blueprint which are triggered by different events.
    The first function that is triggered on startup is this Find All MIDI Devices function that finds our MIDI controller and shown in expanded form here.

    This function cycles through the attached MIDI controllers and lists them on the screen and then steps on to find our specific device that we have defined in another function.
    Once the MIDI device is found, the main Blueprint then processes incoming MIDI messages and conditionally distributes them to the various different functions that I have created for processing them and controlling the virtual camera such as this one for controlling the roll position of the camera.

    When the Blueprint is running, the values of the parameters of the virtual camera are then controlled by the MIDI controller in real time.
    This image shows the viewport of the virtual camera in the editor with its values changed by the MIDI controller.

    So far so good in theory but the MIDI protocol does present us with some challenges but also some opportunities.
    The first challenge is that the value for most "regular" parameters such as note on, control change and so on that you would use as a vehicle to transmit data only ranges from 0 to 127 and doesn't support fractional or negative numbers.
    If you look at the pots on the LaunchControl, you will see that they are regular pots with finite travel so if you rotate it all the way to the left it will output 0 and will progressively output a higher value until it reaches 127 when it is rotated all the way to the right.
    As many of the parameters that we will be controlling have much greater ranges (such as focus position) or have fractional values (such as f stop) then another solution is required.
    If you look at the pots on the X-Touch Mini, these are digital encoders so have infinite range as they only interpret relative movement rather than absolute values like the pots on the LaunchControl do.
    This enables them to take advantage of being mapped to MIDI events such as Pitch Bend which can have 16,384 values.
    If we use the X-Touch Mini, then, we can create the Blueprint in such a way that it either increments or decrements a value by a certain amount rather than absolute values, which takes care of parameters with much greater range than 0 to 127, those which have negative values and those which have fractional values.
    Essentially we are using the encoders or switches on the X-Touch Mini has over engineered plus and minus buttons with the extent of each increment/decrement set inside the Blueprint.
    Workable but not ideal and it also makes it trickier to determine a step size for making large stepped changes (think ISO100 to 200 to 400 to 640 is only four steps on your camera but a lot more if you have to go in increments of say 20) and of course every focus change would be a looooongg pull.
    There is also the aspect that, whilst perfectly fine for initial development, these two MIDI controllers are not only wired but also pretty big.
    As our longer term goal is for something to integrate with a camera, the requirement would quickly be for something far more compact, self powered and wireless.
    So, the solution I came up with was to lash together a development board and a rotary encoder and write some firmware that would make it a much smaller MIDI controller that would transmit to Unreal Engine over wireless Bluetooth MIDI.

    Not pretty but functional enough.
    To change through the different parameters to control, you just press the encoder in and it then changes to control that one.
    If that seems like less than real time in terms of camera control as you have to switch, then you are right it is but this form is not its end goal so the encoder is mainly for debugging and testing.
    The purpose of how such a module will fit in is in how a camera will talk to it and it then processing that data for onward transmission to Unreal Engine. So its a just an interface platform for other prototypes.
    In concert with the Blueprint, as we now control the format of the data being transmitted and how it is then interpreted, it gives us all the options we need to encode it and decode it so that we can fit what we need within its constraints and easily make the fractional numbers, negative numbers, large numbers that we will need.
    It also offers the capability of different degrees of incremental control as well as direct.
    But that's for another day.
    Meanwhile, here is a quick video of the output of it controlling the virtual camera in coarse incremental mode.
     
  16. Like
    majoraxis reacted to Gianluca in My Journey To Virtual Production   
    You probably want something like this.. 
     
     
  17. Like
    majoraxis got a reaction from BTM_Pix in My Journey To Virtual Production   
    @Gianluca - Fantastic!
    I am wondering if it would be possible to film a virtual background behind the actor, say a 10' x 5' short through projected image and then extend the virtual back ground in sync with the computer generated projected background being recorded behind the actor.
    Kind of like in the screen grab from your video that reveals grass, house, bush and trampoline. If the physical back ground in this shot was the projected image behind the actor and the Minecraft extended background seamlessly blended with the small section of projected background being recorded along with the actor, that would super interesting to me, as I am putting together a projected background.
    I am hoping to combined the best of both worlds and I am wondering where that line is.  Maybe there's still some benefits to shooting an actor with a cine lens on a projected background for aesthetic/artistic reasons? Maybe not?
    Thanks!
    (I hope it's OK that I took a screen grab of your video to illustrate my point. If not, please ask Andrew to remove it. That's of course find with me.)

  18. Like
    majoraxis reacted to Gianluca in My Journey To Virtual Production   
    While I am learning to import worlds into unreal, learning to use the sequencer etc etc, I am also doing some tests, which I would define extreme, to understand how far I can go ...
    I would define this test as a failure, but I have already seen that with fairly homogeneous backgrounds, with the subject close enough and in focus (here I used a manual focus lens), if I do not have to frame the feet (in these cases I will use the tripod) it is possible to catapult the subject in the virtual world quite easily and in a realistic way ..
    The secret is to shoot many small scenes with the values to be fed to the script optimized for that particular scene ..
    The next fly I'll try to post something made better ..
     
     
  19. Like
    majoraxis reacted to Gianluca in My Journey To Virtual Production   
    Sorry if I often post my tests but honestly I've never seen anything like it, and for a month I also paid 15 euros for runwayml which in comparison is unwatchable ..
    We are not yet at professional levels probably, but by retouching the mask with fusion a little bit in my opinion we are almost there ... I am really amazed by what you can do...
     
     
  20. Like
    majoraxis reacted to Lovund in Music video shot on Kowa anamorphic   
    Hi,
    Just wanted to share my latest work which I produced and directed; a music video for Kygo ft. DNCE (Joe Jonas). It’s shot on Arri Mini + Arri Mini LF with Kowa anamorphic glass. I’ve been on this forum since I bought my (now retired) GH4 8 years ago.
    Thought some of you might find it interesting ☺️
     
     
  21. Thanks
    majoraxis reacted to BTM_Pix in My Journey To Virtual Production   
    My updates to this thread are way behind real time so, at the risk of ruining future episodes, lets just say that we passed the "viable" stage a few weeks ago 😉
     
  22. Like
    majoraxis reacted to Gianluca in My Journey To Virtual Production   
    From the same developers of the python software that allows you to have a green screen (and a matte) from a shot with a tripod and a background without a subject, I found this other one, it's called RVM, and it allows you to do the same thing even with shots moving and starting directly with the subject in the frame .. For static shots the former is still much superior, but this has its advantages when used in the best conditions. With a well-lit subject and a solid background, it can replace a green background
     
    Or with a green or blue background you can have your perfect subject even if it just comes out of the background or even if the clothes she is wearing have green or blue
    Now I am really ready to be able to shoot small scenes in unreal engine and I can also move slightly within the world itself thanks to the virtual plugin and these phyton tools
    The problem is that the actor has no intention of collaborating ..
    ..
  23. Like
    majoraxis reacted to BTM_Pix in My Journey To Virtual Production   
    In case you were wondering where I'm up to...
    Currently at the integration of some LIDAR, some Bluetooth, some WiFi, some Serial, some 9 axis IMU, some camera, lens and imaging control, some electrical tape and more than some coffee stage of proceedings.

    Its weeks ahead of the last update so I'll do a proper update at some point about the twists and turns of the journey up to this point.
     
  24. Like
    majoraxis reacted to jgharding in My Journey To Virtual Production   
    I'm at intermediate level with Unreal in general, though I've not used VP templates yet, but I have made motion graphics pieces, developed small games (including entirely custom player controllers built from ground up etc etc) so I'm fascinated by VP as well.

    As far as I understand it it's something where cost spirals very very quickly as you scale up to multiple screens. With one screen, let's say you buy a 90-inch 4K OLED or something, you can use a consumer GPU and get away with it.

    Once you want to sync walls of video screens you'll need to use a Quadro Sync and have colossal amounts of VRAM available. https://www.nvidia.com/en-gb/design-visualization/solutions/quadro-sync/

    Like most Unreal things it's WIP and you benefit if you know someone who is good with C++ and Unreal, which is hard to find. For example movie render queue in 4.27 is very much improved compared to previous versions, but still has some really hacky and opaque aspects, such as the need to enter arbitrary CVARs to make certain things look right, which you just have to go and find out, it doesn't hold your hand. It's no Resolve...

    In general that's the hardest part. I've been on the Unreal journey for 3 years and it has never gotten easier, mainly because there is always something new to learn and stuff just keeps changing. It's very rewarding though, good luck and please keep us updated!
  25. Like
    majoraxis reacted to Gianluca in My Journey To Virtual Production   
    In this example I used a python script that allows you to have perfectly outlined images without any green screen, the camera must be placed on a tripod and then just a background image without the subject is enough and the script will do everything else ... not 100% reliable but when it works it's really perfect ...
    I then used the "composure" plugin in unreal to try to put real and virtual in perspective ...
     
    Now I'm experimenting with special effects to be inserted directly into unreal ..
×
×
  • Create New...