Jump to content

My Journey To Virtual Production


BTM_Pix
 Share

Recommended Posts

From the same developers of the python software that allows you to have a green screen (and a matte) from a shot with a tripod and a background without a subject, I found this other one, it's called RVM, and it allows you to do the same thing even with shots moving and starting directly with the subject in the frame .. For static shots the former is still much superior, but this has its advantages when used in the best conditions. With a well-lit subject and a solid background, it can replace a green background

 

Or with a green or blue background you can have your perfect subject even if it just comes out of the background or even if the clothes she is wearing have green or blue

Now I am really ready to be able to shoot small scenes in unreal engine and I can also move slightly within the world itself thanks to the virtual plugin and these phyton tools

The problem is that the actor has no intention of collaborating ..

..

Link to comment
Share on other sites

EOSHD Pro Color 5 for Sony cameras EOSHD Z LOG for Nikon CamerasEOSHD C-LOG and Film Profiles for All Canon DSLRs

This fantastic stuff! Very impressive quality with less production overhead. @Gianluca thanks for continuing to share your virtual production journey with us!

@BTM_Pix is the concept of taking realtime focus motor and zoom motor position data from AFX and informing the virtual camera so that the background responds appropriately, viable?

Thanks!

Link to comment
Share on other sites

  • Super Members
On 2/23/2022 at 10:13 PM, majoraxis said:

 

@BTM_Pix is the concept of taking realtime focus motor and zoom motor position data from AFX and informing the virtual camera so that the back ground responds appropriately, viable?

My updates to this thread are way behind real time so, at the risk of ruining future episodes, lets just say that we passed the "viable" stage a few weeks ago 😉

 

Link to comment
Share on other sites

Sorry if I often post my tests but honestly I've never seen anything like it, and for a month I also paid 15 euros for runwayml which in comparison is unwatchable ..

We are not yet at professional levels probably, but by retouching the mask with fusion a little bit in my opinion we are almost there ... I am really amazed by what you can do...

 

 

Link to comment
Share on other sites

While I am learning to import worlds into unreal, learning to use the sequencer etc etc, I am also doing some tests, which I would define extreme, to understand how far I can go ...

I would define this test as a failure, but I have already seen that with fairly homogeneous backgrounds, with the subject close enough and in focus (here I used a manual focus lens), if I do not have to frame the feet (in these cases I will use the tripod) it is possible to catapult the subject in the virtual world quite easily and in a realistic way ..

The secret is to shoot many small scenes with the values to be fed to the script optimized for that particular scene ..

The next fly I'll try to post something made better ..

 

 

Link to comment
Share on other sites

@Gianluca - Fantastic!

I am wondering if it would be possible to film a virtual background behind the actor, say a 10' x 5' short through projected image and then extend the virtual back ground in sync with the computer generated projected background being recorded behind the actor.

Kind of like in the screen grab from your video that reveals grass, house, bush and trampoline. If the physical back ground in this shot was the projected image behind the actor and the Minecraft extended background seamlessly blended with the small section of projected background being recorded along with the actor, that would super interesting to me, as I am putting together a projected background.

I am hoping to combined the best of both worlds and I am wondering where that line is.  Maybe there's still some benefits to shooting an actor with a cine lens on a projected background for aesthetic/artistic reasons? Maybe not?

Thanks!

(I hope it's OK that I took a screen grab of your video to illustrate my point. If not, please ask Andrew to remove it. That's of course find with me.)

image.png

Link to comment
Share on other sites

Absolutely no problem for the screenshot, but I didn't understand how you could go around a subject and at the same time always have the projected image behind him ... It would take a huge set like in the mandolarian and I think the one who opened the topic is trying to do a similar thing ... I, on the other hand, having no money, I'm trying to extrapolate the subject as if it were behind a green background even if it is not. And.. The smartphone app connected to an unreal engine then does the rest ... For scenes where the subject's feet and movement in the camera are required, you can use the blender camera, but I've never been able to get anything good at the moment...

Link to comment
Share on other sites

  • Super Members

So....getting into the time machine and going back to where the journey was a few weeks ago.

To recap, my interest in these initial steps is in exploring methods of externally controlling the virtual camera inside Unreal Engine rather than getting live video into it to place in the virtual world.

To me, that is the bigger challenge as, although far from trivial, the live video and keying aspect is a known entity to me and the virtual camera control is the glue that will tie it together.

For that, we need to get the data from the outside real world in terms of the camera position, orientation and image settings in to Unreal Engine and then do the processing to act upon that and translate it to the virtual camera.

To achieve this, Unreal Engine offers Plug Ins to get the data in and Blueprints to process it.

The Plug Ins on offer for different types of data for Unreal Engine are numerous and range from those covering positional data from VR trackers to various protocols to handle lens position encoders, translators for PTZ camera systems and more generic protocols such as DMX and MIDI.

Although it is obviously most closely associated with music, it is this final one that I decided to use.

The first commercially available MIDI synth was the Sequential Prophet 600 which I bought (and still have!) a couple of years after it was released in 1982 and in the intervening four decades (yikes!) I have used MIDI for numerous projects outside of just music so its not only a protocol that I'm very familiar with but it also offers a couple of advantages for this experimentation.

The first that, due to its age, it is a simple well documented protocol to work with and the second is that due to its ubiquity there are tons of cheaply available control surfaces that greatly help when you are prodding about.

And I also happen to have quite a few lying around including these two, the Novation LaunchControl and the Behringer X-Touch Mini.

1798827340_launchcontrol.jpg.3a37b3c09fb6d0cfcf90036104c67360.jpg

xtouch.jpg.05f1eb2bc1795de528f1c9db84ac36c4.jpg

The process here, then, is to connect either of these control surfaces to the computer running Unreal Engine and use the MIDI plug in to link their various rotary controls and/or switches to the operation of the virtual camera.

Which brings us now to using Blueprints.

In very simple terms, Blueprints allow the linking of elements within the virtual world to events both inside it and external.

So, for example, I could create a Blueprint that when I pressed the letter 'L' on the keyboard could toggle the light on and off in a scene or if I pressed a key from 1 to 9 it could vary its intensity etc.

For my purpose, I will be creating a Blueprint that effectively says "when you receive a value from pot number 6 of the LaunchControl then change the focal length on the virtual camera to match that value" and "when you receive a value from pot number 7 of the LaunchControl then change the focus distance on the virtual camera to match that value" and so on and so on.

By defining those actions for each element of the virtual camera, we will then be able to operate all of its functions externally from the control surface in an intuitive manner.

Blueprints are created using a visual scripting language comprising of nodes that are connected together that is accessible but contains enough options and full access to the internals to create deep functionality if required.

This is the overview of the Blueprint that I created for our camera control.

928875609_Screenshot2022-03-03at15_52_25.thumb.jpg.d26ed668d66e4c21012bc7f331d5a8df.jpg

Essentially, what is happening here is that I have created a number of functions within the main Blueprint which are triggered by different events.

The first function that is triggered on startup is this Find All MIDI Devices function that finds our MIDI controller and shown in expanded form here.

971163683_Screenshot2022-03-03at15_52_04.thumb.jpg.4dc7c068a87198eda50c9f3508b80c7e.jpg

This function cycles through the attached MIDI controllers and lists them on the screen and then steps on to find our specific device that we have defined in another function.

Once the MIDI device is found, the main Blueprint then processes incoming MIDI messages and conditionally distributes them to the various different functions that I have created for processing them and controlling the virtual camera such as this one for controlling the roll position of the camera.

555262766_Screenshot2022-03-03at15_53_04.thumb.jpg.0e99bbb32867bc2614aff0e4906b9877.jpg

When the Blueprint is running, the values of the parameters of the virtual camera are then controlled by the MIDI controller in real time.

This image shows the viewport of the virtual camera in the editor with its values changed by the MIDI controller.

1229368219_Screenshot2022-03-04at11_42_05.thumb.jpg.fde62e45ae109fd4bc339d8fd0779b0c.jpg

So far so good in theory but the MIDI protocol does present us with some challenges but also some opportunities.

The first challenge is that the value for most "regular" parameters such as note on, control change and so on that you would use as a vehicle to transmit data only ranges from 0 to 127 and doesn't support fractional or negative numbers.

If you look at the pots on the LaunchControl, you will see that they are regular pots with finite travel so if you rotate it all the way to the left it will output 0 and will progressively output a higher value until it reaches 127 when it is rotated all the way to the right.

As many of the parameters that we will be controlling have much greater ranges (such as focus position) or have fractional values (such as f stop) then another solution is required.

If you look at the pots on the X-Touch Mini, these are digital encoders so have infinite range as they only interpret relative movement rather than absolute values like the pots on the LaunchControl do.

This enables them to take advantage of being mapped to MIDI events such as Pitch Bend which can have 16,384 values.

If we use the X-Touch Mini, then, we can create the Blueprint in such a way that it either increments or decrements a value by a certain amount rather than absolute values, which takes care of parameters with much greater range than 0 to 127, those which have negative values and those which have fractional values.

Essentially we are using the encoders or switches on the X-Touch Mini has over engineered plus and minus buttons with the extent of each increment/decrement set inside the Blueprint.

Workable but not ideal and it also makes it trickier to determine a step size for making large stepped changes (think ISO100 to 200 to 400 to 640 is only four steps on your camera but a lot more if you have to go in increments of say 20) and of course every focus change would be a looooongg pull.

There is also the aspect that, whilst perfectly fine for initial development, these two MIDI controllers are not only wired but also pretty big.

As our longer term goal is for something to integrate with a camera, the requirement would quickly be for something far more compact, self powered and wireless.

So, the solution I came up with was to lash together a development board and a rotary encoder and write some firmware that would make it a much smaller MIDI controller that would transmit to Unreal Engine over wireless Bluetooth MIDI.

HLC.thumb.jpg.53b92ff1b354ba552988ed312f7355da.jpg

Not pretty but functional enough.

To change through the different parameters to control, you just press the encoder in and it then changes to control that one.

If that seems like less than real time in terms of camera control as you have to switch, then you are right it is but this form is not its end goal so the encoder is mainly for debugging and testing.

The purpose of how such a module will fit in is in how a camera will talk to it and it then processing that data for onward transmission to Unreal Engine. So its a just an interface platform for other prototypes.

In concert with the Blueprint, as we now control the format of the data being transmitted and how it is then interpreted, it gives us all the options we need to encode it and decode it so that we can fit what we need within its constraints and easily make the fractional numbers, negative numbers, large numbers that we will need.

It also offers the capability of different degrees of incremental control as well as direct.

But that's for another day.

Meanwhile, here is a quick video of the output of it controlling the virtual camera in coarse incremental mode.

 

Link to comment
Share on other sites

I know very well that you are a very good programmer and surely you are trying to make your own system to control the unreal camera, but after seeing the example in video, I wonder if it is not much more convenient to use an already made app, such as a virtual plugin. that I used in my tests ... You can control different parameters such as movement and focus , you can move around the world with virtual joysticks as well as being able to walk, you can lock the axes at will etc etc .. Try to give it a look, there is also a demo that can be requested via email ...

https://www.unrealengine.com/marketplace/en-US/product/virtual-plugin

 

Link to comment
Share on other sites

2 hours ago, BTM_Pix said:

So, the solution I came up with was to lash together a development board and a rotary encoder and write some firmware that would make it a much smaller MIDI controller that would transmit to Unreal Engine over wireless Bluetooth MIDI.

I have a control surface I made for various software. I have a couple of rotary encoders just like the one you have, which I use for adjusting selections, but I got a higher resolution one (LPD-3806) for finer controls, like rotating objects or controlling automation curves. Just like you said, having infinite scrolling is imperative for flexible control.

2 hours ago, BTM_Pix said:

The purpose of how such a module will fit in is in how a camera will talk to it and it then processing that data for onward transmission to Unreal Engine. So its a just an interface platform for other prototypes.

I recommend still passing raw data from the dev board to PC, and using desktop software to interpret the raw data. It's much faster to iterate, and you have much more CPU power and memory available. I wrote an app that receives the raw data from my control surface over USB, then transmits messages out to the controlled software using OSC. I like OSC better than MIDI because you aren't limited to low resolution 8 bit messages, you can send float or even string values. Plus OSC is much more explicit about port numbers, at least in the implementations I've used. But having a desktop software interpreting everything was a game changer for me compared to sending Midi directly from the arduino.

Link to comment
Share on other sites

  • Super Members
1 hour ago, Gianluca said:

I know very well that you are a very good programmer and surely you are trying to make your own system to control the unreal camera, but after seeing the example in video, I wonder if it is not much more convenient to use an already made app, such as a virtual plugin. that I used in my tests

The video is purely a prototyping illustration of the transmission of the parameter changes to the virtual from an external source rather than a demonstration of the performance, hence the coarse parameter changes from the encoder.

So is purely a communication hub that sits within a far more comprehensive system that is taking dynamic input from a positional system but also direct input from the physical camera itself in terms of focus position, focal length, aperture, white balance etc.

As a tracking only option, the Virtual Plugin is very good though but, yes, my goal is not currently in the actual end use of the virtual camera.

A spoiler alert though, is that what you see here is a description of where I was quite a few weeks ago 😉 

Link to comment
Share on other sites

  • Super Members
1 hour ago, KnightsFan said:

I got a higher resolution one (LPD-3806) for finer controls, like rotating objects or controlling automation curve

The role of it in this scenario is just as a cheap and dirty way for me to generate basic input so an LPD-3806 would be overkill 😉 

The real time dynamic information is coming from the camera system.

1 hour ago, KnightsFan said:

I recommend still passing raw data from the dev board to PC, and using desktop software to interpret the raw data. It's much faster to iterate, and you have much more CPU power and memory available. I wrote an app that receives the raw data from my control surface over USB, then transmits messages out to the controlled software using OSC. I like OSC better than MIDI because you aren't limited to low resolution 8 bit messages, you can send float or even string values. Plus OSC is much more explicit about port numbers, at least in the implementations I've used. But having a desktop software interpreting everything was a game changer for me compared to sending Midi directly from the arduino.

Essentially, this is sending the raw data and the Blueprint itself is doing the interpretation so its conceptually the same. 

Because I am generating the MIDI data directly rather than taking it from a control surface and re-interpreting it, I'm able to use the standard MIDI data structures in different ways to work around the limitations of unsigned 8 bit values because the Blueprint will know how to interpret them.

So if I want to change the aperture to f5.6, for example, then I can split that inside a basic Note On message on a specific channel so that I send that on channel 7 with a note value of 5 and a velocity value of 6.

The Blueprint knows that anything it receives on MIDI channel 7 will be an aperture value and to take the note value as the unit and the velocity value as the fractional and re-constitutes it to 5.6 for the virtual camera.

Using the 14 bit MIDI messages such as Pitch Bend or Breath Control also offers more scope to create workarounds for negative values by using, for example, the midway value of the range (8191) as a "0" point and then interpreting positive and negative values relative to that and using division if necessary to create fractional values (e.g. -819.1 to +819.1 or -81.91 to +81.91 etc).

The reason to take the MIDI path was to use BLE MIDI to keep everything wireless and enable it to sit within the existing BLE architecture that I have created for camera and lens/lens motor control so it made for much faster integration.

Having said all that......

As the saying goes, there is no point having a mind if you can't change it and things do change with this over the subsequent steps.

Link to comment
Share on other sites

@BTM_Pix thanks for sharing your developments!

Would it be possible to take the AFX focus information and automate the adjustment the amount of focus/defocus in real time of a projected background image (that is being captured along with the subject) based on the distance the subject is from the camera and the distance the screen is from the subject?

If the camera is stationary then inputting a measurement for calibration should not be a problem (I assume), but if the camera is moving then some type proximity sensor array like the ones used for robot geofencing might do the trick.

Seems like this would be the holy grail of make the subject and back ground seamlessly integrated for shooting scenes where the background projected image is close to the subject because we only have so much room in our living rooms or garages.

Link to comment
Share on other sites

  • Super Members

Yes, consider it an MMX of sorts that takes input from the AFX and then talks to the lens of the virtual camera in the same way the MMX talks to FIZ motors.

And it speaks to both simultaneously of course.

The additional development is in the integration of a positioning element.

More (relatively) soon.

Link to comment
Share on other sites

  • 3 weeks later...

Anyone else wondering how well the Mac Studio will run Unreal for Virtual Production?

I need a number of video outputs so it seems to be a good choice when it comes to outputting a lot of pixels. I could be wrong about its ability to use multiple usb 4 ports to display one large composite image…

Link to comment
Share on other sites

Not the greatest test, but it looks like Unreal 5.0.0 early access runs on the M1 Max Mac Book Pro.  There are some caveats for regarding Unreal 5 functionality on Mac OS, namely it currently does not support Nanite. From the Unreal forums I read that Lumen runs on Unreal 5 Preview 1, so things are moving in the right direction. Maybe with the release of the Mac Studio, Unreal will focus on support of the Mac Studio as it seems like it is going to be very popular...

Heres' an overview of Lumen in Unreal 5:

Here's an overview of Nanite (sounds like it is something everyone will want on Mac OS):

 

Link to comment
Share on other sites

  • 2 weeks later...

Unreal Engine 5 now released.

https://www.unrealengine.com/en-US/blog/unreal-engine-5-is-now-available

 

"The City" from The Matrix Awakens: An Unreal Engine 5 Experience is available a free download to play with Unreal Engine 5. Information below:

https://www.unrealengine.com/en-US/wakeup?utm_source=GoogleSearch&utm_medium=Performance&utm_campaign=an*3Q_pr*UnrealEngine_ct*Search_pl*Brand_co*US_cr*Frosty&utm_id=15986877528&sub_campaign=&utm_content=July2020_Generic_V2&utm_term=matrix awakens

 

It seems like a good minimum system config would be:

 

Windows 10 64-bit

64 GB RAM

256 GB SSD (OS Drive)

2 TB SSD (Data Drive)

NVIDIA GeForce RTX 2080 SUPER (RTX 3090 IT or RTX 3090Ti would be better)

 12 to 16 cores CPU (so a AMD Ryzen 9 5950x or AMD Ryzen 9 5900X)

 

 

Link to comment
Share on other sites

I had a chance to check out "The City" map running on Unreal Engine 5.  The quality was excellent. The AI driven behavior of traffic and crowds is excellent and scalable.  What was very interesting  is that they have window skins that even though it is technically a 2D render it has parallax, so when you move past it, what looks to be furniture inside has proper visual movement.  

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

  • EOSHD Pro Color 5 for All Sony cameras
    EOSHD C-LOG and Film Profiles for All Canon DSLRs
    EOSHD Dynamic Range Enhancer for H.264/H.265
×
×
  • Create New...