Jump to content

Gianluca

Members
  • Posts

    79
  • Joined

  • Last visited

1 Follower

About Gianluca

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

Gianluca's Achievements

Member

Member (2/5)

45

Reputation

  1. I've also played with city... I have a 2070s and in game it's choppy, but i can use the map for cinematic and the quality it's very very good..
  2. I have made some other small but significant progress ... Finally I can set up the sequencer of unreal quite fast and predictable, before there was always something that had to be done that I did not do that made me waste hours just understanding why it behaves in this way etc etc .. Then I finally understood how to create foreground plans in such a way as to increase the realism of the scene ... This bad example of animation really helped me understand this, and as always with unreal, it was not easy at all, at least for me ...;)
  3. In this case it was more complicated to combine the two videos ... You have to perfectly orient the direction of travel otherwise it seems that you walk too fast or slowly ... Even with the aspect ratio it was not easy ... What do you think?
  4. I add this video even if unfortunately I can not add anything in reality to the discussion, just to show the exceptional tracking that blendertrack manages to obtain with an embarrassing simplicity .., everything is then transferred as it is to unreal with the same settings of the camera and this is the result .... If there were no algorithm interpretation problems because the file is in low resolution (1080p from mobile) and bicycle etc and because the background is not homogeneous enough, you could really do anything with the minimum effort ... Unfortunately this type of tracking I can only use for a few and studied scenes ... I then tried to digitize my son with capture reality but even here I couldn't, I'll probably have to try again with other camera settings
  5. Yes, I was very surprised for the last part... I've done some other test with this sequence with other environment and the best part for me is that i can change the location importig other world and my subject will be aligned perfectly with the floor... This will greatly speed up my work when I will combine multiple scenes.. Unfortunately I have not found a way to make the file of the mobile phone perfectly match that of the mirrorless, probably the problem is due to the fact that the mobile phone films at variable framerate while the a6300 at fixed framerate ... Even comparing the two files in resolve, without doing anything, you can see that they have different times .. Maybe I'm doing something wrong but at the moment I have to settle for the mobile phone file, which is unfortunately really bad, also because the application records at most in 1080p ... Now I'm trying to upscale the video with python to see if it improves a bit, but even if it doesn't matter, I have a mind to make a little movie where both the locations and my son will have a cartoon effect, so that the definition is a secondary problem ...
  6. Yesterday finally after an exhausting day arguing with unreal engine, I managed to get a video with a credible tracking, and the merit is all of a free application for smartphones called blendartrack. I have never been able to use the blender camera tracking, moreover if there are moving subjects you have to stay there to delete the points by hand where there is the overlap of the subject, then you have to align the planes etc etc .... With blendartrack all this is really a walk, from when I open blender to when I export for unreal it takes no more than 15 seconds, truly a liberation ... In this example the tracking in my opinion is not even at its maximum potential because in the rush with my son who wanted to do something else, I forgot to add markers, but it is already very good ... What surprised me then is that the focal length used by the mobile phone is also exported, and that the world of unreal (or at least the one used in this example) responds perfectly to the movements of the real world, without having to scale anything. The only big drawback at the moment is that the video taken with the mobile phone has a very low quality compared to what I can get with the a6300 and its dedicated optics .. I have to figure out how to track the file with my mobile phone and then use a file taken with the mirrorless.
  7. Yes, it's robust video matting.. In this video tracking it's bad but it's my fault, fps in the 2 videos doesn't match... I'm using virtual plugin, an app for unreal engine to install on a smartphone... Projected background it's better, but you NEED a studio, a very large studio, with light etc etc.. Here I'm in my garden.... Now i want to use blender camera tracking for videos where exact match is needed..
  8. Hello everyone ... I preferred to open a new topic rather than continue to abuse "my Journey to virtual production" With the carnival I was able to do some more tests with unreal and this time for I recorded my subject up against a yellow wall ... As you can see, in controlled working conditions, the results can be really good ... Obviously there is still a lot of room for improvement, for example I have to synchronize the two video tracks by recording a common audio track, I have to balance the gimbal better (I have a crane-m and with the mobile phone mounted I exceed the grams it can support, so it vibrates a lot) , but apart from that, if I had an actress who does something sensible apart from shooting all the time 🙂 I might at this point think I can shoot something decent ... What do you think? Opinions, advice?
  9. I know very well that you are a very good programmer and surely you are trying to make your own system to control the unreal camera, but after seeing the example in video, I wonder if it is not much more convenient to use an already made app, such as a virtual plugin. that I used in my tests ... You can control different parameters such as movement and focus , you can move around the world with virtual joysticks as well as being able to walk, you can lock the axes at will etc etc .. Try to give it a look, there is also a demo that can be requested via email ... https://www.unrealengine.com/marketplace/en-US/product/virtual-plugin
  10. You probably want something like this..
  11. Absolutely no problem for the screenshot, but I didn't understand how you could go around a subject and at the same time always have the projected image behind him ... It would take a huge set like in the mandolarian and I think the one who opened the topic is trying to do a similar thing ... I, on the other hand, having no money, I'm trying to extrapolate the subject as if it were behind a green background even if it is not. And.. The smartphone app connected to an unreal engine then does the rest ... For scenes where the subject's feet and movement in the camera are required, you can use the blender camera, but I've never been able to get anything good at the moment...
  12. While I am learning to import worlds into unreal, learning to use the sequencer etc etc, I am also doing some tests, which I would define extreme, to understand how far I can go ... I would define this test as a failure, but I have already seen that with fairly homogeneous backgrounds, with the subject close enough and in focus (here I used a manual focus lens), if I do not have to frame the feet (in these cases I will use the tripod) it is possible to catapult the subject in the virtual world quite easily and in a realistic way .. The secret is to shoot many small scenes with the values to be fed to the script optimized for that particular scene .. The next fly I'll try to post something made better ..
  13. Sorry if I often post my tests but honestly I've never seen anything like it, and for a month I also paid 15 euros for runwayml which in comparison is unwatchable .. We are not yet at professional levels probably, but by retouching the mask with fusion a little bit in my opinion we are almost there ... I am really amazed by what you can do...
  14. From the same developers of the python software that allows you to have a green screen (and a matte) from a shot with a tripod and a background without a subject, I found this other one, it's called RVM, and it allows you to do the same thing even with shots moving and starting directly with the subject in the frame .. For static shots the former is still much superior, but this has its advantages when used in the best conditions. With a well-lit subject and a solid background, it can replace a green background Or with a green or blue background you can have your perfect subject even if it just comes out of the background or even if the clothes she is wearing have green or blue Now I am really ready to be able to shoot small scenes in unreal engine and I can also move slightly within the world itself thanks to the virtual plugin and these phyton tools The problem is that the actor has no intention of collaborating .. ..
  15. In this example I used a python script that allows you to have perfectly outlined images without any green screen, the camera must be placed on a tripod and then just a background image without the subject is enough and the script will do everything else ... not 100% reliable but when it works it's really perfect ... I then used the "composure" plugin in unreal to try to put real and virtual in perspective ... Now I'm experimenting with special effects to be inserted directly into unreal ..
×
×
  • Create New...