Jump to content

majoraxis

Members
  • Posts

    489
  • Joined

  • Last visited

Everything posted by majoraxis

  1. Besides the Kinefinity MAVO Edge 6K Digital Cinema I am not aware of any other professional video cameras being released at NAB 2022. It used to be companies would book purchases with manufactures for the next TV season at NAB. This does not seem like a thing anymore, but I could be wrong. It really makes sense for there to be a $5k URSA 8k box camera with the same 12k sensor just with limited resolution and maybe a Canon RF mount and on sensor electronic ND. And at the same time release the 12k G2 with on sensor electronic ND audio for $6,995. There would be lots of people willing to upgrade for electronic ND's... maybe a late summer camera line refresh highlighting an electronic ND management system the always keeps the video signal under peaking and has meta data for smooth exposure ramping post.
  2. Looks like not new cameras from Blackmagic at NAB 2022... I installed Resolve 18 beta and it seems pretty stable (at least for audio processing). I did have it crash when I was trying to drag and drop an audio plugin from one buss to another buss, so I restarted it and copied the plugin settings, instantiated the same plugin on a difference track and pasted the plugin settings and it worked fine. Also, the same session opened up and played back the same audio plugin combination so it seem the there are not adding processing overhead to audio plugin in Resolve 18, which is not expected to be a problem, but it nice to see that it is not an issue. Does anyone know if it is $5 per month to host your projects on the Blackmagic Cloud or is it $5 per project... it seems like it is per project by I have not taken the plunge yet... Thanks!
  3. https://www.blackmagicdesign.com/products/davinciresolve/whatsnew I'm looking forward to many of the new AI masking features in Resolve 18. I imagine Resolve 18 will be at a level where many people who are on the fence about switching will not be able to resist the power of 18!
  4. If you are looking to tame harshness and add density and presence, G-Sonique's Analog Tape ASX-72 is on sale for €18.9 as an intro price for first the 100 customers. This is a Windows only plugin. https://www.g-sonique.com/analog-tape-asx-72-vst-plugin.html I've tried many tape emulations and this is one of the best, especially for not harming the original signal. A great application would be to make an inexpensive camera mic preamp sound more rich and full. If you are not happy with the tone of your audio recording, this will reframe it in pleasing and present manner.
  5. Hi, There are a number inexpensive (when on sale) audio problem solvers to demo to see it they just might take your mix to the next level. 1. FIRE The Clip - Make your mix louder without unwanted side effects. Best Clipper available IMHO. Use instead of or after limiter. €49.00 https://www.acustica-audio.com/store/products/firetheclip 2. CLARITY Vx - Voice noise reduction. On sale now for $29.99 https://www.waves.com/plugins/clarity-vx#clarity-vx-vocal-noise-reduction 3. TRUE MID/SIDE - Can be used to enhance ambience and balance your stereo field. $29.95 https://www.raisingjakestudios.com/utilitiy and fx.html#True:32Mid:47Side All of these have a single knob to control the main effect so they are super easy to used.
  6. @Gianluca I would love to see what you do with “The City” in the Unreal and python background remover thread! I just got a used computer with a 3090 in it. When I use the mouse to navigate there is a little screen tearing but it is playing back in real-time fine. When I use the arrow keys it is is pretty much smooth. I believe it is because I only have a 6 core CPU (AMD 5600). I also did not change the quality setting which can make a difference. My next move is to purchase a AMD 5950x CPU . And put a large fast second hard drive to pull the data from. When I’ve done these things I will have met the minimum specification for The City so I think it will be fine with mosts maps. The thing is for virtual production, if you are just loading a fixed set / location rather than a 4 kilometer area map with lots of AI traffic and crowds it should be fine for real-time if you meet the minimum specifications. What is fascinating to me is how Unreal Engine 5 basically uses / allocates resources so efficiently. It is as if it has a George Lucas producer mindset. What is the minimum to fool the audience by taking advantage of cutting edge technology… What I don’t think is there yet are the meta humans … so real actors shot against a live virtual background is about as real as Unreal can get right now…
  7. I had a chance to check out "The City" map running on Unreal Engine 5. The quality was excellent. The AI driven behavior of traffic and crowds is excellent and scalable. What was very interesting is that they have window skins that even though it is technically a 2D render it has parallax, so when you move past it, what looks to be furniture inside has proper visual movement.
  8. Unreal Engine 5 now released. https://www.unrealengine.com/en-US/blog/unreal-engine-5-is-now-available "The City" from The Matrix Awakens: An Unreal Engine 5 Experience is available a free download to play with Unreal Engine 5. Information below: https://www.unrealengine.com/en-US/wakeup?utm_source=GoogleSearch&utm_medium=Performance&utm_campaign=an*3Q_pr*UnrealEngine_ct*Search_pl*Brand_co*US_cr*Frosty&utm_id=15986877528&sub_campaign=&utm_content=July2020_Generic_V2&utm_term=matrix awakens It seems like a good minimum system config would be: Windows 10 64-bit 64 GB RAM 256 GB SSD (OS Drive) 2 TB SSD (Data Drive) NVIDIA GeForce RTX 2080 SUPER (RTX 3090 IT or RTX 3090Ti would be better) 12 to 16 cores CPU (so a AMD Ryzen 9 5950x or AMD Ryzen 9 5900X)
  9. Hi Andrew, I’ll be the first to jump in. I’m happy to help with the out line of the movie if you would like to send me a link to your “virtual whiteboard”. Please let us know the terms of engagement. I suggest you create a platform to do what you are suggesting so that others can do what you have done - ask for others to collaborate. if you can figure out how to make it enjoyable and profitable to participate then I think others will make a go of it as well. Basically, create a secure pipline/model/portal/web community where people use their skills and where their interests align and have an expectation of revenue share if and when it succeeds.
  10. Not the greatest test, but it looks like Unreal 5.0.0 early access runs on the M1 Max Mac Book Pro. There are some caveats for regarding Unreal 5 functionality on Mac OS, namely it currently does not support Nanite. From the Unreal forums I read that Lumen runs on Unreal 5 Preview 1, so things are moving in the right direction. Maybe with the release of the Mac Studio, Unreal will focus on support of the Mac Studio as it seems like it is going to be very popular... Heres' an overview of Lumen in Unreal 5: Here's an overview of Nanite (sounds like it is something everyone will want on Mac OS):
  11. Anyone else wondering how well the Mac Studio will run Unreal for Virtual Production? I need a number of video outputs so it seems to be a good choice when it comes to outputting a lot of pixels. I could be wrong about its ability to use multiple usb 4 ports to display one large composite image…
  12. Here's a website with lots of examples of Cooke's that are coated, uncoated front lens element and uncoated front and rear lens elements in side by side comparison shots. http://www.cookeminirentals.com/uncoated-elements I prefer the examples with coated lens elements, but I find nothing wrong with the look of a coated Cooke lens in the first place. I think they way to go is to find a lens that flares by the nature of its design (which could incorporate less than ideal lens coatings and lead to lens flares as well), thus a vintage zoom lens may be just what the doctor ordered.
  13. @Gianluca that video was really impressive. From looking at the edge of the python key, it seems that there is a transition to the background at the edge so at times when the background of the subject video is similarly dark or light in color/lighting to background replacement video it is most convincing. Maybe If you lit your subject with a light from one side and had the background behind/on the other side of the subject darker then substitute a similarly lit and oriented virtual back ground, I believe you would have something that is even more seamless. Lighting your subject in anticipation of the background lighting you plan to use with the python script should look even better if you can get the real and the virtual back ground lighting to match closely. For my application, if I could get a projected background that was in sync with the camera and lens "position" with the replacement background synced in post with the camera movement in relation to the projected background then using the python keying script, be able to transition from real and virtual with as little give away as possible. My goal is to seamlessly transition from a subject shot on a minimal projected set to an infinite virtual set, which is compatible with traditional lighting and shooting techniques. I'm looking forward to what you come up with next!
  14. @Gianluca Wow - the tracking looks so realistic, especially when the camera shakes! I love the last part where the camera twists and it looks like it tracked it perfectly. This is really getting good. Looking forward to seeing how you make this work with your a6300, how do you synchronize the tracking data from the phone (assuming you put it on top of the camera) with the video file created in the camera? I imagine the sync would have to be spot to keep that same level believability...
  15. @BTM_Pix thanks for sharing your developments! Would it be possible to take the AFX focus information and automate the adjustment the amount of focus/defocus in real time of a projected background image (that is being captured along with the subject) based on the distance the subject is from the camera and the distance the screen is from the subject? If the camera is stationary then inputting a measurement for calibration should not be a problem (I assume), but if the camera is moving then some type proximity sensor array like the ones used for robot geofencing might do the trick. Seems like this would be the holy grail of make the subject and back ground seamlessly integrated for shooting scenes where the background projected image is close to the subject because we only have so much room in our living rooms or garages.
  16. @Gianluca - Fantastic! I am wondering if it would be possible to film a virtual background behind the actor, say a 10' x 5' short through projected image and then extend the virtual back ground in sync with the computer generated projected background being recorded behind the actor. Kind of like in the screen grab from your video that reveals grass, house, bush and trampoline. If the physical back ground in this shot was the projected image behind the actor and the Minecraft extended background seamlessly blended with the small section of projected background being recorded along with the actor, that would super interesting to me, as I am putting together a projected background. I am hoping to combined the best of both worlds and I am wondering where that line is. Maybe there's still some benefits to shooting an actor with a cine lens on a projected background for aesthetic/artistic reasons? Maybe not? Thanks! (I hope it's OK that I took a screen grab of your video to illustrate my point. If not, please ask Andrew to remove it. That's of course find with me.)
  17. This fantastic stuff! Very impressive quality with less production overhead. @Gianluca thanks for continuing to share your virtual production journey with us! @BTM_Pix is the concept of taking realtime focus motor and zoom motor position data from AFX and informing the virtual camera so that the background responds appropriately, viable? Thanks!
  18. Who knows, maybe a few years ago Sony asked AP what type of specifications they would like and if that would be enough to make them switch from Canon. Maybe Sony, after doing their market research, listened and delivered what their customers wanted and now has an advantage in the market place because of it. Or maybe the system is corrupt, someone inside of AP is getting a kickback from Sony and wrote the bid to only allow for Sony products to be purchased.
  19. Interesting web page format and content. Movie posters pair with the camera and lenses used as well as a brief description. It reminds me of a "quad chart" where an expected format makes information easy to consume by an audience with little time to waste. I love looking at information presented in this way.
  20. When you say "I cleaned up the effect quite a bit", is that in reference to making the movement of the background more fluid? If so, if you could explain basic concept of how to you when from the back ground motion of BGTest to the smooth background motion of BGTest 2, that would be appreciated. Thanks!
  21. Great topic! I have been researching projectors for just such an application. Here are some of my findings. 1. Laser projectors have better contrast ratios as compared to lamp based projectors, so Laser projector blacks are less grey and preferred. 2. Laser projectors have better color consistency and degrade predictably and uniformly over time and typically are good for 20,000 hours, so Laser is preferred over lamp as a projector light source. 3. You can damage your vision by looking into a laser projector lens (when it is on of course) so using a short throw projector that the subject stands so that the projector is behind them or rear projected is the safest configuration when shooting human or animals subjects etc. Some Laser projectors have proximity sensors and dim when an object gets too close to the lens, others do not. 4. DLP vs LCD. DLP can produce a rainbow effect when filming the projected image so LCD is preferred. 5. DLP vs LCD. LCD has better color brightness and saturation over DLP so LCD is preferred. 6. Home vs Commercial Projectors. Some commercial projectors offer "projection mapping" capabilities, which is a sophisticated means of combining the images of multiple projectors, blending the overlapping image edges and compensating for geometric issues as well as offering image calibration across multiple projectors. If you plan to use more than one projector to make a single large image, you will want commercial projectors with projection mapping capabilities. 7. Lens shift and throw distance. Short throw projectors seem to be designed with a narrow through distance range with limited (if at all) optical/physical lens shift capabilities. Many commercial grade projectors have removable lenses with remotely operated motorized lens shift, zoom and focus capabilities. For creative and projection mapping applications, a commercial projector with a detectable lens provides greater placement and screen size options. 8. "4k" projection . True 4k LCD Laser projectors are far more expensive than 4k enhanced LCD projectors that use LCD or DPL imaging element shifting to create the perception of about half the resolution of true 4k (like the LG Cinebeam or Epson LS500 do). Here's what a 3 LCD (1080px1200 native HD) ultra short throw Laser projector with 4000 lumens of light output looks list under trade show conditions front projected and rear projected. It is the Epson PowerLite 700U and Is about $1,400 refurbished. For those who would like to consider using multiple projectors and doing projection mapping, the Epson PowerLite L1100U (6000 Lumens and can take a 4k signal but does not do the 4k enhancement when projection mapping) would be a good entry point at about $2300each refurbished. Here's a video on how a room looks mapped with multiple projectors. I imagine an ALR (ambient light rejecting) screen would help in trying to achieve a useable ratio of screen light to subject light along with higher ISO cameras. @BTM_Pixsuch a great topic. Thank you! I will be following this thread closely.
  22. Beautiful Cinematography! RED Komodo looking great! There is something special about slow motion shot with a global shutter (I assume).
  23. If they increase the sensor size with 25% large photo sites (better lowlight and dynamic range?) they could potentially use virtually the same electronics and modify the hardware with larger ND window and larger throat to the sensor etc. to accommodate a full frame sensor. If they added on sensor electronic ND's, they could reengineer the lens mount accommodate shorter flange distance mounts like the RF and even micro 4/3. I would also like to see/hear 32 bit audio on their next URSA mini pro release. Then we would have pseudo RAW video and audio. One of the benefits of the BRAW architecture is that I can record 12k or 8k and if my computer has trouble playing it back, I can down sample it to 4k or 2k on the fly and it will play back fined without having to make a proxy. USB-C connection to NVMe M.2 makes this workflow possible and affordable. These drives are so fast and currently on sale for about the same price as a 2.5" SSD drive. Seems like Blackmagic has solved most of the problem at the same time. Affordable high performance storage via USB-C. Scaleable RAW like codec for high resolution low band acquisition. Architecture to support for scaleable playback without transcoding. Editing environment to support BRAW. Cinema camera from factor with professional connection the XLR, SDI, PL Mount and B4 mount and lens connection, view finder option, time code. So what are they missing? Full frame? World class auto focus? Electronic ND? RF mount? 32-bit audio? Better dynamic range? Better low light? Selling their own lenses and lighting so that they can offer end to end solutions and support. Renting and loaning their own cameras, lenses, lighting, editing workstations at a discount in markets they want to enter. Provide local support offices in every major market so they can personally fix any problem that may arise. Provide workflow case studies so that productions can justify adoption. "Acquire" name brand endorsers. Focus efforts to be listed as being shot on for award winning productions. Charge 5 to 10 times as much to pay for market dominating infrastructure and accusation. As someone who wants to continue to be able to afford Blackmagic cinema products, I hope they continue to grow organically (like Aputure) into professional markets rather that build the infrastructure and acquire the adjacent product lines to try and dominate the professional market by force of will. With a better economy Blackmagic probably would not have had to lower the price of the 12k to get it to sell, but I'm glad they did because, if not I would not be able to justify purchasing in one (used). Thanks Blackmagic! Your approach is just right. Steady as she goes Captain Petty, steady as she goes. Regardless of the price drop and how many 12k are selling, the Blackmagic URSA Mini Pro 12k is the King of Affordable Professional Cinema Cameras.
  24. The USRA Mini Pro 12k with Camera Update 7.7 promises improved performance including: Improved color balance. Improved demosaic for better shadow detail MTF. Improved demosaic to reduce moire effects on fine patterns. USB-C connection improvements. Improved compatibility with ExAscend U.2 disks on URSA Mini Recorder. Here's a link to the press release: https://www.blackmagicdesign.com/media/release/20211220-01 Not bad considering that you can now get them used for $5k. Thanks for continuing to invest Blackmagic, the 12k is my next camera for sure! Does this mean I do not need to purchase the RAWLITE Optical Low Pass Filter? I hope so...
  25. The great low light sensor of USRA Broadcast G2 makes using/adapting B4 lenses react to lighting conditions as you would expect a cenima camera with a sensor with average lowlight capabilities to perform, and it's not a bad thing, in fact having net "average" low light performance for adapted B4 lenses in a professional camera body is a fantastic improvement. 1. The Blackmagic 4K Broadcast Lens adapter loses at least a stop of light when expanding the 2/3 inch image circle to make it large enough to cover the 4k sensor crop of the 6k sensor. 2. Depending on B4 lens, stopping it down can increase resolving power, increase contrast and allows for easier focus following for shooting action scenes, which may be preferred as compared to shooting it wide open. 3. When zoom on some B4 lenses, you may lose part of a stop of light to a stop of light at the long end as only some are constant aperture when zooming through the entire zoom range. If you start with it stopped down on some B4 zoom lenses, you can achieve a constant aperture when you zoom through the entire zoom range, but at a the cost of some light loss and you decrease the depth of field, which may be a benefit at the long end of the zoom lens to allow easier focus acquisation. 4. If you have and engage the 2x extender on your Broadcast B4 Zoom lens, it doubles to focal length and you will lose 2 stops of light for enabling this capability. So, starting at f1.9 wide open on B4 zoom lens at the wide end, then expanding the 2/3 image with the 4k Broadcast Lens Adapter (included with the Broadcast G2) to cover the 4k sensor sensor crop, you are at f2.8. If you stop it down for constant aperture and/or better resolution and contrast, you are at f3.9. If you engage the 2x extender on top of the above, you are at f7.8 ...and you have a fantastic image that looks like it was shot on a cinema camera with a sensor with average low light capabilities, plus you experience the creative freedom that a B4 zoom lens can provide.
×
×
  • Create New...