Jump to content


  • Posts

  • Joined

  • Last visited

Everything posted by majoraxis

  1. Once a year we have the opportunity to fill in the gaps or expand our capabilities at potentially the best prices of the year. What's on your shopping? Here's some of my favorite Black Friday discounted purchases: 1. Topaz Labs Everything Bundle (Topaz Video AI, Topaz Photo AI, Topaz Sharpen AI, Topaz Gigapixal AI, Topaz DeNoise AI) all for $279 2. Affinity Photo 2, Affinity Designer 2, Affinity Publisher 2, and the iPad version of each, all for $99 3. WaveLab 11 for 50% off and even more because of the exchange rate if you purchase through www.bestservice.com 4. Sonoris Mastering Compressor 25% off. 5. Relab Developments LX480 Dual-Engine Reverb V4 (basically a perfect copy of the Lexicon 480L dual engine reverb) for 50% off.
  2. Fantastic! I am wondering if anyone will use the time lapse record function for creating titles that with morph over time. I just checked their website and not only is it still on but you can get all of their apps for $20, basically, save you $10 off the previously discount price of $10 each. That said if you only get one, Rebelle Pro is the one to get. https://www.escapemotions.com/blog/ten-years-of-escape-motions
  3. Hi, Rebelle 5 Pro simulates water colors with realistic color blending and also does oils, acrylics, pencil and ink. If you ever wanted to try you hand at being a (digital) painter - get this deal - regularly $149 only $10 end Oct 13. https://www.escapemotions.com/blog/ten-years-of-escape-motions https://www.instagram.com/p/CjQb5Faq4yC/
  4. is this feature a big deal? “You can also use Ultimatte 12 to layer computer generated augmented reality foreground objects into a scene, complete with realistic transparency that your talent can walk behind” Could unreal engine 5 in real time to render both the landscape/background and a foreground image the transparency, while changing how in focus the foreground is based on the distance of the subject is from the camera based on AFX focus distance determination? Thanks!
  5. I’ve owned a number of those cameras (GH 1. NX1, BMCC 2.5k). I would say that the GH5 is a great low total cost of ownership camera. It’s 10bit with a decent bit rate for about $750 used. Maybe that’s more than people would consider cheap, but it is hack and hassle free with a broadcast quality image. So have we had access to great low cost icing images for a long time? Sure, but each had it’s challenges and flaws !
  6. A perspective from someone who mostly reads topics, sometimes posts when I have some to add and starts a new topic (very infrequently) if I think it would be of interest to other forum members. I come here for information about video cameras, what to added to them to make them perform better, and comparisons videos if there are 2 or more camera or products that might service the same purpose. Basically, I wanted to gain some knowledge and I am willing contribute if I feel like I have something to offer. Maybe create a camera, lens, lights, sound, production equipment database where forum members could contribute their knowledge and experience through product reviews, tutorial and answer questions that are posted to their product reviews. Everybody has a product they are passionate about and that are willing tell you about. Maybe create structure for people to contribute like a wikipedia for production, which has a questions and comments sections following the review. Oh an if I post a review of a product I get to provide a link to my Share Grid post so that I can rent it to them - maybe EOSHD gets a cut of the rental referral.
  7. Thanks Lilly, Looking forward to what you come up with! If you want to get us started with the second question about what makes up a camera tracking system for virtual production from antilatency and how it compares feature and cost wise to other professional camera tracking systems the would be a great place to start...
  8. @ohheylilly Welcome! We have a number of members who are interested in Virtual Production. Here's a topic for your review: As some people on this forum have professional cameras, lenses, focus and zoom controllers, lighting and green screens, if you would provide an example our how your system works for a non stationary camera (dolly or hand held) in relation to a non stationary subject (distance relative to the camera and focal distance is changing as well as the operator changing the the zoom setting) and a green screen background that gets replaced by a virtual background that changes its perspective and depth of field based on the relationship of the subject to the camera, the realtime reporting of the zoom and focus positions of manual zoom lens using the AFX system by our very own BTM_Pix, and in Unreal Engine 5. That would be greatly appreciated. What makes up a camera tracking system for virtual production from antilatency and how does it compare feature and cost wise to other professional camera tracking systems? Thanks for your interest in our forum and our virtual production needs!
  9. Will the URSA Min Pro 12k get this? I hope so. I also hope that the reset of the USRA Mini Pros and Broadcast get this as well. From what I have seen test wise (thanks for the tests! @FHDcrew and @Davide Roveri), I like the look and feel of the Gyro stabilization, it reminds me of a professional hand held shot rather than a gimbal shot which I am pretty sure I prefer the look of the Gyro. I would also like to see what a gimbal shot without a stabilized lens followed by Gyro stabilization looks like. Should probably look similar, but maybe Gyro stabilization will make it look more like hand held footage and hopefully it will be the best of both worlds. This a game changer feature for me as I like experience of shooing hand held, but it alway looks worse than I think it will when I review the footage. Now I can live in my movie making fantasy world and not have to appologise for following my creative impulses at the end of the day. Thank you Blackmagic!
  10. Very impressed that it is not 8k or 6k so it should have better low at $4.6k than if it were higher resolutions. I would love to see what 20 stops looks like compared to 17 stops etc... will it be noticeable? So will they have a Log 20 to 16 or 12 bit storage file bit depth to retain all of the dynamic range? They have been thinking about how deliver these specifications for a number of years so ti will be informative to see how they solve higher dynamic range capture.
  11. I came across a free leveler that is worth checking out (if you are on Windows.) https://luveler.blogspot.com/ It really has a great natural sound that reinterprets the dynamics to make it more natural sounding even if you only changing a the volume slightly. I am following a multiband compressor on the master bus and the make he compressor sound more natural. I have a feeling this is going to be a secret weapon of short as it can really reframe the dynamic response of a track or mix for the better. Thomas also makes Loudmax, which is a free limiter that runs on both Windows and Mac. https://loudmax.blogspot.com/ Both are worth checking out.
  12. The URSA 12k highlights are pleasing as well as their transition to being blowout is smooth. Skin tone are nice, but the color grading was a bit too green at times for my tastes... I like the full spectrum color rendering in day light. The sensor technology shows its advantages under the right conditions - full spectrum light if required to get the most out of this camera.
  13. Besides the Kinefinity MAVO Edge 6K Digital Cinema I am not aware of any other professional video cameras being released at NAB 2022. It used to be companies would book purchases with manufactures for the next TV season at NAB. This does not seem like a thing anymore, but I could be wrong. It really makes sense for there to be a $5k URSA 8k box camera with the same 12k sensor just with limited resolution and maybe a Canon RF mount and on sensor electronic ND. And at the same time release the 12k G2 with on sensor electronic ND audio for $6,995. There would be lots of people willing to upgrade for electronic ND's... maybe a late summer camera line refresh highlighting an electronic ND management system the always keeps the video signal under peaking and has meta data for smooth exposure ramping post.
  14. Looks like not new cameras from Blackmagic at NAB 2022... I installed Resolve 18 beta and it seems pretty stable (at least for audio processing). I did have it crash when I was trying to drag and drop an audio plugin from one buss to another buss, so I restarted it and copied the plugin settings, instantiated the same plugin on a difference track and pasted the plugin settings and it worked fine. Also, the same session opened up and played back the same audio plugin combination so it seem the there are not adding processing overhead to audio plugin in Resolve 18, which is not expected to be a problem, but it nice to see that it is not an issue. Does anyone know if it is $5 per month to host your projects on the Blackmagic Cloud or is it $5 per project... it seems like it is per project by I have not taken the plunge yet... Thanks!
  15. https://www.blackmagicdesign.com/products/davinciresolve/whatsnew I'm looking forward to many of the new AI masking features in Resolve 18. I imagine Resolve 18 will be at a level where many people who are on the fence about switching will not be able to resist the power of 18!
  16. If you are looking to tame harshness and add density and presence, G-Sonique's Analog Tape ASX-72 is on sale for €18.9 as an intro price for first the 100 customers. This is a Windows only plugin. https://www.g-sonique.com/analog-tape-asx-72-vst-plugin.html I've tried many tape emulations and this is one of the best, especially for not harming the original signal. A great application would be to make an inexpensive camera mic preamp sound more rich and full. If you are not happy with the tone of your audio recording, this will reframe it in pleasing and present manner.
  17. Hi, There are a number inexpensive (when on sale) audio problem solvers to demo to see it they just might take your mix to the next level. 1. FIRE The Clip - Make your mix louder without unwanted side effects. Best Clipper available IMHO. Use instead of or after limiter. €49.00 https://www.acustica-audio.com/store/products/firetheclip 2. CLARITY Vx - Voice noise reduction. On sale now for $29.99 https://www.waves.com/plugins/clarity-vx#clarity-vx-vocal-noise-reduction 3. TRUE MID/SIDE - Can be used to enhance ambience and balance your stereo field. $29.95 https://www.raisingjakestudios.com/utilitiy and fx.html#True:32Mid:47Side All of these have a single knob to control the main effect so they are super easy to used.
  18. @Gianluca I would love to see what you do with “The City” in the Unreal and python background remover thread! I just got a used computer with a 3090 in it. When I use the mouse to navigate there is a little screen tearing but it is playing back in real-time fine. When I use the arrow keys it is is pretty much smooth. I believe it is because I only have a 6 core CPU (AMD 5600). I also did not change the quality setting which can make a difference. My next move is to purchase a AMD 5950x CPU . And put a large fast second hard drive to pull the data from. When I’ve done these things I will have met the minimum specification for The City so I think it will be fine with mosts maps. The thing is for virtual production, if you are just loading a fixed set / location rather than a 4 kilometer area map with lots of AI traffic and crowds it should be fine for real-time if you meet the minimum specifications. What is fascinating to me is how Unreal Engine 5 basically uses / allocates resources so efficiently. It is as if it has a George Lucas producer mindset. What is the minimum to fool the audience by taking advantage of cutting edge technology… What I don’t think is there yet are the meta humans … so real actors shot against a live virtual background is about as real as Unreal can get right now…
  19. I had a chance to check out "The City" map running on Unreal Engine 5. The quality was excellent. The AI driven behavior of traffic and crowds is excellent and scalable. What was very interesting is that they have window skins that even though it is technically a 2D render it has parallax, so when you move past it, what looks to be furniture inside has proper visual movement.
  20. Unreal Engine 5 now released. https://www.unrealengine.com/en-US/blog/unreal-engine-5-is-now-available "The City" from The Matrix Awakens: An Unreal Engine 5 Experience is available a free download to play with Unreal Engine 5. Information below: https://www.unrealengine.com/en-US/wakeup?utm_source=GoogleSearch&utm_medium=Performance&utm_campaign=an*3Q_pr*UnrealEngine_ct*Search_pl*Brand_co*US_cr*Frosty&utm_id=15986877528&sub_campaign=&utm_content=July2020_Generic_V2&utm_term=matrix awakens It seems like a good minimum system config would be: Windows 10 64-bit 64 GB RAM 256 GB SSD (OS Drive) 2 TB SSD (Data Drive) NVIDIA GeForce RTX 2080 SUPER (RTX 3090 IT or RTX 3090Ti would be better) 12 to 16 cores CPU (so a AMD Ryzen 9 5950x or AMD Ryzen 9 5900X)
  21. Hi Andrew, I’ll be the first to jump in. I’m happy to help with the out line of the movie if you would like to send me a link to your “virtual whiteboard”. Please let us know the terms of engagement. I suggest you create a platform to do what you are suggesting so that others can do what you have done - ask for others to collaborate. if you can figure out how to make it enjoyable and profitable to participate then I think others will make a go of it as well. Basically, create a secure pipline/model/portal/web community where people use their skills and where their interests align and have an expectation of revenue share if and when it succeeds.
  22. Not the greatest test, but it looks like Unreal 5.0.0 early access runs on the M1 Max Mac Book Pro. There are some caveats for regarding Unreal 5 functionality on Mac OS, namely it currently does not support Nanite. From the Unreal forums I read that Lumen runs on Unreal 5 Preview 1, so things are moving in the right direction. Maybe with the release of the Mac Studio, Unreal will focus on support of the Mac Studio as it seems like it is going to be very popular... Heres' an overview of Lumen in Unreal 5: Here's an overview of Nanite (sounds like it is something everyone will want on Mac OS):
  23. Anyone else wondering how well the Mac Studio will run Unreal for Virtual Production? I need a number of video outputs so it seems to be a good choice when it comes to outputting a lot of pixels. I could be wrong about its ability to use multiple usb 4 ports to display one large composite image…
  24. Here's a website with lots of examples of Cooke's that are coated, uncoated front lens element and uncoated front and rear lens elements in side by side comparison shots. http://www.cookeminirentals.com/uncoated-elements I prefer the examples with coated lens elements, but I find nothing wrong with the look of a coated Cooke lens in the first place. I think they way to go is to find a lens that flares by the nature of its design (which could incorporate less than ideal lens coatings and lead to lens flares as well), thus a vintage zoom lens may be just what the doctor ordered.
  25. @Gianluca that video was really impressive. From looking at the edge of the python key, it seems that there is a transition to the background at the edge so at times when the background of the subject video is similarly dark or light in color/lighting to background replacement video it is most convincing. Maybe If you lit your subject with a light from one side and had the background behind/on the other side of the subject darker then substitute a similarly lit and oriented virtual back ground, I believe you would have something that is even more seamless. Lighting your subject in anticipation of the background lighting you plan to use with the python script should look even better if you can get the real and the virtual back ground lighting to match closely. For my application, if I could get a projected background that was in sync with the camera and lens "position" with the replacement background synced in post with the camera movement in relation to the projected background then using the python keying script, be able to transition from real and virtual with as little give away as possible. My goal is to seamlessly transition from a subject shot on a minimal projected set to an infinite virtual set, which is compatible with traditional lighting and shooting techniques. I'm looking forward to what you come up with next!
  • Create New...