Jump to content

NVIDIA is done with CUDA on Mac; the final nail in the coffin for NVIDIA cards on Mac


Shell64
 Share

Recommended Posts

NVIDIA and Apple have had a bad relationship for a while. NVIDIA wanted system access that Apple wouldn’t allow, and NVIDIA wouldn’t comply. This resulted in Apple using AMD GPUs for all of their products, including their $6,000 Mac Pro. Now, the relationship has practically ended. NVIDIA are done updating CUDA drivers for Mac. That means future versions of software like Premiere’s CUDA support will eventually cease to function, leaving many NVIDIA Mac/hackintosh users screwed. This is unfortunate, but it was bound to happen. 

Here is the article:

https://gizmodo.com/apple-and-nvidia-are-over-1840015246

Link to comment
Share on other sites

EOSHD Pro Color 5 for Sony cameras EOSHD Z LOG for Nikon CamerasEOSHD C-LOG and Film Profiles for All Canon DSLRs

Well, at least GPU swaps are possible on self-assembled PC rigs. 

i haven't ever done  it, but my understanding was that hackintosh are more robust with AMD anyway...

besides, you have a PC that doesn't play nice with your preferred OS, just change the OS.

im not a big fan of windows, but since all I'm doing in my PC build is Premiere, why really care?

Link to comment
Share on other sites

I am running a MBP with SAPPHIRE AMD Radeon RX 470 in external enclosure eGPU style.  It connects via a Thunderbolt 3 cable that charges the laptop as well as connects to the eGPU.

It is plug and play, with the exception of you have to eject it before pulling it out (I did that by accident once and the graphics was all glitchy until I restarted - oops!) but is pretty nice after that.  Resolve can auto-detect the GPU in the MBP and the eGPU and splits load between both (not that the MBP one is much help) and I have been gaming with the AMD eGPU as well on my external 32" panel.

I was a bit sad that the Nvidia cards weren't supported but went AMD because I just wanted it to work, and it does.  The bang-for-buck ratio of the Nvidia cards was better than AMD, but the new Radeon RX 5700 card is about on par (IIRC) so I think that performance gap has closed.

I'm not a fan of manufacturers forcing customers into closed ecosystems, but overall my eGPU experience hasn't been too bad.

Link to comment
Share on other sites

One thing about the eGPU is that there's kind of a limit to how much use it is depending on your CPU.

I bought one hoping that I would be able to decode the 5K 10-bit h265 from the GH5 real-time, so I could edit without rendering proxies, but it's nowhere close.  It can't even do the 4K 10-bit h264 either, and the challenge is my CPU is the bottleneck, not the GPU.  I went through the Resolve support (which is excellent BTW) and they said that Resolve 16 had GPU accelerated decoding of h264/h265 on AMD, but that it was the first version so there might be optimisations in future versions, but 16.1 didn't change anything.
I was poised to buy the RX 5700 as an upgrade (it's way faster than my 470) but I'm not sure I'd get any benefit really.  It would run OFX plugins faster and maybe export faster, but they're not things I care about really.  If I have to render proxies, then I can render 720p proxies and then the laptop is powerful enough to apply filters and grading in real-time without the eGPU so in that sense it doesn't really help that much.

In this sense I'd rather that everything becomes GPU-based as you can upgrade an eGPU really easily but not a CPU, especially in a laptop.

I'm anticipating that eGPU support and utilisation will improve over time as GPUs are now being pushed for AI and we'll be piggy-backing off the back of that.

I remember in university I did a unit in programming parallel computers and we had access to a cluster that had 2048 CPUs but you had to program it explicitly to use that many CPUs and had direct control over which ones did what, rather than it sorting out which ones did stuff by itself.  They've largely worked those things out now (programming multi-core CPUs isn't done explicitly like it used to be, the software allocates tasks automatically) so I'm hoping that GPU usage will become the same with using multiple GPUs.

This would mean that you can have a modest computer setup but connect it to a bank of however many eGPUs and everything will just organise the work amongst the GPUs and it will be like plugging in more horsepower to get the job done.  We're not there yet but we're getting closer with eGPU support now completely native in OSX.

Link to comment
Share on other sites

On 11/24/2019 at 6:36 PM, kye said:

One thing about the eGPU is that there's kind of a limit to how much use it is depending on your CPU.

I bought one hoping that I would be able to decode the 5K 10-bit h265 from the GH5 real-time, so I could edit without rendering proxies, but it's nowhere close.  It can't even do the 4K 10-bit h264 either, and the challenge is my CPU is the bottleneck, not the GPU.  I went through the Resolve support (which is excellent BTW) and they said that Resolve 16 had GPU accelerated decoding of h264/h265 on AMD, but that it was the first version so there might be optimisations in future versions, but 16.1 didn't change anything.
I was poised to buy the RX 5700 as an upgrade (it's way faster than my 470) but I'm not sure I'd get any benefit really.  It would run OFX plugins faster and maybe export faster, but they're not things I care about really.  If I have to render proxies, then I can render 720p proxies and then the laptop is powerful enough to apply filters and grading in real-time without the eGPU so in that sense it doesn't really help that much.

In this sense I'd rather that everything becomes GPU-based as you can upgrade an eGPU really easily but not a CPU, especially in a laptop.

I'm anticipating that eGPU support and utilisation will improve over time as GPUs are now being pushed for AI and we'll be piggy-backing off the back of that.

I remember in university I did a unit in programming parallel computers and we had access to a cluster that had 2048 CPUs but you had to program it explicitly to use that many CPUs and had direct control over which ones did what, rather than it sorting out which ones did stuff by itself.  They've largely worked those things out now (programming multi-core CPUs isn't done explicitly like it used to be, the software allocates tasks automatically) so I'm hoping that GPU usage will become the same with using multiple GPUs.

This would mean that you can have a modest computer setup but connect it to a bank of however many eGPUs and everything will just organise the work amongst the GPUs and it will be like plugging in more horsepower to get the job done.  We're not there yet but we're getting closer with eGPU support now completely native in OSX.

I recently got an rtx 2060 for my pc. It has an old i7 3770, so I thought the gpu would speed things up. It helps with effects, but playback is unaffected pretty much. Very small difference. I am using the free version, but I transcode to dnxhr. 

Having it all gpu based would be fantastic; older systems like mine could still edit really well without having to go buy a new mobo. 

Link to comment
Share on other sites

16 minutes ago, Shell64 said:

I recently got an rtx 2060 for my pc. It has an old i7 3770, so I thought the gpu would speed things up. It helps with effects, but playback is unaffected pretty much. Very small difference. I am using the free version, but I transcode to dnxhr. 

Having it all gpu based would be fantastic; older systems like mine could still edit really well without having to go buy a new mobo. 

I think it's about bottlenecks.  It's interesting to see what CPU and GPU load you get when doing various things.  I do think that Resolve does all the image processing on the GPU, but that might not mean image file decoding, although I could be wrong.  I definitely get a boost when I go from my laptops puny GPU to the eGPU (I get about 3x FPS, although it's not enough, and it maxes the CPU and not the GPU) but if you're doing heavy image processing then you'll likely get a big boost from having serious GPUs.

If you're rendering proxies then just render whatever resolution works smoothly fo the amount of effects you're doing I guess.

A couple of docs that might be useful:

https://www.pugetsystems.com/recommended/Recommended-Systems-for-DaVinci-Resolve-187/Hardware-Recommendations

https://documents.blackmagicdesign.com/ConfigGuides/DaVinci_Resolve_15_Mac_Configuration_Guide.pdf

Link to comment
Share on other sites

5 hours ago, Shell64 said:

I recently got an rtx 2060 for my pc. It has an old i7 3770, so I thought the gpu would speed things up. It helps with effects, but playback is unaffected pretty much. Very small difference. I am using the free version, but I transcode to dnxhr. 

Having it all gpu based would be fantastic; older systems like mine could still edit really well without having to go buy a new mobo. 

We upgrade our pc from i7-4770 with GTX1070 to i7-9700K with RTX2070 this year, some rendering is like 4X faster (a 2hr video usually took 2hr export on old pc, on new pc is less than 30min), 5.7K 360 can playback realtime without lag which is impossible on older pc, and editing multistream 4K50P with LUT is piece of cake while on older pc 1 stream of 4K50P already killing it.

 

4 core is definitely the bottleneck, even some stuff our 8core i7 pushed to the limit too, a Ryzen 9 will definite be better or the i9 ?

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

  • EOSHD Pro Color 5 for All Sony cameras
    EOSHD C-LOG and Film Profiles for All Canon DSLRs
    EOSHD Dynamic Range Enhancer for H.264/H.265
×
×
  • Create New...