Jump to content
DevonChris

Macbook Pro or Surface Book 2 for 4k editing?

Recommended Posts

2 hours ago, Anaconda_ said:

Does anyone know if the BM box is upgradable?

Yes, it isn't:

Quote

Can I upgrade the GPU chip in the Blackmagic eGPU?
No, the design has been optimized for quiet operation so it’s better suited for creative customers. This means the design is not a simple chassis with a PCIe card plugged in, but an integrated electronics, mechanical and cooling design that cannot have the GPU chip upgraded or changed.

As for the performance of the eGPU with Resolve, no guess work is needed:

Quote

What is the GPU used in the Blackmagic eGPU?
The Blackmagic eGPU is built around a Radeon Pro 580 GPU which is the GPU found in the high end 2017 model of 27 inch iMac.

... and Resolve performs as good (and in some respect even better!) as FCP X on those iMacs.

Another thing to keep in mind is that for GPU performance, the said eGPU doesn't need a *new* MBP:

Quote

Does the Blackmagic eGPU work with all macOS computers?
The Blackmagic eGPU will work with all Thunderbolt 3 based Apple computers that are running macOS 10.13.5 or later. This includes MacBook Pro computers from the 2016 model year and later, 2017 iMac and iMac Pro.

10 hours ago, TwoScoops said:

Wonder if Apple will sell the new BMPCC like they did/do the Raven... 

That Apple and BM are still busy with such joint ventures may be a good thing insofar as in my very humble opinion ProRes Raw could substitute CDNG on the long run. Got FCP and Resolve both on your computer? Then you can download snippets of both from here.

Grant Petty argued that they could get DNG data rates down through compression to almost match those of PRR. Be it as it may, they won't be able to achieve the same performance with that. So it's possible that at Christmas time, when relevant numbers of BMPCC4ks are being shipped, they will have the PRR firmware ...

Share this post


Link to post
Share on other sites
EOSHD Pro Color for Sony cameras EOSHD Pro LOG for Sony CamerasEOSHD C-LOG and Film Profiles for All Canon DSLRs
3 hours ago, Anaconda_ said:

That seems a bit limited compared to this though.. maybe it has a better graphics card, but from where I"m sitting you can do more with the one @BTM_Pix shared earlier in this thread (https://www.scan.co.uk/products/gigabyte-aorus-gtx-1070-gaming-box-thunderbolt-3-rgb-graphics-box)

It seems smaller and can run an additional 4 displays. For myself, I like to edit with 3 screens, and the BM would only allow me to do that if I use the MacBook display as one of them. Which is fine I guess, but you can't beat 3 of the same displays all calibrated and coloured in the same way. 

Does anyone know if the BM box is upgradable?

Regarding the bold part, I would imagine BMD intent this to be used for Davinci Resolve and their stance is that you need an I/O box anyway because monitoring via GUI isn't color accurate due to OS drivers/profiles. So they probably expect one to use 2 screens + a grading monitor via the Intensity Shuttle (that conects through Thrunderbolt or USB).

Share this post


Link to post
Share on other sites
1 hour ago, Phil A said:

So they probably expect one to use 2 screens + a grading monitor via the Intensity Shuttle (that conects through Thrunderbolt or USB)

I suppose so. But the other link I posted can still do that as well. I've only used Resolve a little, but I remember the dual screen mode giving you full screen windows on both monitors. I also like to have room for the script etc. So even with this Resolve setup and the BMD eGPU, 3 screens AND a colour accurate monitor would fit the bill.

Anyway, it's kind of off topic, and is down to your my preferences. For some reason, when I saw this product, I assumed it would be much better than the competition, but to me seems the same or even less flexible for more cash.

Share this post


Link to post
Share on other sites
27 minutes ago, Anaconda_ said:

Anyway, it's kind of off topic, and is down to your my preferences. For some reason, when I saw this product, I assumed it would be much better than the competition, but to me seems the same or even less flexible for more cash.

Yeah I’m a little underwhelmed by the Blackmagic eGPU as well, all that effort and cost and only the same GPU as the iMac? I hope there’s an option in the future for at least a Vega chip from the iMac Pro. 

Share this post


Link to post
Share on other sites
2 hours ago, jase said:

now if they make FCPX work with that Blackmagic eGPU thingy...

I'm wondering whether this is where this link up between the two might be going where when Apple does enable eGPU in FCPX it will only be for the BM product.

Makes sense from a support point of view to only have one variable but also from a financial point of view if Apple are selling them in their own stores.

Share this post


Link to post
Share on other sites

The eGPU is still a mess on the mac side even though Apple has started to support it officially. While the MacOS changes have made it easy to plug an AMD eGPU, FCPX updates have actually made the performance worse. Fortunately, there are still ways to use an NVIDIA eGPU on the mac, but it requires custom scripts (not that hard), and mainly using Resolve or Premiere. 

The new Blackmagic eGPU is a waste of money. I would strongly advise anyone against it since it is $700 for an old non-upgradable chip. If you want to go the eGPU route, get an enclosure and a GPU that you can upgrade. It will be cheaper & faster. I don't understand why they didn't use the Vega chip or even better 2 Vega chips. 

I think the main factor to decide on the computer is if you want to use FCPX or Resolve. Both have strong points and it is up to each user to decide. If you go with Resolve then there are laptops with better dGPUs (Razer Blade one example) and you can use an eGPU much easier. 

In few weeks I will have my hands on the new macbook 15 and I will do some tests with the eGPU. 

Share this post


Link to post
Share on other sites

no one using xps 9570 here? instead of upgrading my old MBP I'm about to go with the xps. It seems to be almost as well built as MBP, it's no gaming laptop design-wise, cheaper than the microsoft surface. But I barely see people  talking about it with resolve. Maybe the 1050ti is not good enough for dng footage, but certainly better than the radeon 560. 

Share this post


Link to post
Share on other sites

The XPS 9570 still suffers from CPU and GPU throttling when run hard (like my 9560) but as it has Thunderbolt it can use an external GPU. These slim laptops are just not designed to deal with the thermal stress of sustained high CPU and GPU rates but if you can offload this work to an external box it may solve it. I have a 4 disk RAID0 on my 9560 via Thunderbolt which gives me 500 mbps data rates but the Thunderbolt chip side of the laptop gets very hot. If you have a big raid storage box and eGPU + laptop you have to ask yourself why are you not using a small desktop ( micro ATX form factor) PC and a Screen......

Share this post


Link to post
Share on other sites
21 minutes ago, Shirozina said:

The XPS 9570 still suffers from CPU and GPU throttling when run hard (like my 9560) but as it has Thunderbolt it can use an external GPU. These slim laptops are just not designed to deal with the thermal stress of sustained high CPU and GPU rates but if you can offload this work to an external box it may solve it. I have a 4 disk RAID0 on my 9560 via Thunderbolt which gives me 500 mbps data rates but the Thunderbolt chip side of the laptop gets very hot. If you have a big raid storage box and eGPU + laptop you have to ask yourself why are you not using a small desktop ( micro ATX form factor) PC and a Screen......

Out of likes. Well stated.

Share this post


Link to post
Share on other sites

I agree with previous comments about this not being value for money.  This is a new space and early adopters will pay heavily for the privilege - either navigating software limitations and customisations as @Don Kotlos mentioned, or by buying plug-in solutions that aren't great value from a power:dollar perspective.

For those unfamiliar with the eGPU space, this is a pretty good resource (this link is for Mac, but the site covers everything): https://egpu.io/setup-guide-external-graphics-card-mac/

9 hours ago, Shirozina said:

The XPS 9570 still suffers from CPU and GPU throttling when run hard (like my 9560) but as it has Thunderbolt it can use an external GPU. These slim laptops are just not designed to deal with the thermal stress of sustained high CPU and GPU rates but if you can offload this work to an external box it may solve it. I have a 4 disk RAID0 on my 9560 via Thunderbolt which gives me 500 mbps data rates but the Thunderbolt chip side of the laptop gets very hot. If you have a big raid storage box and eGPU + laptop you have to ask yourself why are you not using a small desktop ( micro ATX form factor) PC and a Screen......

Totally agree.

Basically everything is more expensive for laptops - just look at the prices for the BM external monitor cards for PCI vs USB....  The 4K PCI card is $199, but I think the Ultrastudio 4K is the cheapest external 4K converter and it's $995!

I think the only reason not to have a desktop computer for editing is that laptops are portable.  Obviously if you're a pro working from an office (with controlled lighting etc) then this isn't an argument that applies, but if you're like me and edits on the move, or even someone that doesn't want to run two computers (and manage all the syncing that requires) then a laptop is the only option.  Travel film-makers, YT creators with demanding publish-schedules, etc are in this situation.

I read that more and more producers and directors want a colourist on set to provide feedback on the 'look' of footage, so flexibility might be worth something.  Or, if you're at the lower-end of the market and using Resolve as your all-in-one (I hear the Media Management features are excellent for ingesting footage) but can only afford one computer then a laptop might also be a compromise that makes sense.

I'm waiting for the support for multiple eGPUs to take off, and then it won't matter what the computer is because you'll be plugging in 4 or 6 of them and having a real-time render farm.  Resolve should be well suited for this as I hear it's more reliant on GPU than CPU, and if they're partnering with Apple that might give them access to the MacOS bits that might need to change there too.  Plus, the ability to sell multiple eGPUs to each person would be a huge deal.

Share this post


Link to post
Share on other sites
9 hours ago, kye said:

I'm waiting for the support for multiple eGPUs to take off, and then it won't matter what the computer is because you'll be plugging in 4 or 6 of them and having a real-time render farm.  Resolve should be well suited for this as I hear it's more reliant on GPU than CPU, and if they're partnering with Apple that might give them access to the MacOS bits that might need to change there too.  Plus, the ability to sell multiple eGPUs to each person would be a huge deal.

I doubt the industry is going to go in the direction of multiple external GPUs for either gaming or video editing.

I find my 8 core 145 watt cpu is the constraint on my pc rather than the Nvidia 1080 so I seriously doubt we will get to the stage that mobile CPUs are not constrained with multiple gpus. (I am admittedly using Premiere with a compressed codec.)

BUT if you go here.....

https://www.pugetsystems.com/

...you will find a lot of testing and a lot of hardware recommendations. In general, they dont recommend multiple gpus (these days) even with desktop cpus.

In general software developers dont have any real incentive to produce software optimized for multiple gpus because they are a very small demographic....

Share this post


Link to post
Share on other sites

I think in the future there will be no CPU's at all in computers. They won't be needed. GPU's for some reason, way beyond my knowledge, are Way better suited for multiple computations. I know they have Tons more cores than any CPU. And do re writes at unbelievable speeds. Intel, AMD are on their way out. Nvidia is just plain kicking ass as of late. And Resolve seems to be about the only NLE maker that is taking advantage of GPU's.

Share this post


Link to post
Share on other sites
17 hours ago, webrunner5 said:

I think in the future there will be no CPU's at all in computers. They won't be needed. GPU's for some reason, way beyond my knowledge, are Way better suited for multiple computations. I know they have Tons more cores than any CPU. And do re writes at unbelievable speeds. Intel, AMD are on their way out. Nvidia is just plain kicking ass as of late. And Resolve seems to be about the only NLE maker that is taking advantage of GPU's.

Not contradicting you, but is there a link to provide evidence of this? I was under the impression that Final Cut makes very efficient use of the GPUs Apple’s computers come installed with... Thanks.

Share this post


Link to post
Share on other sites
54 minutes ago, jonpais said:

Not contradicting you, but is there a link to provide evidence of this? I was under the impression that Final Cut makes very efficient use of the GPUs Apple’s computers come installed with... Thanks.

You maybe looking at the problem backwards. Apple computers may make fairly efficient use of GPUs with FCPX but, in general, they dont have very powerful GPUs to start with. So a US$4000 iMac has a US$250 Radeon 580 GPU. Probably the same reason that the Blackmagic eGPU doesnt include a more powerful GPU.

Share this post


Link to post
Share on other sites
5 minutes ago, Robert Collins said:

You maybe looking at the problem backwards. Apple computers may make fairly efficient use of GPUs with FCPX but, in general, they dont have very powerful GPUs to start with. So a US$4000 iMac has a US$250 Radeon 580 GPU. Probably the same reason that the Blackmagic eGPU doesnt include a more powerful GPU.

No, I understand the GPUs in Apple’s computers aren’t the best. I’m just asking about the NLEs themselves. Sry if that wasn’t clear.

Share this post


Link to post
Share on other sites
8 minutes ago, jonpais said:

No, I understand the GPUs in Apple’s computers aren’t the best. I’m just asking about the NLEs themselves. Sry if that wasn’t clear.

And I am arguing that you are looking at the problem back to front. The 'reason' Apple computers dont use 'better GPUs' is because NLEs are not capable of taking advantage of them.

They make efficient use of what they have got but if they had more, they wouldnt.

(Case in point if I moved from a US$700 1080 to a US$3500 Titan V 12gb gpu, for Premiere on a PC there would be virtually no performance advantage.)

Share this post


Link to post
Share on other sites
14 minutes ago, Robert Collins said:

And I am arguing that you are looking at the problem back to front. The 'reason' Apple computers dont use 'better GPUs' is because NLEs are not capable of taking advantage of them.

They make efficient use of what they have got but if they had more, they wouldnt.

(Case in point if I moved from a US$700 1080 to a US$3500 Titan V 12gb gpu, for Premiere on a PC there would be virtually no performance advantage.)

So you're saying that if you installed a Titan V 12GB, there would be a performance increase in Resolve, but not Final Cut, is that correct?

The logic that the reason Apple doesn't put the very latest and greatest GPUs in their machines is because of Final Cut doesn't hold water with me. First of all, their computers are used for other purposes than video editing. Secondly, they're always upgrading their machines with better GPU. 

Share this post


Link to post
Share on other sites
2 minutes ago, jonpais said:

So you're saying that if you installed a Titan V 12GB, there would be a performance increase in Resolve, but not Final Cut, is that correct?

Broadly what I am saying is this. CPUs and GPUs do fundamentally different things. GPUs have a lot of cores operating in parallel and are good at essentially solving a single equation very fast - so graphics in games and bitcoin mining. CPUs are designed to resolve multiple equations at the same time, such as running an operating system.

In general, NLEs are likely to be CPU constrained (by that I mean if your CPU is running at 100% and your GPU is running at 80%, putting in a 4x more powerful GPU will not help performance as it will simply run at 20%.)

So to the extent that all NLE's are generally CPU constrained, I think that Resolve probably makes best use out of the GPU. In other words a higher end GPU might have a greater impact in Resolve rather than FCPx.

You can go here if you want a lot of detailed analysis of this

 https://www.pugetsystems.com/

But the simple point is that NLEs are largely CPU constrained so 'more powerful' GPUs dont add much performance beyond a certain point.

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...