Jump to content

Shell64

Members
  • Posts

    305
  • Joined

  • Last visited

Posts posted by Shell64

  1. Fellow G7 user here. Honestly, I would use the money toward better glass, lighting, etc. Now this all depends on if you don’t already have these. But if you don’t, or have them but they are lower end, consider going this route. Maybe by an 18-35 and viltrox speedbooster, and an aperture 120d or something. I’ve been tempted to upgrade my g7, but I’ve decided investing in hq glass and lighting is a better investment. 

    If you are fine with lighting and lenses you have, I personally think that your next camera really depends on your shooting style. Doing turn and gun?  Need a complete, ultra-reliable package with great IBIS?  Gh5/G9 is for you. Don’t need IBIS and want larger sensor/better stills?  XT3 for you. Really we can’t determine this for you. It really depends on your shooting style. 

    Honestly I still think the g7 is fine. I would invest in better glass and lights. You will notice more improvement there. 

  2. On 11/24/2019 at 6:36 PM, kye said:

    One thing about the eGPU is that there's kind of a limit to how much use it is depending on your CPU.

    I bought one hoping that I would be able to decode the 5K 10-bit h265 from the GH5 real-time, so I could edit without rendering proxies, but it's nowhere close.  It can't even do the 4K 10-bit h264 either, and the challenge is my CPU is the bottleneck, not the GPU.  I went through the Resolve support (which is excellent BTW) and they said that Resolve 16 had GPU accelerated decoding of h264/h265 on AMD, but that it was the first version so there might be optimisations in future versions, but 16.1 didn't change anything.
    I was poised to buy the RX 5700 as an upgrade (it's way faster than my 470) but I'm not sure I'd get any benefit really.  It would run OFX plugins faster and maybe export faster, but they're not things I care about really.  If I have to render proxies, then I can render 720p proxies and then the laptop is powerful enough to apply filters and grading in real-time without the eGPU so in that sense it doesn't really help that much.

    In this sense I'd rather that everything becomes GPU-based as you can upgrade an eGPU really easily but not a CPU, especially in a laptop.

    I'm anticipating that eGPU support and utilisation will improve over time as GPUs are now being pushed for AI and we'll be piggy-backing off the back of that.

    I remember in university I did a unit in programming parallel computers and we had access to a cluster that had 2048 CPUs but you had to program it explicitly to use that many CPUs and had direct control over which ones did what, rather than it sorting out which ones did stuff by itself.  They've largely worked those things out now (programming multi-core CPUs isn't done explicitly like it used to be, the software allocates tasks automatically) so I'm hoping that GPU usage will become the same with using multiple GPUs.

    This would mean that you can have a modest computer setup but connect it to a bank of however many eGPUs and everything will just organise the work amongst the GPUs and it will be like plugging in more horsepower to get the job done.  We're not there yet but we're getting closer with eGPU support now completely native in OSX.

    I recently got an rtx 2060 for my pc. It has an old i7 3770, so I thought the gpu would speed things up. It helps with effects, but playback is unaffected pretty much. Very small difference. I am using the free version, but I transcode to dnxhr. 

    Having it all gpu based would be fantastic; older systems like mine could still edit really well without having to go buy a new mobo. 

  3. NVIDIA and Apple have had a bad relationship for a while. NVIDIA wanted system access that Apple wouldn’t allow, and NVIDIA wouldn’t comply. This resulted in Apple using AMD GPUs for all of their products, including their $6,000 Mac Pro. Now, the relationship has practically ended. NVIDIA are done updating CUDA drivers for Mac. That means future versions of software like Premiere’s CUDA support will eventually cease to function, leaving many NVIDIA Mac/hackintosh users screwed. This is unfortunate, but it was bound to happen. 

    Here is the article:

    https://gizmodo.com/apple-and-nvidia-are-over-1840015246

×
×
  • Create New...