Jump to content

Apple M1 - my take on it


Andrew Reid
 Share

Recommended Posts

I think it's important to note that although ARM is making these chips for Apple, they are still Apple designed (1000+ engineers only for that). I doubt similar chips for Windows will be made. These devices are going to trash similar setups in the windows world in terms of performance per watt. It's about time they came back with something good, but I'll wait a while before buying anything myself. They'd better get their shit together when it comes to keyboards too- the main reason I haven't upgraded anything.

This is really good news that it's using integrated graphics. I lost count of how many laptops Apple has screwed up in their designs (incorrect transistors, etc.). I'd think they'll do better with integrated graphics.

Also, I'm tired of the iPad stuff- it's just not as capable without doing hurdles. We STILL cannot sync external audio and video automatically!

Link to comment
Share on other sites

EOSHD Pro Color 5 for Sony cameras EOSHD Z LOG for Nikon CamerasEOSHD C-LOG and Film Profiles for All Canon DSLRs

I’m pretty jazzed about this announcement. That Mini is mighty tempting at $700 since I have everything else and already edit off external SSD’s. H265 across the board would be my preference for file size and the performance so far looks pretty impressive. They ship next week so we’ll see benchmarks soon.   

Wonder if the M1 bits will be in the next iPad Pro. Do it Apple. Then I can pass projects between the Mini and the iPad On FCPx.

I second replacing Big Sur with Apple OS and having the same OS across all devices. We’re certainly heading in that direction.

cheers

chris

Link to comment
Share on other sites

I want to see running Resolve in a 16 GB ram shared with the GPU.... when for an 8K timeline the official BM recommendation is 24 GB GPU VRAM and 64 GB RAM !!! I bet on these machines you can playback 8k ProRes videos in real-time, as Apple says, on a FHD timeline with Resolve..... 

Link to comment
Share on other sites

3 hours ago, John Matthews said:

I think it's important to note that although ARM is making these chips for Apple, they are still Apple designed (1000+ engineers only for that). 

This is really good news that it's using integrated graphics. I lost count of how many laptops Apple has screwed up in their designs (incorrect transistors, etc.). I'd think they'll do better with integrated graphics.

The first sentence doesn't seem quite right, ARM designs chips, they don't produce them. Apple license one (or more likely several) designs from ARM and combine them with their own designs and possible also others into the M1 chip.

IMO, integrated graphics is the way to go for laptops, but not desktop computers. One of Apple's problems has been that they went with Radeon and Intel Iris GPUs. Both nVidia and standard Intel GPUs are much better supported than Radeon and Iris. Going for ARM processors they might use something like the Mali GPU. It will be crucial that whatever GPU they uses works well with Adobe Premiere Pro/Resolve, but it's hard to predict how it will go as neither of those currently run on the iPad Pro as far as I know.

Link to comment
Share on other sites

6 minutes ago, UncleBobsPhotography said:

The first sentence doesn't seem quite right, ARM designs chips, they don't produce them. Apple license one (or more likely several) designs from ARM and combine them with their own designs and possible also others into the M1 chip.

IMO, integrated graphics is the way to go for laptops, but not desktop computers. One of Apple's problems has been that they went with Radeon and Intel Iris GPUs. Both nVidia and standard Intel GPUs are much better supported than Radeon and Iris. Going for ARM processors they might use something like the Mali GPU. It will be crucial that whatever GPU they uses works well with Adobe Premiere Pro/Resolve, but it's hard to predict how it will go as neither of those currently run on the iPad Pro as far as I know.

Too late to edit, but it seems like Apple designs their own GPUs for the iPad, which makes it likely they do the same for the M1 chip

Link to comment
Share on other sites

After using my old one for 4 years I replaced my 13" MBP with a new one only a few months ago.  

I did this deliberately, knowing that the new architecture was coming, because I didn't want to be a beta tester.

I haven't read in detail what was in the announcement, but assuming that it was anything close to what was predicted, it's an interesting thing.
The things I think are most interesting are that by transitioning all the Apple hardware to ARM:

  • they can optimise the hell out of everything as they'll have control over the whole software/hardware stack
  • it essentially merges the hardware platforms of phones/tablets/computers, meaning that they would go to one App Store, app developers only have to write one version of the apps instead of two
  • all iPhone/tablet apps could be run natively on the computers
  • potentially all the computer apps would be able to be run on iPhone/tablets

The reason I waited is that it also means that they'll have to re-write OSX from the ground up, potentially putting into place a huge backlog of fundamental architecture changes that have been accumulating since OSX went to a unix platform, which is going to be a huge and very messy process.  Also, that means that every OSX program will have to be re-written, or will have to run in an emulator.  That's not something I want to beta test on something as sensitive to performance as video editing.

The end-game of this technology choice is that your phone will become your computer.
I've said this before, but imagine coming home, taking your phone out of your pocket and dock it which provides power and also connects it to your monitors / peripherals / eGPU / storage arrays and it goes into 'desktop' mode and becomes a 'PC'.  
This might sound like science fiction, but I've seen someone actually do this years ago running linux on an Android device - it had tablet mode and desktop mode, similar to how modern laptops with touchscreens now have a tablet mode and a pc mode.  Modern processors are good at being efficient while they sit almost idling in the background, but then turn into screaming power-thirsty race-horses when asked to do something huge (anyone that has had their phone crash will know it can drain the battery before you even notice that anything is wrong).  When you dock it then full-power becomes available and only cooling may be a limiting factor.

The other aspect that supports this is the external processing architecture that has been worked out.  OSX supports having many eGPUs and Resolve will happily use more than one, although currently you don't get much improvement by having more than a few of them.  It's not inconceivable that in future an eGPU will be available that will appear to the OS as a small cluster of eGPUs, and the computer simply becomes a controller.  When I studied computer science we programmed software for a couple of toroidal clusters, one of which had 2048 CPUs (IIRC).  The architecture is getting there and video processing and graphics is a perfect application for it as you can just divide up the screen, or separate tasks such as decompressing video / rendering the UI / scaling the video / doing VFX processing / colour grading / etc.

Link to comment
Share on other sites

Geek bench scores are out. Keep in mind these are synthetic benchmarks but offer a good estimate of performance. This doesn’t take into account the M1 neural engine or Encoder/decoder acceleration chips.

MacBook Pro with M1 chip and 16GB RAM posted a single core score of 1714 and a multi-core score of 6802.

To put that into some perspective: a 2019 high-end 16-inch MacBook Pro earned a single-core score of 1096 and a multi-core score of 6870.

16 GB is limiting. I wish they had a 32 GB option. However, combined with the extremely fast LDDR memory, the unified SoC and I/O and blazing fast SSD storage, and how MacOS manages and maps memory I suspect 16GB in final cut pro is enough for most projects. 

Looking good, especially performance per watt!

Link to comment
Share on other sites

2 hours ago, kye said:

The reason I waited is that it also means that they'll have to re-write OSX from the ground up, potentially putting into place a huge backlog of fundamental architecture changes that have been accumulating since OSX went to a unix platform, which is going to be a huge and very messy process.

This process has been happening for 8+ years. iOS and OS X have essentially merged. OS X is no more. It is now MacOS and it shares architecture more in common with iOS.

2 hours ago, kye said:

Also, that means that every OSX program will have to be re-written, or will have to run in an emulator.

Apple is the master at architecture changes. They’ve done 3 in the past 15 years. PowerPC -> Intel -> ARM.

They have had universal binaries for years now starting back to the PowerPC to Intel switch. Universal binaries ship native code for both architecture in one “app bundle”. The result is native performance.

For app developers that don’t recompile their code to universal will have to use Rosetta to translate the code on the fly. This sounds bad, but it ties into their “clang” JIT compiler that have been developing for years. It’s world class and one of the fastest JIT compilers in development.

Apples move to Apple Silicon is very much been in play for many years and finally a return to Apple developing both hardware and software from the ground up. Which can be good and bad depending on your perspective (increased performance versus lock-in).

I am quite impressed with Apple’s stead fast march from acquiring an ARM cpu chipset developer to now leading the world in designing the fastest ARM cpus out there. Fast enough to blow past even Intels best desktop CPUs at a quarter of the power and thermal envelop. Pretty amazing win for Apple and RISC-based ARM CPUs.

Link to comment
Share on other sites

11 hours ago, UncleBobsPhotography said:

Too late to edit, but it seems like Apple designs their own GPUs for the iPad, which makes it likely they do the same for the M1 chip

Sorry my first sentence wasn't clear. ARM outsources its production and designs chips. Apple Silicon uses elements of a the ARM architecture, but the chip is designed by Apple.

For those who are quoting specs like "only 16gb of RAM" and you cannot add more, I don't think it matters quite as much because everything (CPUs, GPUs, and RAM) are located on the M1 chip itself, rendering the chip much more effective at handling processes. Also, RAM and SSDs are becoming closer and closer to similar speeds. In short, this would not stop me from buying one of these machines.

Remember what Alan Kay once said: “People who are really serious about software should make their own hardware.” This had been quoted by Steve Jobs many times. A slower spect machine can outperform the higher spect machine, given optimized software. I'm not a Apple software fanboy, as they've dumbed-down so many of their products in recent years, making them much less capable (i.e. FaceTime, iTunes), but they'll always have a serious advantage over Wintel and AMD, especially performance per watt.

Link to comment
Share on other sites

12 minutes ago, John Matthews said:

everything (CPUs, GPUs, and RAM) are located on the M1 chip itself

Interesting. Was this in the presentation? 

 

17 minutes ago, John Matthews said:

Sorry my first sentence wasn't clear. ARM outsources its production and designs chips. Apple Silicon uses elements of a the ARM architecture, but the chip is designed by Apple.

ARM designs the cores. They also have designs for the (multi core) processor function, but which is limited to connectivity architecture and instructions (DynamIQ vs big.LITTLE for example). So SoC designers (such as Samsung, Qualcom, Apple, Mediatek etc) actually mix and Max combinations of ARM processors, laying them within the processor, based upon the budget, power and processing requirements, product which is made for etc. Most of the engineers figure out what the best combination of cores and instruction for the job is, the size of the node, size of the various cache etc). Apple isn't probably doing anything significantly different form the othe processor designers. 

 

Apparently the McBook Air's (performance cores?) were running at 3.20 GHz. I am wondering how many Cores are running at that frequency. I would guess 4, at Max. Which is similar to Smartphone ARM processor structure where the cores are divided between efficient ones (mostly for browsing and other non-demanding functions) and performance cores (used for shooting photo and video, editing work and everything else requiring processor heavy work to be done).

https://browser.geekbench.com/v5/cpu/4648107

Link to comment
Share on other sites

2 hours ago, sanveer said:

Interesting. Was this in the presentation? 

Yes, it was. between 7-8 minute mark where they mention DDR4 is included on the M1 chip.

2 hours ago, sanveer said:

Apple isn't probably doing anything significantly different form the othe processor designers. 

You're probably right. However, what is different is the level of integration in terms of security and software. They're going to be tough to match.

Link to comment
Share on other sites

There is an in-depth analysis of the M1 on Anandtech here - https://www.anandtech.com/show/16226/apple-silicon-m1-a14-deep-dive

It confirms what the Apple presentation suggests (if you look very carefully at the diagrams/images) - the DRAM is not part of the M1 SoC itself, it's alongside it, with the M1 and DRAM placed very close together inside a single multi-chip package.

Given the chips they've been designing for years (for iPhones & iPads), I assume that Apple have an (expensive) ARM 'architecture license' so they can design their own CPU cores that conform to ARM's architecture and instruction set standards. Some of the other big ARM-based chip suppliers do the same - it means they can design and produce chips that lower-level ARM licensees don't have access to, thus potentially gaining a competitive advantage.

Link to comment
Share on other sites

21 hours ago, John Matthews said:

I think it's important to note that although ARM is making these chips for Apple, they are still Apple designed (1000+ engineers only for that).

Actually, a correction: ARM is not making these chips for Apple. ARM actually doesn't make chips at all, they license their designs in two distinct ways.

The first way is licensing a full design of an actual chip, so all someone like HTC or LG has to do is pay the licensing rights, send the design to a Fab (where chips are built) and they are done.

The second (and much less common way) is that they license what's called an "ISA" (Instruction Set Architecture), which is basically (to put it simply) the words/instructions that the "language" of an ARM architecture speaks, and then someone like Apple can license this ISA and then design and implement its own chips designs *from scratch* implementing the ISA, and this is where things get interesting as Apple has world-class leading-edge engineers designing from ground zero its own chips. So although Apple's chips are "ARM-based", they are not the generic chips most other manufacturers buy, and this is where Apple shines. A great thing about this approach (which as you can imagine, is very expensive) is that Apple can modify the chip as fit, and add all the extra circuitry to it to optimize it for specific things (like macOS acceleration, encryption in real-time, video encoding and decoding, Machine Learning, etc).

Bottom line: Even if you try to compare two ARM-derived chips running at the same frequency between say a generic Android device and an Apple-designed one, it is highly-likely that Apple's chips will be more efficient in real-world terms.

For those interested in this topic I wrote a lengthly article about Apple Silicon. I offer below the original link (in Spanish) and the Google-translated version of it in english (the translation is surprisingly good):

http://eliax.com/index.cfm?post_id=11569

https://translate.google.com/translate?sl=es&tl=en&u=http%3A%2F%2Feliax.com%2Findex.cfm%3Fpost_id%3D11569

Hope that explains a bit what's going on here (as a Microprocessor Engineer and Computer Scientist I know a bit about these things and the industry as a whole, and love to share my knowledge in layman terms).

 

 

Link to comment
Share on other sites

7 hours ago, John Matthews said:

Yes, it was. between 7-8 minute mark where they mention DDR4 is included on the M1 chip.

Oh ok. I am guessing that it probably implies that it has a smartphone setup, instead of a laptop or desktop type, in terms of s processor. It's an SoC, instead of merely a processor. One cannot really change the RAM on a smartphone. I obviously could be wrong, but, Apple usually exaggerates things. Their Retina display was like 720p, when many smartphones were offering QHD and more. So they're probably just selling smartphones SoCs like they invented the wheel. 

Link to comment
Share on other sites

4 minutes ago, sanveer said:

Oh ok. I am guessing that it probably implies that it has a smartphone setup, instead of a laptop or desktop type, in terms of s processor. It's an SoC, instead of merely a processor. One cannot really change the RAM on a smartphone. I obviously could be wrong, but, Apple usually exaggerates things. Their Retina display was like 720p, when many smartphones were offering QHD and more. So they're probably just selling smartphones SoCs like they invented the wheel. 

There's conflicting info on this. When I look at, it seems like the RAM is part of the M1 chip. I've never had problems with their displays. For me, retina displays mean you can't see pixels and I've been quite happy with them.

Link to comment
Share on other sites

21 minutes ago, John Matthews said:

There's conflicting info on this. When I look at, it seems like the RAM is part of the M1 chip. I've never had problems with their displays. For me, retina displays mean you can't see pixels and I've been quite happy with them.

I guess more information on the SoC will slowly start trickling in, and we'll know specs and performance figures. I am guessing their graphics processing should be enough for most tasks too. I personally want a huge tablet (18 inch?) on which desktop grade editing would be possible. 

 

Retina means it's a fancy name for not even giving customers 1080p and covering it with wordplay. That was smart, but dishonest. 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

  • EOSHD Pro Color 5 for All Sony cameras
    EOSHD C-LOG and Film Profiles for All Canon DSLRs
    EOSHD Dynamic Range Enhancer for H.264/H.265
×
×
  • Create New...