Why oh why is Apple switching to ARM?
Image credit: Apple Mini on my desk this morning.

Why oh why is Apple switching to ARM?

Before we look at Apple itself, right in front of you, things were changing drastically in the semiconductor industry.

EDIT: spooky coincidence? I added a note at the bottom with respect to ARM.

AMD

AMD was dead about 5 years ago. In a cut throat x86 market, because reliability and performance are key, it takes about1 year at least for a board certification for server and cloud chips. And history taught AMD a harsh lesson, if marketing overpromises, it destroys the brand in the highest margin business, the servers.

Hence, they did it differently, a three step process:

  • Use the first Ryzen CPU to catch the spotlight. AMD is back in GPU and CPU. A cost effective CPU, nothing more.
  • Second generation Ryzen had to challenge the mid-range processor with a better price tag.
  • Third generation Ryzen has more cores, more performance per Watt and a price advantage as well. This is TCO, total cost of ownership in the data center market!

On the side, #TSMC N7, the technology node that AMD uses now, is more advanced than the 14nm++ node Intel uses for Xeon.

But it requires a different ASIC design approach because of the manufacturing defects. Instead of one big die, the chips have multiple "chiplets" inside with an interconnect or interposer. Hybrid chips, where an IO ring in 12/14nm is used and inside two 7nm chiplets and a GPU die are possible.

The once in a lifetime zero to hero push AMD showed, is proof again of the mistake that the conservative semiconductor industry always makes.

People that coordinate best-in-class marketing with best-in-class silicon experts, come back from the dead to the top.

In smaller versions, Movidius, Habana Labs and other hardware startups show in rather short term (silicon R&D short term) that they are acquired because in a few years time they bring out an amazing chip that has incredible value.

Compare the size of the teams, the budget and the timeline and they beat the big guns. They are not afraid of Goliath. By the time David is considered a competitor, it is already too late.

Apple

Apple did their own iPhone A-series processors for a long time, based on licensed ARM architectures.

Many tried to get ARM based servers, even AMD tried in 2016 if I'm not mistaken. Today, x86 still rules in the server market. But ARM has pushed from their traditional phone/tablet market to the notebook/laptop market. They keep trying to close in.

Ampere, the startup (not Nvidia Ampere), has a 80-core single threaded server chip. If anything, it shows the continued push to challenge x86. And Apple is certainly interested to simplify the underlying hardware systems. Today, ARM based in mobile and x86 in MacBook, Mini and others. The push into NLP (Natural Language Processing) and Image classification on our mobile devices is more difficult on x86 systems. Apple took the GPU in house a few years ago. Because a GPU has a lot of parallel processing capabilities for pixels, it is used in academic circles for accelerating various AI algorithms.

Look at #FANGMAN, an acronym from the investment world.

What do you know about Facebook? Pure software company?

WRONG! They have ASIC teams now, contractors make $100 per hour in the US.

What do you know about Amazon? Warehouses, delivery, AWS?

Well, AWS has now second generation Graviton 2, an ASIC developed for Amazon based on ...? You guessed it! ARM based.

Google? TPU and eTPU.

When you are valued at $1 trillion, you can design your chips (subcon possibly and manufactured in a foundry model) and NOT buy them from semiconductor giants. Remember the opportunity Apple gave in the 5G battle versus Qualcomm? Apple finally acquired the 5G division and took it in-house.

The volume is in the millions of chips and that makes it very profitable to do so.

They own the processor and hardwareacceleration (GPU is video acceleration in hardware) for any (AI) algorithm. Especially image classification on your MacBook, Mini and other machines will be much faster. It simplifies the whole platform for apps as well. Depending on the sensors of the device, the ARM processor and its acceleration will be similar.

[another edit] And last but least, the Mac appstore is a bit dying, not a lot of things happening because people need to port their software to the x86 platform which is annoying. If those apps require little adaptation for iPhone, iPad and Mac, it is going to be one helluva better ecosystem. Handoff from iPhone to Mac to iPad, anyone? The recent iPad(s) has (have) the sidecar feature (unfortunately not for my Mac Mini of the picture on the top of this article, it is the 2012 version). Both MacOS, iPadOS and the underlying hardware must support it. This links back to the stall we saw in processor and GPU innovation. I am working today on a Mac Mini from 2012. Except for sidecar, it still runs smooth. The Belgian car brand Minerva made that mistake and went out of business. Their problem? Allegedly the cars didn't need to be replaced regularly, they kept going!

CONCLUSION/PREDICTION

Never before is innovation and disruption making an impact on the semiconductor industry. AMD's pipeline is on fire, it will continue to pump out better silicon. But eventually it will slow down because nobody wants to cannibalize their previous generation and good enough will start to creep back in.

But that is not the case for #FANGMAN. The gather big troves of data that need processing. They will continue to find more cost effective ways to turn software algorithms in Application Specific Integrated Circuits for TCO optimization. And, with the Keller surprise exit, the old conservative way of wasting billions and never see the obvious WILL take the market of hardwareaccelerators move from pure semiconductor giants to the new $1T club. They will outsource parts, I am sure, cost is always on an MBAs mind but they will never return to buying the core processing power from chip vendors.

How did AMD do it? Why is ARM still pushing and getting closer to server chips? Why does Apple switch to ARM for Mac? Why don't big semiconductors change before it is too late? Don't they see the red flags everywhere?

Rest assured, they don't. As long as the ship floats, the bling bling is coming. Nobody cares about 2021, no executive is ever sure they will still be with the company next year. Meanwhile, software companies are going remote. They will attract the talent, the out-of-the-box innovators. I predict a further brain drain form traditional semiconductor companies to software companies and hardware positions inside software companies. Hardware engineers are paid a lot more there than they are paid in traditional semi companies.

Backburning is a technique in forest management to use a controlled fire to get rid of dead trees and make room for new life!

??If you are like me and point out big inefficiencies that could save millions of USD, then you will find out everything about the true nature of a concrete wall. It will be abundantly clear why bunkers are made of that material. We spot what others don't want to see.??

BTW: I saw AMDs rise coming, all the signs were there. I have posted about it many times on Q&A site QUORA. #vlsi #semiconductors #semiconductorindustry #technology

If anything, Apple is focussed on smart business and simplifying while saving cost is smart.

EDIT June 23rd, 2020

After posting this, I had to take a few moments (so much intellectual thinking and stuff, you know how it goes), and was looking at some HPC news. Something that was posted 19 hours ago:

“It’s a very well balanced machine. It was designed to do supercomputing – that is to say, it wasn’t cobbled together from commodity processors and GPUs. It was designed specifically for this high end, high performance computing.”

The above is a quote from this article:

ARM-based Fugaku Supercomputer on Summit of New Top500 – Surpasses Exaflops on AI Benchmark - insideHPC

Never underestimate the power of ARM!

Lambert L.

Linux SysAdmin with a penchant for cyber security

4 年

A question though, and this might be a dumb question coming from a young'un: If ARM/RISC/ASIC are superior and taking over the market, why didn't they take over the market decades ago? Why did x86/CISC dominate in the first place and dominate for so long? I feel like if RISC/ASIC are taking over now, they should have taken over a long time ago.

Bharathwaj T A

Principal Engineer - Physical Design

4 年

Nice one

Américo Dias ??

Analog IC Design Engineer | MBA Candidate

4 年

Very nice article Bert! Thanks

要查看或添加评论,请登录

社区洞察

其他会员也浏览了