The AI race extends to AI chips – and British chip designer Arm is gearing up to make a big play. Arm is already the dominant chip designer for 99% of the world’s mobile devices, and there’s talk it plans to become not just a chip designer, but a manufacturer as well in 2025.
The AI Innovator recently spoke with Chief Commercial Officer Will Abbey to discuss the company’s plans for AI, not only for mobile devices but in data centers and other gadgets. We also talked about the coming competition from a group that backs the x86 computing architecture, which has been in place for decades.
What follows is an edited version of that conversation.
Arm is well-known in semiconductor circles but perhaps not outside them. Can you introduce our readers to Arm?
Will Abbey: Everything is a computer of sorts, whether it’s a smart watch, smart car, smart washing machine. And the brains of anything that’s smart is the (computer) CPU. Back in our inception as a company, we were producing power efficient, high performance CPUs, and the first incarnation of the product that we developed is what ended up in that iconic Nokia 6210 mobile device. … That’s what paved the way and drove the smartphone revolution. Fast forward, where are we today? … The arm partnership has shipped almost 300 billion devices.
… We have four lines of businesses. First one is anything that has a display. So building on the success of our mobile business, we expanded that to laptops, to watches, to TVs. That is one line of business, which we call clients. … The second line of business is our infrastructure data center business. The third is IoT, which is anything that is low power, is a consumer device that is connected to the internet. That’s our IoT line of business. And then the last but not least is autonomy, including electric vehicles and robotics. …
In order to bring a CPU to life, in order to bring an architecture to life, you need a rich software development community. And so what’s paved the way for our success in the smartphone revolution and our success in all the other markets that I talked about. You know, we’re proud to boast a 20 million developer community that’s creating content for arm across all of those markets that I’ve talked about. So it becomes a flywheel effect, if you will.
In designing your chips, do you tap the outside developer community for help?
We don’t make the chips. We license our technology to companies that will build the chips. Let’s look at an Android phone – take Samsung devices. They will contain a central processing unit or an SOC, a system on a chip. So we would license our designs, our platform to a company like System LSI, which is a chip company within Samsung, and would also license to companies like Qualcomm, MediaTek, and they would then supply it to the handset manufacturer, and the handset manufacturer then produces the handset that you and I would use. So we were somewhat removed in that respect.
But there is a trend that’s taking place – whether it’s on the PC side, whether it’s on the car side, whether it’s on the phone side – that the manufacturers of those end devices are also looking at building silicon teams within their organizations, and so we do supply to the SOC manufacturer, but we’re also working very closely with the end equipment manufacturer to license the technology to them so they can build those chips themselves.
Can you give me a sense of the breadth of your market share?
So in mobile, we’re about 99%. Anyone that is building a mobile device will use arm. The central processing unit is the applications processor; that’s the main brains of the phone, but there are things like modem, there are things like touch screen controllers that also use arm. But we have 99% market share in the mobile side on PCs, especially with this new wave of PCs that are called AI PCs. There’s a shift from the traditional x86 chips … and so it’s a growing market for us.
We’ve been the architecture of choice for applications like body sensors, body control, security, infotainment under the bonnet through engine management. But as the world has moved to electrification, that’s provided a great opportunity for us to expand our market share into these emerging use cases – ADAS, lane following, infotainment, as companies start to embrace more autonomy of Level 3 and beyond, we’re ideally positioned to grow market share in those areas. (We have) fairly significant overall market share in automotive.
In infrastructure, … every major hyperscaler has an arm fleet that they are deploying alongside (traditionally dominant) x86 (processors), but we’re growing market share in the data center space as well.
How are you positioning yourself in the AI chip market?
AI, fundamentally, is all about ML. It’s all about matrix multiplications. And so there are a number of ways in which any company can provide technology to fuel AI applications. There is the CPU, there is the GPU, and then there’s the NPU. When we think about deployment today, there’s training that takes place in the cloud – data centers we’ve already talked about. Nvidia is a great use case for training that’s taking place in the data center. Nvidia uses a combination of their GPUs. Grace Blackwell is a good example of that – their GPU is coupled with the arm CPU complex.
But when you start to think outside of that into more personal devices – like smartphones – there are a number of great use cases which are based on arm today such as audio processing and video processing. From an AI perspective, all of that is being powered from the CPU, because the compute performance that’s required is sufficient for you to run that on a CPU, some of it is on a GPU as well, which is fortunate enough to be used in those use cases.
And then we start to look at digital televisions – great examples in terms of video and the performance requirements for video correction on AI is not that huge. … There are two things that are very popular right now that digital TV manufacturers are working on and will be deploying this year. One is live translation or caption. … You can run that off a CPU and arm is ideally positioned through the companies that are producing AI chips for live translations in DTV. Another good example is just the ability to look at a scene and using AI capabilities to dynamically twist and shape the video so it’s more immersive based on that content. Those are just great examples of using arm CPUs to deliver the performance that’s required.
Any comment on the x86 advisory group that Intel and AMD, Microsoft, HP, Google Cloud, and others, formed to go against you?
We’ve been very passionate about providing broad and flexible capability and access to our partners, so you don’t need to come to arm for an arm chip, you could go to a number of Silicon partners to source the chip of choice from a silicon partner. We’re a big believer in standards. We’re a big believer in choice. And so the fact that Intel and AMD are partnering to give customers choice is a huge endorsement of the approach that we’ve taken. We do believe that giving partners choice one is a good thing. And competition is healthy for the whole of ecosystem.
Do you plan to make your own chips at some point?
No comment on that.
What innovations do you have coming up that make arm a go-to solution for high performance, AI computing in the cloud?
The reality of where AI is going, the demand for performance and the implications on power is going to continue to soar, and we as a society are going to have to make informed choices of, ‘do we want to keep our lights on or do we want to keep compute taking place for AI?’ The same way that the mobile revolution was made possible through high performance, power efficient designs wrapped around a rich and growing software development ecosystem, those things are going to be become really important when we think about bringing AI to the masses.
We’re continuing to improve our power efficient story, and we’re continuing to drive for higher performance at the same time whilst improving our software development community. And so those key three elements of software development, power efficiency and performance are going to be critical if we’re serious about bringing AI to the masses. And so that’s where we’re focusing on.
I guess you work with everyone, like TSMC, Broadcom, everybody, right?
It’s hard for me to think of a silicon partner that we don’t work very closely with when our network of partners extends to over 1,000 partners.
Can you talk about new trends in AI chip designs, or just in AI generally that affect your industry?
This idea of shifting AI from the cloud to devices off the edge of the network, devices that you and I interact with, that trend is going to continue to increase. And so for that trend to be realizable, a couple of elements are really important. We’ve already talked about increased performance. Security of our personal data is another important aspect. And so, one of the things that we haven’t covered is the security and how serious we take security into our designs. And so again, high performance, low power, and making sure that our data is secure, those are the trends that we’re going to continue to see in 2025 and beyond, and arm is focused on addressing all of those trends.
Be First to Comment