British chip designer Arm, which dominates the mobile chip market, said it will begin producing its own data center silicon for the first time. This marks a major shift from its long-standing model of licensing its chip designs as it aims to participate in the AI boom.
The company introduced ‘Arm AGI CPU,’ a processor designed for AI data centers that run continuously operating software agents capable of reasoning and executing tasks. The move positions it to compete more directly with CPU leaders, such as Intel and Advanced Micro Devices, and encroach on the turf of Nvidia, which builds GPUs and AI accelerators.
Arm said the rise of agentic AI is driving a surge in computing demand, particularly for CPUs that handle coordination, memory access and data movement. Data centers may need more than four times the current CPU capacity per gigawatt of power as AI workloads scale, the company said.
The new chip will feature up to 136 cores and is designed to deliver higher performance and efficiency within existing power constraints. Arm said it could provide more than double the performance per rack compared with traditional x86-based systems, potentially reducing capital costs for large-scale AI deployments.
Meta is serving as a lead partner and co-developer, integrating the chip with its own AI infrastructure. Other companies, including OpenAI, Cloudflare and SAP, are expected to deploy the processor.
The announcement reflects a broader industry shift toward custom silicon as AI workloads strain traditional computing architectures. Major cloud providers such as AWS and Google Cloud have already developed in-house chips to improve performance and reduce costs.
More recently, Elon Musk announced plans to build an integrated chipmaking plant, Terafab. The facility aims to produce 1 terawatt of compute annually for Tesla, xAI and SpaceX.