Gartner: AI chips power 21% surge in global semiconductor revenue to $793bn in 2025

Global semiconductor revenue rose 21% in 2025 to $793.4 billion, as the AI infrastructure boom pushed demand for accelerators, high-bandwidth memory and networking silicon to record levels, according to preliminary results from Gartner released on January 12, 2026.

The topline number matters, but the mix matters more. Gartner’s core point is that the chip market’s growth is no longer being driven by a broad rebound across categories, it is being pulled forward by the specific “bill of materials” needed to build and run large-scale AI systems. That includes not only processors, but also the memory and interconnect required to keep data moving through massive clusters.

“AI semiconductors — including processors, high-bandwidth memory (HBM), and networking components continued to drive unprecedented growth in the semiconductor market, accounting for nearly one-third of total sales in 2025,” said Rajeev Rajput, senior principal analyst at Gartner. He added that the dominance is set to rise as AI infrastructure spending is forecast to surpass $1.3 trillion in 2026.

Nvidia extends its lead as memory becomes a frontline battleground

The biggest winner in Gartner’s vendor ranking is Nvidia, which strengthened its lead over Samsung Electronics by $53 billion and became the first semiconductor vendor to cross $100 billion in annual chip sales. Nvidia’s 2025 semiconductor revenue is listed at $125.7 billion, giving it 15.8% market share.

Samsung held the No. 2 spot at $72.5 billion. Gartner notes Samsung’s chip revenue was supported by memory growth, while its non-memory business fell year on year — a useful reminder that “AI-led” growth is not evenly distributed across every part of the chip stack.

SK hynix moved into the No. 3 position at $60.6 billion, up 37.2% year on year, which Gartner explicitly links to strong demand for HBM in AI servers. Intel fell to No. 4 at $47.9 billion and continued to lose share, ending 2025 at 6% market share — half of what it held in 2021.

The table also highlights sharp gains for suppliers tied to the data-centre upgrade cycle. Micron rose 50.2% to $41.5 billion, while Broadcom grew 23.3% to $34.3 billion — reinforcing Gartner’s argument that this cycle is being powered by compute plus memory plus networking, not compute alone.

Why HBM is the story inside the story

Gartner’s release gives an unusually clear signal on what is becoming a strategic choke point. In 2025, HBM represented 23% of the overall DRAM market and surpassed $30 billion in sales, while AI processors exceeded $200 billion in sales. The message is that the AI buildout is now reshaping the memory market’s centre of gravity, turning HBM from a premium niche into a defining feature of industry growth.

That shift matters because HBM is not simply “more DRAM”. It is tightly coupled to advanced packaging, manufacturing capacity and qualification cycles. For AI buyers — hyperscalers, sovereign cloud programmes and enterprises building their own AI clusters — the constraint is increasingly the ability to secure complete systems, not just GPUs. If the accelerator arrives without the right memory and networking ecosystem behind it, the performance and utilisation story breaks down quickly.

The “how”: AI changes the semiconductor market’s power map

Gartner’s data captures a broader strategic transition: AI is changing where pricing power sits.

In previous cycles, smartphones and PCs could set the pace for large parts of the industry. This cycle is being set by data centres and the AI workloads inside them. That shifts value toward vendors positioned in three places:

  • compute platforms that are widely adopted for training and inference,

  • specialised memory suppliers with HBM capacity,

  • networking and interconnect providers that keep clusters scalable and efficient.

It also helps explain why Gartner expects the AI-driven portion of the chip market to expand further. The firm says AI semiconductors are set to represent more than 50% of total semiconductor sales by 2029, indicating it sees AI not as a temporary surge, but as a structural reweighting of the market.

What to watch in 2026

If AI infrastructure spending does pass $1.3 trillion in 2026, the next year becomes less about whether demand exists and more about how resilient the supply chain and economics are under sustained pressure.

Three issues stand out.

First, capacity and qualification in HBM and advanced packaging will remain central, because they determine how fast full AI systems can ship, not just individual chips.

Second, the market’s concentration risk rises as growth becomes more dependent on a narrower set of AI-driven categories. Strong headline growth can coexist with weakness in segments that are not directly pulled by AI.

Third, the competitive dynamics between the top suppliers will increasingly be shaped by ecosystem control — software, platforms, system partnerships and the ability to deliver at scale — rather than pure silicon performance alone.

Previous
Previous

The Women in Tech Problem Is Not Awareness. It’s Power.

Next
Next

The World Is Building AI Infrastructure. The Economics Are Still Catching Up.