At the core of this expansion is the GPU, which remains the dominant processor architecture for AI workloads due to its unmatched parallel processing capability and mature software ecosystem. Nvidia continues to hold an overwhelming share of this segment, with successive generations - from Hopper to Blackwell to Rubin and beyond - each delivering step-change improvements in compute density, memory bandwidth, and energy efficiency. AMD provides meaningful competition with its MI-series accelerators, while the broader landscape is being reshaped by hyperscalers developing their own custom silicon to reduce dependency on merchant chip vendors and lower total cost of ownership.
AI ASICs represent the fastest-growing processor category, as companies including Google, Amazon Web Services, Microsoft, and Meta invest heavily in purpose-built chips optimised for specific workloads such as inference, recommendation, and training. These internally developed accelerators - including Google's TPU series, AWS Trainium and Inferentia, Microsoft MAIA, and Meta's MTIA - are increasingly displacing third-party GPUs for certain use cases, fundamentally altering the competitive dynamics of the market and creating a parallel ecosystem of chip co-designers and advanced packaging specialists.
The server CPU market, though more mature, continues to evolve rapidly. Intel and AMD maintain leading positions with their x86 architectures, but face mounting pressure from Arm-based alternatives championed by hyperscalers such as AWS with Graviton, Google with Axion, Microsoft with Cobalt, and Nvidia with Grace and Vera. RISC-V is also emerging as a credible contender for specific workloads, particularly as open-source hardware ecosystems mature. Meanwhile, FPGAs continue to serve niche roles in low-latency and specialised inference applications.
Underpinning all of this is a complex and increasingly strained supply chain. Advanced semiconductor manufacturing is concentrated at TSMC, Samsung, and Intel Foundry, with leading-edge nodes below 5nm commanding the majority of AI chip demand. High Bandwidth Memory, supplied primarily by SK Hynix, Samsung, and Micron, has emerged as a critical bottleneck, while advanced packaging technologies such as CoWoS are operating at near-full capacity. Hyperscaler capital expenditure continues to flow into data centre construction, power infrastructure, and silicon procurement at a scale that is reshaping global semiconductor supply chains.
Geopolitics adds a further layer of complexity. US export controls on advanced AI chips have accelerated China's drive toward semiconductor self-sufficiency, with domestic players such as Huawei HiSilicon, Cambricon, Biren, and Hygon developing increasingly capable alternatives. The bifurcation of the global AI compute market into US-aligned and China-domestic supply chains is one of the defining structural trends of the decade, with profound implications for technology strategy, investment allocation, and national industrial policy.
The Global Market for Computing and AI for Data Centers 2026-2040 is a comprehensive strategic intelligence report covering the full landscape of data centre processor technology, market dynamics, competitive positioning, and long-range forecasting through to 2040. Produced for technology executives, semiconductor investors, strategic planners, and policy analysts, the report provides the depth of quantitative rigour and qualitative insight required to navigate one of the most rapidly evolving markets in the global economy.
The report opens with a set of preliminary materials including a detailed glossary of technical terms and abbreviations, a clear articulation of research objectives and scope, biographical profiles of the authoring team, and a candid retrospective on previous forecast accuracy. This is followed by a three-page summary and a full executive summary designed for senior readers who require rapid orientation to the report's key findings without sacrificing analytical depth.
Chapter one establishes the macroeconomic and geopolitical context, examining global AI infrastructure investment trends, hyperscaler capital expenditure trajectories for both US and Chinese players, the evolving regulatory landscape including US export controls, and the widening technology divide between Western and Chinese semiconductor ecosystems.
Chapter two forms the quantitative heart of the report, delivering granular market forecasts from 2021 to 2040 across all major processor categories. Revenue, average selling price, unit volume, wafer consumption, and server tray forecasts are provided at the vendor, product, and technology node level, enabling readers to build detailed bottom-up views of market opportunity and competitive exposure. Separate analytical lenses are provided for CPU, GPU, and AI ASIC dynamics, including HBM-driven revenue disaggregation and compute die forecasting.
Chapter three addresses the market forces shaping demand, including the falling cost of generative AI inference and training, the emergence of agentic and physical AI, the compute demands of recommendation engines and coding assistants, the competition between LLMs and traditional search, and broader questions around the CapEx and OpEx economics of AI infrastructure. An exploratory section examines the longer-term possibility of space-based data center architectures.
Chapter four maps the competitive landscape in detail, providing ecosystem maps for both the data center processor supply chain and the foundation model developer community. It includes financial benchmarking of leading chip designers, a deep-dive case study on OpenAI's revenue and compute trajectory, comprehensive market share analysis, and a dedicated section on Mainland China covering domestic market sizing, hyperscaler demand, manufacturer profiles, and supply chain structure.
Chapter five delivers an authoritative review of technology trends across all processor categories, covering process node roadmaps, chiplet architectures, rack-scale system designs, memory and packaging technology, and emerging computing paradigms including photonics, neuromorphic, and quantum computing. Unique assets include a full AI ASIC technology specification database and a start-up landscape analysis.
The report concludes with a forward-looking outlook chapter presenting bull, base, and bear case scenarios for the market through 2031 and beyond to 2040, a comprehensive risk register, and strategic recommendations. An extensive company profiles section - covering 81 organisations with one dedicated page per company - rounds out the report, providing standardised strategic and financial snapshots of every major player in the ecosystem.
Report Contents include:
- Global AI infrastructure and investment landscape
- US and Chinese hyperscaler CapEx trends and projections
- AI regulatory landscape and export controls
- The US-China technology divide
- Market Forecasts (2021-2040)
- Total data centre processor revenue forecast
- GPU, AI ASIC, CPU and FPGA revenue forecasts
- Average selling price (ASP) forecasts by vendor and product tier
- Processor unit shipment forecasts
- Wafer starts by technology node and foundry (TSMC, Samsung, Intel Foundry)
- GPU and AI ASIC compute die forecasts
- HBM-driven revenue separation
- Server tray volume forecasts
- Dedicated CPU focus and GPU/AI ASIC focus sections
- Market Trends
- Cost of generative AI inference and training
- From agentic AI to physical AI
- Recommendation models for social networks
- Coding assistants
- Search engines vs. LLMs
- OpenClaw
- CapEx vs. OpEx in the generative AI era
- The future of space-based AI data centres
- Market Share & Supply Chain
- Data center ecosystem map
- Foundation models ecosystem map
- US vs. China tech war timeline
- Financial metrics of data center chip designers
- Case study: OpenAI revenue and gigawatt forecast
- Market share analysis - CPU, GPU, AI ASIC, XPU co-designers
- Mainland China focus: market size, hyperscaler demand, manufacturer profiles, supply chain
- Technology Trends
- CPU: x86, Arm, RISC-V, workload specialisation
- GPU: process nodes, chiplets, rack-scale architecture, HBM integration, interconnects
- AI ASIC: hyperscaler roadmaps, start-up landscape, specification database, disaggregated inference
- GPU vs. AI ASIC comparative analysis
- Advanced packaging and HBM (HBM2E through HBM4), CoWoS, AI rack bill of materials
- Emerging computing: photonics, neuromorphic, quantum
- Outlook
- Market outlook 2026-2040 with bull/base/bear scenarios
- Technology outlook 2026-2040
- Key risks and opportunities
- Strategic recommendations
- Company Profiles
Table of Contents
Companies Mentioned (Partial List)
A selection of companies mentioned in this report includes, but is not limited to:
- 01.AI
- Achronix Semiconductor
- Advanced Micro Devices (AMD)
- AI21 Labs
- Alchip Technologies
- Aleph Alpha
- Alibaba Group / T-Head Semiconductor
- Amazon Web Services (AWS)
- Ampere Computing
- Anthropic
- Arm Holdings
- Axelera AI
- Baidu
- Biren Technology
- Broadcom
- ByteDance
- Cambricon Technologies
- Cerebras Systems
- China Mobile
- Cisco Systems
- Cohere
- CoreWeave
- d-Matrix
- DeepSeek
- Dell Technologies
- Enflame Technology
- Esperanto Technologies
- Etched
- Fujitsu
- Furiosa AI
- GlobalFoundries (GF)
- Google (DeepMind / TPU Programme)
- GrAI Matter Labs
- Graphcore
- Groq
- GUC (Global Unichip Corp.)
- Hewlett Packard Enterprise (HPE)
- HiSilicon Technologies
- Huawei Technologies
- Hygon Information Technology
- IBM
- Iluvatar CoreX
- Intel Corporation
- Kalray
- Lattice Semiconductor
- Lightmatter

