When an AI company signs for gigawatts of future compute, the story is bigger than one model provider. It tells us what the next decade of AI competition will be built on: access to power, chips, cooling, land, grid reliability, and enough capital to turn intelligence into infrastructure.
On April 6, 2026, Anthropic announced a new agreement with Google and Broadcom for multiple gigawatts of next-generation TPU capacity expected to come online starting in 2027. The company said the expansion would support frontier Claude models and rising customer demand. It also reported that Claude's run-rate revenue had surpassed $30 billion, up from about $9 billion at the end of 2025, and that more than 1,000 business customers were spending over $1 million annualized.
The infrastructure story
Modern AI depends on more than clever algorithms. Frontier models need enormous training and inference capacity. As agents move into real business workflows, inference demand also grows: not just one prompt and one answer, but planning, tool calls, memory retrieval, document analysis, evaluations, and background work.
That turns compute into industrial infrastructure. The bottlenecks begin to sound familiar to Alberta: electricity, land, permitting, heat, resilience, and long-term capacity planning.
Why Canada is moving
Canada's Sovereign AI Compute Strategy is trying to answer the same question nationally. The federal strategy describes AI compute as the technology that powers AI and includes investments in public and commercial infrastructure. The Government of Canada has also opened the AI Sovereign Compute Infrastructure Program, which makes roughly $890 million available for the infrastructure build layer of a large-scale sovereign public AI supercomputer over seven fiscal years beginning in 2026-27.
For Canada, sovereignty is not a slogan. It means Canadian-located and Canadian-governed systems, data residency, operational control, and the ability for Canadian firms and researchers to build without depending entirely on foreign infrastructure.
Alberta's opening
The federal strategy explicitly points to Canada's advantages in energy, land, and climate. Alberta has all three, plus industrial experience building large physical systems. That does not mean every AI data centre should be built here. It does mean Alberta should be at the table when Canada decides where sovereign compute belongs.
- Energy: AI data centres require reliable power planning, not wishful thinking.
- Industrial skill: Alberta understands infrastructure, permitting, operations, safety, and maintenance.
- Applied demand: energy, construction, agriculture, logistics, municipal services, and manufacturing all need practical AI systems.
- Data sovereignty: Canadian companies increasingly need AI options that keep sensitive data under Canadian governance.
Frontier AI may be invented in labs, but it will be scaled by whoever can build dependable infrastructure.
For Alberta businesses, the near-term lesson is not to build a data centre tomorrow. It is to understand that compute strategy will affect pricing, availability, data residency, and vendor risk. In other words: the AI stack is now part of business strategy.