Verified AI News / Infrastructure

Compute is the new oil

Anthropic's multi-gigawatt TPU agreement with Google and Broadcom is a reminder that frontier AI is not only a software race. It is a power, data centre, silicon, and sovereignty race.

April 25, 2026Opcelerate Neural News Desk8 min read
AI compute infrastructure and data centre systems

When an AI company signs for gigawatts of future compute, the story is bigger than one model provider. It tells us what the next decade of AI competition will be built on: access to power, chips, cooling, land, grid reliability, and enough capital to turn intelligence into infrastructure.

On April 6, 2026, Anthropic announced a new agreement with Google and Broadcom for multiple gigawatts of next-generation TPU capacity expected to come online starting in 2027. The company said the expansion would support frontier Claude models and rising customer demand. It also reported that Claude's run-rate revenue had surpassed $30 billion, up from about $9 billion at the end of 2025, and that more than 1,000 business customers were spending over $1 million annualized.

2027New TPU capacity expected to begin coming online.
$30B+Anthropic's reported 2026 run-rate revenue.
1,000+Business customers spending over $1M annualized.

The infrastructure story

Modern AI depends on more than clever algorithms. Frontier models need enormous training and inference capacity. As agents move into real business workflows, inference demand also grows: not just one prompt and one answer, but planning, tool calls, memory retrieval, document analysis, evaluations, and background work.

That turns compute into industrial infrastructure. The bottlenecks begin to sound familiar to Alberta: electricity, land, permitting, heat, resilience, and long-term capacity planning.

Why Canada is moving

Canada's Sovereign AI Compute Strategy is trying to answer the same question nationally. The federal strategy describes AI compute as the technology that powers AI and includes investments in public and commercial infrastructure. The Government of Canada has also opened the AI Sovereign Compute Infrastructure Program, which makes roughly $890 million available for the infrastructure build layer of a large-scale sovereign public AI supercomputer over seven fiscal years beginning in 2026-27.

For Canada, sovereignty is not a slogan. It means Canadian-located and Canadian-governed systems, data residency, operational control, and the ability for Canadian firms and researchers to build without depending entirely on foreign infrastructure.

Alberta's opening

The federal strategy explicitly points to Canada's advantages in energy, land, and climate. Alberta has all three, plus industrial experience building large physical systems. That does not mean every AI data centre should be built here. It does mean Alberta should be at the table when Canada decides where sovereign compute belongs.

Frontier AI may be invented in labs, but it will be scaled by whoever can build dependable infrastructure.

For Alberta businesses, the near-term lesson is not to build a data centre tomorrow. It is to understand that compute strategy will affect pricing, availability, data residency, and vendor risk. In other words: the AI stack is now part of business strategy.

Sources checked

Anthropic: Google and Broadcom compute partnership

Government of Canada: Canadian Sovereign AI Compute Strategy

Government of Canada: AI Sovereign Compute Infrastructure Program

Opcelerate Neural helps Alberta companies turn AI infrastructure trends into practical software decisions. Start with a private AI readiness audit.