The AI industry’s bottleneck is shifting from model intelligence to compute capacity. Anthropic’s projected $100 billion in annual recurring revenue, noted on All-In, exposes the limits of flat-fee consumer subscriptions. Its metered enterprise model allows revenue to scale directly with usage - an 'electricity model' that outpaces OpenAI's 3x growth.
"Anthropic is on track to hit $100 billion in ARR by year-end."
- David Sacks, All-In
Scaling this demand requires power and land. Chamath Palihapitiya warns frontier labs are hitting a physical limit where they must own their infrastructure to survive. Relying on Amazon or Google creates a strategic dependency competitors can throttle. Over 40% of contested data center builds are now being canceled, with Maine banning new builds entirely citing grid strain.
The hardware race is intensifying. Chris Lattner argues on This Week in AI that NVIDIA's dominance is a software lock-in problem, not a silicon lead. CUDA is a 20-year-old legacy system unsuited for modern generative AI. Google, with seven generations of Tensor Processing Units, possesses better scale-out capabilities. Amazon’s Trainium and Inferentia chips are also gaining ground with high-end users like Anthropic.
"Google has been building Tensor Processing Units (TPUs) for seven generations and currently possesses better scale-out capabilities than NVIDIA."
- Chris Lattner, This Week in AI
China is proving hardware sanctions haven't crippled its AI development. Z.ai’s open-source GLM 5.1, trained entirely on Huawei chips and capable of 1,700-step autonomous work cycles, shrinks the performance gap with Western models to months.
Enterprise deployment is the next hurdle. Anthropic’s Managed Agents platform abstracts the complex distributed systems engineering required for autonomy, turning it into a prompt engineering task. Yet, as Nathaniel Whittemore notes on The AI Daily Brief, human oversight remains critical - agents write briefs overnight, but a sharp operator still tunes the prompt every Friday.
The labs winning this decade won't just have the best models. They'll control their power, bypass proprietary hardware silos, and deploy autonomous agents at scale.



