The race to scale AI has turned into a scramble for physical resources, where money alone can't buy you a seat.
According to Dylan Patel on the Dwarkesh Podcast, Big Tech's massive capital expenditures are a multi-year bet, funding power turbines for 2028 and data centers for 2027. The AI labs need capacity now. OpenAI's aggressive early deal-making locked in cheaper cloud capacity, creating a decisive advantage. Anthropic's conservative financial stance left it exposed; its explosive growth now forces it to chase last-minute compute deals at premium prices.
The hardware constraint is driving innovation in how compute is structured. On This Week in Startups, the Hippius subnet uses Bit Tensor's decentralized network to create a distributed cloud storage service, positioning itself as a cheaper, more resilient alternative to Amazon S3. The founders argue centralization creates systemic risk, and a distributed architecture offers inherent fail-safes.
This push for decentralization faces its own bottlenecks. On the Presidio Bitcoin Jam, the discussion highlighted that despite open-source models, real control in AI often sits with a few entities. Training data, compute, and distribution remain centralized, making true decentralization more aspiration than reality.
The competition is no longer just about models or algorithms. It's about securing the physical and structural foundations to run them. The winners will be those who control the pipes, not just the payload.
Dylan Patel, Dwarkesh Podcast:
- In some sense, a lot of the financial freakouts in the second half of last year were because, "OpenAI signed all these deals but they didn't have the money to pay for them…"
- Anthropic was a lot more conservative. They were like, "We'll sign contracts, but we'll be principled."


