AI's biggest bottleneck is no longer algorithms, but electricity, water, and silicon.
On the Dwarkesh Podcast, Dylan Patel explained the high-stakes race for physical infrastructure. Big Tech's $600 billion capex funds compute years in advance. AI labs need it now. OpenAI's early, aggressive deal-making locked in cheaper capacity, while a more conservative Anthropic must now hunt for last-minute chips at premium prices. This divergence reveals a new strategic layer: scaling AI is a war for depreciating physical assets.
That war is colliding with Earth's limits. This Week in AI host Philip Johnston noted that communities like Tucson, Arizona are unanimously voting down gigawatt-scale data centers over water and energy concerns. The backlash is forcing the search for alternative locations, including space. Johnston's startup, Aethero, is launching an H100 GPU next week to test orbital data centers, betting that reusable rockets can make space-based solar cheaper than terrestrial farms.
Decentralization offers another path. On This Week in Startups, the founders of Hippius Subnet 75 pitched their service as a drop-in replacement for Amazon S3, distributing storage across a global network of hard drives. They argue centralization creates systemic fragility, a risk that grows as compute concentrates.
Meanwhile, the promised intelligence feels increasingly distant. Podcasting 2.0 dissected Sam Altman's vague retreat from defining AGI, noting the business model is explicit: hook developers, then raise prices. The messy reality of local AI tooling, described as a pile of stinking bullcrap, contrasts with corporate mystique.
The practical work is less about reasoning and more about memory. On TFTC, Brian Murray and Paul Itoi highlighted the daily frustration of AI assistants that forget everything between sessions. The next leap, they argue, won't be better language models but tools that remember, using graph databases to create persistent knowledge webs.
The industry is bifurcating. One path chases physical scale at any location, from orbital clusters to decentralized nets. The other path focuses on making the intelligence we already have actually useful. Both are reactions to the same truth: the software is hitting hardware walls.
Dylan Patel, Dwarkesh Podcast:
- In some sense, a lot of the financial freakouts in the second half of last year were because, OpenAI signed all these deals but they didn't have the money to pay for them.
- Anthropic was a lot more conservative. They were like, We'll sign contracts, but we'll be principled.




