The trillion-dollar AI buildout is slamming into a wall of transformers, plumbers, and copper cables. While software updates ship in weeks, physical infrastructure takes years, creating a supply choke that is reshaping the tech industry’s fundamental economics. According to David Sacks on The Conversation with Dasha Burns, the U.S. faces an 'infinite game' for global market share where slowing domestic innovation merely offshore's progress to adversaries.
Hyperscalers are responding by becoming energy companies. Microsoft, Google, and Amazon have announced combined capital expenditures nearing $800 billion for 2026, crushing free cash flow. As Chamath Palihapitiya noted on All-In, they are locking in long-term power purchase agreements at rates more than double the spot price just to guarantee supply. The asset-light software era is over; these firms are becoming highly leveraged, capital-intensive industrial utilities.
The compute crunch is absolute. Baseten CEO Tuhin Srivastava reports his clusters run at mid-90s utilization with 'zero slack,' forcing three-to-five-year contract lock-ins with 30% upfront cash. This isn't a temporary shortage but a structural shift where access to GPUs is now a strategic barrier. The scarcity is so severe that even with this spending, firms are throttling service - Anthropic’s Opus 4.7 is currently 'compute-gated,' pushing users to older models.
“If global data centers run on Huawei chips and DeepSeek models five years from now, the U.S. loses its primary lever of soft power.”
- David Sacks, The Conversation with Dasha Burns
Supply constraints are exposing physical bottlenecks far beyond silicon. Reiner Pope explained on the Dwarkesh Podcast that a key limit for scaling massive Mixture-of-Experts models is the physical space for copper cables within a rack, dictating how many GPUs can communicate at high speed. Meanwhile, the immediate economic impact is inflationary, not deflationary. Steve Hou argued on Forward Guidance that the massive infrastructure spending competes for scarce tradespeople, boosting wages for electricians and plumbers by 25-30% while providing zero slack to the Fed.
Sacks advocates a federal regulatory framework focused on specific harms like child safety, pre-empting a patchwork of 1,200 state-level AI bills he sees as knee-jerk governance. His proposed 'ratepayer protection pledge' would ease permitting for data centers if companies bring their own power infrastructure, turning AI firms into energy providers that sell excess back to the grid. This is a pragmatic recognition that the buildout's fate hinges on local politics and physical logistics.
“We are currently over-provisioning on memory capacity just to get the necessary bandwidth. We have a surplus of space but a deficit of speed.”
- Reiner Pope, Dwarkesh Podcast
The competition extends beyond infrastructure. Sacks cited a stark optimism gap: 83% of Chinese respondents believe AI will be more beneficial than harmful, compared to under 40% of Americans. He views this cultural divergence as a bigger threat to U.S. leadership than any single technical setback. The race isn't just to build bigger models, but to define the global ecosystem in which they operate.
The path forward is through specialization, not sheer scale. Srivastava noted that over 95% of Baseten’s inference traffic uses custom-tuned models, not raw open-source weights. Companies like Abridge survive not by outspending frontier labs, but by owning deep user workflows - like clinician interactions - that generate proprietary training signals. This suggests the ultimate winners will control unique data loops, not just massive compute.





