The AI revolution is no longer just about code. It's about power grids, transformers, and data centers.
According to financial expert Jordy Visser on the *Bitcoin And* podcast, AI has hit its “physical limits.” This is forcing a massive investment pivot from software-centric companies to the raw materials and energy infrastructure that power them. The era of easy growth fueled by algorithms is giving way to a capital-intensive buildout of the physical world.
Elon Musk is taking this to its logical extreme. Convinced the legacy semiconductor industry is too cautious, he plans to build a “Terafab” - a single facility the size of three Central Parks to vertically integrate chip production. Brett Winton of ARK Invest, speaking on *FYI*, explained that Musk sees access to chips as the primary bottleneck for building galaxy-spanning intelligence. The project is a high-stakes move to force the entire supply chain to expand.
Brett Winton, FYI - For Your Innovation:
- Access to chips is his anticipated choke point because he believes he can launch terawatts of energy into space.
- He just needs terawatts of chips to accompany that energy to train and infer massively intelligent AI models.
Just as this industrial mobilization begins, political resistance is mounting. On *The AI Daily Brief*, Nathaniel Whittemore detailed a bill from Bernie Sanders and Alexandria Ocasio-Cortez that would pause all new data center construction in the U.S. The proposal recasts the infrastructure buildout from a technical issue into a populist one, creating a new layer of uncertainty for investors and builders.
Meanwhile, the people actually building the infrastructure are signing long-term deals. Michael Intrator, CEO of cloud provider CoreWeave, dismissed fears of rapid GPU obsolescence on the *All-In* podcast. He called the argument “nonsense” pushed by short-sellers, noting his clients sign five-year contracts and that prices for older A100 chips are actually appreciating. For Intrator, the sustained demand for inference - the practical application of AI models - proves this is a long-term capital cycle.
Michael Intrator, All-In:
- My take on the GPU depreciation bait is that it's nonsense.
- It's a debate that is being brought to the forefront by some traders that have a short position in the stock and they're trying to talk down.
Some companies are trying to engineer their way around the problem. Google's “TurboQuant” algorithm and Apple’s strategy of “distilling” large models onto iPhones could reduce reliance on massive, centralized data centers. But these efficiencies are running against an explosion in demand.
The future of AI will be decided not just by better algorithms, but by who can secure the power and industrial capacity to run them.



