AI is colliding with its limits. Chase Lock Miller from Crusoe AI is spearheading projects like the 1.2-gigawatt data center for OpenAI and Oracle, demonstrating the immense power demands of today's AI. This scale highlights the current necessity of massive infrastructure just to keep up.
Miller's focus reflects a brute force method, emphasizing infrastructure and energy consumption. Yet, as demand skyrockets, this strategy faces sustainability challenges. Enter Naveen Rao from Unconventional AI, who argues that the real issue is an outdated computing model.
Rao critiques the continued use of 80-year-old computing paradigms, originally designed for different tasks. He proposes circuits that directly mirror neuronal activity, aiming for massive efficiency gains. His target is a transformative leap in computing, potentially a thousand-fold improvement in energy efficiency.
Rao’s vision surpasses merely matching the human brain's efficiency. Unlocking synthetic intelligence involves not just meeting but exceeding natural neural efficiency. With vast global energy capacity, the challenge isn't availability but effective utilization.
Naveen Rao, This Week in AI:
- We're kind of thinking about the computer that we all know and love.
- It's something that's an 80-year-old paradigm.
