The era where raw compute power was the primary constraint on artificial intelligence is over. The new bottleneck, according to analysts, is human oversight. Nathaniel Whittemore explained on The AI Daily Brief that with state-of-the-art models reducing hallucination rates from 21.8% to nearly 0.7% between 2021 and 2025, the barrier has shifted from technical reliability to managerial judgment. When every employee can generate a 100-page memo with a click, the only remaining value is knowing which words to keep.
This creates a stark division between proficient users and skeptics. Whittemore's survey data shows 97% of his audience uses AI daily, with over 60% engaging in advanced agentic workflows. These power users treat AI as a suite of specialized tools, employing an average of 3.5 different models for distinct tasks. The skill isn't in perfect initial prompting - models now auto-refine messy instructions - but in iterative feedback and knowing when to scrap the output.
Philosopher Bradley Rettler, on What Bitcoin Did, warns this convenience comes with a cognitive tax. He argues that outsourcing reasoning creates a dangerous loop: the more you use AI as a substitute for your own thinking, the worse you get at thinking yourself. Empirical data shows groups allowed to use AI for a task perform faster but are much worse at doing it themselves afterwards.
Bradley Rettler, What Bitcoin Did:
- If we give up doing that thinking, the AI just keeps reproducing what we've already done and we don't make progress.
The danger is a centralized thought monopoly. If humans stop contributing original ideas, AI systems merely repackage a sanitized average of past human thought, curated by a handful of tech companies. Yet Rettler also sees a countervailing force: AI’s ability to find novel connections across vast datasets is ushering in a golden era for philosophy, where semantic reasoning trumps syntactic mastery.
The consensus across these perspectives is that the AI stack now has a new, human layer. The work is no longer production but curation. As Whittemore put it, volume is free, making discernment the only scarce resource.
Nathaniel Whittemore, The AI Daily Brief:
- You absolutely do not need to know some complicated set of tricks to get a lot out of these models.
- The whole idea is that you just talk to them in English and they will figure it out.

