Scarcity has a new address. Intelligence is now cheap; verifying its output is the only asset that matters.
On Bankless, MIT economist Christian Catalini frames AI as a deflationary shock for knowledge work. When generating a legal brief or marketing strategy costs nothing, economic value concentrates on the human authority who signs off on it. This creates a ‘missing junior loop’ - AI automates the grunt work that once trained novices, starving the pipeline of future verifiers.
Even seasoned experts aren’t safe. Catalini notes that top professionals in law and finance are hired by AI labs to create evaluation data, effectively digitizing their intuition into the models that may one day replace their judgment. He dismisses the notion of irreplaceable human taste as ‘cope’ - to an economist, anything measurable can be replicated.
This need for deep, unreplicable expertise is the core safeguard against AI’s dangers. On What Bitcoin Did, Junseth argues that without domain knowledge, users cannot spot catastrophic errors. He recounts using an LLM for chemistry where it proposed formulas that would have caused explosions if followed. The machine is a tool for the expert, not a replacement for the education.
Junseth, What Bitcoin Did:
- The language of every single industry is not best spoken by an English major.
- The language of science is best spoken by a scientist.
The verification economy flattens traditional hierarchies. The bottleneck is no longer who can do the work, but who has the authority to declare it finished. As AI agents proliferate, the human role shrinks to that of final gatekeeper - the scarce, responsible adult in a room full of artificially intelligent children.
Christian Catalini, Bankless:
- If you're entry level, if you haven't really acquired that tacit knowledge... AI is out of the box often a good substitute for you across every domain.
- Everybody now has access to a pretty good marketer or pretty good engineering lead.

