The Frontier

Your signal. Your price.

Dwarkesh Podcast
Dwarkesh Podcast 1d ago
  • The Michelson-Morley experiment (1887) did not prove the ether nonexistent. It only falsified certain ether theories, like the existence of an ether wind. Michelson continued to believe in the ether until his death in the 1920s.

  • Nielsen argues that falsification in science is far more complicated than naive models suggest. The Michelson-Morley result didn't induce special relativity; it merely ruled out some ether models while others remained viable.

  • Lorentz derived the mathematical transformations that form the basis of special relativity before Einstein, but interpreted them as physical effects of moving through the ether. His theory was experimentally indistinguishable from Einstein's until later tests like muon decay experiments in the 1940s.

  • Scientific communities can converge on a correct interpretation before definitive experimental proof arrives, as with the acceptance of heliocentrism centuries before stellar parallax was measured in 1838.

  • Poincaré understood key postulates of special relativity but clung to a dynamical explanation for length contraction, which Nielsen suggests shows how deep expertise can sometimes obstruct fundamental conceptual shifts.

  • Nielsen argues that major bottlenecks in science occur where existing heuristics and processes no longer apply. Progress requires a diversity of research programs exploring many promising ideas, as with the different responses to anomalies in the orbits of Uranus (leading to Neptune) and Mercury (requiring general relativity).

  • The theory of natural selection emerged independently in the 1850s because necessary building blocks like deep geological time (established by Lyell in the 1830s) and global biogeography from colonial voyages were finally in place.

  • Nielsen contends AlphaFold's success is primarily a story of decades of expensive experimental data acquisition (the Protein Data Bank), with AI modeling representing only a small fraction of the total investment.

  • Nielsen sees complex AI models like AlphaFold not as classic explanations but as new types of objects. They may contain embedded explanations that can be extracted through interpretability work, or enable novel operations like merging and distillation.

  • Nielsen believes the technology and science tree is vastly larger than we realize, with different civilizations likely exploring different branches. This creates the potential for massive future gains from trade in ideas, not just resources.

  • The explosion of new fields like computer science shows diminishing returns arguments fail because unseen 'desserts' are constantly added to the buffet of knowledge, allowing fresh progress without mastering prior centuries of work.

  • Nielsen argues new fundamental primitives keep being discovered within established frameworks, like public key cryptography and blockchain ideas emerging decades after the Church-Turing thesis defined computation.

  • Quantum computing emerged as a field in the 1980s because two historically contingent trends matured simultaneously: the salience of personal computing and the new ability to manipulate single quantum states with ion traps.

End of 7-day edition — 13 results