Light Matter is building chips that use light, not electricity, to move data between processors. Co-founder Nick Harris, speaking on This Week in Startups, argued that copper wiring has hit a physical wall, creating a bottleneck for sprawling AI data centers. His company's photonic technology pushes 1.6 terabits per second over a single fiber, allowing GPUs to be spaced a kilometer apart while acting as one system. This structural advantage, Harris claims, can triple the training speed for massive foundation models.
The race for faster AI hardware underscores a broader market shift toward deterministic enterprise applications. While generative AI captured headlines, the real money is in structured data - the rows and columns of spreadsheets and databases that power fraud detection and supply chains. On This Week in AI, Jeremy Fraenkel explained that traditional Large Language Models falter here because their outputs can change if you simply reorder table columns. His company, Fundamental, is building Large Tabular Models (LTMs) designed for consistency across billions of rows.
This pivot from creative to operational AI is reshaping corporate strategies. Victor Riparbelli pointed to OpenAI's discontinuation of its video model Sora as a lesson in focus. While OpenAI chased multiple modalities, Anthropic concentrated solely on B2B and code generation. That discipline paid off as Claude Code became a developer favorite, forcing OpenAI to retrench. The era of AI as a general-purpose toy is giving way to a battle for core business workflows.
The convergence is clear: the hardware to train models faster, the specialized models to exploit structured data, and the corporate focus to monetize it. Light Matter's photonics offer a path past the energy and thermal limits of copper. LTMs promise to unlock the 80% of enterprise data currently ill-served by LLMs. And the companies that can execute on a narrow, high-value use case, like Anthropic did with coding, are pulling ahead. The next phase of AI won't be about what it can generate, but how reliably it can compute.
Nick Harris, This Week in Startups:
- We can actually 3x faster time to train.
- The first companies that adopt this photonic technology for linking up GPUs and AI data centers are going to have an enormous advantage.
Jeremy Fraenkel, This Week in AI:
- If you change the order of your sentence in Claude or ChatGPT, you can get a different output.
- But with tables, you actually don't want that.

