04-03-2026Price:

The Frontier

Your signal. Your price.

AI & TECH

Light Matter's photonic chips aim to triple AI training speed

Friday, April 3, 2026 · from 2 podcasts
  • Light Matter's photonic interconnect chips can link GPUs a kilometer apart, tripling foundation model training speeds.
  • Structured enterprise data, not text, is the next AI frontier, driving new 'Large Tabular Model' architectures.
  • Focus is winning: OpenAI killed Sora as Anthropic's focused B2B code tools captured high-value enterprise markets.

Light Matter is building chips that use light, not electricity, to move data between processors. Co-founder Nick Harris, speaking on This Week in Startups, argued that copper wiring has hit a physical wall, creating a bottleneck for sprawling AI data centers. His company's photonic technology pushes 1.6 terabits per second over a single fiber, allowing GPUs to be spaced a kilometer apart while acting as one system. This structural advantage, Harris claims, can triple the training speed for massive foundation models.

The race for faster AI hardware underscores a broader market shift toward deterministic enterprise applications. While generative AI captured headlines, the real money is in structured data - the rows and columns of spreadsheets and databases that power fraud detection and supply chains. On This Week in AI, Jeremy Fraenkel explained that traditional Large Language Models falter here because their outputs can change if you simply reorder table columns. His company, Fundamental, is building Large Tabular Models (LTMs) designed for consistency across billions of rows.

This pivot from creative to operational AI is reshaping corporate strategies. Victor Riparbelli pointed to OpenAI's discontinuation of its video model Sora as a lesson in focus. While OpenAI chased multiple modalities, Anthropic concentrated solely on B2B and code generation. That discipline paid off as Claude Code became a developer favorite, forcing OpenAI to retrench. The era of AI as a general-purpose toy is giving way to a battle for core business workflows.

The convergence is clear: the hardware to train models faster, the specialized models to exploit structured data, and the corporate focus to monetize it. Light Matter's photonics offer a path past the energy and thermal limits of copper. LTMs promise to unlock the 80% of enterprise data currently ill-served by LLMs. And the companies that can execute on a narrow, high-value use case, like Anthropic did with coding, are pulling ahead. The next phase of AI won't be about what it can generate, but how reliably it can compute.

Nick Harris, This Week in Startups:

- We can actually 3x faster time to train.

- The first companies that adopt this photonic technology for linking up GPUs and AI data centers are going to have an enormous advantage.

Jeremy Fraenkel, This Week in AI:

- If you change the order of your sentence in Claude or ChatGPT, you can get a different output.

- But with tables, you actually don't want that.

By the Numbers

  • 70%poll respondents who think AI will decrease job opportunitiesmetric
  • 30%Americans worried AI will impact their own jobmetric
  • $255 millionFundamental Series A raisemetric
  • 90%Fortune 100 companies using Synthesiametric
  • $100 millionSynthesia ARRmetric
  • $4 billionSynthesia valuationmetric

Entities Mentioned

AmazonCompany
AnthropicCompany
Claude CodeProduct
FLOWTool
Google AntigravityProduct
Light MatterCompany
NvidiaCompany
OpenAItrending
QualcommCompany
SoraProduct
SpeedCompany
SynthesiaCompany

Source Intelligence

What each podcast actually said

How 3 CEOs Use AI to Run $10B in Companies | This Week in AIApr 2

  • Jeremy Frankel's company Fundamental built a foundation model for tabular data, not an LLM, to address structured enterprise data in rows and columns.
  • Frankel states that large language models are designed for unstructured data like text and video, but most useful enterprise data is structured tabular data.
  • Fundamental's large tabular model architecture differs from LLMs because it is not autoregressive; changing column order in a table does not change the output.
  • Frankel claims LLMs are not suitable for deterministic predictive tasks like fraud detection, where output consistency is critical.
  • Frankel says traditional machine learning algorithms still outperform most LLMs for predictive tasks on tabular data.
  • Fundamental's Nexus model aims to unify various predictive use cases like credit card fraud and demand forecasting into a single, more accurate model.
  • Victor Riparbelli says Synthesia, an AI video platform for business, has 90% of Fortune 100 companies as customers.
  • Riparbelli says Synthesia's initial focus was enabling PowerPoint users to create video content, a demand they identified in 2022.
  • Riparbelli says Synthesia is developing real-time interactive video, where users can role-play with AI avatars, moving beyond broadcast video.
  • Nick Harris states that the classic rules driving computing progress, Moore's Law and Denard scaling, are now over.
  • Harris says the future of computing relies on two things: building bigger computer chips and networking them together at high bandwidth.
  • Harris states copper interconnect limits GPU proximity in racks, while photonics allows GPUs to be separated by a kilometer and still act as a single system.
  • Nick Harris says Light Matter's chip with Qualcomm pushes 1.6 terabits per second over a single optical fiber, equivalent to 1,600 homes with gigabit internet.
  • Harris states Light Matter's M1000 chip has 114 terabit per second bandwidth, roughly equal to undersea cables connecting North America and Europe.
  • Nick Harris claims photonic technology can 3x the training time for large AI models, significantly accelerating the rate of AI progress.
  • Harris says Light Matter builds chips for hyperscalers like Google and Amazon, as well as for GPU and networking companies.
  • Jeremy Frankel states that exploring non-NVIDIA hardware like Amazon's Tranium chips is a priority to avoid dependency on a single hardware platform.

Also from this episode:

AI & Tech (11)
  • Jeremy Frankel states that 70% of people in a poll believe AI will decrease job opportunities, but only 30% of Americans worry it will happen to them.
  • Jeremy Frankel argues that the first major wave of AI automation is targeting cognitive work, not just physical labor.
  • Jeremy Frankel says his company Fundamental emerged from stealth as a unicorn 16 months after founding.
  • Frankel states Fundamental raised a $255 million Series A led by Oak with participation from Valor Battery and Salesforce.
  • Riparbelli states Synthesia has over $100 million in ARR and a $4 billion valuation after raising over $500 million.
  • Riparbelli argues OpenAI shutting down Sora shows the company learned the 'unteachable lesson' of focus the hard way.
  • Riparbelli claims Anthropic's success with Claude Code shows that focusing solely on B2B code generation is a highly valuable near-term strategy.
  • Jeremy Frankel notes that at a recent Lightspeed founder retreat, everyone was discussing Claude Code, not other AI products.
  • Victor Riparbelli outlines Synthesia's thesis that AI will drive the marginal cost of creating video and audio content to near zero.
  • Nick Harris explains that modern AI data center racks consume a megawatt of power and require reinforced concrete due to their weight and cooling needs.
  • Riparbelli estimates that generating a personalized one-hour movie with current state-of-the-art video models would cost around $700, making it commercially unsustainable.

How Focus Killed Sora and Saved Anthropic | This Week in AI with Victor Riparbelli, Nick Harris & Jeremy FraenkelApr 1

  • Jeremy Frankel's company Fundamental builds foundation models for tabular data, a modality that differs from LLMs.
  • Large language models primarily solve unstructured data problems like text and images but do not impact structured row-and-column data.
  • Structured tabular data constitutes the vast majority of useful data for enterprises but never had its 'ChatGPT moment' until now.
  • A large tabular model differs from an LLM because it requires permutation invariance; column order should not change the output, unlike language.
  • Traditional machine learning algorithms still outperform LLMs for predictive tabular tasks like fraud detection or demand forecasting.
  • Synthesia, an AI video platform for business, has over $100 million in ARR and a $4 billion valuation.
  • Nick Harris's company Light Matter builds photonic interconnect technology to link AI chips, replacing copper with light for greater bandwidth and reach.
  • Copper's short reach forces AI racks to be packed densely at megawatt scales, creating cooling and infrastructure challenges.
  • Light Matter's chip with Qualcomm pushes 1.6 terabits per second over a single optical fiber, equivalent to 1,600 houses with gigabit internet.
  • Light Matter's M1000 chip has 114 terabits per second bandwidth, comparable to undersea cables connecting North America and Europe.
  • Most runtime for AI models on supercomputers is spent on networking and moving data between GPUs, not on compute.
  • Hyperscalers like Amazon and Google build custom chips to control costs, despite NVIDIA's CUDA software moat.
  • Synthesia's next product is real-time interactive video, where users role-play with AI agents, requiring high bandwidth and low inference costs.
  • Victor Riparbelli argues that manually building tools like a CRM often has a higher focus cost than the monetary savings from avoiding a subscription.
  • Jeremy Frankel's team built its own CRM called Fetch integrated into Slack, questioning the need for external tools at a small scale.
  • CEOs now use AI to summarize communications, keep strategic tension tight, and act as omnipresent managers across their organizations.

Also from this episode:

Startups (1)
  • Fundamental emerged from stealth as a unicorn just 16 months after founding with a $255 million Series A led by Oak.
Big Tech (1)
  • OpenAI shut down its Sora video model because it learned the lesson of focus, while Anthropic focused solely on code generation.
Coding (2)
  • Claude Code's rise has become a dominant topic in founder circles, indicating a major shift towards AI-assisted coding.
  • The central challenge with VibeCoding is building a verification framework to ensure the generated software works correctly.
AI & Tech (2)
  • Whisperflow is a speech-to-text tool that outperforms others by fixing grammatical errors and allowing natural pauses during dictation.
  • AGI is a moving goalpost; technology that would have been considered AGI a decade ago is now seen as standard.
Labor (3)
  • A Quinnipiac poll shows 70% of Americans believe AI will decrease job opportunities, but only 30% are personally worried.
  • Jeremy Frankel argues AI automation is different because it automates cognition, not just physical labor, unlike past revolutions.
  • Victor Riparbelli is optimistic that future jobs will focus more on human enjoyment like dining and music, moving away from numerical work.