04-20-2026Price:

The Frontier

Your signal. Your price.

AI & TECH

AI inference costs break SaaS economics

Monday, April 20, 2026 · from 3 podcasts
  • Every new AI user adds direct compute costs, killing the zero-marginal-cost software model.
  • Enterprises now pay 10x more to replicate platforms with raw LLMs.
  • Investors flee legacy SaaS as AI agents scale without headcount.

The software industry’s foundational promise - near-zero marginal cost - has been nullified by AI inference. For decades, adding a user to a SaaS platform meant negligible additional expense. Now, each user triggers GPU cycles and token costs that scale linearly with growth. Anish Acharya on The Kevin Rose Show cited a founder needing $25 million just to support 100,000 monthly active users. Free-forever models are no longer viable; the math has changed.

This cost structure is reshaping venture incentives. Startups can spin up technical clones in hours, eroding code as a moat. But as Nathaniel Whittemore notes on The AI Daily Brief, the real bottleneck is no longer engineering - it’s network effects and data quality. The market’s pivot away from 'time savings' to 'Opportunity AI' reflects a deeper shift: companies now prioritize new capabilities over efficiency. GEO, or Generative Engine Optimization, is projected to hit $34 billion by 2034, signaling that firms are building workflows that didn’t exist before.

Investors have reacted swiftly. Legacy software stocks took a $400 billion hit this quarter, dubbed a 'SAS apocalypse' by Whittemore. Block cut 40% of its staff, a signal that agentic automation is already displacing roles. Meanwhile, AI-native firms like Anthropic hit a $19 billion run rate, with Claude Code doubling revenue in two months. The capital isn’t disappearing - it’s migrating to tools that automate, not just assist.

"The moat was never actually the software. It was the network that ran away before the clones could catch up."

- Anish Acharya, The Kevin Rose Show

ServiceNow CEO Bill McDermott dismisses the doom narrative but confirms the economic shift. He argues that rebuilding deterministic workflows with LLMs is 10 times more expensive than using established platforms. A simple ServiceNow app replicated in raw LLM would incur massive GPU and token costs, not to mention rebuilding tribal knowledge and integration logic. For enterprises, reliability trumps novelty - they’ll forgive a human error, but not a software failure.

McDermott sees a future of 2.2 billion digital agents entering the workforce, already evident in ServiceNow’s 90% AI-handled support cases. This decouples growth from headcount: companies can scale without hiring. HR and finance roles will shrink, while high-EQ and creative engineering roles rise. The new competitive edge isn’t automation alone - it’s integrating AI safely across hyperscalers, data silos, and legacy systems.

"The winner of the AI era won't be a single model, but the platform that integrates them all."

- Bill McDermott, No Priors

The platform layer is becoming the new battleground. McDermott positions ServiceNow as an 'AI control tower' using zero-copy data strategies to execute where data lives. This isn’t just integration - it’s defensibility. The complexity of connecting AWS, LLMs, and security stacks creates a moat that raw models can’t replicate. As AI reshapes work, the real question isn’t whether models are smart enough, but whether businesses can build systems that act reliably - and affordably.

Source Intelligence

- Deep dive into what was said in the episodes

How the Best Companies Use AIApr 19

  • Investors now fear AI is too effective, threatening the traditional SaaS business model with obsolescence.
  • Enterprises are abandoning 'time savings' metrics to focus on AI-driven revenue and new capabilities.
  • Anthropic’s standoff with the Pentagon signals a deepening rift over the weaponization of frontier models.

Network Effects, AI Costs, and the Future of Consumer Investing with Anish Acharya on The Kevin Rose ShowApr 19

Also from this episode: (13)

Other (13)

  • Kevin Rose, an investor, notes that AI models have made coding accessible to him, eliminating the need to remember syntax and allowing creative ideas to be built without manual code review.
  • Anish Acharya argues that traditional software moats based on engineering effort are shrinking, with the time to copy features like Instagram filters reduced to 48 hours, while AI inference costs threaten venture economics by pushing consumer startups to skip early funding rounds.
  • Kevin Rose builds personal projects on the belief that information should be platform-agnostic, using markdown as a portable 'lowest atomic unit' for file structure, allowing data to flow freely across diverse platforms and formats.
  • Anish Acharya states OpenAI holds an 'unassailable advantage' in consumer AI with 950 million weekly active users, making it the largest AI consumer product by orders of magnitude and a dominant force in model quality.
  • Anish Acharya expresses skepticism about claims of an AI model being 'too dangerous to release,' suggesting alternative reasons such as offensive/defensive capabilities, GPU shortages, or strategic marketing to create 'aura farming.'
  • OpenAI has proposed heavy taxation on AI profits and a safety net fund to address job displacement, a concept Kevin Rose supports for re-distributing wealth back to the public.
  • Anish Acharya argues that 'universal basic purpose' is more critical than universal basic income, as societal unrest is less about insufficient money and more about people lacking important work or a sense of 'hero's journey.'
  • Anish Acharya predicts a 'barbell' effect in AI model pricing: cutting-edge models will become more expensive and potentially unavailable via API, while older models like GPT-4o will experience massive deflation, with token prices dropping 100x since release.
  • Kevin Rose is optimistic about hardware devices that encourage disconnection and shared reality, citing 'Tin Can,' a simple phone for children, as an example that fosters excitement and real connection.
  • Kevin Rose defines 'dark data' as tacit knowledge - like specific mechanical adjustments or optimal mic setup - that remains unrecorded but holds immense value for AI models if unearthed and integrated. He cites estimating Charlie Rose's table dimensions from screenshots as an example of back-calculating such data.
  • Kevin Rose predicts that within five years, humans will no longer write code, as AI will compile directly to binary, eliminating the need for coding languages and database selection decisions, thereby simplifying infrastructure.
  • Kevin Rose predicts that within three years, 50-100 new peptides will be discovered and introduced into humans, significantly altering longevity by acting as 'upstream regulators' for various bodily functions, such as healing injuries or improving sleep.
  • Anish Acharya predicts a 'uniquely American' four-day work week, driven by AI's radical productivity gains rather than societal shifts, similar to how Ozempic addressed diet issues instead of fundamental behavioral changes.

Scaling Global Organizations in the Age of AI with ServiceNow CEO Bill McDermottApr 17

  • Bill McDermott bought his first delicatessen business for $5,500, with the total cost rising to $7,000 with interest. He secured inventory on consignment by leveraging supplier relationships.
  • According to McDermott, for a simple application on ServiceNow's platform, it would cost 10 times more to try replicating it with a large language model. This factors in rebuilding costs, human capital, and GPU and token expenses.
  • McDermott argues enterprise buyers tolerate human error but will not forgive software for making mistakes, placing a premium on deterministic and context-aware workflow platforms over probabilistic LLMs.
  • ServiceNow processes more than 85 billion workflows and seven trillion transactions in-flight on its platform, representing major global brands.
  • McDermott states that cybercrime is the world's third-largest economy at $1 trillion per month, behind the US and China, prompting ServiceNow's expansion into security with Armis.
  • ServiceNow integrated its acquisition of security company Armis in 20 days, which McDermott cites as evidence of its engineering power to execute complex integrations quickly.
  • Within ServiceNow, AI agents now manage 90% of customer service cases, with only 10% requiring human intervention, shifting employee roles toward critical thinking and judgment.
  • McDermott says only about 11% of companies in Brazil have moved beyond the AI experimentation phase into mainstream deployment, illustrating the early-stage adoption curve globally.
  • Enterprise AI adoption varies by sector: public sector focuses on fraud prevention, healthcare on modernization, and financial services on headcount optimization and business model reinvention.
  • McDermott personally conducts 72 one-on-one conversations with quota-carrying sales representatives in a month to stay grounded in customer feedback and frontline insights.
Also from this episode: (1)

AI & Tech (1)

  • McDermott expects 2.2 billion AI agents to enter the workforce in the next few years, fundamentally changing headcount strategies. He foresees net new hiring dramatically decreasing as agents handle more work.