07/08/2025

AI

Enterprise

Next Big Shift in Search: From Product to Infrastructure

For 25 years, search meant going to Google’s input box and getting ten blue links. Over the past two years, search has shifted: now, you go to Google or SearchGPT and get ten paragraphs.

However, a bigger shift is occurring in search that’s less noticeable. It doesn’t involve products from specific companies, like Google or OpenAI; in fact, that’s exactly the point. Search is no longer a product you go to. Search is instead becoming internet infrastructure, embedded within millions of products without you even knowing.

From product to infrastructure

It should come as no surprise that AI is eating software. We believe that design tools like Figma, coding tools like Cursor, and knowledge management tools like Notion are no longer static tools you tell what to do, now they talk back. The world will soon be filled with intelligent products that you can chat with.

As intelligence is embedded into every product, so too will search.

That’s because AIs need to search. They can’t memorize all the world’s information. They are too small and static, while the world’s information is immense and ever-changing. LLMs need tools that can search over the world’s information. LLMs need search engines.

Companies are at different stages of integrating search into their AI products. We believe Cursor and Notion are early. Others will follow. Eventually, most software products will have search embedded inside them, and we’ll soon reach a day when more searches happen outside traditional search engines than within them.

A new world of needs

When search becomes infrastructure, many changes follow. Here are three:

  1. Fragmentation. Every product will have its own search and, therefore, its own specialized search needs. Some will want fast search, others would rather have higher quality. Some want fun results, others want serious ones. Search tools will fragment and specialize to satisfy the varying search needs.
  2. New revenue models. Traditional engines monetize clicks; embedded search monetizes outcomes. That shifts power to pure‑play search providers that sell retrieval‑as‑a‑service.
  3. Retrieval quality becomes the differentiator. Agents care about recall, latency, and structured entities, not ad-filled results or SEO clickbait.

It seems every software company has an AI strategy these days. They will soon need a search strategy too.

 

Opportunity #1: Web search for LLMs

LLMs need a live index of the public web, i.e., they need web search. It needs to be accessible via API. That’s what companies like Exa are building. They built a search engine from scratch that we believe is optimized for AIs. When AIs are the end-users, all aspects of search — crawling, embedding models, and interfaces — need to be rethought for optimal downstream LLM performance.

Addressable market?  Every product that uses LLMs.
Pricing?  Usage‑based, not ad auctions.
Benefit?  LLMs get high-quality, real-time knowledge.

 

Opportunity #2: Deep research for humans

LLMs enable deep research for humans that goes far beyond the traditional search engine’s half-second results. We believe that OpenAI’s deep research proves the appetite. We also believe that Exa’s Websets play here by unleashing an army of agents to provide a comprehensive set of information for the hardest searches. Both are now available as an API to power new search products that are OK with long-running jobs.

Addressable market?  Hundreds of millions of knowledge workers.
Pricing?  Subscriptions, monetizing saved hours rather than clicks.
Benefit?  10x reduction in research time.

 

Opportunity #3: Private data search for LLMs

Some corporate knowledge is locked behind SaaS silos, like Slack, Drive, and Salesforce. A security‑aware retrieval layer, combined with a chat interface, surfaces it intelligently and instantly. We believe that Glean’s momentum ($7.2 B valuation, June 2025) shows the scale.

Addressable market?  The $400 B+ enterprise productivity stack — every employee who asks, “where’s that doc?”.
Pricing?  Per‑seat SaaS with natural upsell to workflow automation and agent workloads.
Benefit?  Employee productivity and preserving institutional memory.

 

How we think the dust will settle

  • Search APIs win big: The most valuable search companies may never display a results page. Instead, they’ll be an invisible infrastructure layer powering a new world of agents.
  • Fragmented search ecosystem: Expect many winners. API search infra companies power millions of products in the background. Chat apps are the frontend layer for most knowledge workers. And, we think Google continues to lead free, ad‑driven consumer search.
  • A more informed world: Search is embedded into every app, every agent, every context window. Products become more trustworthy and wise as they incorporate the world’s information into each token.

 

I’m biased — Lightspeed led early investments in Exa and Glean — but the broader thesis drives that conviction. As search melts into the operating system of both humans and machines, we’ll need specialized infrastructure to keep our knowledge fresh, comprehensive, and trustworthy.

If you’re building the plumbing that will feed the next trillion agent requests — or the interfaces that will make them usable — I’d love to chat.

Search isn’t a product anymore. It’s the water main. The question is: Who owns the pipes?

 

 

The content here should not be viewed as investment advice, nor does it constitute an offer to sell, or a solicitation of an offer to buy, any securities. The views expressed here are those of the individual Lightspeed Management Company, L.L.C. (“Lightspeed”) personnel and are not the views of Lightspeed or its affiliates; other market participants could take different views.

Unless otherwise indicated, the inclusion of any third-party firm and/or company names, brands and/or logos does not imply any affiliation with these firms or companies.

Lightspeed Possibility grows the deeper you go. Serving bold builders of the future.