Back to Y Combinator Startup Podcast

What Surprised Us Most In 2025

Y Combinator Startup Podcast

Full Title

What Surprised Us Most In 2025

Summary

The podcast discusses surprising trends observed in 2025, including the stabilization of the AI economy with clear roles for model, application, and infrastructure layers, and a shift in developer preference from OpenAI to Anthropic as the top LLM.

It also explores the emergence of new infrastructure solutions like space data centers and fusion energy, the increasing accessibility of AI model building, and the shift from rapid idea generation to a more normalized difficulty in finding startup concepts.

Key Points

  • Anthropic has surpassed OpenAI as the preferred LLM provider among Y Combinator applicants in the winter 2026 batch, indicating a significant shift in developer preference for AI models.
  • Gemini's usage is also climbing, with a noticeable increase in popularity, and its reasoning capabilities and integration with real-time information are highlighted as key strengths.
  • The concept of an "AI bubble" is being reframed, with the current infrastructure investment akin to the telecom boom of the 90s, which ultimately paved the way for new applications and opportunities.
  • There's a growing trend of founders building an "orchestration layer" that allows them to swap between different LLMs based on task-specific performance, rather than being loyal to a single model provider.
  • The difficulty of finding startup ideas has returned to pre-AI-boom levels, suggesting a maturation of the AI landscape and a move away from the constant disruption of new model releases creating sudden opportunities.
  • Infrastructure challenges, such as power generation and land availability for data centers, are driving innovative solutions like space-based data centers and advancements in fusion energy.
  • The creation and fine-tuning of smaller, domain-specific AI models are becoming more accessible and effective, even outperforming larger models in certain niches, though continuous improvement is necessary to keep pace with leading general models.
  • The trend of lean startups, where companies achieve significant revenue with minimal staff, is evolving post-Series A, with companies still needing to hire substantial teams to meet increasing customer expectations and competitive pressures.
  • The perceived "AI bubble" is seen as a positive sign by some, indicating a glut of resources that lowers costs and creates opportunities for startups to innovate and build new applications, similar to how excess bandwidth fueled the internet's growth.

Conclusion

The AI economy has matured into a stable ecosystem with distinct layers, creating a predictable environment for startups and investors.

Current infrastructure investments, while appearing excessive, are setting the stage for widespread AI application proliferation, similar to past technological revolutions.

The increasing accessibility of AI model development and the emergence of innovative infrastructure solutions point to continued growth and opportunity in the AI space for founders.

Discussion Topics

  • How will the increasing competition among LLM providers shape the future of AI development and application innovation?
  • What are the most promising areas for AI-driven infrastructure innovation, and what are the biggest hurdles to overcome?
  • As AI adoption matures, how will the definition of a successful startup evolve in terms of team size, revenue growth, and resource allocation?

Key Terms

LLM
Large Language Model - AI models trained on vast amounts of text data capable of understanding and generating human-like text.
YC
Y Combinator - A highly selective startup accelerator program that provides seed funding and mentorship.
API
Application Programming Interface - A set of rules and protocols that allows different software applications to communicate with each other.
Tech Stack
A set of technologies used to build and run an application or system.
GPU
Graphics Processing Unit - Specialized electronic circuits designed to rapidly manipulate and alter memory to accelerate the creation of images. Increasingly used for AI computations.
AMD
Advanced Micro Devices - A technology company that produces computer processors and graphics cards, a competitor to NVIDIA.
TPU
Tensor Processing Unit - Google's custom application-specific integrated circuit (ASIC) designed specifically for machine learning and deep learning workloads.
Capex
Capital Expenditure - Money spent by a company to acquire, maintain, or improve its fixed assets, such as buildings, machinery, and equipment.
SSE
Scalable Storage Engine - Likely referring to a distributed storage system.
Tokamak
A device that uses a powerful magnetic field to confine plasma in a toroidal shape for fusion energy research.
RL
Reinforcement Learning - A type of machine learning where an agent learns to make decisions by taking actions in an environment to maximize a reward.
ARR
Annual Recurring Revenue - The predictable revenue a company expects to receive on a yearly basis.
CEQA
California Environmental Quality Act - A state law that requires state and local agencies to evaluate the environmental impacts of proposed projects.
Foie Gras Startups
Likely referring to startups that are heavily funded and may be overvalued or pursuing unsustainable growth.

Timeline

00:00:00

The AI economy has stabilized with defined layers for models, applications, and infrastructure.

00:00:50

Anthropic has become the number one preferred LLM API for Y Combinator applicants, surpassing OpenAI.

00:03:19

Gemini is also climbing in rankings, with its reasoning and real-time information capabilities noted.

00:06:42

Founders are arbitraging multiple LLMs by building an orchestration layer to select the best model for specific tasks.

00:08:40

The concept of an AI bubble is viewed as an opportunity, similar to the telecom boom, by creating a glut of resources.

00:12:28

The current phase of AI development is compared to the "installation phase" of technological revolutions, characterized by heavy investment and a feeling of a bubble, which will lead to a "deployment phase" of abundance.

00:14:37

Space data centers and fusion energy are emerging as solutions to infrastructure challenges like power generation and land scarcity.

00:17:33

There is an increasing interest in building smaller, domain-specific AI models.

00:21:00

Biocoding has emerged as a significant new category in AI applications.

00:22:25

The AI economy has matured, with a stable structure and a clearer playbook for building AI companies.

00:23:03

The difficulty of finding startup ideas has normalized, moving away from constant disruption by new AI announcements.

00:24:07

The "fast takeoff" argument for AI is tempered by log-linear scaling and human resistance to organizational change.

00:25:49

The trend of companies achieving high revenue with very few employees is evolving, with a return to hiring teams post-Series A due to increased customer expectations.

Episode Details

Podcast
Y Combinator Startup Podcast
Episode
What Surprised Us Most In 2025
Published
December 22, 2025