Key Takeaways
- AI and scaling laws are driving massive investment and competition among tech giants like Google, Meta, Microsoft, and others in a "race to create a digital God"
- Infrastructure efficiency is becoming a critical success factor for AI companies, measured by metrics like model flops utilization (MFU)
- Data centers and semiconductors are evolving rapidly to support AI, with innovations needed in networking, storage, cooling, and chip design
- Robotics and self-driving cars may be the next major frontier, with Tesla potentially having a significant advantage due to its data and AI capabilities
- The investing landscape is likely to shift, with fundamental investors potentially gaining an edge through strategic use of AI tools over the next 5-10 years
Introduction
In this episode of Invest Like the Best, host Patrick O'Shaughnessy interviews Gavin Baker, managing partner and CIO of Atreides Management. Gavin is a seasoned technology investor with over 20 years of experience covering companies like Nvidia. The conversation focuses heavily on the current state and future of artificial intelligence, semiconductors, data centers, robotics, and how these technologies are reshaping both the tech industry and the investment landscape.
Topics Discussed
The AI Arms Race and Scaling Laws (5:14)
Gavin begins by discussing how the "Magnificent Seven" tech giants (Google, Meta, Microsoft, Amazon, Apple, Nvidia, and Tesla) are now in direct competition due to the rise of generative AI:
- These companies were previously in separate "swim lanes" but are now competing directly
- There's a belief that scaling laws will continue, meaning larger AI models will keep getting more capable
- Companies are willing to spend enormous sums, with some leaders saying "I am willing to go bankrupt rather than lose this race"
Gavin explains that progress in AI capabilities is closely tied to advances in GPU technology and the ability to create larger, more coherent training clusters:
- Recent innovations like XAI's 100,000 GPU cluster are pushing the boundaries
- The next generation of Nvidia GPUs (Blackwell) is expected to enable another leap forward
- There's a debate over whether emergent properties in AI are real or just "in-context learning"
Infrastructure Efficiency and AI Economics (18:00)
A key point Gavin emphasizes is how AI differs from traditional software economics:
- AI has high marginal costs, unlike software's near-zero marginal cost
- Infrastructure efficiency is becoming a critical differentiator for AI companies
- Model Flops Utilization (MFU) is a key metric, typically around 35-40% currently
Gavin introduces a framework for thinking about AI efficiency:
- MAMF (Maximum Achievable Matrix Multiplication Flops) - measures software efficiency
- SFU (System Flops Efficiency) - captures networking, storage, and memory efficiency
- Checkpointing frequency - how often models need to be saved due to hardware failures
- PUE (Power Usage Effectiveness) - energy efficiency of data centers
He argues that companies who can optimize these factors will have a significant advantage in both training and inference costs.
Data Centers and Semiconductors (25:55)
Gavin provides a detailed overview of the current state and challenges in data center and semiconductor technology:
- GPUs have gotten 50x faster in the last 5 years, while other data center components only 4-5x faster
- This imbalance leads to inefficiencies and is a major focus for improvement
- Innovations are needed in networking, storage, cooling, and chip design
- Companies like Nvidia, AMD, and others are racing to solve these challenges
He uses an analogy of a restaurant kitchen to explain data center components:
- GPU as the head chef
- Storage as the delivery truck bringing ingredients
- Memory as the refrigerator
- Networking as the connections between all these elements
The Future of AI and Data (35:29)
Gavin addresses concerns about running out of training data for AI:
- Synthetic data appears to be working well, though it's not fully understood why
- Unique data sources and real-time data will be key differentiators for AI companies
- Companies like Meta, Google, and Tesla have significant advantages in proprietary data
He also discusses the potential for AI to shift to more decentralized compute:
- Inference (running AI models) may increasingly happen on phones and other edge devices
- This could lead to "super phones" with much more powerful on-device AI capabilities
- Apple and Google are well-positioned to benefit from this trend
Robotics and Self-Driving Cars (1:01:45)
Gavin argues that robotics, particularly self-driving cars and humanoid robots, may be the next major frontier:
- Tesla's Full Self-Driving (FSD) technology is making rapid progress
- The combination of computer vision, large language models, and robotics is powerful
- Humanoid robots may become more cost-effective than specialized robots due to scale
He believes Tesla has a significant advantage in self-driving due to its data and AI capabilities:
- "FSD is now on the same scaling law and arguably on a faster scaling law because they have a lot of catch up to do that GPTs have been on."
- Tesla's data advantage may be 100-1000x larger than competitors
Leadership in Tech Giants (1:14:54)
Gavin shares insights on what makes leaders like Elon Musk (Tesla, SpaceX), Jensen Huang (Nvidia), and Lisa Su (AMD) exceptional:
- They work directly on the most critical problems in their companies
- They welcome bad news and have flat hierarchies for subject matter experts
- They are deeply mission-driven, attracting top talent with compelling visions
"Elon and Jensen love hearing bad news at those companies. If there's bad news, it must immediately go to them. And that's very differentiating."
The Evolution of Investing (1:24:35)
Gavin discusses how AI might change the investment landscape:
- AI tools may give fundamental investors an edge over quantitative strategies for 5-10 years
- The emphasis in venture capital may shift more towards judgment and emotional intelligence
- Growth equity is likely to evolve towards providing more genuine operational value-add
He introduces the concept of different "quotients" for investors:
- IQ (Intelligence Quotient)
- EQ (Emotional Quotient)
- KQ (Knowledge Quotient)
- JQ (Judgment Quotient)
Gavin argues that AI will level the playing field on KQ, making JQ and EQ more important differentiators for investors.
Conclusion
This wide-ranging conversation covered the cutting edge of AI, semiconductors, robotics, and their implications for both the tech industry and investing. Gavin Baker's deep expertise and passion for these topics shines through, providing valuable insights into how these technologies are reshaping our world. The discussion highlights the massive changes underway in AI infrastructure, the potential for robotics to transform labor markets, and how investors might navigate this rapidly evolving landscape. As AI continues to advance at a breakneck pace, it's clear that we're entering a new era of technological development with far-reaching consequences for business, society, and the global economy.