• last year
Leigh Marie Braswell, Partner, Kleiner Perkins, Sarah Guo, Founder and Managing Partner, Conviction, Rebecca Kaden, Managing Partner, Union Square Ventures, Allie Garfinkle, Fortune

Category

🤖
Tech
Transcript
00:00Hello to all of you. Thank you so much for being here. I am really excited to be here
00:04with all of you. This morning at the term sheet breakfast, I asked the room, how many
00:09people here still have a lot of big questions about how AI is going to play out? Actually,
00:14who here has... Actually, let's ask the room. How many people here have big questions about
00:17how AI is going to play out? Excellent. Hopefully, we're going to help directionally come up
00:23with some answers here. Does that sound like a plan?
00:26Let's try. Let's try. We're going to do it. So that being the case, let's start with the
00:29hardest thing. What is the trickiest thing about investing in AI right now? Leigh Marie,
00:34you're right next to me, so start us off. I think it's becoming very clear over the
00:39past two years since ChatGBT came out that there is a massive opportunity to build companies
00:46in AI. Even if the models weren't going to improve, they definitely will. We should probably
00:51just discuss that. But even if they weren't going to improve, there's just so many ways
00:54that both enterprises and consumers are getting value out of this tech today. Whether it's
00:59a coding assistant, whether it's some sort of customer support chatbot, we're already
01:03seeing a lot of these things in the enterprise. So a really challenging part of my job and
01:07a high-stakes part of my job is figuring out who are the winners, especially in these certain
01:11application areas where we know a lot of value is going to accrue.
01:15Sarah? I'm a super visual person, and so I always have these two kind of scary images
01:21in my mind. One is there's an idiom, picking up pennies in front of a steamroller. And
01:28so you have this amazing set of capabilities. And one thing we think about all the time
01:32is how do we make sure we take the most advantage of such capabilities and also not do something
01:38that is better fulfilled by a general player, a foundation model player? And so making sure
01:44that we are at the right point of the frontier is something that we struggle a lot with.
01:50I think what we all know is that the opportunity is massive. The pace is extraordinary. There's
01:55so much in front of us. But to your first point, I think there are fundamental questions
02:00that we don't know the answer to right now and whose answers are indicative of the opportunity.
02:07One is probably the rate of advancement, how fast things are going to move, and can we
02:11assume that the rate we're on is indicative of the rate we will be on. The other is probably
02:16what this ecosystem looks like. Is it closed, dominated by a few big models? Is it very
02:21open, dominated by many? And the answers to those questions are determinant, I think,
02:26about the world we live in and where those opportunities sit. And so until we answer
02:31those questions and until we know them and figure out if they are knowable, things like
02:35how much of the stack of foundation model dominates, where do the applications sit,
02:40we can guess at them, but we don't really know without knowing kind of those two axes.
02:45And so I think that's kind of the biggest question on our minds right now.
02:48I really am curious about this rate of change, because it's something you hear all the time.
02:51People will say, AI is moving so fast. How fast is it moving? How would you describe
02:55it? I'd actually love to start with you, Rebecca, since you brought it up.
02:59Sure. I mean, I think we all can see AI isn't new and it's not new in the last couple of
03:04years, but that the pace has been accelerating and the breakthroughs feel frequent and potentially
03:09more and more frequent. So I think it is moving quite quickly. What we don't know, we know
03:14there are asymptotes. We know there are barriers we hit today and we know there are barriers
03:18we'll hit in the future. We don't know whether past is indicative of the future and where
03:22those asymptotes lie and how hard they are to overcome. And those I actually think are
03:28kind of unknowable questions today. So it's been moving very quickly, but we don't know
03:32kind of the go forward right there. What do you think, Sarah?
03:36The floor is lava, right? I think this is the most fun, most challenging time to be
03:41in technology in the last two decades of my career, but it's also a really big opportunity
03:46for leadership or if you're an investor for, you know, for greed, right? Because the ability
03:51to change the dynamics of an industry are, they're more limited in the middle of a cycle,
03:57but when you have the opportunity to just actually change unit economics or change the
04:02markets that you serve because of these capabilities, I think people can be much more dynamic with
04:08their strategy. And so, you know, the speed of change is very high and that's very good
04:13for us. What about you, Lee Marie?
04:15I mean, I agree with what's been said. I mean, I think one reason why it's so challenging
04:19to predict right now is we look at what's happening with the foundation model players
04:24and it does seem like the strategy has shifted a bit. Like at this point, you know, we've
04:28seen the massive gains that pre-training these models can have, but it is now becoming economically
04:34very challenging to continue just pouring more and more money into that strategy. And
04:38so we've seen incredible progress across post-training and then also something called test time compute.
04:43What does that mean?
04:44Basically giving these models more time to reason, giving them time to explore solutions
04:48kind of in parallel and iteratively. And sort of, that's kind of how O1, which is, you know,
04:54OpenAI's latest model, can come up with the answers to some harder questions. So it is
04:58really challenging to predict because it's like, okay, this is now a new sort of paradigm.
05:03How fast is that going to accelerate? Is it as capital intensive as pre-training was?
05:09Like will this actually democratize more and more people sort of building on top of these
05:12large pre-trained models, especially now that there are open source pre-trained models
05:16like LLAMA? So I think it's as a, you know, recovering competitive math kid who's been
05:21obsessed with machine learning for the past, you know, many years, it's just such an exciting
05:25time to like make predictions about how all this is going to play out.
05:28I really want to ask more about being a recovering competitive math kid, but that sounds like
05:31a different panel. Since you brought up OpenAI, valued at $157 billion right now, one of the
05:39biggest questions I get from people who are not in venture capital is how on earth are
05:44these valuations going to play out? What kind of exits can, what can exits really look like?
05:48How do you justify some of these valuations? Because to my dad in Florida, they look crazy,
05:53right? Who wants to take on the valuation question first?
05:58I'm happy to. Look, we can all paint many scenarios where it's not worth $157 billion,
06:05but I think there is one where it might be, right? And that's the world where you have
06:09rapid acceleration of these models, right? Where past is precedent and we keep going
06:13and we keep training and it's scale keeps accumulated and go and, you know, compounding
06:17on itself and you have a closed ecosystem. And so you imagine OpenAI as the ultimate
06:23Apple app store of this iteration where what they can extract isn't 30%, but it's maybe
06:28everything, right? They can own the stack and the opportunity to build applications
06:33on top of it, get smaller and smaller. To be clear, when we're talking about everything,
06:37what do you mean? So the question is, the internet history
06:40has always been a balance of protocol and infrastructure layer and application layer.
06:45And there's always been eras where protocol and infrastructure owns more and eras where
06:50applications own more. And we're kind of in this process of figuring out where we are
06:54here. And there's a world where actually we're in a moment where the protocols, which are
07:01the models in this kind of analogy, own everything because their ability to do what the applications
07:07do is comprehensive. I'm not sure we land there, but we don't know. And in that imaginative
07:13scenario, I think it is worth what it is worth today and probably more.
07:17Sarah, you're nodding. I worked at Goldman Sachs for a brief and
07:24inglorious year. And so in that period of time...
07:27Also sounds like another panel. Love Goldman, actually. Amazing organization,
07:32I'm more of a tech person. At Goldman, at most investment banks, you learn to do sum of parts
07:40valuation or different scenario analysis, even if you're in venture. I think the first decision
07:49point when people think about what open AI is worth is like, okay, well, how good of a business
07:55is it to chat GPT as a subscription consumer and prosumer business? And the reported numbers
08:01on open AI revenue are 3 to 4 billion and run right now. That's quite a large business. So
08:06you could even say it might be worth that pretty soon. But there are questions on how large of a
08:15consumer subscription business you can build. I think then people ask, can you build an ads
08:20business? Because Google's worth quite a bit more than $157 billion. And if you look at,
08:26you know, especially like Gen Z and younger, their habits around search or asking questions
08:32on the internet are changing. And so I think that's really interesting. Open AI has made some
08:39hires in people who have done ads before. I think the really interesting question about whether or
08:44not there's a venture return from there is when you ask, well, what does the organization believe?
08:49And the organization believes in AGI, right? And I don't spend like a lot of time thinking
08:54about this question because I find it very hard to reason about. Why do you find it hard to reason
08:58about? Well, because, well, I think most people, maybe it's just me. I think most people struggle
09:03to picture a world where you're like, oh, I have, you know, for the cost of compute,
09:09the capability of any knowledge worker on demand, right? And how does that restructure
09:19industries and what can I do with it? I think the premise of perhaps being a venture investor
09:27is not outside of opening out or the winning foundation model labs in the case of takeoff
09:32and AGI is not super interesting if you don't believe that the last mile of integration into
09:37the economy of these capabilities is pretty hard. And I think for any business leader here,
09:42you're like, oh, it is pretty hard. I got to manage all these people. I got to deal with
09:45competing incentives. I got to move an organization. We have all these interfaces. I still have a main
09:50frame over here, right? And so I think the last mile is pretty long and the credible but pretty
10:00simple answers for how does AGI make money kind of amount to we're going to let the model trade
10:05or something. And I think that's like uninspiring. It might work to capture value, but I think that's
10:11kind of the spectrum of how you think about the value. Well, the thing I like about that is that
10:14what you wind up in, which is kind of the moment I believe we're in, is as much a kind of opportunity
10:21of imagination as an opportunity of technology. And so, you know, I believe when you think about
10:25the value that's going to be created, it's as much of this conversation of where the limitations are
10:30and where the speeds are and where the asymptotes are as who can dream it up and who can imagine
10:34something that is really hard to get our minds around today and think about how you productize
10:38that into fixtures of our lives and our business. And that's the fun part about being an investor.
10:43That's very much up our wheelhouse because you're in an imagination crisis, which is, I think, a big
10:48moment of opportunity. Lee Marie. Evaluations. I guess I want to start with a big number,
10:54right? Let's do it. So I think, you know, by some estimation in 2023, I think enterprises already
11:01spent, you know, $600 million on sort of gen AI applications. So that's not even the models.
11:06That's just like the applications themselves. And that's 8x'd this year. So it's like not even just
11:11a dream. It's, you know, literally happening right now. And I mean, I work with a company,
11:16this company Codium. They're an AI-powered developer tooling platform. Upwards of 40%
11:20of their users' code is generated through Codium. They're like speeding up programmers by literally
11:25writing 40% of their code in some cases. So it's just, it's clear there's so much value being
11:30created today. And then just the rate of growth of these companies like Codium, even with a small
11:35team, we are seeing things that blow way past the sort of, you know, historical SaaS benchmark of
11:40triple, triple, double, double. You know, we're just seeing like a totally new paradigm with them
11:44and some of these other teams. And so then as an investor, you're seeing, wow, this solves real
11:49problems today. Enterprises are already eager to adopt it so early on in its lifecycle. And then
11:54these models are just getting better. So there's just more and more sort of end-to-end capabilities
11:57that they're going to be able to do. And so I think all that excitement, especially if you feel
12:02conviction around, this is the team that's going to capture this vertical specific opportunity,
12:07totally justifies a high valuation. You look like you have a thought, Sarah.
12:12I think Lu Marie is right. Like it's a general technology and the way it's going to get
12:18delivered is by unbundling it into all of the ways in which it's actually useful to people.
12:24Listening to the narrative in AI is very dangerous. I feel like a year and a half ago,
12:29the narrative was like nobody will ever catch open AI. Feels more questionable now. And like
12:34the application layer has no value. They called them GPT wrappers, right? And there's a version
12:39of that, right? If it's just like three prompts and an SEO page, like it's not a valuable company.
12:44But if you are building with much more understanding of the domain and product,
12:50and this is an opportunity for incumbents and for startups, I think there's a lot of value
12:57in every nook and cranny where we can use this really general capability.
13:00That's actually a great segue. As a heads up, I'm going to be coming to you all for questions soon.
13:05But first, I'd actually love to talk about use cases. That seems like a good place to go from
13:10here. Sarah, I'd like to start with you. What AI use cases are you jazzed about? I know you're
13:15really interested in defense right now. It is true. I think an increasing number of people in
13:21the technical community and the investor community are aware of how important AI is or just innovation
13:28and technology is to national security in this era. And so I just joined the board of a company
13:33in this space that we can't talk about yet. Are you sure? Yes, I'm sure. I had to try. I hope to
13:40tell you guys more soon. I do too. I'd say broadly, I think electronic warfare is an
13:46increasingly critical part of great power conflict in the future. A huge part of what
13:54motivates me as an investor is seeing democratization of capability, giving people
13:59the ability to build applications, be it faster because they can't write code at all, to create
14:05videos, to do design. And the better our tools are, the more democratized those capabilities are.
14:12To Lee Marie's point, we have companies in our portfolio that have really extraordinary growth
14:17rates, companies like Harvey, where what they want to do is democratize the ability to have
14:22high quality legal counsel. And so I think the most inspiring thing to me about working in AI is
14:29these algorithms apply to so many different fields and the demand for these different services,
14:35I think, is quite elastic. I think many more people want best-in-class counsel or the
14:42ability to make amazing videos or to write applications or even just write better than
14:47people may have imagined, to Rebecca's point about imagination. Rebecca? Yeah, I mean, I think
14:54what AI on the application layer is going to let us do is dramatically change
14:58three curves, right? Cost, quality, and ease of use. And if you think about where there's
15:04opportunity to dramatically change industries from those three factors, it gets really, really
15:08exciting. Legal is one example. Healthcare is obviously another one. Education is another one.
15:14And there are these industries that have been fundamentally structured in a way with blockers,
15:19right? Where you have buckets of spend or tricky things to get around or difficulty in adoption,
15:27where if you change the approach to the market, the ease of adoption, and the quality that can
15:32be delivered at low cost to the end user, you can transform the market that you're working with.
15:37We think a lot about what it would take to have ubiquitous care at zero cost. And that has been
15:43something that has been effectively impossible to do that I think is now a challenge of imagination
15:48and kind of approach versus fundamental ability. Personalized, perfect learning, no matter where
15:55you are, no matter the way you learn. Again, structurally very difficult in a society we set
16:00up, has been very difficult to do from a cost curve, now I think is a product problem and not
16:04a capability problem. And when you think of it that way, I think we're going to enter a time of
16:08massive transformation of those markets. Lee Marie, use cases.
16:13Totally agree with what's been said so far. I mean, you think about there's hundreds of
16:18billions of dollars spent on software, but then there's trillions of dollars spent on services.
16:23And you've got all these use cases across industries where the most valuable companies
16:29can afford to hire consultants to go off and maybe do some competitive analysis for them.
16:34The vast majority of other companies can't. So I love the sort of idea Sarah said of democratizing
16:40access to better intelligence, cost cutting, all this stuff is going to be possible with
16:45these models eventually. And I think there are really excellent opportunities across verticals,
16:49whether it's healthcare, whether it's financial services, whether it's even in design and
16:53manufacturing, using things that we learn from this generation of models and then from the next
16:58generation of generative models across the board. I just think a lot of these industries are going
17:02to look totally different in the next few years. One thing that's always fun to think about is
17:07what the second order effects of a change you imagine could mean for the ecosystem or our
17:15businesses. And I'll take one because Lee Marie has been talking about code generation.
17:21How many people work at a company here that uses SAP? A reasonable number. Most of the world's
17:28largest businesses run on SAP. Anybody who knows a lot about ERP would say, all right, the
17:35implementation and change management and upgrade cycle of SAP is really painful. And that limits
17:43many things you can do in business. It's an amazing system. There's very few replacements.
17:48But if you're on a five-year, 10-year upgrade cycle for SAP, there's a lot of money being put
17:54into the consulting and maintenance of that. And I think one thing that we're really excited about,
17:59because we think that process, there's a company called Nova AI in this space. We think that
18:03process is going to get automated. It unlocks a lot of budget and a lot of IT capability to go do
18:10other things if you're spending less energy and money on just keeping the trains running.
18:16We have a question over here. Please say your name and your company.
18:20Yep. I'm John, the founder of 1440. So at reInvent last week, obviously Bedrock,
18:26they announced a data marketplace, I think, for foundational models. They invested a lot of money
18:31in Anthropic. They launched Nova. I'm curious your thoughts on Amazon, AWS,
18:37and obviously the Anthropic partnership. What do you think?
18:49So I think Matt Garman did this wonderful podcast with us on no priors where he...
18:57I wasn't going to say, but I was like, actually, you've done some work on this, haven't you?
19:00Well, I think the cloud players and the model players are strategic partners to all of us
19:05now. And they've been for a long time, but continue to be. And Matt, in so many words,
19:11said no matter what happens, AWS wins. And that might be true. I think a lot of customers, they
19:20have large commitments with Amazon and then want to run models and experiment with them,
19:26and Bedrock is a safe place to do so. And that company has been in the business of
19:32building data centers for a long time. That being said, I think, and others here are involved
19:39in companies in the inference and infrastructure layer, is I think it is understated how backward
19:45in time we've gone at the infrastructure level. There are companies in our portfolio that have
19:51a GPU server or a rack with sufficient power in their office. I have young people who work for
20:00me who have never seen a server closet. And so I think that the change in hardware capabilities
20:06is really dramatic, and that creates opportunities for new players, even in infrastructure.
20:12Yeah, I mean, this is something we're spending some time on. I think the opportunity below the
20:18layer of foundation models in terms of energy, energy efficiency, the energy infrastructure
20:24that's going to support this at scale is going to demand a reinvention of where we've been,
20:31both really on the hardware and probably the software side of managing it too.
20:35And that is a completely horizontal layer that's going to sit below this. And I think you'll see
20:39some of these big providers, Amazon, et cetera, play big roles in that, but there will also be
20:45room for large new entrants to come in that take a more first principles kind of ground up view
20:50that will be a different layer of winners in this world.
20:55Any thoughts, Leemurie? You're like, yeah, that makes sense.
20:59I mean, I think another thing that I want to underscore is just the importance of it is,
21:04especially from the perspective of these application companies, amazing that you have
21:07so many players, including Amazon, that are offering cutting edge models to then use and
21:13that are partnering with these companies to sort of help them get distribution. So, I mean, I think,
21:18you know, as Sarah was saying, maybe two years ago, people thought it was just going to be open
21:22AI and Microsoft's kind of like game in town. And no, I have many companies that work with Amazon
21:27and clawed through Amazon. And so, it's been really incredible seeing sort of the reduction
21:32in a platform risk. I'll add one thing here. I don't know how controversial this view is now,
21:37but I think it was controversial a year and a half ago. Like, open source matters in this space.
21:43Like, we have companies who use it. I'm sure many of your companies use it, but
21:47the open source models are already better than anything that was released last year from the
21:53proprietary foundation model labs. And I relate this to the Amazon question because if you think
21:58it is all going to be in the labs, including inference and like infrastructure in the next
22:03generation, those questions matter less. I do not believe that to be true. You know,
22:08we are very clear that some of the labs themselves cannot keep up with the inference needs of their
22:12customers. And so, I think a lot of people are going to want choice and variety, and that's going
22:17to end up with new players like base 10 or together, whoever it is, and cloud players,
22:23which is Amazon's refrain, right? Customers want choice. But we have another question over here.
22:27Hey, Steven Wolf Boneta, CEO of Alpha. Hey, Rebecca, good to see you.
22:31There was a recent episode of Y Combinator where they said vertical AI agents are going to be 10
22:35times bigger than SaaS companies. I'd love to get your take. Do you need to be AI native to win in
22:40where we're going? And what does happen to all the SaaS players like the sales forces of the world?
22:47Look- Is it agent time?
22:49Yeah. I think there's kind of future state, and then there's this question of how long does it
22:55take us to get there? Do I believe there's a future state where agents have the capability,
23:01consistency, and accessibility to do a lot of those things and are dominant?
23:07Totally. I think we don't really know how far away we are from that, and that the last 1%, 2%,
23:145% of quality and consistency matters a huge amount in transitioning the world for that to
23:21be the case. And we may be at 80%, 90%. We're not at 100% there. So I see it as a dominant likelihood
23:31of the future, but I think like in any consumer deliverable, the last 5% matters more than
23:37anything else. And I think sometimes we're underweighting how far we are from that last 5%
23:43to really happen. And my hot take is that agents don't work. And what I mean by that is if you get
23:49on Twitter and you see how the vast majority of people on Twitter define agents, it's these things,
23:54they're coming for your job, and they're totally going to replace you today. And I just-
23:59Right now.
23:59Right now. Yesterday. You're already late. They already replaced you. And that's just not the
24:04case. They do not yet work reliably for the vast majority of use cases. I think there's a really
24:09great analogy between agents and self-driving cars, and actually a lot of LLMs and self-driving cars.
24:14And I think right now we are kind of in the world where, yes, the car can make a lot of decisions
24:21itself, but you still need to be at the wheel. And there are times when you need to intervene.
24:26And I think right now, some agent companies, especially verticalized ones, especially in
24:31coding and things like that, there are these awesome processes where agents can do most of
24:34the work, but you have to intervene a bit to sort of guide it. And then eventually maybe we get to a
24:39world of tele-op agents. So the agent does lots of things, and then you'll be occasionally called in.
24:45You won't have to actually monitor it. But I'd say right now we're in that world where, yes,
24:49agents can really speed you up, but the human in the loop for basically all of the use cases
24:54is vitally important.
24:55Where do you stand on agents, Sarah?
24:58I think agents work. We have instances in our portfolio of agents that do useful things. I
25:04would agree with Leemarie that the X, the Twitter version where it's like, ah, I built an agent that
25:10replaces you with three... That's not a thing. Most jobs are actually quite complicated, and
25:16you require a bunch of guardrails and evaluations to figure out how to make something work in a
25:20real context, but culminate in incident response and security investigation where we don't have
25:26enough talent. It's really useful for customers. You guys want to try Bolt.new. You can write an
25:33application in natural language and deploy it to the world, and it will be a full-fledged
25:37application with authentication without a developer. That's really exciting. It's only
25:42going to get better. Your question was really about do you need to be native? I think there's
25:48the deepest innovator's dilemma in this area. If you take a core software category like CRM,
25:54for example, there's some version of CRM. There's a company called Day AI you can take a look at.
26:00It's very SMB-oriented today, but there's some version of CRM where there are no reps that are
26:05populating the database with opportunity updates. That data exists in the world somewhere. It's in
26:12your call records. It's in your visits. It's in your email. You should not, as a human, be toiling
26:18on database updates. If you're Salesforce, do you want to be like, oh, you don't need any of the
26:24business logic we just put 30 years of work into? No. You're like, oh, it'll work with Salesforce
26:29somehow. I think there's actually an argument that the ease of adoption of this technology
26:34fundamentally is much easier than any past transformation. That actually creates a world
26:41where you need to be adaptable, but you actually don't need to be native, because the ability to
26:46integrate it into workflows that already exist should be much easier than we've ever seen before.
26:52So you all have agreed, sort of, to play a game with me. It's going to be a quick word
26:56association game. I'm going to say one word, and you're going to say the first thing that pops
26:59into your head. Don't even think about it, and we're just going to go. Ready? 2025, Rebecca.
27:06Wild West. Sarah. Abundance. Lee Marie. Progress. AI. Exciting.
27:15The fund. Revolutionary. Value. Applications. Elastic. You stole mine.
27:27First thing other than applications, go. First thing other than applications,
27:31what is the value? Models. NVIDIA. That's all the time we have. The answer is NVIDIA.
27:40Thank you all so much.

Recommended