• 7 months ago
On this special episode of Decoder, Google CEO Sundar Pichai sat down with Nilay Patel this week following the company's I/O developer conference to talk about the state of AI, the major changes rolling out now to Google Search, and the future of the web.
Transcript
00:00 [MUSIC PLAYING]
00:03 Sundar Pichai, you are the CEO of both Alphabet and Google.
00:06 Welcome to Decoder.
00:07 I like.
00:08 Good to be here.
00:08 I am excited to talk to you.
00:09 I feel like I talk to you every year at Google I/O,
00:11 and we talk about all the things you've announced.
00:13 There's a lot of things to talk about.
00:15 There's a lot of AI news to talk about.
00:18 As you know, I am particularly interested
00:19 in the future of the web, so I really
00:21 want to talk about that with you.
00:22 But I figured I would start with an easy one.
00:25 Do you think language is the same as intelligence?
00:29 Wow.
00:30 That's not an easy question.
00:33 I don't think I'm the expert on it.
00:35 I think language does encode a lot of intelligence,
00:41 probably more than people thought.
00:45 Explains the successes of large language models
00:49 to a great extent.
00:51 But I think my intuition tells me,
00:53 as humans, the way we consume information,
00:56 I think there's a lot more to it than language alone.
00:58 But I'd say language is a lot more than people think it is.
01:03 Yeah.
01:04 The reason I asked that question to start
01:05 is I look at the announcements at I/O with AI
01:08 and what you're doing.
01:09 I look at your competitors with AI and what they're doing.
01:11 And everything is very language-heavy, right?
01:13 It's LLMs that have really led to this explosion of interest
01:16 and innovation and investment.
01:19 And I wonder if the intelligence is
01:21 increasing at the same rate as the facility with language.
01:25 And I kind of don't see it.
01:26 To be perfectly honest, I see computers
01:28 getting much better at language and actually, in some cases,
01:31 getting dumber.
01:32 And I'm wondering if you see that same gap.
01:34 It's a great question, though.
01:35 Part of the reason we made Gemini natively multimodal
01:39 is so that with strain with audio, video, text images,
01:44 and code, and you're beginning to see glimpses of it now,
01:49 but it hasn't all actually made its way into products fully yet.
01:53 So maybe the next cycle, when we have multimodality working
01:56 on the input and output side, and we are training
01:59 models using all that, I think that will encapsulate a lot
02:03 more than just today, which is primarily text-based.
02:06 So I think that continuum will shift
02:08 as we take in a lot more information that way.
02:12 So maybe there's more to come.
02:14 The reason I ask that is it feels like last year,
02:18 the tagline was bold but responsible.
02:20 That's Google's approach.
02:21 You said it again on stage this year.
02:23 And then I look at our reactions to AI getting things wrong,
02:27 and it seems like they're getting more and more tempered
02:29 over time.
02:30 I'll give you an example.
02:31 In the demos you had yesterday, you
02:33 showed multimodal video search of someone trying
02:36 to fix a broken film camera.
02:38 And the answer was just wrong.
02:40 Sort of straightforwardly, the answer
02:42 that was highlighted in the video
02:43 was just open the back of the film camera and jiggle it.
02:46 And it's like, well, that would ruin all of your film.
02:48 And no one who had an intelligent understanding
02:51 of how that camera would just suggest that.
02:53 - Now, ironically, I was talking to the team.
02:56 They, you know, the team as part of making the video,
02:59 they consulted with a bunch of subject matter experts
03:04 who all reviewed the answer and thought it was okay.
03:07 I understand the nuance.
03:08 I agree with you, obviously.
03:09 You don't want to expose your film
03:11 by taking it outside of a dark room.
03:14 There are certain contexts in which it makes sense
03:17 to do that.
03:18 You know, if you don't want to break the camera
03:20 and if what you've taken is not that valuable.
03:23 - Sure. - Right.
03:24 It makes sense to do that.
03:26 You know, it's a good example of, you're right,
03:29 there is a lot of nuance in it.
03:31 And, you know, part of what I hope search serves to do
03:35 is to, you know, gives you a lot more context
03:39 and around that answer
03:41 and allows people to explore it deeply.
03:44 But I think, you know, these are the kind of things
03:47 for us to keep getting better at.
03:50 But to your earlier question, look, I think
03:52 I do see the capability frontier continuing to move forward.
03:58 I think we are a bit limited
03:59 if we were just training on text data,
04:01 but I think we are all making it more multimodal.
04:04 So I see more opportunities there.
04:06 - Let's talk about search.
04:07 This is the thing that I'm most interested in.
04:09 I think this is the thing that is changing the most
04:12 sort of in an abstract way.
04:13 It's the thing that's the most exciting.
04:14 - Yeah. - Right.
04:15 You can ask a computer a question
04:16 and it will just like happily tell you an answer.
04:18 - Yeah. - That feels new.
04:19 I see the excitement around it.
04:21 Yesterday you announced AI previews are coming to search.
04:24 That's an extension of what was called
04:26 the search generative experience
04:27 and that's gonna roll out to everyone in the United States.
04:29 I would describe the reactions to that news
04:32 from the people who make websites
04:34 is fundamentally apocalyptic.
04:36 The CEO of the News Media Alliance said to CNN,
04:39 "This will be catastrophic to our traffic."
04:42 Another media CEO forwarded me a newsletter
04:45 and the headline was, "This is a death blow to publishers."
04:48 Were you expecting that kind of response
04:51 to rolling out AI previews in search?
04:53 - Look, I definitely, I recall in 2010,
04:59 there were headlines that the web is dead, right?
05:03 I mean, I've long worked on the web, obviously.
05:06 I deeply care about it.
05:08 When the transition from desktop to mobile happened,
05:11 it was a lot of concerns because people are like,
05:12 "Oh, it's a small screen.
05:13 How will people read content?
05:16 Like, why would they look at content?"
05:18 We had started introducing
05:20 what we internally call web answers in 2014,
05:23 which have featured snippets outside.
05:25 So you had questions like that.
05:27 I remain optimistic.
05:31 Empirically, what we are seeing throughout these years,
05:35 I think human curiosity is boundless.
05:40 - Yeah.
05:41 - When people come,
05:42 and it's something I think we have deeply understood
05:44 in search, more than any other company,
05:48 I think we will differentiate ourselves in our approach,
05:51 even through this transition.
05:53 I think as a company, we realized the value
05:57 of this ecosystem and it's symbiotic.
06:01 If there isn't a rich ecosystem
06:05 making unique and useful content,
06:08 you know, what are you putting together and organizing?
06:11 Right? And so we feel it.
06:13 I would say through all these transitions,
06:18 things have played out a bit differently.
06:21 I think high quality content, users are looking for it.
06:25 The counterintuitive part,
06:27 which I think almost always plays out,
06:29 is it's not a zero-sum game in terms of AI overviews.
06:34 People are responding very positively to it.
06:37 It's one of the most positive changes I've seen in search
06:40 based on metrics we see, but people do jump off on it.
06:44 And when you give context around it,
06:45 they actually jump off it.
06:47 It actually helps them understand.
06:50 And so they engage with content underneath too.
06:54 In fact, if you put content and links within AI overviews,
06:57 they get higher click-through rates
06:58 than if you put it outside of AI overviews.
07:01 - Yeah.
07:02 - But I understand the sentiment, right?
07:04 You know, it's a big change.
07:06 You know, these are disruptive moments.
07:08 AI is a big platform shift.
07:11 And people are projecting out,
07:12 and people are putting a lot into creating content.
07:15 It's their businesses.
07:17 So I understand the perspective.
07:19 So, you know, I'm not surprised.
07:20 We are engaging with a lot of players,
07:22 both directly and indirectly.
07:24 But I remain optimistic how it'll actually play out.
07:28 But it's a good question, but happy to talk about it more.
07:31 - So I think you know that I have this concept
07:33 I call Google Zero, which is born of my own paranoia.
07:36 Every referrer that The Verge has ever had
07:39 has gone up, and then it's gone down.
07:40 And Google is the last large-scale referrer of traffic
07:43 on the web for almost every website now.
07:46 And I can see that for a lot of sites,
07:48 Google Zero is playing out.
07:49 Google traffic has gone to zero,
07:51 particularly independent sites
07:53 that aren't part of some huge publishing conglomerate.
07:55 So there's an air purifier blog
07:57 that we cover called House Fresh.
07:58 There's a gaming site we cover called Retro Dodo.
08:01 Both of these sites have said,
08:02 "Look, our Google traffic went to zero.
08:03 "Our businesses are doomed."
08:05 Is that the right outcome here of all this,
08:07 that the people who care so much about video games
08:11 or air purifiers, that they actually started websites
08:14 and made the content for the web
08:15 are the ones getting hurt the most in the platform shift?
08:18 - No, look, I mean, it's always difficult
08:22 to talk about individual cases, right?
08:24 And like, you know, at the end of the day,
08:28 we are trying to satisfy user expectations,
08:30 and users are wording with their feet, right?
08:32 So when people are trying to figure out
08:35 what's valuable to them, and we are doing it at scale,
08:38 and, you know, I can't answer on the particular site.
08:41 - But it's that thing where it's a bunch of small players
08:45 are feeling the hurt, like, loudly.
08:47 Like, they're saying it, like,
08:48 "Our businesses are going away."
08:49 And that's the thing you're saying,
08:50 like, we're engaging, we're talking,
08:52 but this thing that is happening very clearly.
08:55 - But it's not clear to me that's a uniform trend.
08:57 Like, I have to look at data on an aggregate, right?
08:59 So anecdotally, when people give,
09:01 there are always times when people have come in an area
09:03 and said, "Me as a specific site have done worse."
09:07 But that may be a moment in which that's because
09:10 it's like an individual restaurant coming and saying,
09:12 "I've started getting less customers this year."
09:15 People have stopped eating food or whatever it is, right?
09:18 Like, that's not true necessarily, right?
09:21 Some other restaurant might have opened next door,
09:23 which is doing very well, right?
09:24 So it's tough to say, I think, from our standpoint,
09:28 when I look at historically, even over the past decade,
09:33 we have provided more traffic to the ecosystem
09:36 and we've driven that growth, right?
09:38 So that, and you may be making a secondary point
09:41 around small sites versus more aggregating sites,
09:44 which is the second point you're talking about.
09:47 Ironically, there are times when we have done changes
09:52 to actually send more traffic to the smaller sites.
09:55 Some of the sites which complain a lot
09:57 are the aggregators in the middle, right?
09:59 So should the traffic go to the restaurant,
10:03 or should the restaurant create a website
10:05 with their menus and stuff,
10:06 or people writing about these restaurants?
10:10 These are deep questions.
10:10 I'm not saying there's a right answer.
10:12 - But you're about to flip over the whole apple cart, right?
10:14 You're about to start answering
10:15 some of these questions very directly.
10:17 And where that content comes from in the future,
10:20 I think you want the people who care the most
10:22 to publish that information directly
10:24 to be the thing that you synthesize.
10:25 - Yeah, and I--
10:27 - And the incentives for that seem to be
10:28 getting lower and lower, on the web anyway.
10:30 - Yeah, I feel it's the opposite.
10:32 And if anything, I feel like through AI overviews,
10:37 when you give people context,
10:40 yes, there are times people come
10:43 and all they want is a quick answer and they bounce back.
10:46 But overall, when we look at user journeys,
10:49 when you give the context,
10:50 it also exposes people to areas
10:53 branching off, jumping off points.
10:54 And so they engage more.
10:57 So actually, this is what drives growth over time.
11:01 - I look at like desktop to mobile,
11:03 questions were similar.
11:04 In fact, I think it was a cover,
11:06 almost tempted to pull out saying the web is dead.
11:09 And there was a Google zero argument 10 years ago.
11:13 But you yourself made the point that we,
11:15 it's not an accident, I think we still remain
11:17 as one of the largest referers.
11:19 Because we've cared about it deeply for a long, long time.
11:24 I look at our journey, even the last one year
11:27 through search and data experience,
11:29 I constantly found us prioritizing approaches,
11:34 which would send more traffic
11:37 while meeting user expectations.
11:40 You know, we think that through deeply
11:41 and we actually change our approach.
11:44 And if there are areas where we feel like
11:45 we fully haven't gotten it right,
11:47 we are careful about rolling it there.
11:50 But I think what's positively surprising us
11:53 is that people engage more
11:56 and that will lead to more growth over time, I think,
11:58 for high quality content.
12:00 And there's a lot of debate about
12:02 what's high quality content.
12:04 And, you know, but I think I would hope that,
12:08 you know, at least in my experience,
12:11 you know, I value independent sources,
12:15 I value smaller things, I want more authentic voices.
12:18 And I think those are important attributes
12:20 we are trying to constantly improve.
12:22 - You mentioned that you think more people
12:24 will click through links and AI previews.
12:26 I think Liz, who runs search, had a blog post
12:29 making the same claim.
12:30 There's no public data that says that this is true yet.
12:34 Are you going to release that data?
12:35 Are you going to show people
12:35 that this is actually happening?
12:37 - At an aggregate, I think people,
12:40 I mean, we rely on this value of the ecosystem, right?
12:42 If people over time on an aggregate don't see value,
12:47 if website owners don't see value coming back from Google,
12:49 I think we'll pay a price.
12:51 I think so we have the right incentive structure.
12:54 But obviously, look, I think we are careful about,
12:57 there are a lot of individual variations
12:59 and some of it is users choosing which way to go.
13:02 And so I think that part is hard to sort out.
13:06 But I do think we are committed at an aggregate level
13:08 to do the right thing.
13:09 - Yeah.
13:10 I was reading some SEO community trade publications
13:13 this morning, responding to the changes.
13:15 And one of the things that was pointed out
13:17 was that in Search Console,
13:19 it doesn't show you if the clicks are coming
13:21 from a featured snippet or an AI preview
13:24 or just a regular Google 10 blue links.
13:26 Would you break that out?
13:27 Would you commit to breaking that out
13:28 so people can actually audit and verify and measure
13:31 that the AI previews are sending out
13:32 as much traffic as you say they are?
13:34 - You know, I think it's a good question
13:38 for the search team if they think about this
13:41 at a deeper level than I do.
13:46 I think we are constantly trying to give more visibility
13:48 in a way, but we also don't want people to,
13:51 we want people to create the content that's good.
13:53 And we are trying to rank it and organize it.
13:57 So I think there's a balance to be had.
13:59 You know, the more we spec it out,
14:03 then the more people design for that.
14:05 So I think there's a trade-off there.
14:08 I think, so it's not clear to me what the right answer is.
14:12 - Yeah, that trade-off between what you spec out
14:15 and say and what people make.
14:16 - Yeah.
14:17 - I think that's been the story of the web
14:18 for quite some time.
14:19 And it had reached, I think, a steady state.
14:21 Whether you think that steady state was good or bad,
14:24 but it was at least a steady state.
14:26 Now that state is changing, right?
14:28 AI is obviously changing it.
14:30 The 10 blue links model, the old steady state,
14:33 very much based on an exchange, right?
14:36 We're gonna let you index our content.
14:38 We're gonna have featured snippets.
14:39 We're gonna let you see all of our information.
14:41 In return, you will send us traffic.
14:43 That formed the basis of what you might call
14:45 a fair use argument, right?
14:47 Google's gonna index the stuff.
14:48 There's not gonna be a lot of payments in the middle.
14:50 In the AI era, no one knows how that's gonna go, right?
14:54 There are some major lawsuits happening.
14:56 There are deals being made by Google,
14:58 OpenAI for training data.
15:00 Do you think it's appropriate for Google
15:03 to start making more deals to pay for data
15:05 to train on search results?
15:07 Because those AI snippets are not really the same
15:10 as the 10 blue links or anything else
15:12 you've done in the past.
15:13 - Here we have,
15:14 it's a good question.
15:18 To be very clear,
15:20 there's a myth that Google's search
15:22 has been 10 blue links for,
15:24 like I look at our mobile experience
15:26 over many, many years, right?
15:28 And we've had answers.
15:30 We allow you to refine questions and so on.
15:33 We've had featured snippets, right?
15:36 And so on.
15:38 So the product has evolved significantly.
15:42 I think, but having said that,
15:46 as a company, even as we look at AI,
15:48 here we've done even,
15:52 we've had showcase, we have done licensing deals.
15:56 To the extent there is value,
15:59 we obviously think there is a case for fair use
16:05 in the context of beneficial transformative use.
16:08 Not gonna argue that with you as given your background,
16:12 but I think there are cases in which we will see
16:15 dedicated incremental value to our models.
16:19 And we'll be looking at partnerships to get at that.
16:22 So I do think we'll approach it that way.
16:24 - Let me ask this question in a different way.
16:26 And I won't do too much fair use analysis with you,
16:28 I promise, as much as I like doing it.
16:30 Yeah, there's some news reports recently
16:32 that OpenAI had trained
16:34 its video generation product, Sora, on YouTube.
16:37 How did you feel when you heard that news?
16:40 - Look, we don't know the details.
16:44 I think our YouTube team is following up
16:48 and trying to understand.
16:49 Look, we have terms and conditions, right?
16:53 And we would expect people to abide
16:55 by those terms and conditions, right?
16:57 And so I think when you build a product,
16:59 and so that's how I felt about it, right?
17:02 - So you felt like they had broken your terms
17:03 and conditions, or potentially, or if they had,
17:05 that wouldn't have been appropriate.
17:06 - That's right, yeah, that's right.
17:08 - The reason I asked that question,
17:10 which is a much more emotional question,
17:11 is, okay, maybe that's not appropriate.
17:13 And what OpenAI has said,
17:16 whatever they've said is essentially in the order of,
17:18 we've trained on publicly available information,
17:20 which means we found it on the web.
17:22 Most people don't get to make that deal, right?
17:24 They don't have a YouTube team of licensing professionals
17:27 who can say, "We had terms and conditions."
17:29 They don't even have terms and conditions.
17:30 They're just putting their stuff on the internet.
17:32 Do you understand why, emotionally,
17:35 there's the reaction to AI from the creative community,
17:38 that it feels the same way as you might have felt
17:40 about opening a training on YouTube?
17:42 - Absolutely.
17:43 I mean, I think, be it website owners,
17:50 or content creators, or artists,
17:52 I can understand how emotional a transformation this is.
17:56 And I think part of the reason you saw,
17:59 even through Google I/O,
18:00 when we were working on products like music generation,
18:04 we've really taken an approach by which we are working first
18:07 to make tools for artists.
18:09 We haven't put a general purpose tool out there
18:11 for anyone to create songs, right?
18:14 So the way we have taken that approach
18:16 in many of these cases is to put the creator community
18:19 as much at the center of it as possible.
18:22 We've long done that with YouTube.
18:24 And through it all, I think we are trying to figure out
18:27 what are the right ways to approach this.
18:30 But it is a transformative moment as well.
18:32 And there are other players in this.
18:34 We are not the only player in the ecosystem.
18:37 We are not the only player in the ecosystem.
18:39 But to your earlier question, yes,
18:40 I understand people's emotions through it.
18:44 I definitely am very empathetic
18:48 to how people are perceiving this moment.
18:50 - 'Cause they feel like it's a taking, right?
18:52 That they put work on the internet
18:53 and the big companies are coming,
18:55 they're taking it for free,
18:57 and then they're making products
18:58 that you're charging 20 bucks a month for,
19:00 or that will lift their creative work
19:02 and remix it for other people.
19:04 And the thing that makes it feel like a taking
19:06 is very little value accrues back to them.
19:08 And that's really the thing I'm asking about,
19:10 is how do you bring value back to them?
19:11 How do you bring incentives back to the small creator,
19:14 the independent business,
19:16 that's saying, "Look, this feels like a taking."
19:17 - I mean, look, the whole reason we have spent,
19:22 I think we've been successful on platforms like YouTube
19:25 is we've worked hard to answer this question well.
19:28 And so you'll continue to see us dig deep
19:30 about how to do this well.
19:32 And I think the players who end up doing better here
19:37 will have more winning strategies over time.
19:39 I genuinely believe that.
19:40 And I think across everything we do,
19:44 we have to sort that out.
19:47 Anytime you're running a platform,
19:49 I think it's the basis on which
19:51 you can build a sustainable long-term platform.
19:53 So I view through this AI moment,
19:56 over time, there'll be players
19:58 who will do better by the content creators,
20:01 which support their platforms,
20:03 and whoever does it better will emerge as the winners.
20:06 I think that, I believe that to be a tenet, right?
20:09 And we can work in these things over time.
20:11 - Yeah, one thing that I think is really interesting
20:13 about the YouTube comparison in particular,
20:15 it's been described to me many times
20:17 that YouTube is a licensing business, right?
20:19 You license a lot of content from the creators,
20:20 you obviously pay them back
20:22 in terms of the advertising model there.
20:23 The music industry has a huge licensing business
20:27 with YouTube.
20:27 It is an existential relationship, I think, for both sides.
20:31 Susan Wojcicki used to describe YouTube as a music service,
20:34 which I think confused everyone
20:35 until you looked at the data.
20:36 - I think that's a really interesting point.
20:38 I think that's a really interesting point.
20:40 I think that's a really interesting point.
20:41 - I think that's a really interesting point.
20:42 - I think that's a really interesting point.
20:42 - I think that's a really interesting point.
20:43 - I think that's a really interesting point.
20:44 - I think that's a really interesting point.
20:45 - I think that's a really interesting point.
20:46 - I think that's a really interesting point.
20:47 - I think that's a really interesting point.
20:48 - I think that's a really interesting point.
20:48 - I think that's a really interesting point.
20:49 - I think that's a really interesting point.
20:50 - I think that's a really interesting point.
20:51 - I think that's a really interesting point.
20:52 - I think that's a really interesting point.
20:53 - I think that's a really interesting point.
20:53 - I think that's a really interesting point.
20:58 - I look at other players and how they've approached.
21:01 - You mean you're talking about OpenAI,
21:02 which is just out there taking stuff, right?
21:04 - Look, in general, when you look at how we have approached
21:07 search generative experience, even through a moment like
21:10 this, the time we have taken to test, iterate,
21:14 prioritize approaches, and the way we have done it
21:17 over the years, I would say, I definitely disagree
21:22 with the notion we don't listen, right?
21:24 So we deeply care, we listen.
21:28 Not everything we do, people may agree,
21:30 when you're running an ecosystem,
21:31 you are balancing across different needs,
21:36 but I would definitely think, and I hope we always do,
21:40 because I think that's the essence of what makes
21:42 a product successful.
21:43 - Yeah, let me talk about the other side of this.
21:44 So there's search and people are going to game search
21:46 and that's always going to happen,
21:47 and that's a chicken and egg problem.
21:50 The other thing I see is happening is the web is being
21:52 flooded with AI content.
21:53 There was an example a few months ago,
21:55 some unsavory SEO character said,
21:57 "Here's this thing I just did.
21:59 "I stole a bunch of traffic from a competitor.
22:00 "I copied their site map, I fed it into an AI,
22:04 "and had it generate me copy for a website
22:06 "that matched their site map.
22:07 "I put up this website, I stole a bunch of traffic
22:09 "from that website, my competitor."
22:11 I think that's a bad outcome.
22:12 I don't think we want to incentivize that
22:13 in any way, shape, or form.
22:15 That's going to happen at scale, right?
22:17 And more and more of the internet that we experience
22:19 will be synthetic in some important way.
22:22 How do you, on the one hand, build the systems
22:25 that create the synthetic content for people,
22:27 and on the other hand, rank it so that you're only
22:29 getting the best stuff?
22:30 Because at some point, the defining line for a lot
22:32 of people is, "I want stuff made by a human
22:35 "and not stuff made by an AI."
22:37 - I think there are multiple parts to your question, right?
22:40 So one, how do we sift through high quality
22:44 from low quality?
22:45 I'm like, I literally view that as our mission statement.
22:47 It's, you know, and it is what has defined search
22:50 over many, many years.
22:53 I actually think people underestimate the,
22:56 it gets, you know, anytime you have these disruptive
22:59 platform shifts, you know, you're going to go
23:02 through a phase like this.
23:04 I have seen that teams invest so much,
23:07 our entire search quality teams, you know,
23:09 been spending the last year, right, gearing up
23:14 our ranking systems, et cetera, to better get
23:19 what is high quality content.
23:20 I think if I think the next decade, people who can do
23:24 that better, who can sift through that,
23:26 I think we'll win out.
23:29 I think you're right in your assessment that,
23:31 you know, people will value human created experiences.
23:35 I hope the data bears that out.
23:37 And, you know, we have to be careful every time
23:40 there's a new technology.
23:41 There are old filmmakers, if you go and talk about CGI
23:44 and films, they're going to react very emotionally, right?
23:47 And there are still esteemed filmmakers
23:49 who never use CGI in films.
23:53 But then there are people who use it
23:56 and produce great films, right?
23:59 And so I think you can't just say anything with AI.
24:04 You know, you may be using AI to lay out, you know,
24:07 enhance video effects in your video, et cetera.
24:10 But I agree with you.
24:12 I think using AI to produce on mass content
24:17 without adding any value, et cetera,
24:19 I don't think is what users are looking for.
24:21 Yeah. Right.
24:22 But there is a big continuum and, you know,
24:24 over time, users are adapting.
24:27 We are trying hard to make sure we do it
24:32 in a responsible way, but also listening
24:34 to what users actually find as high quality versus not.
24:38 Yeah.
24:39 And trying to get that balance right, right?
24:40 And that continuum will look different
24:43 a few years out than it is today.
24:46 But I think it's, I view it as the essence
24:48 of what search quality is.
24:50 And do I feel confident we will be able
24:54 to approach it better than others?
24:55 Yes. Right.
24:57 And I think that's what defines the work we do.
24:59 For the listener, these have been a lot of subtle shots
25:01 at opening AI today.
25:02 Can I put this into practice by showing you a search?
25:05 I actually just did this search.
25:06 It is a search for best Chromebook
25:10 As you know, I once bought my mother a Chromebook Pixel.
25:13 I don't believe it.
25:14 It's one of my favorite tech purchases of all time.
25:16 So this is a search for best Chromebook.
25:17 I'm gonna hit generate at the top.
25:18 It's gonna generate the answer.
25:19 And then I'm gonna do something terrifying,
25:21 which is I'm gonna hand my phone to the CEO of Google.
25:23 This is my personal phone.
25:24 Yeah.
25:25 Don't dig through it.
25:26 So you look at that and, you know,
25:28 it's the same generation that I've seen earlier.
25:30 I asked for best Chromebook and it says,
25:31 here's some stuff you might think of.
25:32 And then you scroll and it's some Chromebooks,
25:35 it doesn't say whether they're the best Chromebooks.
25:37 And then it's a bunch of headlines.
25:39 Some of it's from like verge headlines.
25:40 It's like, here's some best Chromebooks.
25:42 That feels like the exact kind of thing
25:46 that an AI generated search could answer in a better way.
25:49 Like, do you think that's a good experience today?
25:50 Is that a way point or is that the destination?
25:53 - I think, look, you're showing me a query
25:54 in which we didn't automatically generate the AI.
25:57 - Well, there was a button that said, do you wanna do it?
25:58 - But that's, let me push back, right?
26:00 There's an important differentiation, right?
26:01 There's a reason we are giving a view
26:06 without the generated AI overview.
26:08 And as a user, you're initiating an action, right?
26:11 So we are respecting the user intent there.
26:14 And when I scroll it, I see Chromebooks.
26:16 I also see a whole set of links, which I can go,
26:20 which tell me all the ways you can think about Chromebooks.
26:23 - Yeah.
26:24 - So I see a lot of links.
26:26 So we both didn't show an AI overview in this case.
26:30 As a user, you're generating the follow-up question.
26:35 - I think it's right that we respect the user intent.
26:38 - Yeah.
26:38 - If you don't do that, right,
26:39 people will go somewhere else too, right?
26:41 I think so, you know, so.
26:43 - But I'm saying the answer to the question,
26:45 I did not write what is the best Chromebook.
26:47 I just wrote best Chromebook.
26:48 The answer, a thing that identifies itself as an answer
26:52 is not on that page.
26:53 And the leap to I had to push the button
26:55 to Google pushes the button for me,
26:57 and then says what it believes to be the answer
26:59 is very small.
27:01 And I'm wondering if you think a page like that today
27:04 that is the destination of the search experience,
27:07 or if this is a waypoint and you can see a future,
27:09 better version of that experience.
27:10 - Oh, I'll give you your phone back.
27:13 I'm tempted to check email right now out of habit.
27:17 Look, but you know, I think the direction
27:20 of how these things will go,
27:22 you know, it's fully tough to predict, you know,
27:25 we are, you know, users keep evolving, right?
27:28 It's a more dynamic moment than ever.
27:33 We are testing all of this, right?
27:35 And like, you know, and this is a case
27:37 where we didn't trigger the AI overview
27:40 because we felt like our AI overview
27:43 is not necessarily the first experience
27:44 we want to provide for that query
27:46 because what's underlying is maybe a better first look
27:49 at the user, right?
27:50 And those are all quality trade-offs we are making.
27:53 But if the user is asking for a summary, right?
27:56 We are summarizing and giving links.
27:59 I think that seems like a reasonable direction to me.
28:01 - Yeah.
28:02 Can I show you, I'll show you another one.
28:04 I'll show you another one where it did expand automatically.
28:06 This one I only have screenshots for.
28:08 So this is Dave Lee from Bloomberg did a search.
28:10 He got an AI overview and he just searched
28:12 for JetBlue Mint Lounge SFO.
28:15 And it just says the answer, which I think is fine.
28:18 And that's the answer.
28:19 If you swipe one over, I cannot believe
28:20 I'm letting the CEO of Google swipe on my camera roll.
28:22 But if you swipe one over, you see where it pulled from.
28:25 You see the site it pulled from.
28:27 It is a word for word rewrite of that site.
28:30 This is the thing I'm getting at, right?
28:32 - Sorry, you're saying?
28:34 - The AI generated preview of that answer.
28:37 If you just look at where it came from,
28:39 it is almost the same sentence that exists
28:42 on the site, on the source of it.
28:44 That's what I mean.
28:45 It's at some point that the better experience
28:47 is the AI preview.
28:49 And it's just the thing that exists
28:50 on all the sites underneath it.
28:52 It's the same information.
28:53 - In my experience, and that's not what users...
28:57 Look, the thing with search,
28:59 we handle billions of queries.
29:02 You can absolutely find a query and hand it to me and say,
29:05 "Could we have done better on that query?"
29:08 Yes, for sure.
29:10 But when I look across, in many cases,
29:13 part of what is making people respond positively
29:16 to AI overviews is the summary we are providing
29:19 clearly adds value, helps them look at things
29:22 they may not have otherwise thought about.
29:25 If you are adding value at that level,
29:28 I think people notice it over time.
29:30 And I think that's the bar you're trying to meet.
29:33 Our data would show over 25 years,
29:39 if you aren't doing something which users find valuable
29:41 or enjoyable, they let us know right away.
29:45 Over and over again, we see that.
29:46 And through this transition, everything is the opposite.
29:52 It's one of the biggest quality improvements
29:55 we are driving in our product.
29:57 People are valuing this experience.
30:00 So, I think I would place a lot of –
30:03 I think there's a general presumption
30:05 that people don't know what they are doing,
30:07 which I disagree with strongly.
30:09 Like, people who use Google are savvy.
30:12 They understand.
30:14 And so, to me, I can give plenty of examples
30:19 where I've used AI overviews as a user.
30:22 I'm like, "Oh, this is giving context.
30:23 Oh, maybe there are these dimensions
30:25 I didn't even think in my original query."
30:27 How do I expand upon it and look at it?
30:30 Yeah.
30:31 You've made oblique mention to OpenAI a few times, I think.
30:35 I actually haven't.
30:37 You keep saying "others."
30:38 There's one other big competitor
30:39 that is, I think, a little more...
30:41 I mean, you're putting words in my mouth, but that's okay.
30:44 Okay.
30:45 I would say I saw OpenAI's demo the other day
30:49 of GPT-4.0 Omni.
30:53 It looked a lot like the demos you gave at I/O,
30:55 this idea of multimodal search,
30:56 the idea that you have this character you can talk to.
30:59 You had gems, which are the same kind of idea.
31:01 It feels like there's a race
31:03 to get to kind of the same outcome
31:06 for a search-like experience or an agent-like experience.
31:09 Do you feel the pressure from that competition?
31:11 Well, I mean, this is no different from Siri and Alexa.
31:15 And we worked in the industry.
31:16 I think when you're working in the technology industry,
31:19 I think there is relentless innovation, right?
31:23 We felt a few years ago,
31:26 all of us building voice assistants,
31:28 you could have asked the same version of this question,
31:30 right?
31:31 And what was Alexa trying to do?
31:33 And what was Siri trying to do?
31:35 So I think it's a natural extension of that.
31:38 I think you have a new technology now,
31:40 and it's evolving rapidly.
31:43 Do I feel, you know,
31:45 I felt like it was a good week for technology.
31:47 There was a lot of innovation,
31:48 I felt on Monday and Tuesday and so on.
31:50 That's how I feel.
31:52 And I think it's going to be that way for a while.
31:55 For a while.
31:56 I'd rather have it that way.
31:58 You know, you'd rather be in a place
31:59 where the underlying technology is evolving,
32:03 which means you can radically improve your experiences,
32:05 which you're putting out.
32:07 I'd rather have that anytime than a static phase
32:10 in which you feel like, you know,
32:12 you're not able to move forward fast.
32:15 I think a lot of us have had this vision
32:16 for what a powerful assistant can be.
32:19 But we were held back by the underlying technology,
32:22 not being able to, you know, serve that goal.
32:26 I think we have a technology
32:27 which is better able to serve that.
32:28 That's why you're seeing the progress again.
32:31 So I think that's exciting.
32:32 To me, I look at it and say,
32:33 we can actually make Google Assistant a whole lot better.
32:36 You're seeing visions of that with Project Astra, right?
32:39 It's, you know, it's incredibly magical to me when I use it.
32:42 So, you know, I'm very excited by it.
32:46 - Yeah, and this brings me back
32:47 to the first question I asked, right?
32:49 Language versus intelligence.
32:52 To make these products,
32:53 I think you need a core level of intelligence.
32:56 Do you have in your head a measure
32:59 of this is when it's going to be good enough,
33:00 where I can trust this?
33:02 On all of your demo slides
33:03 and all of OpenAI's demo slides,
33:05 there's a disclaimer that says, "Check this info."
33:08 And to me, it's ready when you don't need that anymore.
33:11 Right, you didn't have,
33:11 "Check this info" at the bottom of the 10 blue links.
33:14 You don't have, "Check this info"
33:16 at the bottom of featured snippets necessarily.
33:18 - You're getting at a deeper point
33:20 where hallucination is still an unsolved problem, right?
33:23 - Yep. - I know.
33:24 In some ways, it's an inherent feature.
33:26 It's what makes these models very creative, right?
33:30 You know, it's why it can immediately write a poem
33:33 about Thomas Jefferson in the style of Nilay.
33:37 It can do that, right?
33:37 It's incredibly creative.
33:39 But, you know, LLMs aren't necessarily the best approach
33:44 to always get at factuality, right?
33:48 And which is part of why I feel excited about Search
33:52 because in Search, we are bringing LLMs in a way,
33:55 but we are grounding it with all the work we do in Search
33:59 and laying it with enough context, I think,
34:01 I think we can deliver a better experience
34:05 from that perspective.
34:06 But I think the reason you're seeing those disclaimers
34:09 is because of the inherent nature, right?
34:11 There are still times it's going to get it wrong.
34:13 - Yeah.
34:14 - But I don't think I would look at that
34:16 and underestimate how useful it can be at the same time.
34:20 I think that would be a wrong way to think about it.
34:23 - Yeah.
34:23 - Google Lens is a good example, right?
34:25 When we did Google Lens first, when we put it out,
34:28 it would get, you know, it didn't recognize all objects well,
34:32 but the curve year on year has been pretty dramatic
34:35 and users are using it more and more.
34:37 We get billions of queries now.
34:38 We've had billions of queries now with Google Lens.
34:41 It's because, you know, the underlying image recognition
34:44 paired with our knowledge entity understanding
34:47 has dramatically expanded over time.
34:50 So I would view it as a continuum, right?
34:53 And I think, again, I go back to this saying,
34:56 users vote with their feet, right?
35:00 Fewer people used Lens in the first year.
35:03 We also didn't put it everywhere
35:04 because we realized the limitations of the product.
35:08 - When you talk to the DeepMind Google Brain team,
35:11 is there on the roadmap a solution
35:13 to the hallucination problem?
35:15 - It's Google DeepMind, you know,
35:16 but are we making progress?
35:20 Yes, we are.
35:21 We have definitely made progress, you know,
35:24 when we look at metrics on factuality year on year.
35:27 So we are all making it better, but it's not solved.
35:32 Are there interesting ideas and approaches
35:35 which they are working on?
35:36 Yes.
35:37 You know, yes, but time will tell, right?
35:40 But I would view it as LLMs are an aspect of AI, right?
35:45 You know, we are working on AI in a much broader way,
35:49 but it's an area where I think we are all working
35:53 definitely to drive more progress.
35:55 - All right, last question.
35:57 I think it's the theme of this conversation.
35:59 Five years from now, this technology,
36:00 the paradigm shift will be through it, it feels like.
36:04 What does the best version of the web look like
36:07 for you five years from now?
36:09 - I hope the web is much richer in terms of modality.
36:14 I think today, I feel like the way humans
36:19 consume information, you know,
36:22 still not fully encapsulated in the web.
36:25 Today, things exist in very different ways, right?
36:27 You have web pages, you have YouTube, et cetera,
36:29 but over time, I hope the web is much more multimodal.
36:33 It's much more richer, much more interactive.
36:38 It is a lot more stateful, which it's not today.
36:42 So, you know, I view it as,
36:46 while fully acknowledging the point,
36:49 people may use AI to generate a lot of spam.
36:52 I also feel every time there's a new wave of technology,
36:55 people quite don't know how to use it.
36:57 When mobile came, everyone took web pages
37:00 and like shouted into mobile applications.
37:02 Then later people evolved really native mobile applications.
37:07 So the way people use AI to actually solve new things,
37:11 new use cases, et cetera, is yet to come.
37:14 So when that happens,
37:15 I think the web will be much, much richer too.
37:17 So I think, you know, so you being, you know,
37:21 dynamically composing a UI
37:23 in a way that makes sense for you, right?
37:27 And different people have different needs, right?
37:32 But today, you know, you're not dynamically composing that UI.
37:36 - Yeah.
37:37 - AI can help you do that over time.
37:39 You can also do it badly and wrongly
37:40 and people can use it shallowly,
37:43 but there will be entrepreneurs
37:45 who figured out a extraordinarily good way to do it.
37:48 And out of it, there'll be great new things to come.
37:52 So, yeah.
37:53 - Google creates a lot of incentives
37:54 for development on the web through search, through Chrome,
37:57 through everything that you do.
37:58 How do you make sure those incentives
38:00 are aligned toward those goals?
38:01 'Cause I think maybe the biggest thing here
38:04 is that the web ecosystem is in a moment of change
38:07 and Google has a lot of trust to build and rebuild.
38:11 How do you think about making sure
38:12 those incentives point into right goals?
38:14 - Not everything is in Google's control.
38:17 I wish I could influence how,
38:19 what is the single toughest experience
38:21 when I go to websites today?
38:23 As a user, you can never have,
38:25 you have a lot of cookie dialogues to accept, et cetera.
38:28 Right?
38:29 So I would argue there are many things outside.
38:31 You can go poll a hundred users, right?
38:33 And like, you know,
38:34 but the incentives we would like to create,
38:36 look, I think, and there's a complex question,
38:41 which is how do you reward originality, creativity,
38:46 independent voice at whatever scale
38:49 at which you're able to do and give a chance
38:52 for that to thrive in this content ecosystem we create.
38:56 Right?
38:57 And that's what I think about.
38:58 That's what the search team thinks about.
39:01 But I think it's an important principle
39:02 and I think it'll be important for the web
39:04 and important for us as a company.
39:06 - That's great.
39:07 Well, Sundar, thank you so much for the time.
39:07 Thank you for being on "Decoder."
39:09 - Thanks.
39:09 Thanks, Nilak.
39:10 Greatly enjoyed it.
39:11 - That was great.
39:12 - I appreciate it.
39:13 (upbeat music)

Recommended