The Verge's Nilay Patel, Alex Cranz, and David Pierce discuss Telegram CEO being charged in a French criminal investigation over content moderation, Yelp suing Google for antitrust violations, a week in AI-generated nonsense, and more.
Category
🤖
TechTranscript
00:00:00Hello and welcome to Rich Hast, the flagship podcast, the Boogs Palma fan club.
00:00:09Again might be true, like legitimately might be true.
00:00:12I have one.
00:00:13I've bought a lot of dumb gadgets because of David recently.
00:00:21And I don't know that those are the right decisions.
00:00:25Well you can't read.
00:00:26We should say that up front.
00:00:28Famously illiterate.
00:00:29I hate reading.
00:00:30Hi, I'm your friend, Eli.
00:00:33I love reading and I think you should do more of it.
00:00:36David Pierce is here.
00:00:37Hi.
00:00:38Alex Kranz is here.
00:00:39I'm just thinking about reading Rainbow now.
00:00:40You know, it's fun when you have a small child, you're like, in reading again?
00:00:44That's a real thing.
00:00:45And what Max wants to read with me most is Calvin and Hobbes, which is, I love it.
00:00:50The dream.
00:00:51That rules.
00:00:52Have you ever tried to read Calvin and Hobbes out loud to another person who doesn't understand
00:00:58the jokes from the literal 1980s?
00:01:01She's like, what's a telephone?
00:01:04Calvin and Hobbes is definitely a cartoon for adults.
00:01:07Like it is not.
00:01:08The jokes in it and like the messages in it, I find like deep and important and profound
00:01:13now and I'm like very old and I don't know how to explain any of that to a child.
00:01:20Try.
00:01:21If you have a kid and you're like, here's what we can do.
00:01:23We can read comic books together.
00:01:25We were at, I already own all these books and we were at an estate sale, like down the
00:01:28street from our house and we bought the full collection again cause it was so cheap.
00:01:32And I was like, this is what money is for.
00:01:34Like this is why you make it to buy all the Calvin Hobbes books whenever they're available
00:01:37to you.
00:01:38And so she was excited cause we bought all these, what appeared to her to be comic books.
00:01:42I encourage you if you have children to try and just try to read any comic strip.
00:01:46You're like, first of all, that's bananas.
00:01:48And then these are like the multi layers.
00:01:50Like I'm giggling and she's like, I don't.
00:01:52Is the, is the tiger alive?
00:01:54I don't understand.
00:01:55Get her in on Kathy.
00:01:56She looks at me.
00:01:57Oh man.
00:01:58She looked at me very seriously the other day and she goes, my line never comes alive.
00:02:07And I was like, well, you're growing up kid.
00:02:08You're in first grade now.
00:02:09Yeah.
00:02:10Work on your imagination, Max.
00:02:11How about that?
00:02:12All right.
00:02:13There's a lot going on this week.
00:02:16It's kind of a, we're two weeks out from the iPhone event.
00:02:19It's I keep joking.
00:02:20It's like summer's over.
00:02:21That's where we are.
00:02:22Yep.
00:02:24I candidly have just like end of summer, whatever.
00:02:27Like that's my brain.
00:02:29Like what if I, what if I just shut it down for a couple of weeks before we ramp up into
00:02:32gadget season?
00:02:33But there's still a lot of news, a ton.
00:02:35And all of it is like policy news.
00:02:36Like the, the, the legal systems of the world just keep generating PDFs for me to read on
00:02:41a book's Palma.
00:02:42I mean, literally it's just everyone trying to do the work that they have to do so that
00:02:47they can have a long weekend over Labor Day.
00:02:49I'm convinced that this is the same thing that happens like right before Thanksgiving.
00:02:54It's just a bunch of people who are like, I need a minute.
00:02:57Let me, let me file all my paperwork.
00:02:59Let me get all the work done.
00:03:00Let me push publish on the thing and then get out.
00:03:03Like nothing is going to happen on Friday.
00:03:06Nothing.
00:03:07Everybody's on vacation.
00:03:08They're like, I did it already.
00:03:09Four day weekend.
00:03:10And you're listening to this on a Friday.
00:03:11So get ready.
00:03:12Yeah.
00:03:13We're going to recapitulate everyone else's week of work.
00:03:14If you've done any work today, stop it.
00:03:15I do think like the, the theory falls a little short when we consider France because they
00:03:21don't do Labor Day the same way.
00:03:23Every day in France is Labor Day.
00:03:25That's true.
00:03:26They're like, were you counting on the buses today?
00:03:27A strike.
00:03:28French people talk.
00:03:29Can I just talk about the books for one more second?
00:03:34So governments around the world generate PDFs and it is our job to read them abroad in broad
00:03:39strokes.
00:03:40That's how I think of the virtual policy team's coverage.
00:03:43David made everybody buy a book's Palma, right?
00:03:45He came to your house and with threat of violence made you buy an E Ink Android phone.
00:03:50He just knocked on my door, but it was, it was polite, but it happened.
00:03:55The other function of the verge besides responding to government PDFs is making people buy stupid
00:04:01gadgets.
00:04:02I'm very confident in this dynamic.
00:04:04Like this is how it should be.
00:04:06So I buy a book's Palma and I'm like, this is how I'm going to do PDFs because so much
00:04:09of my job is reading PDFs.
00:04:10But then you need another piece of software because it's not good at anything by itself.
00:04:15Cause it's just a low end Android phone with an E Ink this way.
00:04:18So then you got to get ReadWise Reader and it feels like those two companies should just
00:04:26put it together.
00:04:27Those are the two things.
00:04:29It's not actually the book's Palma that people want.
00:04:31It's ReadWise Reader on an E Ink screen.
00:04:33A hundred percent.
00:04:34It could not agree more.
00:04:36And, and like every time Alex and I both go through this, every time we talk to a company
00:04:41that makes e-readers, we say, when are you going to make a decent app for reading things?
00:04:44And every time we talk to a company that makes a reading app, it's like, why don't you make
00:04:47hardware?
00:04:48And at some point somebody is going to do it and it's going to be great because you're
00:04:52exactly correct.
00:04:54Didn't Amazon kind of do it?
00:04:55Right.
00:04:56This is what I'm arguing for is lock-in.
00:04:57I'm just pointing out software.
00:04:59That's the word.
00:05:00Good is so, so, so important.
00:05:02That's true.
00:05:03ReadWise Reader.
00:05:04Good software.
00:05:05There's good software out there.
00:05:06Omnivore.
00:05:07Free app.
00:05:08Good software.
00:05:09Make a, make a thing for it.
00:05:11Or, or work together with other company that's making the thing.
00:05:14Like if you buy the books, but you would not know that ReadWise Reader exists and is the
00:05:18thing that unlocks the hardware.
00:05:20Right.
00:05:21Right.
00:05:22I feel like they could just close that loop.
00:05:23Anyway, ReadWise Reader, the thing that you do to it is you upload the PDF and it basically
00:05:26OCRs it into a format that looks good on an E-Ink screen, which is great.
00:05:31Only that takes time and I have no attention span and no patience and I was like, screw
00:05:36it.
00:05:37I'm just reading this PDF on my iPad.
00:05:38And that has happened like five times in a row.
00:05:39Yeah.
00:05:41And our features editor is constantly dealing with this because we're always getting like
00:05:43galleys of books and stuff.
00:05:46And every time I get like a PDF of a book that's about to be released that I have to
00:05:50read for whatever reason, I first try to upload it into my Kindle where it doesn't work and
00:05:55looks bad.
00:05:56Then I try to put it in ReadWise where it doesn't work and looks bad.
00:05:58And then I end up reading it like in Dropbox on my iPad.
00:06:01And I'm like, how is this the best possible solution here?
00:06:06And yet it is.
00:06:07Every time.
00:06:08So I've come this close from going from Boox Palma to buying an iPad Mini like pilots around
00:06:14the world.
00:06:15By the way, people now just send us creep shots of pilots with iPads.
00:06:19It's the best.
00:06:20It's my favorite.
00:06:21Very unexpected outcome of hosting a podcast about technology.
00:06:26But all the pilots look great.
00:06:28Everyone looks happy with their iPad.
00:06:29Okay.
00:06:30Let's talk about some actual techniques.
00:06:31So we got to talk about Telegram.
00:06:34There's a bunch of other legal stuff that happened this week and then we got a true lightning
00:06:37round where we're going to try to get through them all to wrap this thing up.
00:06:41But we should start with Telegram, which is, I would say, not only news of the week, but
00:06:45the ongoing news of time to come.
00:06:49Because the idea that social or messaging platform owners face criminal liability for
00:06:54what happens on our platforms, it might feel new and shocking because the French government
00:06:58arrested this guy.
00:06:59But it's also been building for a very, very long time that some amount of responsibility
00:07:04for what happens on our platform will be imposed.
00:07:07And our government has wanted to do this, right?
00:07:10I mean, Congress is like, Jack Dorsey, we're going to yell at you for a while.
00:07:14This one, I think, is particularly shocking because of just the facts of what Telegram
00:07:17is, the fact that Pavel Durov, the CEO, landed in France on a private jet and the French
00:07:23authorities immediately arrested him.
00:07:25Why land your plane where the people are going to arrest you?
00:07:28His sort of unrepentant attitude about what's going on the platform.
00:07:30David, catch us up on the basics.
00:07:32Sure.
00:07:33Basically, last Sunday, Pavel Durov, the CEO of Telegram, was arrested, like you said,
00:07:39after landing the PJ in France.
00:07:42Just don't land your PJ there.
00:07:43It's an odd move.
00:07:44You're not connecting.
00:07:45Don't call it PJ.
00:07:46No.
00:07:47Just don't land the PJ.
00:07:48No.
00:07:49Is a good...
00:07:50We can make shirts that say, just don't land the PJ.
00:07:52That could work.
00:07:53Stay in the sky, Pavel.
00:07:55And all we knew at the time was that he was arrested, I believe the French authorities
00:07:59said like in conjunction with an investigation into crimes.
00:08:04It was very vague, but he was arrested and over the next several days, it came out that
00:08:10essentially what was happening is that he was being arrested based on criminal activity
00:08:15that was happening on Telegram.
00:08:18He was not, as far as I understand, being accused of doing the crimes, but of what the
00:08:22French government ended up calling complicity in the criminal activity that was happening
00:08:27on Telegram.
00:08:28He's just been charged with a bunch of things, most of which are complicity for things like
00:08:33money laundering and child abuse, child sexual abuse material, all sorts of other internet
00:08:38crimes.
00:08:39And the overarching theme seems to be, A, a lot of this bad stuff was happening in relative
00:08:45plain sight on Telegram.
00:08:47And B, Telegram not only knew it was hosting this stuff and allowed it, but actively resisted
00:08:56working with governments to help.
00:08:59I think, again, this is the very beginning of what I think is going to be a very long
00:09:03story.
00:09:04But I think in terms of the basics of what we know, I think that's kind of where we are
00:09:07at this moment.
00:09:08Yeah.
00:09:09Am I missing anything?
00:09:10No.
00:09:11Those are the basics.
00:09:12Obviously, none of us are experts in French law.
00:09:13If you are an expert in French law, hit us up, we'd love to talk to you.
00:09:19But the pieces of the puzzle that are really important here that I think we can understand
00:09:23and talk about credibly are, one, how Telegram works.
00:09:28Because a lot of people want to impute how other platform works onto Telegram, and they're
00:09:32actually really different.
00:09:34So it's a very common misconception that Telegram is encrypted, and it is just not.
00:09:39It's not even secret chats, one-to-one chats.
00:09:43You can push a button and make them encrypted, but that is not the default.
00:09:47Telegram groups are not encrypted.
00:09:49There's nothing inherently secure about Telegram.
00:09:52I will tell you, I know a lot of activists who know, who know that there is CSAM on iMessage,
00:09:59that there is bad stuff on Signal, and the defense is these are end-to-end encrypted.
00:10:05No one can see the data except the participants.
00:10:08So Apple knows it's there.
00:10:10People have told them.
00:10:11They have actually built some tools to try to detect it and gotten yelled at for essentially
00:10:15breaking the encryption.
00:10:16We've talked about that story at length.
00:10:18But fundamentally, it can't see it.
00:10:21So it has this out.
00:10:23These companies have these outs.
00:10:25People are doing bad things on our platforms, but we can't see it, so we don't know.
00:10:30You should find other ways to catch them.
00:10:32This is a lot of the argument about end-to-end encryption is the cops want to see the bad
00:10:37stuff, and they want the companies to build them back doors to give the bad stuff to them.
00:10:42I think there's a temptation to assume Telegram is operating the same way, and it is not.
00:10:48And I think that is actually the first most important thing, is all the stuff that's happening
00:10:52out in the wide open in Telegram, in these huge channels, in these unencrypted chats,
00:10:59Telegram can see almost everything that happens unless you press secret chat, which no one
00:11:02is pressing, and it only happens between people anyway.
00:11:05And so it is a very unusual service in the sense that it knows.
00:11:10It has the technical ability to know what is happening on service, and lots of people
00:11:14can see what's happening on service.
00:11:16Have y'all used Telegram a whole lot?
00:11:19In bits and pieces.
00:11:20Today felt like the wrong day to start.
00:11:22Yeah.
00:11:23No, I remember using it, I used it when I went to Taipei at one point just because it
00:11:27was easy to communicate with my friends, and when I got back, COVID hit, was living in
00:11:32New York, and so all the protests and stuff were happening.
00:11:34So there was a ton of organizing happening on Telegram.
00:11:38Most miserable experience I've ever had using a chat program.
00:11:41Just awful.
00:11:42Because it is just like, it's a chat room.
00:11:46You're now in a chat room, and you're going to go to bed and wake up to 200 new messages
00:11:50in a chat room, and you have no control over that.
00:11:54You make a good point.
00:11:55Let me just quickly explain the rough structure of Telegram, because like you were saying,
00:11:58it is actually instructive for what's happening here.
00:12:01It tells you a lot about the story, yeah.
00:12:02So there are basically three levels of Telegram, which exists, I would say, somewhere in the
00:12:08if you imagine the overlapping Venn diagram of WhatsApp and Discord and WeChat, Telegram
00:12:16is somewhere in the middle of those three things.
00:12:19So all the way at the top, you have public channels, which is basically the equivalent
00:12:24of an Instagram feed.
00:12:26You post something, lots of people can see it.
00:12:30It's essentially just a public one-way feed.
00:12:32The middle thing, which I think is probably Telegram's most, I don't know, if not most
00:12:38used and sort of most unique thing is the group chats.
00:12:41And the group chats can have, I think the number is up to 200,000 people in them, which
00:12:44is a crazy number, but is is a huge part of the reason that Telegram has been used for
00:12:50things like political organizing and for huge like government communications.
00:12:54It's because you can have that many people literally in a space together.
00:12:58It becomes abject chaos, but it can be really useful.
00:13:02And then at the bottom of that is one to one chat.
00:13:05So you can go all the way from you and I are texting each other to like one to millions
00:13:13inside of Telegram.
00:13:14And there's also like there's there's app stuff going on in there.
00:13:17There are all kinds of like plug-in things you can do inside of Telegram.
00:13:19It has some of that like WeChat operating system-y dynamic to it.
00:13:24But most of what happens-
00:13:25Truly terrible emojis.
00:13:28It's not a good looking app in the same way that WeChat is not a good looking app, but
00:13:31like it does a lot.
00:13:32Yeah.
00:13:33And so it's it's very useful for those reasons.
00:13:35And so the reason most people use it, and for our purposes, I think the reason it's
00:13:41particularly interesting right now because of what's going on with Pavlodourov is really
00:13:46probably the top two things, right?
00:13:48The thing where one person can communicate with a lot of people very quickly and the
00:13:52sort of giant teaming group chats of up to 200,000 people, right?
00:13:56So like there's nothing else that works quite that same way on the internet.
00:13:59And to be clear, if you have a one to 200,000 person group chat, it doesn't matter if that's
00:14:05encrypted.
00:14:06Definitionally the thing is leaking, right?
00:14:09You're not trying to keep that secret.
00:14:10Like that's not the point of talking to 200,000 people.
00:14:13So there's all these arguments for why it would and would not be encrypted.
00:14:16But I think the first thing to just be very clear on is it's not like the other platforms
00:14:21that are used for political organizing.
00:14:22It's not like the other platforms that are encrypted or claim to be encrypted.
00:14:27It is inherently this broadcast medium.
00:14:29And then the other piece, which is tremendously relevant to all of this, is that it has no
00:14:35content moderation, which is not like a technical decision.
00:14:39That is a policy decision.
00:14:40They've just decided this is going to be fine.
00:14:43It's a free for all.
00:14:44We don't care.
00:14:45We support free speech.
00:14:46And the policy decisions that flow from that get all the way to like, how do we treat governments?
00:14:50And also just simply get things like ISIS lives on Telegram.
00:14:53This is just going to be the app that ISIS uses.
00:14:55And people have known this for a long time.
00:14:56We are going to accept an enormous amount of pornography, including some of the worst
00:15:01pornography, including some child sexual abuse material, perhaps lots of child sexual abuse
00:15:05material.
00:15:06And everyone can just see it.
00:15:07And researchers can just file report after report about the prevalence of this material
00:15:11on Telegram.
00:15:12And the company is going to do nothing about it.
00:15:15And then that brings you to the actual policy decisions, which is the way they've structured
00:15:19the data on Telegram, means that if a government wants to issue a warrant or a subpoena, they
00:15:24actually have to issue like 20 in countries across the world in order to get the data
00:15:29and prove the case against the bad people.
00:15:31And Telegram is very proud of this.
00:15:32And all platforms have their transparency reports.
00:15:38In other times, they've had what's called canaries, where they have a sentence on their
00:15:44websites like, we've never done this.
00:15:46And then they silently change it, get rid of that sentence, so people know it happened.
00:15:50But even they haven't disclosed anything.
00:15:52Telegram's one is like, we've just never given up any data.
00:15:55Everyone can see all the bad things on our platforms.
00:15:57They want to catch the bad people.
00:15:58And we have made it, so you cannot.
00:16:00You can't do it.
00:16:01Like, it's so hard to do it.
00:16:02No government has ever managed to do this.
00:16:05And I think that is the liability.
00:16:07Again, I don't really know French law, but I know how our government thinks about it.
00:16:13I know how researchers think about it.
00:16:14I know how academics think about it.
00:16:16And once you get to, you know it's happening, and you won't even help the cops stop it,
00:16:21you end up with, maybe you're responsible for it yourself.
00:16:24And that is, that's a lot of steps.
00:16:26You got to get through a lot of steps.
00:16:29I don't think there's any networks in the United States that are anywhere close to that,
00:16:33even though the usual morons are screaming about free speech on Telegram.
00:16:38That's just not how it works here.
00:16:39Like, no one is that stupid.
00:16:42But here, it feels like there was such an active effort to frustrate authorities from
00:16:47prosecuting just straightforwardly criminal behavior that dude landed his PJ in France
00:16:53and got arrested.
00:16:54Yeah.
00:16:55I think we got to find out the case.
00:16:56Again, not an expert in French law, like there's a whole bunch of stuff, but you know the crimes
00:17:01are happening, and you won't even help us stop it.
00:17:04That's where you get that collapse of liability to, we're just going to arrest the CEO of
00:17:07the social network.
00:17:08Well, and it goes back to the questions of encryption, too, because I think you look
00:17:12at something like iMessage, right?
00:17:15Where Apple's stance, right or wrong, believe it or don't believe it, is privacy is more
00:17:20important than everything else.
00:17:22We can't see it, neither can you, right?
00:17:23Yeah, we don't even know it's happening.
00:17:25You believe it's happening, but there's no way for us to know it's happening.
00:17:28Right.
00:17:29Agree or disagree, that is a stance.
00:17:30That is a whole stance.
00:17:32What Telegram has said is the opposite, right?
00:17:34Like, all the things you're saying about, we can see it and we're doing nothing about
00:17:38it, they've, like, touted this as part of the point over the years.
00:17:42And one of the things that Pavel Durov has said many times is, like, you can't build
00:17:46a private, secure, safe place for people to talk except for terrorists.
00:17:51And like, again, a bunch of really interesting arguments behind that sentence, but it's not
00:17:55encrypted.
00:17:56And this is the thing that it comes back to, right, is it would be a different argument
00:18:00if Telegram were an encrypted app on which all of this stuff was kind of loosely known
00:18:04to be happening, but you couldn't see it.
00:18:05But, like, you can see it.
00:18:07It's just, it's just there.
00:18:09I heard that all these researchers in the last few days who are coming out and saying,
00:18:12like, the the new defy AI apps is one that you see a lot on on Telegram.
00:18:18And Will Arimas at the Washington Post wrote a great story about a researcher who was like,
00:18:22I just went and found a bunch of them just right there.
00:18:24They're just sitting there.
00:18:25It's not like a secret.
00:18:27This stuff is known.
00:18:28And so it seems like that disconnect between it's just sitting right there.
00:18:33The evidence is that is in front of us, like, all I have to do is look with my eyes, and
00:18:37it's right there.
00:18:39And yet you are willingly obfuscating my ability to do anything about it.
00:18:43That is, I feel like the bridge that I've never seen another company cross this way.
00:18:46Yeah.
00:18:47Well, I think there's the one example is like Kim.com, right, where he was kind of egregiously
00:18:51ignoring the fact that his his mega upload site was full of pirated stuff.
00:18:59And then one day they can't the New Zealand authorities came and said, yeah, you can't
00:19:02do that anymore.
00:19:03And now he's being extradited, like on his extradition tour every couple of years, gets
00:19:08sent to a new country to be extradited.
00:19:10So I think there's like examples of this before.
00:19:14And Pavel, I look, why did this guy come to France, because this is the biggest question
00:19:19I have.
00:19:20Why land a plane in France?
00:19:21Like, did he said before, like, I'm not going to go to places where they want to arrest
00:19:25me for this stuff.
00:19:26And they hate what I'm doing.
00:19:27France, like, did he miss the Olympics?
00:19:31Maybe he thought they were just distracted, like they're coming off the Olympics.
00:19:34He's like, I'm gonna stop and get a baguette and a cigarette, you know, see how it goes.
00:19:37Two things about Kim dot com.
00:19:38One, we had the best headline in British history.
00:19:41Kim dot com mega uploaded to the United States on copyright charges, which people thought
00:19:46was a body shaming joke and like, no, that's the name of the company.
00:19:49I'm sorry.
00:19:50It was just very funny.
00:19:52That whole sequence of events was very funny to copyright law.
00:19:56The only one where we just accept the speech regulation.
00:19:59It is wild to me.
00:20:01You're like, you fired a bunch of Disney movies, jail, the entire international community is
00:20:05going to put you in jail all together at the same time.
00:20:08Everyone's like, huh, that's not a problem.
00:20:10Elon Musk is not like justice for Kim dot com, right?
00:20:14And then it's like, we're talking about ISIS and CSAM and we're having a free conversation.
00:20:19It's like, what are you talking about?
00:20:22So there's just a huge disconnect in how we perceive speech regulations, even in this
00:20:26case, where it's two people are facing criminal prosecution around the world for, you know,
00:20:31for violating our speech norms.
00:20:32One just happens to be Disney, like Hollywood.
00:20:35And that seems to be fine.
00:20:36So that's just weird.
00:20:38Like just on its face, Alex, I agree that there's a weirdness there.
00:20:41But the other piece of the puzzle is that even United States where we have strong prohibitions
00:20:44against speech regulations and the first amendment and things like section two 30, which insulate
00:20:51platform owners for the speech that happens on our platforms.
00:20:54We make a big exception for criminal behavior on your platforms.
00:20:58Like part of section two 30 is no effect on criminal prosecution.
00:21:03Like it's the heading.
00:21:04It's like you don't get insulated from crimes, right?
00:21:08If the people are doing crimes in your platforms and we can connect it to you, we're going
00:21:12to, you are the crimes.
00:21:13That's you.
00:21:14FOSTA and SESTA, the, the, the, the exception to two 30 that made basically made talking
00:21:22about sex work illegal on these platforms.
00:21:23It was an anti trafficking law, but it landed with basically sex workers can't do business
00:21:29on platforms anymore.
00:21:31That is a criminal.
00:21:32That's a criminal statute.
00:21:34So like you're, this is illegal now you'll be responsible for this.
00:21:36You'll be liable for this and the platforms I'll take it down and most people are sort
00:21:41of fine with that.
00:21:42There's a large controversy brewing about that, that we don't have time to talk about,
00:21:45but that was the trade off that was made in that policy here.
00:21:49I think there's just a deep confusion about a French law.
00:21:54No, no, no.
00:21:55So it's French law and be like, how much do you have to know?
00:21:57When do you have to know it is Mark Zuckerberg going to go to jail because all of this stuff
00:22:01exists on meta platforms, but they actively try to shut it down.
00:22:05They work with law enforcement.
00:22:07So it was like getting in jail.
00:22:08If you do slightly less than meta, do you go to jail?
00:22:11If you're Elon Musk, do you go to jail?
00:22:13Right?
00:22:14Like I don't think we understand the gradation.
00:22:17And then on top of that, every country on the planet right now is grappling with how
00:22:22to regulate social media and the bad things that happen on social media.
00:22:25And they're all going to come to different, wildly different answers.
00:22:29And I think all of these platform owners are kind of like, okay, this is the tip of the
00:22:32spear.
00:22:33Like this is the beginning of the end, even though it's so out of bounds, even though
00:22:38it's so far.
00:22:39If you like, no one else is doing a telegram is doing it this way.
00:22:41Do you guys consider telegram social media?
00:22:43I always considered it like a chat platform.
00:22:46What is a photo?
00:22:47Yeah.
00:22:48I basically wanted to just do that.
00:22:52What is a social media platform?
00:22:54Like it's, it's Instagram enough in the way that people use it in some ways that I think,
00:23:00I think it counts.
00:23:01Okay.
00:23:02I'm all the way at like anything where you can post.
00:23:05Yeah.
00:23:06So like, uh, and that post might go to people you don't know is a social media platform.
00:23:11Yeah.
00:23:12Right.
00:23:13And that's a, that's a very loose, so you can, you can find exceptions to that one all
00:23:16over the place, but that's kind of my, my broad definition.
00:23:20Can you post?
00:23:21I like that.
00:23:22Well, can you post and, and can the person on the other, like, do you have to know the
00:23:24person on the other end of it?
00:23:26So like by this definition, like discord is a social media platform, right?
00:23:29Twitch is a social media platform.
00:23:30Um, Twitch is obviously it's in my mind, obviously social YouTube is a social media platform
00:23:34by this definition.
00:23:35Um, so I think there's, that's a, that's a broad one, but this basic idea of like, if
00:23:41you run a platform or someone can talk to a lot of strangers, what responsibility do
00:23:45you as a platform owner have?
00:23:47And pretty much the only responsibility we've decided any of these companies truly have
00:23:53is making sure there's not copyright infringement, like is making sure that Hollywood and the
00:23:58music industry are protected.
00:23:59That is pretty much it.
00:24:01Right.
00:24:02And then there's a, there's a litany of, of really bad things that they've decided that
00:24:06they will keep off on their own.
00:24:08Um, and that is it.
00:24:10That's the answer that like across the world, like that's the baseline answer is copyright
00:24:13infringement.
00:24:14And then, and then there's Tik Tok, which is like, what if we did it anyway?
00:24:17Yeah.
00:24:18But like, I would just point out that that always is where these arguments break down.
00:24:23If you're okay with sending Kim come to jail for copyright infringement, you, you, you
00:24:29might have to reconsider how you feel about the other stuff, or you might have to reconsider
00:24:33if you're a copyright infringement.
00:24:34I just think this stuff is so openly horrible that actively thwarting the authorities should
00:24:41probably land you in some hot water.
00:24:44But I think the question of who, who is you in that sentence, right?
00:24:49I think is, is doing a lot of work in this question because like what, what we see most
00:24:53of the time is companies get in trouble, right?
00:24:57Like companies get fined or everybody, like the government sends things to lawyers and,
00:25:04and it's all this sort of entity to entity thing.
00:25:07And so I think the fact that it was like a, a, a dude who got arrested for it is so unusual.
00:25:13I mean, it's like if, if somebody did a bad tweet and Jack Dorsey got arrested, like back
00:25:18in the day, instead of him going to how different our lives would have been if Dorsey could
00:25:22have been arrested for bad tweets.
00:25:24This is what I mean.
00:25:25And I think it's, we live in a utopia.
00:25:26I think if you, if you like, there's that, the jails would just be full of Jack Dorsey.
00:25:35But I think he'd be on a sex tradition tour.
00:25:37Yeah, exactly.
00:25:38Send him around the world from jail to jail around the world.
00:25:41I think if you, if you cast this out and in a certain way that it goes, like, you know,
00:25:46we talk about chilling effects of laws all the time.
00:25:48How are you going to feel right now?
00:25:49If you're a CEO of a company that is dealing with any of this, you're like, Oh my God,
00:25:53suddenly not only is my company at risk, but literally I personally could go to jail, which
00:25:58I guarantee you is not something most people are thinking about.
00:26:02Yeah.
00:26:03My access to high quality wine and cheese is at risk today.
00:26:06We cannot go to France.
00:26:07I was like, who are those?
00:26:08Who are those CEOs?
00:26:09It's like Mark Zuckerberg.
00:26:10What?
00:26:11Elon Musk.
00:26:12They're fine.
00:26:13It's a lot of them.
00:26:14Right.
00:26:16I think Andy Jassy could go to jail because something bad happened on Twitch, but all
00:26:21of us.
00:26:22Yes.
00:26:23He should for Amazon Kindle app, like sooner should go to jail for what happens on YouTube
00:26:27is like a pretty weird jump.
00:26:29Right.
00:26:30Um, at the same time, all of those companies actively cooperate with law enforcement all
00:26:35the time.
00:26:36For sure.
00:26:37In ways, in ways, good and bad.
00:26:38Uh, I think on our lightning round list is like Mark Zuckerberg sending a letter to Jim
00:26:42Jordan in Congress this week saying, I, we caved to the Biden administration too much.
00:26:46I think that letter is like political theater, but like, that's what I mean by active cooperation
00:26:50in the government on both sides.
00:26:52He's saying we cooperate with the Biden administration and I was cooperating with Jim Jordan saying
00:26:55we cooperate with the Biden administration.
00:26:57Like the, the big companies in this country at least are actively engaged with various
00:27:03governments around the world in various ways.
00:27:06And I think that insulates them from Mark Zuckerberg getting arrested in France.
00:27:11Where you just have this other thing, it is so different, like architecturally and
00:27:15how the app works, uh, user experience wise and you know, the, the one to 200,000 kind
00:27:20of opportunities you have moderation wise and that they do literally nothing with the
00:27:24worst content anyone can possibly imagine.
00:27:27And then policy wise in that they have structured it to make it impossible for the authorities
00:27:32to even arrest the people that can see with their own eyes and all the middle finger to
00:27:37them in the process.
00:27:38And you just, you just like stack all that up and you're like, well, that is a lot of
00:27:41decisions that land you in French jail.
00:27:44I wonder how it's going to affect Elon because he'd really been trying to take the same path
00:27:50with X that Durov's been doing of just being like the free speech absolutionist.
00:27:54Is that how you say it?
00:27:55Oh, I, I completely disagree with that.
00:27:56I understand the argument.
00:27:57That's what he wants us all to think.
00:27:58Like, yeah, he's always like, ah, it's all about free speech and he fundamentally is
00:28:03very different in how he actually does that because he's always like, oh, you want me
00:28:07to take that down?
00:28:08But like, yeah, I mean his, his definition of free speech and we can, you know, look
00:28:13at the tweet is, uh, if people want speech regulations, they should pass laws and that's
00:28:19how democracy is supposed to work and I will follow the law and you're like, you're, that's
00:28:23the weirdest definition of free speech that exists because you're asking for government
00:28:28speech regulations and what they have followed historically has been draconian speech laws
00:28:33in other countries.
00:28:34Yeah.
00:28:35Right.
00:28:36I mean, he has draconian speech laws.
00:28:37X complies with them.
00:28:38Brazil has new draconian speech laws and he doesn't like the government, so he won't.
00:28:41It's like, it's this weird balance of like, actually I'll do whatever I want, but like
00:28:48if I don't want to get into a fight with the government or I like the government there,
00:28:51I'll do what they say, which is just not a position, right?
00:28:54That is just pure opportunism and I think the noise around Durov getting arrested in
00:29:01France is like useful for him and his ongoing quest to be perceived as a free speech martyr.
00:29:07But if you look at his posts lately, you can tell he's like in an intellectual spiral because
00:29:13his position has no center, right?
00:29:16There's no, there's actually no ideological commitment to anything.
00:29:19So if you're like, I would wish to defy the government, but I might get arrested.
00:29:22Also some of this stuff is horrible.
00:29:23Also I'm still like, I'm suing advertisers for antitrust.
00:29:27There's nothing there.
00:29:29There's literally no center, there's no intellectual center to that position and you can kind of
00:29:33see him be like, oh crap, this is what free speech might actually mean.
00:29:38I don't want to over intellectualize Elon's, you know, ketamine musings or whatever the
00:29:43fuck it is, but it's like you can see it's starting, like the pressure, welcome to hell
00:29:48is basically what I'm saying.
00:29:49Like the internal contradictions of owning a platform like X and all of the demands that
00:29:53are placed on that, I think it's starting to wear on him a little bit.
00:29:57Yeah.
00:29:58Right.
00:29:59And I think we should probably move on from this, but the thing that keeps coming up to
00:30:03me as I read these stories and talk to people about this, there's been a ton of good reporting
00:30:06on this by the way.
00:30:07Good, good week for the journalists out there, uh, is how much that the, the sort of legal
00:30:14push against the tech industry is still going along the lines of, we have to protect the
00:30:18children.
00:30:19Uh, like so much of this is about the CSAM stuff that's happening on telegram and the
00:30:24laws that have been passed recently.
00:30:26Uh, I think it was, I think it was Britain that Reese that passed a law last year, uh,
00:30:32that explicitly allows the executives of tech companies to be held personally responsible
00:30:37for CSAM stuff and child safety stuff on their platform in general, if they're told about
00:30:43it and don't do anything about it, like that's, that's now a law that exists and, and this
00:30:47idea of, uh, we have to protect children being like the most bipartisan thing you'll find
00:30:53anywhere on any line of political reasoning.
00:30:58That's how we're getting towards all of this.
00:31:00And it's just, it's just fascinating that that continues to be the thing that is like
00:31:03the deeper this goes, the more it's about child safety.
00:31:06That's all.
00:31:07That's always the argument for censorship.
00:31:08Really?
00:31:09That's always like the first line of arguments for censorship is we must think of the children.
00:31:12Well, no, it's Hollywood first and then child safety.
00:31:15Yeah.
00:31:16It's, it's never, it's never not Mickey mouse first.
00:31:18Yeah.
00:31:19Mickey mouse.
00:31:20And then child safety almost always.
00:31:21Like there was a great Simpsons episode about that exact thing.
00:31:23And then all of the big fights in the nineties over a TV ratings and stuff like that, where
00:31:29you had like Clinton and then all these Republicans getting together being like, won't anyone
00:31:33think of these babies?
00:31:34Right.
00:31:35You can get anything done if you can convince people it's about keeping children safe.
00:31:38Yeah.
00:31:39And I agree with you and you know, tipper Gore put the explicit lyrics label on all the
00:31:46CDs and that was tipper Gore.
00:31:49Very weird.
00:31:50Yeah.
00:31:51Tipper Gore versus NWA.
00:31:52Just imagine how far we've come.
00:31:55It's super weird.
00:31:57But the, the reason that you do that is if you can make the counter arguments seem morally
00:32:03indefensible, you, you win.
00:32:05Right.
00:32:06Right.
00:32:07And, but here I just want to come back to, it is really there, it, the CSAM is there,
00:32:12it's very bad.
00:32:14And so like sometimes you just have to say it's there and it's bad and you have to do
00:32:19something about it.
00:32:20Right.
00:32:21And I, that's, I, if you've listened to this show or decoder or list, read us for years,
00:32:25like the free speech debate online has raged and we have struggled with it and we've done
00:32:29endless episodes about it, but like you need an ideological center, right?
00:32:34That's what I'm saying.
00:32:35And it's really, it's there, like ISIS is there, like you don't have to overthink it.
00:32:41You just have to make it go away and hold the people responsible for it accountable.
00:32:45And I think here it's like, well, that's an easy line.
00:32:47There are lots of gray areas and lots of much harder lines.
00:32:50That one feels pretty easy to me.
00:32:51Yeah.
00:32:52All right.
00:32:53We should wrap it up.
00:32:54I promise we'll talk about doing other more fun crimes in the next section.
00:32:59Support for the Verge cast comes from Shopify.
00:33:02Now look, time spent on the internet, it's not always a waste of time.
00:33:06Sometimes you can learn a new skill.
00:33:07Maybe you're curating the finest of cat memes, or you could turn a profit by growing your
00:33:13business with Shopify.
00:33:15Shopify is the global commerce platform that helps you sell at every stage of your business.
00:33:20You could be using their all-in-one e-commerce platform or their in-person points of sale.
00:33:25Shopify helps you sell everywhere, whatever it is you're selling.
00:33:29And according to Shopify, they help you turn browsers into buyers 36% better than average
00:33:34when you compare them to other leading commerce platforms.
00:33:37They also have an AI powered tool to help make the most of your time.
00:33:41It's called Shopify Magic.
00:33:42And some of your favorite brands, they're already using Shopify for e-commerce.
00:33:47Brands like Allbirds, Rothy's, Brooklinen, and more.
00:33:50Because businesses that grow, grow with Shopify.
00:33:53And you can sign up for a $1 per month trial at Shopify.com slash Vergecast.
00:33:58That's all lowercase, Shopify.com slash Vergecast now to grow your business no matter what stage
00:34:04you're in.
00:34:05Shopify.com slash Vergecast.
00:34:10We're back with more crimes.
00:34:13Fun crimes.
00:34:14And I trust crimes.
00:34:15Yeah, I was gonna, are these, they're less bummer crimes.
00:34:19I don't know if they're more fun crimes.
00:34:21I would say that-
00:34:22Not a lot of whimsical crimes in this episode.
00:34:25The grok AI being so out of control that you can just make Hillary Clinton do GTA is legitimately
00:34:33very funny to me.
00:34:34Okay, yeah.
00:34:35Fair enough.
00:34:36I'll give you that.
00:34:37I have a lot of deep reservations about deepfakes and what is a photo and lies on the internet.
00:34:42And then I'm like, it's also hilarious to have Elon Musk go on an ayahuasca journey
00:34:46at the end of it.
00:34:47Have his deepfake character say, I'm starting in soup kitchen, which is a real video I watched
00:34:51today.
00:34:52It's like-
00:34:53Oh, brother.
00:34:54Reproachoso, brother.
00:34:55I don't know what to tell you.
00:34:56It's all very good.
00:34:57With an Elon voice too.
00:34:58Yes.
00:34:59All of it's great.
00:35:07Look at it.
00:35:09Respect the truth.
00:35:10Photography is sacrosanct.
00:35:11I laughed.
00:35:12I did lol.
00:35:13Let's start with the big one, David.
00:35:17You said this is the story of segment two.
00:35:19It's Yelp sues Google for antitrust violations.
00:35:22So okay.
00:35:23Did you guys think that Yelp had already sued Google?
00:35:26Because I definitely thought Yelp sued Google a long time ago.
00:35:30Yeah.
00:35:31Yelp at this point is a front for suing Google, right?
00:35:33Is that not how it works?
00:35:34Basically.
00:35:35Yeah.
00:35:36I think that Yelp is just lawyers now.
00:35:37Yes.
00:35:38But anyway, so very recently, earlier this summer, a couple of weeks ago, Google lost
00:35:43its antitrust search trial.
00:35:46We're still in the remedies phase and then there will be appeals and then God only knows.
00:35:49But for now it lost.
00:35:51And so Yelp, clearly emboldened by that fact, filed a suit of its own that basically brings
00:35:58up something that got thrown out of the last case against Google, which I find really interesting.
00:36:04And essentially the argument is that Google prioritizes its own stuff in Google search
00:36:10results and thus deprioritizes Yelp and TripAdvisor and other companies like it and has thus killed
00:36:16them.
00:36:18This is an argument we've been hearing forever.
00:36:20This is an argument Yelp has been making loudly forever.
00:36:23Literally I assumed Yelp had already sued Google.
00:36:26Apparently it hadn't.
00:36:28And now what Yelp seems to be thinking, like Jeremy Sopelman, the CEO said, in as many
00:36:34words, the wins on antitrust have shifted dramatically.
00:36:37So this company clearly senses this is the moment to finally pick the fight it's been
00:36:43wanting to pick for many years, which I think is very odd, given that that particular fight
00:36:48has been thrown out of court in the same search trial that Google just lost.
00:36:53So I'm confused.
00:36:54But also I think it's fascinating.
00:36:56And yet again, like Google is just up against it, man.
00:37:01Like the epic stuff is happening.
00:37:03The search stuff is happening.
00:37:04This is happening.
00:37:05It's just it just keeps happening for Google.
00:37:07There's an ad tech trial.
00:37:09Yeah.
00:37:10I say that like people know it is.
00:37:11Google has yet another antitrust trial coming up over its ad tech stack, which is the money.
00:37:17Yeah.
00:37:18I mean, wasn't this like Google strategy?
00:37:20They would spend 20 years fucking around and then.
00:37:24Now they're in their find out phase, and that's why they have all the lawyers that they pay
00:37:28a lot of money to.
00:37:29Yeah.
00:37:30I mean, there's something I agree with Jeremy Sopelman and the wins of antitrust have definitely
00:37:35changed.
00:37:36I actually can point to a very, very odd example of this.
00:37:40Luther Lowe, who is Yelp's old policy person, recently left Yelp.
00:37:45He's doing all this stuff.
00:37:46I think he's a Y Combinator now.
00:37:48And they did like a meet and greet in D.C., you know, like a little thing.
00:37:52And J.D. Vance, before he was vice presidential candidate, showed up at the meet and greet
00:37:56with all the startups talk about how he used to be an ad tech and he thought big tech was
00:38:00censoring everyone and the woke mob was out of control.
00:38:02And in the middle of all that nonsense, he was like, I think Lena Kahn's doing a good
00:38:06job.
00:38:07Yes.
00:38:08Weird.
00:38:09Weird.
00:38:10Right.
00:38:11Like that's how much the wins of antitrust has changed.
00:38:12He's like, the only person about administration is doing a good job is Lena Kahn because we
00:38:15should break up.
00:38:16And really his point was like, because this woke bullshit is out of control and like maybe
00:38:20that doesn't all put together like that puzzle, you put all the pieces together and like that
00:38:24doesn't look like anything.
00:38:26But he still said it, right?
00:38:27Like that's where the mindset is, is these companies have too much control and we should
00:38:32take that control away from them and do something else.
00:38:35What the something else is, I think all of us are very curious to find out.
00:38:38It's clearly going to be something else.
00:38:41But you look at Google, which is just taking it on the chin lately, right?
00:38:46In the search case, in the epic case they lost there's, and the judge in that case has
00:38:50been very loud.
00:38:51Like you did it.
00:38:52I'm going to do something about it.
00:38:54And then the ad tech case, which I, you know, I, we don't know what's going to happen.
00:38:58Every trial is kind of a coin flip, but we're going to get a documents about Google's money
00:39:03in that case.
00:39:04Like it's going to, some stuff is going to come out because it's going to be the business
00:39:06side of Google talking about the money.
00:39:09And I, I think Google's image is very cuddly as though the money just appears from nowhere
00:39:13and that's not true.
00:39:15And that's also the trial.
00:39:16Let's not forget that Google wrote a, just without any questions, just wrote a check
00:39:20for the maximum.
00:39:21Oh my God.
00:39:22I forgot about this.
00:39:23Avoid having a jury trial.
00:39:24Like innocent, you know, innocent until proven guilty.
00:39:28But that's a tough look.
00:39:29Can we just sit on that for one second?
00:39:30This is a real thing.
00:39:31We have a picture of the check on the website.
00:39:33I believe they just wrote a check to the department of justice for the maximum, like a cashier's
00:39:39check.
00:39:40I believe it's drawn on Wells Fargo and they're like, here's all of the money we could possibly
00:39:45owe you.
00:39:46We would like to settle the case.
00:39:47Like out of the blue.
00:39:48Yeah.
00:39:49It's incredible.
00:39:50Just to not have a jury trial.
00:39:52And I mean, it's at Wells Fargo, so it'll take about what, 12 years before it gets cashed.
00:39:59The blockchain solves this, Alice.
00:40:01Sorry.
00:40:02Sorry.
00:40:03Have you heard of lightning?
00:40:04Okay.
00:40:05So back to Yelp.
00:40:06I think the winds have changed is an opportunity for maybe not to, you know, win like Epic,
00:40:13right?
00:40:14I don't have a fortnight in the background to fund all this stuff, but I think maybe
00:40:19to claw at a settlement out of Google, to somehow put some pressure on Google to open
00:40:23up and make this go away, to appease the Europeans in some way, whatever you think of Google's
00:40:28relationship to the internet, it is changing.
00:40:30And I think that as we've said a million times, that means the internet is going to change.
00:40:33So you think Yelp wants to be to Google what like the Delta emulator folks have been to
00:40:39Apple, which is just, just sort of the one in there, just like prying it open slightly.
00:40:46And they're in a position of weakness and going to have to start to make policy changes
00:40:50that are good for you, but also good for everybody.
00:40:52And so Yelp is like, we don't, we don't want to break up Google.
00:40:54We just, we just think this is a chance to like extract some of what we want from the
00:40:59company because they're going to decide it's easier to do that.
00:41:01It's kind of necessary for Yelp at this point, because they've been struggling for a while,
00:41:07right?
00:41:08And they have resorted to basically bullying restaurants and stuff and making sure that
00:41:13they're on their, their platform and everything.
00:41:15Here's what we can do.
00:41:16We can shake down millions of local restaurants or we can get a check from Google, which is
00:41:20Yeah.
00:41:21I mean, it kind of feels like what it is for Yelp.
00:41:24There's so many companies you can describe with that sentence.
00:41:27Good God.
00:41:28Well, so there was somebody who was on Threadsday who posted that I've been more cantankerous
00:41:33lately.
00:41:34Can I, can I just describe my Joker moment broadly?
00:41:39Because I love that these companies are having these fights.
00:41:40I love that the internet's going to be potentially more open.
00:41:42I love that at least we're in a season of change.
00:41:44All that's great.
00:41:45I do not for one second believe any of this is truly idealistic, right?
00:41:49I think this is a version of capitalism at play where a lot of very self-interested parties
00:41:55are looking out for their own self-interest and they're colliding in weird ways.
00:41:58Great.
00:41:59The reason I say it that way is because a long time ago I used to cover net neutrality
00:42:03like every other day.
00:42:04And the big player in net neutrality was Netflix.
00:42:08If you will recall, Netflix and Reddit were like out in front, like black out the internet,
00:42:15fight for net neutrality.
00:42:16We don't let Comcast throttle us, disclosure, Comcast is an investor in this company.
00:42:19And boy, did they not like my net neutrality coverage.
00:42:23Just putting it all out there.
00:42:25And then one year, Reed Hastings was on stage of the Code Conference, I believe Peter Kafka,
00:42:30and Peter said, you've been really quiet about net neutrality lately.
00:42:32And Reed Hastings looked at him and said, yeah, we're so big it doesn't matter anymore.
00:42:36And I turned into the Joker.
00:42:37That's honest.
00:42:38Wow.
00:42:39And like, fine.
00:42:40Right?
00:42:41Like, fine.
00:42:42And so I like the idea that Yelp is an idealistic freedom fighter, or even Delta, even though
00:42:48it's a smaller company, I do think they are pressing the regulatory advantage to open
00:42:51up these platforms.
00:42:52Oh, yeah.
00:42:53But I don't think that like they're the, they're going to get theirs.
00:42:56Right?
00:42:57And then all bets are off.
00:42:58And I think all of this change is like, who's going to get theirs at the end of this?
00:43:02And so like, we're going to shake down a bunch of restaurants or shake down Google is kind
00:43:05of the best, in my mind, like the best way to understand, like, everyone's trying to
00:43:09get paid.
00:43:10Whether or not the legal system can shift the money around in a more equitable way at
00:43:13the end of all this, totally remains to be seen.
00:43:16I truly do not know.
00:43:18In a funny way, what you're arguing for is capitalism working as intended.
00:43:21Yeah.
00:43:22Everyone fighting for money is what it should be.
00:43:25What we've landed on is nobody can fight for money because Google took it all.
00:43:28Yeah.
00:43:29That's true.
00:43:30Capitalism has never been tried, David.
00:43:32I think that's what I'm trying to tell you.
00:43:34Yeah.
00:43:35What if it was just a street fight for money?
00:43:37Let's find out.
00:43:38Is postmodern capitalism just Google opening its wallet?
00:43:42Yeah.
00:43:43You have.
00:43:44Like, sorry.
00:43:45You come all the way.
00:43:46That's the horseshoe theory of internet capitalism is Google is the planned economy director
00:43:49just handing money to people.
00:43:53I'm not saying that's not what everybody wants.
00:43:55Okay.
00:43:56So that's Google.
00:43:57We'll see what happens with Yelp and Google.
00:43:58But you can see, like, as we keep saying, whatever happens to Google is the internet.
00:44:02You can just see the internet sort of fracturing around that already.
00:44:06This other one I think is really interesting.
00:44:09It's a weird ruling.
00:44:10TikTok has to face a lawsuit for recommending the blackout challenge.
00:44:15And this is a very, very sad story.
00:44:16There is a 10-year-old who died doing the blackout challenge.
00:44:22Parents sued TikTok.
00:44:24One of those cases where it's like TikTok didn't make the blackout challenge videos.
00:44:32They're not telling you to do it.
00:44:33Right.
00:44:34The people are seeing the content on the platform.
00:44:35They're taking the action of horrible, extremely depressing thing happened.
00:44:41And now we're going to try to hold TikTok liable.
00:44:42And throughout most of this, we are not holding platforms liable for what the recommendation
00:44:48algorithms do, except we just had a Supreme Court ruling which said, maybe we should like
00:44:54more or less.
00:44:55Maybe we should.
00:44:56So one of the weirdest things about this case is that it's it's like a it's like a puzzle
00:45:01the Supreme Court put itself in.
00:45:03So there were content moderation laws passed in Texas and Florida.
00:45:07They went up to the Supreme Court, the Supreme Court basically said, no, these are these
00:45:11are weird.
00:45:12You're just obviously conservative states being mad at Facebook, but you didn't think
00:45:16about what would happen to other platforms, smaller platforms.
00:45:20You have to go think about this before you go back and figure it out.
00:45:23The one thing the Supreme Court was very clear about was that it is protected first party
00:45:28speech when they curate other people's content.
00:45:32Right.
00:45:33So Section 230 says you, Facebook, are not responsible for a people post on platform
00:45:38for mostly.
00:45:40That's their speech, not your speech.
00:45:43The way you moderate that speech and present it to other people is your speech.
00:45:47So that's protected by the First Amendment.
00:45:49You cannot the government cannot set content moderation rules because the content moderation
00:45:54itself is speech.
00:45:56Right.
00:45:57Okay.
00:45:58That feels right.
00:45:59Generally, that feels right.
00:46:00That's how you're going to market it from platforms.
00:46:01That's how you end up with a truth social and an X and then whatever kindergarten subreddit
00:46:06rules you want to participate in.
00:46:07Right.
00:46:08Like, that's how you get the whole range of expression in the market and sort of decide
00:46:11how much content moderation you want.
00:46:13The court, the Third Circuit evaluating TikTok case says, well, give quote, given the Supreme
00:46:19Court's observation that platforms engage in protected first party speech on the First
00:46:22Amendment when they curate compilations of other people's content via their expressive
00:46:25algorithms, it follows the doing so amounts to first party speech under Section 230.
00:46:32So the algorithm is your speech.
00:46:34That means you're liable for your speech.
00:46:36So if you recommend a bunch of blackout challenge videos, that's your speech.
00:46:40That's a choice you've made.
00:46:41Now you're liable for your speech and it's calling that curation, which is really fascinating.
00:46:45So this is just like a long, a long chain of reasoning that ends with, okay, some of
00:46:49this belongs to you.
00:46:50So the government can tell you what to do, but it belongs to you.
00:46:53Now other people can get mad at you for it.
00:46:56And that, I don't know how that's going to shake out.
00:46:57I don't know if TikTok is the right plaintiff for that.
00:46:59They're TikTok after all.
00:47:01We just talked about save the children is a theme, TikTok versus the United States government
00:47:06versus China.
00:47:07That's all in the mix there.
00:47:09Whether or not TikTok can like mount a challenge to this, kind of a huge precedent to say these
00:47:15algorithms are not only your speech and can be protected from laws and can be protected
00:47:20from government interference, but also now other people can sue you for your algorithms
00:47:24too.
00:47:25Yeah.
00:47:26I mean that like breaks the internet, right?
00:47:29Like, cause it doesn't just break TikTok, which arguably should be broken because I
00:47:33hate its algorithm right now, but it breaks, it breaks Facebook.
00:47:38It breaks Google because Google is using an algorithm to re to recommend search results.
00:47:42Yeah.
00:47:43You're showing me bad stuff is now legally actionable is like, Whoa, off we go, you know?
00:47:49And like, yeah, fine.
00:47:51But we're, I don't think we're ready for that world either.
00:47:53So the other side of the coin, the non Google side of the coin also sort of up for grabs
00:47:56because of what's happening in the legal system, which just, again, say this to you, I'm trying
00:48:00to have the end of summer here.
00:48:02I'm trying to drink a pina colada and the PDFs just keep coming nonstop killing me.
00:48:07All right, let's do one more PDF and then we got to get out of here.
00:48:10The pina colada is a wait.
00:48:12This one is a tough one cause it's not actually the law yet.
00:48:15And I actually don't know if Gavin Newsom is going to sign this bill, but the California
00:48:19state assembly passed an AI safety bill.
00:48:21What's going on in here, David?
00:48:23So it's called the safe and secure innovation for frontier artificial intelligence models
00:48:27act, which just rolls off the tongue.
00:48:30Is that spell anything or S S I F a I M a CIF fama.
00:48:37No, all good laws are backgrounds.
00:48:41Everyone needs to remember this.
00:48:42If your law is not a backer name, it's not a good law.
00:48:45It's just, it's so important.
00:48:47Ask Chad, you need to make a name, man.
00:48:49Yeah.
00:48:50Right.
00:48:51Can you like the, the insane things people have done?
00:48:53I think the people writing this bill weren't fans of AI, though it's true that they weren't
00:48:58going to have to chat GPT do it.
00:49:00This law, they should have found a way to make it spell out artificial with all the
00:49:03letters and it would have been amazing, but alas it is mostly just called SB 10 47.
00:49:10It's an AI safety bill that I would say concerns itself as we've been talking about in part
00:49:16with who is responsible for bad things that happen downstream of AI models.
00:49:21And it changes the way that companies need to test and train these things.
00:49:25It changes the way that they need to talk to the government.
00:49:27Like it's, it's kind of a sweeping, like, here's how we want to oversee and think about
00:49:33AI models.
00:49:36There's been a lot of backing and forthing open AI kind of came out against it.
00:49:40Anthropic was like, we love it.
00:49:42Let's change some stuff.
00:49:44There have been a bunch of politicians who are against it, a bunch of politicians who
00:49:48are for it.
00:49:49I agree.
00:49:50It seems very debatable whether Gavin Newsom will sign this thing as it currently stands.
00:49:56But it does still feel like a moment that it passed out of the assembly the way that
00:50:02it has.
00:50:03If he signs it, does that just mean like Google open AI, a lot of these companies relocate?
00:50:07Like, well, California is still a huge market.
00:50:10Yeah.
00:50:11But also, one of the things a lot of the companies have said, essentially, is, why are you making
00:50:17a law to make us do what we've already promised to do, which is essentially make sure our
00:50:22AI is not going to destroy the world.
00:50:24And I would argue that's a very funny line of reasoning to be like, we said we're going
00:50:28to be cool.
00:50:29Why are you going to make me be cool is an odd stance.
00:50:34But my sense is, and again, all of this could change, especially as the law, you know, continues
00:50:40to change.
00:50:41And depending on how it gets implemented, is that a lot of these companies are going
00:50:43to not like it, but we'll figure out a way to play along.
00:50:46I don't think this is the sort of thing that is like going to run anybody out of California.
00:50:50Yeah.
00:50:51Yeah.
00:50:52Anthropic sent a letter basically supporting it.
00:50:55And to your point, they're like, yeah, but then you read the letter and we should definitely
00:51:00do something.
00:51:01Here's just a line from Anthropic's letter.
00:51:02We believe SB 1047, particularly after recent amendments, likely presents a feasible compliance
00:51:07burden for companies like ours.
00:51:09In light of the importance of averting catastrophic misuse.
00:51:13Yeah.
00:51:14Cool.
00:51:15Yeah.
00:51:16There's a real chance we might destroy the world.
00:51:19So let's do a feasible compliance burden.
00:51:22Whoops.
00:51:23Yeah.
00:51:24I don't know how this plays, you know, the Biden had the AI order that just came out.
00:51:29A bunch of companies said they were going to join the AI model board that make sure
00:51:34there's safety testing and releases results.
00:51:37This one feels like in particular, Gavin Newsom has to make a decision and then that
00:51:44will decide whether a bunch of similar state laws get passed all around the country.
00:51:49Because once California does it, it's like, fine, like Newark will do it tomorrow.
00:51:53Right.
00:51:54Right.
00:51:55Because now you're not making them do anything.
00:51:56You're just like off to the races.
00:51:58And then Alex, your point, the question is like whether Texas does or doesn't do it,
00:52:02and it doesn't matter because all the researchers are in California anyway, right?
00:52:06And you have to sell to the California market and then there you go.
00:52:09So I'm very confused about this one in the sense that what you really need is a federal
00:52:15law.
00:52:16And that doesn't seem likely and it's also election season and who knows what's going
00:52:21to happen.
00:52:22But it also seems like Newsom kind of doesn't want to sign it, like he hasn't said anything.
00:52:26Like usually when you're the governor of the big state with the first in the nation AI
00:52:31safety bill, you're like pounding the pavement.
00:52:33And here he literally has not said anything to anyone.
00:52:35Like affects who gives him money for his presidential campaign in four years.
00:52:40Yeah.
00:52:41I mean, yeah.
00:52:42I mean, I also think like we're in a moment right now where all of the money on planet
00:52:49Earth is being thrown at AI.
00:52:52And there are a lot of very powerful, very rich people who will be happy to yell about
00:52:57you being anti-progress if you sign a bill that does anything perceived to be slowing
00:53:02it down.
00:53:03And I think it was open AI that was like, we need a federally driven set of AI policies
00:53:08rather than, you know, just a bunch of different state laws about how this and that's what
00:53:12will really foster innovation.
00:53:13And this is like the same thing everybody says about everything, which is like we would
00:53:17so so love to have regulation as long as it's like the perfect regulation and doesn't slow
00:53:24anyone down or cost us any money.
00:53:26But it would be like so good.
00:53:27Like Mark Zuckerberg did this, the crypto industry has been doing this forever.
00:53:31Nobody wants there to be rules until there are rules.
00:53:34And then everybody's like, well, what if there was just like a simpler set of rules that
00:53:36I could just like, yeah, like, that'd be fine.
00:53:38By the way, in case you're wondering what SB 1047 says, you have to make it possible
00:53:43to quickly and fully shut the model down.
00:53:45You have to write.
00:53:46You just need to unplug the terminator.
00:53:48That seems fine.
00:53:49Yeah.
00:53:50You have to ensure the model is protected against unsafe post-training modifications,
00:53:54which is hard.
00:53:55And then you have to have a testing procedure to see whether a model or its derivatives
00:53:59are especially at risk of causing critical harm.
00:54:02So this is a lot like some of that is pretty fuzzy.
00:54:04Some of it is like you just have to be able to unplug it.
00:54:06I was like, those last two are super fuzzy.
00:54:08Yeah.
00:54:09They're like, how do you prove this isn't going to destroy the world is actually a really
00:54:12hard question.
00:54:13What is critical?
00:54:14I would argue that if you're watching a product and you have to ask that question, we should
00:54:19ask that question a lot, right?
00:54:21Like Ford doesn't come out with a 2025 Bronco Sport and immediately be like, will this destroy
00:54:27the world?
00:54:28Did you watch Oppenheimer?
00:54:29I'm just saying.
00:54:30He said the probability is near zero.
00:54:31There's so few products in the world where we're like, what, is this going to destroy
00:54:37the world?
00:54:38If you are really crushing as a project manager, you're not going to ask those questions.
00:54:41You're going to be happier hitting deadlines.
00:54:43You got user stories to collect.
00:54:44You don't have time to worry about that.
00:54:47I'm just like, you know, the TikToker is doing like bottled water.
00:54:50It's not like...
00:54:51All right.
00:54:52By the way, there's just a reminder for me, mental note, there's like two or three TikTok
00:54:58gadget companies.
00:54:59I think we should write profiles.
00:55:01I won't tell you who they are, but if you have thoughts, email us at The Verge.
00:55:05There's one in particular.
00:55:06I'm like, where did they come from?
00:55:07Are they the next anchor?
00:55:09Just a little mystery drop.
00:55:11All right.
00:55:12We got to take a break.
00:55:13We're going to come back.
00:55:14Lightening Round Part Two.
00:55:15The fun one.
00:55:16Huh?
00:55:17The fun one.
00:55:18Unsponsored.
00:55:20All right, we're back.
00:55:26In the break, I disclosed to David what company I was thinking of and Liam immediately said
00:55:31he bought something.
00:55:32And now David is writing a profile.
00:55:33It's happening.
00:55:34It's going to happen.
00:55:35I'm not telling you who it is.
00:55:36Just immediate.
00:55:37It was incredible.
00:55:38If you can guess, if you send us an email and you guess, maybe something good will happen
00:55:42to you.
00:55:43We'll ship you whatever Liam bought from them.
00:55:45No.
00:55:46All right.
00:55:47Okay.
00:55:48We're doing it this way.
00:55:49We're doing true Lightning Round, unsponsored.
00:55:52No one can tell us what to do because they haven't paid us any money.
00:55:56What I'm suggesting here is you could pay us money.
00:55:58I don't know how that works and I probably still won't do what you say, but live a life
00:56:04of possibility.
00:56:05True Lightning Round.
00:56:06I had one Vergecast listener slash startup executive, who I will not name, who asked
00:56:11me very seriously recently if we actually want people to sponsor the Lightning Round
00:56:16or if we enjoy this bit so much that we actually don't want people.
00:56:19And I was like, no, to be very clear, I like money more than I like bits.
00:56:22We will take, please sponsor the Lightning Round.
00:56:25Can I do an aside about the influencer?
00:56:27I've been thinking a lot about journals, influencers, the whole thing.
00:56:31Here's the problem that we have.
00:56:33When you give other people money, they do what you say.
00:56:38People come to our sales team and like, well, they do what we say.
00:56:40And we're kind of like, no, just like flatly no.
00:56:44There's a real problem with journalism as a whole right now, fighting an uphill fight.
00:56:48But there's always a chance, huh?
00:56:52I do think our enthusiasm for a sponsor of the Lightning Round might make it the greatest
00:56:57ROI in the history of the Verge.
00:57:00We will talk about whoever that sponsor is forever, unless they suck, in which case we
00:57:05will see.
00:57:06You never know.
00:57:07Life's a game of chance.
00:57:09Roll the dice, people.
00:57:10Sponsor the Lightning Round.
00:57:11All right.
00:57:12True Lightning Round.
00:57:13I'm going to read a headline.
00:57:14We're going to get back to it.
00:57:15We're going to move on.
00:57:16Right?
00:57:17This is the idea?
00:57:18Yeah.
00:57:19Okay.
00:57:20Here we go.
00:57:21Google Gemini will let you create AI-generated people again.
00:57:22Remember the diverse Nazis?
00:57:23No.
00:57:24Yeah.
00:57:25Those were the days.
00:57:26We're back.
00:57:27Google says they fixed it.
00:57:28Only white Nazis now.
00:57:29Wait.
00:57:30You all talked at length about what is a photo last week of reimagining the pixel?
00:57:33Yeah.
00:57:34I had a bit of a freak out that I still am getting a lot of notes from people about,
00:57:38and I feel very good about as time goes on.
00:57:41Yeah.
00:57:42By the way, yes, we're going to defend the sanctity of photography here in the version.
00:57:47I feel nothing about this.
00:57:49My only question is, it feels like Google's attitude about this has slightly shifted.
00:57:53Right?
00:57:54Like, the diverse Nazis were a thing, and they killed it because now they're like ... Actually,
00:57:58they killed it because a bunch of people were like, we should be able to make diverse Nazis.
00:58:02Wouldn't it be weird?
00:58:03And they took it down because they're like, we don't want to ... I don't know how Gemini
00:58:07is going to work now, but reimagine, they're just like, yeah, let it ride.
00:58:10Right?
00:58:12I guess that's their attitude towards it.
00:58:13Yep.
00:58:14Well, it feels like something has shifted there, and I'm not sure what with Google.
00:58:17Was it six months ago that the Gemini thing first happened?
00:58:20Yeah.
00:58:21It was earlier this year.
00:58:22Yeah.
00:58:23Wild how much this has shifted in six months, because I totally agree with you.
00:58:27There's more safety rules there.
00:58:29Reimagine is not supposed to add a bunch of drugs to ... I don't think that's what they
00:58:33want.
00:58:34But you can't add people at all.
00:58:36You can't add people.
00:58:37It won't touch people.
00:58:38We don't know how this one's going to work.
00:58:39So there's some shift.
00:58:40All I'm saying is there's a shift.
00:58:42It's not the next thing we're going to talk about, which is just a free-for-all, but there's
00:58:46some little shift inside of Google that says, okay, the market is willing to accept a little
00:58:50bit more crazy, and here we are.
00:58:52Seeing it free-for-all, XAI's new grok image generator floods X with controversial AI fakes.
00:58:58That is a very soft headline.
00:58:59Again-
00:59:00Wait, that's ... Okay.
00:59:01Here, little quick-
00:59:02We're almost like a little too nice.
00:59:04No, little quick how the internet works thing.
00:59:07That's actually the SEO optimized headline that goes to Google.
00:59:10Can I read you the headline that's on our website?
00:59:12This makes more sense.
00:59:13Yes.
00:59:14X's new AI image generator will make anything from Taylor Swift in lingerie to Kamala Harris
00:59:17with a gun.
00:59:18Yeah.
00:59:19We should ... I don't know.
00:59:20That's what Google wants.
00:59:21Kamala Harris with a gun is all anybody's searching nowadays, let alone the other thing.
00:59:27It's bananas.
00:59:29Again, I've watched hilarious videos of Donald Trump and Hillary Clinton in an arm standoff
00:59:36on X, and the underlying technology is called Flux, which is an open-source AI system.
00:59:42Even if X added the controls, you've now sort of opened everyone's eyes to Flux, which is
00:59:47open-source.
00:59:48You can just do whatever you want with.
00:59:49Flux is, by all accounts, very good.
00:59:51It's very good.
00:59:52You can just go watch the thing.
00:59:53It's just very funny that grok ... Elon Musk, he wants to be a famous person, right?
01:00:00He released his own completely out-of-control deepfake tool on his own platform, and the
01:00:05number of deepfakes of him are out of control.
01:00:10What was the one you were saying you were seeing before?
01:00:13It's Elon Musk.
01:00:14It's a long video where Elon Musk narrates his ayahuasca journey.
01:00:21There's a nightmare dream sequence in the middle of it, and at the end of it, he commits
01:00:26to giving free food to everyone, like soup kitchens, and also says he's going to build
01:00:30a public transport system, and it's perfect.
01:00:35It is a perfect Elon Musk troll.
01:00:37That's amazing.
01:00:38Robot Elon Musk voice.
01:00:40Deepfakes should be illegal.
01:00:43Speaking of laws that are coming, a lot of people agree that deepfakes should be illegal,
01:00:47especially non-consensual pornography deepfakes.
01:00:50This is going to be a weird fight.
01:00:51It's going to be a weird fight.
01:00:54Scarlett Johansson does not want her voice being used by OpenAI.
01:00:57She's filed that complaint.
01:00:59I do not know how these are going to go, because it feels like we just let the cat out of the
01:01:04bag.
01:01:05It's not feasible compliance.
01:01:06Here we go.
01:01:07We have a picture of Mickey Mouse smoking a cigarette on TheVerge.com today.
01:01:10It's so good.
01:01:11Yes.
01:01:12Next one.
01:01:13Smart home company Brilliant has found a buyer.
01:01:14David, did you ever have a Brilliant?
01:01:16I did, briefly.
01:01:17Wow.
01:01:18I knew it.
01:01:19I had one of the earliest, earliest ones.
01:01:21I met with their founders before they even launched the product, and they gave me a prototype
01:01:25of one.
01:01:26I put it in my wall, and then I had a balcony at my apartment, and it turned on the balcony
01:01:31light permanently.
01:01:33It wouldn't turn it off.
01:01:36And that was my experience with the first Brilliant.
01:01:39Brilliant stuff actually got very good.
01:01:41They made most of their business going into apartment buildings and stuff, but just never
01:01:46really made it work.
01:01:48Yeah.
01:01:49This is the, what if you had an iPod touch in your wall that controlled all your smart
01:01:51home stuff.
01:01:52Right?
01:01:53Yeah, exactly.
01:01:54And Jen Tuohy, our brilliant smart home reviewer, reported they were going out of business,
01:01:59which they kind of denied, and then they went out of business.
01:02:02Rescued.
01:02:03Rescued.
01:02:04Rescued by those white knights of American capitalism.
01:02:05Private equity.
01:02:06Ah, our people.
01:02:08We love them.
01:02:09Rescued.
01:02:10The company is now called Brilliant Next Gen, which is just brutal.
01:02:15And it seems like what we know so far is that people who had Brilliant stuff, their stuff
01:02:21is going to continue to work.
01:02:24What happens after that, I don't know.
01:02:28The new leadership did say they're going to stop selling direct-to-consumer stuff anywhere
01:02:32other than their website, which I think is a pretty strong signal that they are going
01:02:37further and further into the selling into new construction business, which makes a certain
01:02:43amount of sense.
01:02:44A lot of these companies, like Amazon has focused a lot on that with Alexa stuff.
01:02:47There's been this big race to be the sort of default choice for professional builders.
01:02:51I know that when I was still in the rental market, what I looked for in an apartment
01:02:55was a six-year-old unsupported iPod Touch built right into the wall.
01:02:58It's the dream.
01:03:00Same.
01:03:01So I was like, get these non-iPod Touch apartments out of my face.
01:03:07Yeah.
01:03:08But I think this company has always been, I would say, more sort of B2B than a thing
01:03:12a regular person would consider buying.
01:03:15You go to Home Depot and you're going to buy a Lutron, not this.
01:03:18I think that's pretty clear.
01:03:21But it'll be interesting to see if they can get it righted or if this just becomes kind
01:03:26of a slow death for a smart home company.
01:03:29Yeah.
01:03:30I feel like if you have one of these in your wall, you're running at the clock and that
01:03:32cloud service continuing to work.
01:03:34Yeah.
01:03:35That's right.
01:03:36Yeah.
01:03:37All right.
01:03:38ESPN, where to watch feature, finds where to stream sporting events.
01:03:41They basically made TV Guide, but for streaming.
01:03:43That has ever happened to me in my entire life.
01:03:44This is pretty funny.
01:03:45I spent like a lot of yesterday trying to find the catch in this.
01:03:48And I think there's like a big galaxy brain strategy, but also this is just like a good
01:03:52nice thing for the Internet.
01:03:53ESPN released a thing that just gives you a list of all the games that are being played,
01:03:57which is like a thing that every sports app has, right, except for Apple Sports, which
01:04:00doesn't.
01:04:01But you can go and you can see where all the games are, what the score is, whatever.
01:04:05And now it just has a TV Guide thing where you can.
01:04:08It'll say it's playing on MLB TV or it's playing on TNT or this MLS game is on Apple TV plus
01:04:14like it just tells you where to stream sports, which seems like a thing that should not be
01:04:19complicated or need to exist.
01:04:20A picture of a pirate ship and a Reddit logo anywhere in this app.
01:04:25Yeah, there's just one that says parentheses.
01:04:27It's a Russian website against like half of these.
01:04:32You will be extradited.
01:04:36But like ESPN did this whole big media day on Wednesday this week and essentially made
01:04:44clear that it is making this pivot from being a cable channel to like a sports lifestyle
01:04:48brand. That was how Sarah Fisher and Axios explained it.
01:04:52And I think that was really right. They're about to launch a streaming service that shows
01:04:56you all the ESPN stuff like ESPN, the cable channel is going to be a streaming service.
01:05:00They're part of venue sports.
01:05:01ESPN whole thing is they're just like, we want you to come to ESPN when you want to
01:05:05watch sports, even if you go somewhere else because they have the rights.
01:05:08If you open the ESPN app first, we win, right, because that's how you bet.
01:05:11That's how you get into fantasy, all this stuff.
01:05:13And so ESPN is like deep in the weeds of like, how do we become a destination for all
01:05:20things sports all the time?
01:05:21And this is both a very good idea in that vein and also just like a useful page that I
01:05:26am going to load every single day for the rest of my life.
01:05:28I have a question. Was this originally supposed to be a venue launch party or
01:05:33something? Because wasn't venue like supposed to launch around now before Fubo
01:05:36killed it? It was.
01:05:37So they sued them.
01:05:39Another another antitrust lawsuit in the background.
01:05:42Yeah, the entire Internet economy, maybe.
01:05:45But then he was supposed to have everything in it.
01:05:47Then he was the opposite strategy.
01:05:49Right. ESPN is trying to send you to every other service.
01:05:52And then he was like, we're going to buy everything.
01:05:54And Fubo's dead. And Fubo was like, hold on a minute.
01:05:58ESPN was going to be a part of this.
01:05:59So then, you know, you'd go and you'd look and be like, oh, what should I watch?
01:06:02Oh, it's all on venue. I sure should subscribe to venue.
01:06:05It's a TV guide. But the answer is a one channel.
01:06:08It's just one channel every time.
01:06:10I don't see it.
01:06:11But you know what's so telling about the sports industry is it wouldn't have been
01:06:15that way. The thing that is ostensibly the streaming service for all the sports
01:06:19is not actually the streaming service for all the sports.
01:06:22This is why it's so terrible.
01:06:23Yeah, it's going to be like 60 percent, I think.
01:06:25Yeah, which is fine.
01:06:27And I think venue, if it eventually launches, will be a useful thing
01:06:30that I'm sure I will give too much money to.
01:06:31But like the fact that you can go on here and it will tell you
01:06:35the game is on Prime Video or that the game is on Apple TV Plus
01:06:39is like a genuine user interface victory.
01:06:43And I also think it makes a lot of sense for ESPN as like a big
01:06:46how low our standards have fallen.
01:06:48This is what I mean. TV guide.
01:06:50Literally, it looks like we're moving on, David.
01:06:53I feel like you're going to try to convince some people to buy this thing.
01:06:55And I'm telling you already immediately saying it.
01:06:59I have it. I have it.
01:07:00That's why this thing is called flawed note pin. Yeah.
01:07:04And it's we're just in this phase of everybody launching
01:07:07little tiny voice recorders that then use chat GPT
01:07:11to summarize whatever they hear, like they all have bigger ideas.
01:07:15But that's essentially what it is.
01:07:17This one is just cool because it's like a little it's a little guy.
01:07:20It's a wearable.
01:07:21It comes with a lanyard and a clip and something else.
01:07:27Oh, and a thing you can wear on your wrist.
01:07:28So the idea is like it's a it's a wearable.
01:07:30You you tap it. You talk to it.
01:07:32It summarizes your notes.
01:07:35Is that anything?
01:07:37I don't know.
01:07:39But this is like this is the thing, right?
01:07:40Like Microsoft recall is a version of this.
01:07:42The limitless thing which we've talked about is a version of this.
01:07:45This idea of like, how do we make it easy for you to input
01:07:48all of your stuff into an AI system and then it will make something out of that
01:07:53for you is like the big new product question for a lot of these companies.
01:07:58I don't know if any of it amounts to anything.
01:08:00I have just been sitting here like yelling thoughts into this thing all day,
01:08:03and it just keeps transcribing them, being like,
01:08:06you need to remember that your meeting is in an hour.
01:08:09I'm like, is this useful?
01:08:10What did I get from this?
01:08:11This is like, what were those journals or the pens in the notebooks?
01:08:15Had the live scribe dots. Yeah.
01:08:17And I wanted to be a live, a live scribe person so bad, so bad.
01:08:22And now it's a little trashy BT bubble device.
01:08:25I look forward to your review.
01:08:26I encourage everyone to wait for the inevitable review.
01:08:30But the good news is this is a thing I can actually do,
01:08:34which like summarize a book like voice to text to summarization.
01:08:40I is good at all of those steps, unlike so many other things.
01:08:44Like I watched a very funny video of somebody reviewing the brilliant frame
01:08:48AR, AI glasses, and I got such flashbacks to doing the review of the human
01:08:53AI pen where you're just sitting there asking it really basic questions.
01:08:56He was sitting there holding up like a
01:08:58a I forget what it was a drink of some kind, I think.
01:09:01And it kept being like, that's a bag of potato chips.
01:09:03He's like, no, it's not.
01:09:05And it's just like, oh, yeah, this is where I is.
01:09:06But summarizing texts on point.
01:09:09All right. I look again.
01:09:10I look forward to your review.
01:09:11And if you don't have the chart of wearable bullshit in the review,
01:09:14here's what I'm saying.
01:09:15The maker of the books, Palma has a new cheaper e-reader.
01:09:19Don't get it. You don't get it.
01:09:21No, wait, you're saying not to get it.
01:09:23OK, so it's one hundred fifty dollars.
01:09:24It's the books six go.
01:09:26And it's their new small, low budget, cheap e-reader that also runs Android.
01:09:31So you think, oh, these are all good things.
01:09:33Oh, it's cheaper than the books.
01:09:34Palma. But I get a wider screen.
01:09:36That sounds great.
01:09:37Then you look at how much RAM it has
01:09:39and then you look at how much RAM the books Palma has.
01:09:42Oh, my God. It's only two gigs of RAM.
01:09:44Only two gigs of RAM.
01:09:45Don't don't I like I've had a couple of books products
01:09:48where they where they skimp on the RAM and you feel it.
01:09:52It's real, real rough.
01:09:53Yeah, so that's the sort of thing that like on a Kindle,
01:09:55you don't really have to worry about RAM
01:09:56because it's kind of only doing one thing.
01:09:58Yeah, but like you're going to accidentally have three apps open
01:10:01at the same time on this thing, and it is just going to set itself on fire.
01:10:04Yep. So don't like resist the urge.
01:10:07I know books put more RAM in like it's not that you're not Apple.
01:10:12It's I mean, at this point, we are just directly controlling books.
01:10:14His business. I'm trying.
01:10:16We will tell you what to do.
01:10:18All right. Speaking of RAM,
01:10:21it's makes no sense.
01:10:23Sure. The Dyson Airwrap ID is a new, smarter hair curler.
01:10:27So I put this on here because one is Alex has mentioned to us many, many times
01:10:31the Dyson Supersonic, that's the hair dryer and the Airwrap
01:10:34legitimate gadgets like insane gadgets in the classic Dyson mold of we made a fan.
01:10:41What can have a fan?
01:10:42And remember when they got all the way to the car and they're like, no fan.
01:10:45You know, they stopped fans.
01:10:48They got to their do it like a hair serum now.
01:10:51And I was like, where is the fan in this hair serum?
01:10:53Yeah, I'm calling it right now.
01:10:54Dyson, you're way out your lane. Yeah.
01:10:56Fans only. Yep.
01:10:58I really wanted the car to have a fan.
01:11:00Anyway, tell me about it.
01:11:02So this is the new Airwrap.
01:11:04They've released a couple of different additions at this point.
01:11:07This one's got a couple of different
01:11:10extensions, one that like just is going to suck your hair slightly differently.
01:11:15I probably should explain what an Airwrap does after saying that.
01:11:18It sucks your hair.
01:11:19I don't understand.
01:11:19Yeah, the Airwrap basically sucks your hair.
01:11:22You go in one of two directions.
01:11:24It sucks your hair in one of two directions.
01:11:26And and then you like you can put hot air on it or cold air on it.
01:11:30Yeah. And it's magic.
01:11:32And if you have ever tried to curl your hair or have thought about it, it's
01:11:36it's horrible. And this makes it easier.
01:11:38It's for those of us who don't want to worry about having hot metal near our faces.
01:11:42OK, so I'm going to ask you this question about this one.
01:11:44Yeah. The addition to this one is that it has Bluetooth.
01:11:48It's got an app. What?
01:11:51What? I think the app could probably be like you could just not get this
01:11:57and turn on use YouTube or TikTok.
01:12:00I would give you the same thing.
01:12:02But yeah, it's supposed to kind of guide you through doing the hair
01:12:05and making sure you're holding it for long enough,
01:12:07because a lot of people would just do it once and be like, it looks good.
01:12:09Is this like the smart oven theory where you like
01:12:12you have a QR code of a hairstyle and then it just like does the settings for you?
01:12:16Yeah. Yeah. Yeah. It kind of walks you through it.
01:12:19I'm going to play with this app, obviously, but that's kind of the idea.
01:12:22It's just supposed to help you make things a little easier
01:12:25through some Bluetooth. And so, yeah, more gadget.
01:12:29But honestly, the the extensions for it are the cool part.
01:12:32They're doing like one for curly hair and one for straighter hair.
01:12:35So curly hair people get like more diffusers and stuff, which is very exciting.
01:12:40And it's just as stupidly expensive as the air was.
01:12:44And it's worth it. So whatever.
01:12:45Oh, somebody in your life is going to ask for one to get it.
01:12:48I bought one for Becky because Alex has been so high on these things for so long
01:12:52and for in the spirit of no one knows how to use it.
01:12:56I bought without the Bluetooth, the Bluetooth one. Yeah.
01:12:58Also, also, if I bought Becky a Bluetooth device, she'd be like, no, no, thank you.
01:13:03I'm never I'm never using this functionality.
01:13:06But for like a week, both Becky and Max looked like like full Texas
01:13:09cheerleaders, just big hair every day.
01:13:13Yeah. And that's how, you know, somebody gets one.
01:13:15Yeah. And then the weird thing is our air purifiers like light up
01:13:18every time we use it.
01:13:19And someone told me it's because of the hot air like the hair products use.
01:13:23It like lights up their purifiers.
01:13:24Yeah, that would make sense. Your hairspray.
01:13:27Yeah, you're not. Don't breathe that stuff.
01:13:29I mean, it smells great, but don't huff it.
01:13:31Here's I'm saying, Dyson, find more things to put fans in.
01:13:34That's your lane.
01:13:35And I fully support it.
01:13:38Get to a car, get to a hovercraft car.
01:13:42This is what I'm I'm trying to.
01:13:43This is why your first car failed.
01:13:45It wasn't floating on a bed of perfectly produced air.
01:13:49The next car. They got this. I believe in them.
01:13:51Do you think they were like great hair?
01:13:53Do you think they were like electric motors are kind of like fans, right?
01:13:56They're fans of the blades.
01:13:57And they're like, we'll do electric motors at scale.
01:13:59I'm dying. James Dyson, Sir James Dyson, come on the show.
01:14:03I have only one question for you.
01:14:04It's not worth a full decoder.
01:14:06Did you think electric cars are fans of that blades?
01:14:08Yes or no, sir?
01:14:11Just a yes or no.
01:14:12We're going to get James Dyson on the show and we're going to play.
01:14:14Can you put a fan in that?
01:14:16We're going to do that for one full hour of the broadcast,
01:14:19and it's going to be incredible.
01:14:20And I honestly believe he would say yes to doing that.
01:14:24This is my new goal.
01:14:25Can you put a fan in that with James Dyson?
01:14:29Gauntlet throne. Yeah, let's go.
01:14:31Snapchat finally launched an iPad app.
01:14:34Good, right?
01:14:37Huh?
01:14:38A little like the 13 year olds.
01:14:40They're like, they can use their iPads now.
01:14:43That's great.
01:14:44I feel like this belongs in our Is This Anything?
01:14:46Hall of Fame.
01:14:48Snapchat for iPad.
01:14:49Right.
01:14:49It's like it's a bunch of weird AI gadgets.
01:14:51And it's like Snapchat on iPad.
01:14:53Like, is this anything?
01:14:56What people want is Instagram.
01:14:58Yeah.
01:14:59And they're never going to get it.
01:15:01And Snap is like it clearly wants you to use it to, like,
01:15:05watch Snap originals and stories and stuff.
01:15:09I don't think this is going to be like a super kick ass
01:15:12messaging system.
01:15:14Snapchat for the iPad.
01:15:16Not so sure.
01:15:18All right.
01:15:18Speaking of Instagram, another classic all time headline
01:15:21on the Verge.com Instagram ads, what photos have always needed words.
01:15:26It's good.
01:15:27Yeah.
01:15:28I look at a lot of photos and think, what if it just had a letter on it?
01:15:31Basically, you can now put text on photos in the Instagram editor,
01:15:35which is fine because you could already do that.
01:15:37I was going to say, everyone has figured out how to do this anyway.
01:15:40But the fact that Instagram is slowly becoming Canva because
01:15:43Instagram is slowly becoming Craigslist.
01:15:46Very real.
01:15:47Oh, that's OK.
01:15:48So the reason I put this in here is I was going to ask,
01:15:51are we headed towards a world in which threads and Instagram
01:15:55are just the same thing?
01:15:56Instagram is going to get texty and threads is going to get photosy
01:16:00and they're just going to be the same.
01:16:01But what you're saying is threads is going to become Instagram
01:16:03because Instagram is becoming Craigslist.
01:16:05No, threads isn't going to become Instagram.
01:16:07Yeah, I think they know that the point of threads is extremely
01:16:10leading questions about nothing.
01:16:12And they're just going to keep leaning right into it.
01:16:14I come up with lists of questions to ask threads like every day.
01:16:18I was walking down the street and I saw a guy wearing headphones.
01:16:21Has anyone seen one of these before?
01:16:23All day long.
01:16:24I could be the most popular person on threads tomorrow.
01:16:29You can just do it all day long.
01:16:31You just like look at things.
01:16:32Has anyone ever seen this before?
01:16:34I saw one today.
01:16:36Today, a person posted, they were on an airplane.
01:16:38And you know, sometimes they're on an airplane,
01:16:40you're boarding and there's conversation going on.
01:16:42And they're like, Oh, I saw that.
01:16:43I saw that.
01:16:44They were just like, I've never seen this on a plane before.
01:16:46And I fly all the time.
01:16:47My plane is wet.
01:16:48Just like millions of replies of people being like,
01:16:51you've never been on a plane before.
01:16:52Yeah.
01:16:53A lot of people were asking.
01:16:54She was asking if she was going to die.
01:16:56And people were like, no.
01:16:58But I'm just telling you the fact that like millions of people will
01:17:01fall for the worst engagement bait on threads means we've learned
01:17:04nothing.
01:17:05You know how like we just did the like you're here because you,
01:17:08you think you're here.
01:17:09You're here.
01:17:11You know how like we just did the like you're here because you,
01:17:13you think people are smarter than Photoshop, you know?
01:17:16And everyone's like, we already know this.
01:17:18I'm like, I'm going to show you this picture of condensation on a
01:17:21plane and millions of people bringing one of our biggest tech
01:17:25companies,
01:17:26algorithms to its knees over the dumbest engagement bait in the
01:17:29world.
01:17:30It is fun.
01:17:31Just as like an intellectual exercise to look around and just think
01:17:34like, what engagement baby question could I ask?
01:17:37You can just post a picture of anything, but what is this?
01:17:39Yeah.
01:17:40Like my water has bubbles in it.
01:17:41How did that happen?
01:17:42This is what we're doing from now on.
01:17:45I'm just get ready.
01:17:46It's coming.
01:17:47Anyway.
01:17:48My, my point is, I think they know that they want that thing.
01:17:50They want a bunch of calm tweets during NBA games, right?
01:17:54Like you, that's not actually like a visual communication moment,
01:17:57but it's a place for people to engage and they just want the
01:17:59engagement and put the ads there, whether or not that's photos or
01:18:01not.
01:18:02I think they're happy with what they've got.
01:18:03I'm just saying Instagram is becoming a marketing platform.
01:18:05It already was a marketing platform and increasingly what they're
01:18:08marketing is small business services.
01:18:09So being able to just put that picture on your yoga studio with the
01:18:13text, it's like yoga class.
01:18:14Like they're just going to do it.
01:18:15And that, I think that they're, that's why I was like, it's Canva.
01:18:18Cause that's what Canva does for people.
01:18:21Great.
01:18:22Cool.
01:18:23Soon.
01:18:24It will just be AI.
01:18:25Like generate me a photo of a yoga studio.
01:18:26It'll be fine.
01:18:27All right.
01:18:28Last one.
01:18:29We've gone this whole time without talking about it.
01:18:31We're just going to talk about it for one second in the slide around
01:18:34because I'm confident next week we'll have a full preview.
01:18:36Apple's iPhone 16 launch event is September 9th.
01:18:39It sure is glow time, right?
01:18:42That's the tagline.
01:18:43That's gotta be for the Siri, right?
01:18:45The Siri effect.
01:18:46Yeah.
01:18:47Which I'm excited about.
01:18:49Okay.
01:18:50I have, I, I, I see this going one of two ways and I'm curious which
01:18:53you guys think it is one.
01:18:56This is just another AI show.
01:18:59There's going to be some new hardware they're going to, but all they're
01:19:01going to talk about is how like the new,
01:19:04whatever chip inside of the iPhone 16 gives you more AI stuff.
01:19:11Or this is going to be like a massive new hardware rev.
01:19:18We're going to get the camera buttons.
01:19:19We're going to get new designs, new colors, all kinds of stuff.
01:19:22Like I could see this being either incredibly boring or incredibly huge,
01:19:27cool, exciting new iPhone year.
01:19:29And I kind of feel like it's nothing between, and it all seems to me,
01:19:32it depends on how exciting Apple really thinks Apple intelligence is.
01:19:36I think it'll be between cause that's,
01:19:38they threaded that needle at WWDC where they took all the really cool stuff
01:19:42that was going to happen.
01:19:43And they were like, this is happening.
01:19:45So we're going to be like button cool, new phones for the rest of the show.
01:19:50That is true.
01:19:51Like WWDC was actually like very good and totally overshadowed by the
01:19:56weirdness of Apple intelligence.
01:19:58Yeah. I think, you know, they've got to talk about it.
01:20:00It's the thing that's coming next year and this phone will be the bleeding
01:20:04edge of it.
01:20:05I think the real question is whether they bring those features to the base
01:20:09model phone for lack of a better word, right?
01:20:12Does the iPhone 16 get Apple intelligence or just the 16 pro?
01:20:15Cause only the 15 pro is getting it right.
01:20:17If the 16 is getting it, the regular 16,
01:20:20that's the big upgrade cycle and they're going to reintroduce all of the
01:20:24features because that's,
01:20:26they know that general consumers do not pay attention to WWDC, right?
01:20:31That's, that's the tech audience. That's developers.
01:20:33Apple releases new iPhone.
01:20:35It's going to have the features coming later this year.
01:20:38You'll be able to tell Syria to do whatever you can tell Syria to do.
01:20:40That's your good morning, America.
01:20:42Yeah. We've seen some rumors.
01:20:44I think that it was going to come for the 16, right?
01:20:46I assume so, but I'm just saying like, that's my guess that like, this is,
01:20:50that's how they balance that out. I'm excited for a bunch of other stuff.
01:20:53I think we're also expecting new watches, new watches, uh,
01:20:57new AirPods and for Mac mini maybe. Right.
01:21:01Is it weird that that is the thing I want by a month?
01:21:05We're not going to get any computer.
01:21:06I can keep everything else the same for another year, but I,
01:21:10an M four Mac mini to replace the M one Mac mini that is sitting here.
01:21:13I want it so bad.
01:21:14If that is here in two weeks, I don't know.
01:21:17I don't want to pay you guys money.
01:21:18All right, we got to stop. This was the lightning round.
01:21:20We're going to have a full preview next week.
01:21:22When I go do some reporting, think some real thoughts preview.
01:21:26That's that's next week's episode.
01:21:28That's why we ended the lightning around the unsponsored lightning round.
01:21:31Tim cook. You could have, you could have, you could have tried.
01:21:34Could have been you could have been you could have actually planned.
01:21:37That's the big announcement on September.
01:21:41Finally, Iverges here.
01:21:43They're going to put it up.
01:21:44Iverges real.
01:21:46That's a deep cut.
01:21:47That's the verge cast.
01:21:48We've got one big story.
01:21:50We did a deep dive into OpenSea this week.
01:21:52It is a great story with some incredible quotes in it.
01:21:54And also one of those,
01:21:56one of those moments that like fills an editor in chief hearts with joy.
01:21:59Part of our story is we didn't know if OpenSea had gotten an SEC
01:22:02investigation notice,
01:22:03and they tried to front run our story by announcing the notice the day
01:22:05before we published.
01:22:06Sure did.
01:22:07Always, always a win.
01:22:08Go read that story.
01:22:09It's really good.
01:22:10We should also say, by the way, we're off Sunday and Tuesday.
01:22:12The third episode of our productivity series is actually going to run
01:22:15September 8th.
01:22:16So we'll be back Friday to do an iPad preview or iPad.
01:22:20We'll be back Friday to do an Apple preview and then Sunday with that.
01:22:23And then all kinds of Apple stuff the week after it's going to be wild.
01:22:26All right.
01:22:27That's it.
01:22:28That's our chest.
01:22:32And that's it for the verge cast this week.
01:22:34Hey, we'd love to hear from you.
01:22:36Give us a call at eight, six, six verge one.
01:22:38One,
01:22:39the verge cast is a production of the verge and box media podcast network.
01:22:42Our show is produced by Andrew Marino and Liam James.
01:22:45That's it.