• 8 months ago
The Verge's Nilay Patel, David Piece, and Alex Cranz discuss David's review of the Humane AI Pin, Taylor Swift's music back on TikTok, a new party speaker, and much more.
Transcript
00:00:00Hello and welcome to VergeCast, the flagship podcast of appreciating the good vibes on
00:00:07the hardware.
00:00:11That might be the truest one we've ever done.
00:00:13Yeah.
00:00:14Yeah.
00:00:15That's what we do.
00:00:16But we totally get it.
00:00:17There's room to make things even better.
00:00:18By which I mean functional.
00:00:19Hi, I'm your friend Eli.
00:00:21David Pierce is here.
00:00:22Hi.
00:00:23Alex Krantz is here.
00:00:25Hi.
00:00:26It's a big week on the VergeCast, all right?
00:00:27It's like maybe the biggest week we've ever had because David has reviewed the Humane
00:00:31AI pin.
00:00:32Yeah.
00:00:33It has come full circle.
00:00:35This really does feel like it's been a very funny experience in that like almost exactly
00:00:40a year ago was that first TED talk that Imran Chowdhury, the CEO and co-founder of Humane
00:00:46gave that we were just like, this is nothing.
00:00:48I don't believe it.
00:00:50We basically sat here and we're like, I don't believe any of the things that happened in
00:00:54this demo.
00:00:55And it turned out that's sort of true.
00:00:59We'll get to it.
00:01:00We'll get to it.
00:01:01And if you were a fan of arguing about the Vision Pro score, boy, get ready for us to
00:01:06talk about the Humane score from totally opposite perspectives.
00:01:10I'm ready.
00:01:11It's very good.
00:01:12That happened.
00:01:13Taylor Swift is back on TikTok.
00:01:15I have some completely irresponsible gossip to share about the status of TikTok and Universal
00:01:19Music Group.
00:01:20I love this.
00:01:21And so if you're listening and you feel like you've always wanted to substantiate some
00:01:25gossip, get ready because I'm looking for you.
00:01:29There's other news.
00:01:30Open AI is in copyright trouble as always.
00:01:32So is Google.
00:01:33I mean, there's really important news, which is that Kobo announced color e-readers.
00:01:38Yes.
00:01:39All three of us are just stoked.
00:01:42The three of us that care.
00:01:43It's just Alex, Alex, and Alex.
00:01:44Again, the flagship podcast of appreciating the good vibes on the hardware.
00:01:48Yeah, exactly.
00:01:49That's 100 percent that story.
00:01:51And then the most important news of all, which is that Sony has a new party speaker.
00:01:55Exactly.
00:01:56We're appreciating the vibes.
00:01:57This really is a big Verge cast.
00:01:58Yeah.
00:01:59If you like the Verge cast, you're going to love this Verge cast.
00:02:02That I can promise you.
00:02:03Let's start.
00:02:04Let's get right into it.
00:02:06David, you magnetically clipped a robot to your body for several weeks.
00:02:12It was warm, as I'm wont to do.
00:02:14It got warm.
00:02:15It only ever spoke to me in, I believe, Korean or Arabic.
00:02:19Every time I tried to use it, it was locked in a translation loop.
00:02:22So I have no experience with it.
00:02:24You gave it a four.
00:02:25You said it didn't work.
00:02:26Tell us about it.
00:02:27Sure.
00:02:28So the whole idea of the pin and this new generation of AI gadgets is that your smartphone
00:02:34is actually slower and worse than you think.
00:02:38Basically, the idea is instead of taking out my phone and unlocking it and doing stuff
00:02:43and tapping on screens and typing things, I should just be able to touch the thing on
00:02:48my chest and it should do things on my behalf.
00:02:52And I think that is a super interesting idea.
00:02:54Anyone who has ever watched a sci-fi movie, ever, sort of knows what that looks like.
00:03:01And so I spent a couple of weeks running around asking it questions about where I was and
00:03:07asking it questions about the world and using it to try to play music and try to make phone
00:03:13calls, which you can do on the AI pin and you can send text messages.
00:03:17And it takes notes for you and all this different stuff.
00:03:20And I am simultaneously more bullish on AI gadgets than ever and so deeply done with
00:03:29the humane AI pin.
00:03:30All right.
00:03:31I want to unpack that.
00:03:32And I will say, bizarrely, a huge split in the comment base on this review.
00:03:38The commenters on YouTube love the review, as far as I can tell.
00:03:41And you did a great job.
00:03:42Your buddy, Ben Strauss, we got to talk to him.
00:03:44He's calling in the show later today.
00:03:47They're like, this is great.
00:03:49We love this review.
00:03:50All the things.
00:03:51And then our commenters picked up on the inherent tension of what David just said, which is
00:03:55this product is horrible, but I'm excited about this category and said, why?
00:03:59Nothing about this product has proven that this category is viable, which I would say
00:04:02is a fair criticism.
00:04:04Why do you think this thing, which does not work, I believe, is a sentence that is in
00:04:09your review, made you excited about this category?
00:04:12So, I mean, this is where we're going to get into arguing about the score, right?
00:04:16Because it's not that it doesn't work, ever.
00:04:19It works.
00:04:20Seriously.
00:04:21No, this is like, there were a handful of moments in two weeks of testing this thing
00:04:26that were like legitimately eye-opening, right?
00:04:29Like, standing in Penn Station, tap the thing.
00:04:32I'm like, tell me what restaurant this is and if it has good reviews.
00:04:36And somewhere between one and a thousand seconds later, it comes back and gives me like real
00:04:43information.
00:04:44Like, people like, this on the menu, it has 4.3 stars on Google.
00:04:48People say it's a little expensive, but the servers are friendly.
00:04:51That is like an astonishingly useful thing to have done for me.
00:04:57Just truthfully, so like one of the things I've discovered is like, I have a dog, so
00:05:00I am forever walking the dog, and I have a kid who is often only happy in a stroller.
00:05:05So I'm just forever out walking in my neighborhood.
00:05:10One thing that I do when I am forever out walking in my neighborhood is I end up like
00:05:14remembering things I need to do, so I'm constantly like pulling out my phone and using Siri to
00:05:19set reminders, or add something to my calendar, or just write a note down in my notes app.
00:05:25Every single one of those things, better to do on a device that is touching my body, right?
00:05:31Like, it was better on my chest than it was in my pocket.
00:05:36All true, right?
00:05:37So I had like a handful of these moments where I was like, oh, having this thing that actually
00:05:40does abstract away all of the things I have to do to get into my phone, and just lets
00:05:45me say the thing that's in my brain, and it's then out of my brain, awesome.
00:05:51Loved it, great.
00:05:53The problem with the pen is that it doesn't do it enough.
00:05:56And the problem with this whole category is that like I know how to write down a note
00:06:00in my phone, and I know when it has worked.
00:06:04It's annoying, but it works, and I can do it reliably every time.
00:06:10The pen, I just stopped trusting was the problem, right?
00:06:12So it's like when it works, it's cool, and it worked just enough that I was like there
00:06:17is something here.
00:06:20It's just so far away from that something at this moment, that it's like I can't in
00:06:24good conscience tell you to even try it, but like you should want the good version of this
00:06:29to exist.
00:06:30So I know you talked to Humane through the course of this review.
00:06:33They can't have been surprised by their own product.
00:06:36One assumes they're not hopelessly surprised by their own product.
00:06:40Why did they ship this now?
00:06:41Did you get a sense?
00:06:43I don't know the answer to that question, and I have asked that question to myself and
00:06:47to them many times.
00:06:48I think the honest truth is like there comes a time in the process of making hardware where
00:06:53you just have to ship the damn thing.
00:06:56It's just really expensive to not ship a product, especially a product that a lot of people
00:07:00have given you money for and you have made many of.
00:07:05I think if you were to rewind a year and tell Humane this is the point where they would
00:07:10be now, I would bet they would not be shipping this product right now because they also have,
00:07:15they claim this gigantic software update coming this summer that adds really basic stuff like
00:07:20setting timers.
00:07:21You can't set a timer.
00:07:23That's like the joke about assistants is the only thing they can do is set timers.
00:07:28The joke about Siri is they can only set one.
00:07:31It can't set any.
00:07:32It can't set any.
00:07:33Look, AI systems are bad at math, I think, historically, is the thing, and so just counting
00:07:38must be very challenging for them.
00:07:40It's actually down even harder.
00:07:44But yeah, so there's this big software update supposedly coming this summer that's going
00:07:47to add some of that functionality and fix some of the things that don't work now.
00:07:51So my guess would be they want to ship the August version of this thing today, and that
00:07:58is just not where they are.
00:08:00There comes a time when, for a variety of reasons, you're just locked in to when something
00:08:04ships, and it's really expensive and really hard, especially for a first-generation hardware
00:08:08company, to delay by months and months.
00:08:11It's just hard.
00:08:12Let's take it in pieces because there's a line in your review where you say, none of
00:08:15this is ready, not the hardware, not the software, not the AI.
00:08:18That's all of it.
00:08:19That's all the things.
00:08:21That's all of it.
00:08:22I'm ready.
00:08:23That's one thing.
00:08:24You very clearly are ready for this thing.
00:08:27David's a 10 out of 10.
00:08:28He's ready.
00:08:30There's a picture in the review of David using the thing where he just looks like the world's
00:08:33happiest secret service agent.
00:08:36He's just beyond thrilled to be tapping on his chest and whispering a secret.
00:08:41So you're obviously ready, but the thing isn't ready.
00:08:42Let's start with the hardware.
00:08:43Three pieces, right?
00:08:44Hardware, software, AI.
00:08:47What about the hardware isn't ready?
00:08:48The hardware is kind of in the thing we see a lot with first-generation hardware, which
00:08:54is just, it's full of little wonky bugs.
00:09:00The biggest one by a mile is thermals.
00:09:03This thing is small.
00:09:04I have it sitting right here.
00:09:05It's this big.
00:09:06It is shockingly small.
00:09:07I saw it in the office, and I was amazed at how small it is.
00:09:09Yeah, it is not a large thing, and it's a nicely made thing.
00:09:12It's made of aluminum.
00:09:13It's pretty durable.
00:09:14I've dropped it.
00:09:15I threw it in the wash the other day just to see.
00:09:17It's pretty big to put on your chest.
00:09:19Yes.
00:09:20Compared to all of the other chest wearables that I've had in my life.
00:09:26It's like brooch size.
00:09:28How many people do you know that wear brooches that aren't 80 and at church?
00:09:33Yeah.
00:09:34Not a lot.
00:09:35Huge market, though.
00:09:36Those people are sitting on an enormous amount of wealth that they have to transfer to their
00:09:39children at some point.
00:09:40I mean, yeah, and they can just transfer it in humane tents.
00:09:42Directly to humane.
00:09:43It's smaller than you think, but bigger than it should be, I think, is your favorite.
00:09:48Yeah, I think that's exactly right.
00:09:50It is both too big and impressively small, for sure, but the biggest ... I mean, I'm
00:09:55holding it now.
00:09:56I've been holding it in my hand for 15 seconds, and it's warm.
00:10:00I'm not kidding.
00:10:01That's a real thing that's happening right now, and that is the overwhelming issue, right?
00:10:05It gets warm when you use it.
00:10:07It gets warm so quickly when you use it that it pretty frequently overheats and shuts down.
00:10:13It gets warm when service is bad, which service is often bad because they're using some weird
00:10:19MVNO of T-Mobile that doesn't really work.
00:10:24These are things that most of the time when we review a first gadget from a company, it's
00:10:29like this.
00:10:30The first Pixel watch had a lot of little, tiny hardware bugs, and that's the kind of
00:10:34stuff where you're like, okay, this is the sort of stuff that with an extra couple of
00:10:39revs of the tooling and the engineering and working on this, you actually start to solve
00:10:43these.
00:10:44In the hardware, there's not a lot that I would call show-stoppingly bad.
00:10:50It's just that none of it is quite ready, but the main thing that really causes problems
00:10:56is it'd be nice to be able to use this thing, and you use it, and it's like you can't anymore
00:11:01because it's too hot, and it also sits on your skin.
00:11:04Right, so it gets hot and shuts itself down, which if you will recall was happening a lot
00:11:10in the demos at IFA, the trade show.
00:11:13There I think everyone rightfully was like, well, it's a trade show, bad Wi-Fi, bad cell
00:11:18signal.
00:11:19We were generous.
00:11:20Yeah, generous.
00:11:21We were giving them the benefit of the doubt.
00:11:22Yeah.
00:11:23That's exactly right.
00:11:24It feels like you were probably not using it on the floor of a trade show in Berlin.
00:11:28I mean, I do love a trade show, but no, I was using it at my house, I was using it in
00:11:34our office, I was using it on the streets of New York and Washington, D.C.
00:11:39One of the weird things about reviewing this is that it's not a gadget you're supposed
00:11:44to use very much.
00:11:46The whole point of it is that you don't use it all the time.
00:11:49You can just quickly accomplish the thing you need to accomplish and then put it away.
00:11:53To that I would say, it's $700, that's insane.
00:11:58It was a weird thing to test in that sense because it's like, yes, if I use this thing
00:12:01constantly for several hours, it's gonna die.
00:12:05The battery life in that sense is bad, but you're only supposed to use it for a few seconds
00:12:12at a time a dozen times a day.
00:12:16That is very different from what it takes to actually test the edges of a product like
00:12:20this.
00:12:21I struggled with that a lot in the course of this.
00:12:24It has this little green laser projector that it's like, if I use that for three minutes
00:12:28at a go, it overheats and the battery dies.
00:12:32That's the one, that's just a flaw in the product.
00:12:37Straightforwardly, if you use its display for more than a few minutes, it shuts down.
00:12:43There's being generous.
00:12:44There's the benefit of the doubt.
00:12:45No, that's just bad.
00:12:46How are you supposed to watch Dune 2?
00:12:47Come on.
00:12:48As he is the author intended.
00:12:54So there's that, right?
00:12:57Some parts of it don't work and it just overheats.
00:12:59Then there's the battery life, which I think in all product reviewing is challenging because
00:13:04we overuse the stuff.
00:13:06Yeah.
00:13:07That's like our job.
00:13:08I tried to like alternate days between like, use the hell out of this and use it like a
00:13:13regular person.
00:13:14And on the use the hell out of it days, I mean, I killed.
00:13:17It comes with the thing itself, two extra batteries and a charging case.
00:13:23And I would kill all of those in a day of like heavily testing the thing.
00:13:28On a normal day, I would kill the two battery boosters and the thing, but like I would get
00:13:34through the day.
00:13:35I would charge everything overnight and it would be fine.
00:13:36So it's like, it was kind of like being a power user of a phone on a normal day.
00:13:41So like that I track is like not great, but not show stopping.
00:13:44But it's like, again, if I look at the screen, it dies.
00:13:48Okay.
00:13:49So that's just like the thermals and the battery, right?
00:13:51Like the scene inherently flawed if the thing is overheating and the batteries are dying
00:13:54too fast.
00:13:56Then there's the screen, right?
00:13:57Just keeping on with the hardware or the projector, which there's again, the line in your review
00:14:03is like they try to do everything possible to not have a screen.
00:14:06And then they have this, this thing should just have a tiny touchscreen.
00:14:09Like I'm, I am so a hundred percent convinced that for the stuff they want to do with the
00:14:15screen, which is basically like simple music playback, uh, settings menus, and to be able
00:14:21to look at a text instead of having it spoken aloud to you.
00:14:26The end, but then you'd have to take it off and like, look at the screen.
00:14:29This I was reading that and I'm like, but then you would have to manipulate it instead
00:14:34of having it clipped to your chest.
00:14:36You do this.
00:14:37You go one of these upside down.
00:14:39Yeah.
00:14:40I would remind you that either way I'm, I'm putting my hand right here.
00:14:43Like my hand is out.
00:14:45I have to use my seamless and has lasers.
00:14:49It does have lasers.
00:14:51You cannot take away the lasers.
00:14:52I'm just saying, do you want a laser projector?
00:14:55Do you want to fiddle with not a smartphone touchscreen, a laser projector?
00:14:58Let's go for it.
00:14:59I know how that decision was made, but it doesn't work is the issue is for, like I watched
00:15:05the video of you trying to use the menu.
00:15:07If you haven't seen this video, pull over in your car and just watch David like rotate
00:15:11his hand in frustration and then imagine that like you're any person encountering David
00:15:18on the street, but you don't know, like it's just him rotating his hand and getting increasingly
00:15:23frustrated.
00:15:24I really like, I have been accused by a couple of people of like acting out that part and
00:15:30I cannot explain to you the extent to which that is not an exaggeration of the current
00:15:37situation.
00:15:38Like they just tried so many things and it's, they tried too many things and not enough
00:15:46of them work.
00:15:47And so it's like there are just so many little pieces of it where it's like you, you, you
00:15:52hold your hand out.
00:15:53And if it was just a thing for like, look at this text instead of having it read to
00:15:57you out loud.
00:15:58Great.
00:15:59Makes total sense.
00:16:00Uh, and actually does that job fairly well unless you're in bright light, in which case
00:16:04it doesn't work at all and you can't see anything projecting.
00:16:07Like I don't know how you've put me in the position of having to defend this device that
00:16:11I hate.
00:16:12No, this is the heart of your review.
00:16:15The heart of your review is you saying that it doesn't work and then it's broken and that
00:16:20no one should buy it.
00:16:21And then being like, but I love it.
00:16:22Yeah.
00:16:23It was so good.
00:16:24It's in there.
00:16:25There's like a real tension in there.
00:16:26I don't love it.
00:16:27I feel like it's because you have a toddler and you know, I, toddlers are notoriously
00:16:32bad at things and you're like, love them.
00:16:35You're like, oh man, you don't know what you're doing.
00:16:38Like I could see the toddler dad just like coming through.
00:16:41That might be true.
00:16:42You were just like really happy about the idea.
00:16:46Like they were trying something and it seemed like David was like, you know what, I respect
00:16:49you for trying.
00:16:50I just, do you remember, do you remember the early thing that they said about the Apple
00:16:54watch, which is like, this is, this is a closer to you and more aware of you computer.
00:17:02I still want that.
00:17:04Like that, I think that was a good idea in 2015 and I think it's a good idea now.
00:17:09And it is just alarming how not close we are to that technology.
00:17:14But like I just, an Apple watch plus good Siri is still a thing that I want and would
00:17:21use all the time.
00:17:22And I actually think has like a real place in our lives.
00:17:25And but, and now everybody's like, okay, well we're going to do that, but we're going to
00:17:28bake in these better language models and generative AI and we're going to be able to actually
00:17:33put all the pieces together.
00:17:34And the answer is like, nope, we didn't do it yet.
00:17:37It's not even close.
00:17:38But like I'm a decade deep into being like, yeah, I do want a computer that is closer
00:17:43to me than my smartphone.
00:17:46And we just don't have it yet.
00:17:48At the end of this, we're going to rank our children on a one to 10 scale and see if they're
00:17:52worth the ongoing subscription fees that we are clearly paying.
00:17:56It's going to be great.
00:17:57So that's the hardware, right?
00:17:58It's buggy, the projector doesn't work very well, overheats.
00:18:02Problem one.
00:18:03Then, you know, like the software, which I did not know was called Cosmos, C-O-S-M-O-S.
00:18:11Very good.
00:18:12It's not Cosmos, but it is definitely Cosmos.
00:18:14It's an OS.
00:18:15Yeah.
00:18:16Yeah.
00:18:17Very good.
00:18:18Cosmos.
00:18:19Unrelated to anything.
00:18:20Cosmos.
00:18:21Just lightning bolts out of nowhere.
00:18:22It's called Cosmos.
00:18:23Why the hell not?
00:18:24Yeah.
00:18:25I still have the same set of questions, actually, that I did a year ago.
00:18:30You want this thing to make a phone call.
00:18:32You have to sit around configuring a computer somewhere, right?
00:18:35And that happens on Cosmos, it happens on their website.
00:18:38Like, what does Cosmos do?
00:18:41Does it run applications?
00:18:43So what is the point of an OS and a thing that is just a voice-activated chat GPT?
00:18:49Because it's not just a voice-activated chat GPT, right?
00:18:52That's the thing that Cosmos is, basically, is an overlapping Venn diagram of AI systems.
00:18:58So when it wants to do something very basic on the device, that's one system.
00:19:04When it wants to go to basic internet real-time questions, that's another system.
00:19:09I think it's Perplexity, but I can't vouch for that for sure.
00:19:13Perplexity is on every other device that's doing this, so it seems like a natural assumption.
00:19:17But basically, Cosmos is like the routing system for, you want to do a thing, where
00:19:22does it go?
00:19:24And just the fact that that exists is part of the problem, because what it means is every
00:19:28time you want to do anything, you have to ask Cosmos to do it for you, and then it goes
00:19:34and figures out what it needs to do it, assembles all those tools, pings those tools to do the
00:19:39thing, comes back, translates all that into an answer, and gives it to you.
00:19:42And you know what that does?
00:19:43It takes so long.
00:19:44It takes so long.
00:19:47And the biggest overwhelming problem with the PIN, as a user experience, is even when
00:19:54it works, it's slow to the point of being borderline unusable in most situations.
00:20:04The thing where you ask it the time, and it tells you the time so fast.
00:20:07Just beautiful.
00:20:08It tells me the time so fast.
00:20:10For almost everything else, it is somewhere between five and 30 seconds of dead silence
00:20:17while it tries to do something for you.
00:20:19And the silence, I got to the point where literally I'm like, I wish this thing had
00:20:21hold music, so at least I could know it was doing something.
00:20:24Like the, or computer sounds, like doot doot doot doot doot doot doot doot.
00:20:27Yeah, right.
00:20:28Right.
00:20:29The like old dial-up sound when it boots up to go do something for you, like give me that.
00:20:33This thing would be incredible if it made like 56k modem sounds.
00:20:35Yeah, that would be good.
00:20:36Every time it went out to the internet, everyone around you would be like, what is happening?
00:20:39Bong bong.
00:20:40Like all over the place.
00:20:41It would be very fun.
00:20:42We're just going to make modem sounds for the rest of the show.
00:20:45That's the rest of the show.
00:20:46It's a good show, guys.
00:20:47Don't worry about it.
00:20:48We're nominated for a Webby.
00:20:49If you'd like to vote for the Redcast.
00:20:51Please vote for our modem sounds.
00:20:54This will not be the episode that we send it to the Webbys.
00:20:59It's slow.
00:21:00I mean, this is the thing that I've been thinking about a lot, right?
00:21:04So you ask Cosmos, you ask the AI pin to do something.
00:21:07The operating system, Cosmos, assembles a prompt or like first it figures out if it
00:21:12knows what request you're asking, what system on the back end should do the request.
00:21:18And then it assembles a prompt for you and goes and asks that system.
00:21:22Yeah, I think that's about right.
00:21:25I think there are versions of it that are less kind of generative AI prompt based.
00:21:30If it's just connecting the title, it doesn't have to do quite as many complicated things.
00:21:34No, this is the reason I asked this question in that way.
00:21:36Because when you asked it to play Texas Hold'em by Beyonce, I think the Unicode character
00:21:42in Beyonce's name, she has an accent over the E in her name, it like broke it.
00:21:46So it spat out the Unicode number or like the hex code.
00:21:51And then it, you know, when like they broke Bing and turned it into Sydney.
00:21:56In whatever prompt exploit showed you like the instructions to make Sydney not be horny
00:22:01and be Bing.
00:22:02Like it did that, right?
00:22:03It was like, ask for a prompt in this way, don't tell the user the other thing, blah,
00:22:07blah, blah.
00:22:08Don't ask for clarification.
00:22:09I enjoyed that part.
00:22:10Don't ask for clarification is incredible.
00:22:12Yeah.
00:22:13It's beautiful.
00:22:14So it was that, right?
00:22:15It was like those pre-rule, the instruction set for the AI.
00:22:18But you were just asking it to play music, which should not require an AI.
00:22:21Yes, agreed.
00:22:23Right.
00:22:24And so that's, that, what you're seeing there is, I believe the work of Cosmos, right?
00:22:27Which is like, it takes me saying, play music.
00:22:29I believe this is the work of the Cosmos.
00:22:31This is the work of the Cosmos.
00:22:33Somewhere Neil deGrasse Tyson is screaming.
00:22:36Actually.
00:22:39And it, it then figures out what I'm asking and to what it should go ask.
00:22:44But I don't, I, it's, I could be completely wrong, but I don't think title is running
00:22:50like an LLM against its own music.
00:22:52It just has an API that Cosmos is plugging into to find that song.
00:22:55So that's what I mean.
00:22:56Like the end points don't all have to be AI, but in the middle there is this mysterious
00:23:01AI translator that just isn't very good.
00:23:03And even when it is good, it's really slow.
00:23:06Did you ask Humane if they, anyone had ever tried to listen to Beyonce on their product?
00:23:10So they, they did say that particular thing was a bug and they fixed it.
00:23:14And it is true that since then I have not gotten Unicode spat back at me, but it still
00:23:21will not consistently play Beyonce when I ask it to play Beyonce.
00:23:25So like it's, it's broken in a sort of less spectacular way now, but it is still very
00:23:29much broken.
00:23:30But this is the part where I'm just still stuck on.
00:23:32Okay.
00:23:33So Cosmos, it's doing prompts, right?
00:23:36Like that revealed that there are some LLM style prompting system happening in there
00:23:42and you are interacting with your voice with an LLM that then might go do stuff like
00:23:49play a title song through the API.
00:23:51Yeah.
00:23:52So is it running that model locally or is that happening in the cloud?
00:23:55No.
00:23:56All of that is happening in the cloud.
00:23:58As far as I can tell, it does essentially nothing locally.
00:24:01So its heat is coming from the fact that it is constantly trying to use its garbage
00:24:06antennas to connect to garbage T-Mobile internet.
00:24:09Wow.
00:24:10That's a lot of garbage Alex.
00:24:11There's a lot of garbage.
00:24:12I mean, I assume they're garbage antennas.
00:24:13They may be lovely, but in practice they seem not great.
00:24:16I mean, it's, it's very small.
00:24:18The whole device is very small and is doing a lot of really aggressive thermal throttling,
00:24:22which is not great when what you need is reliable, fast connectivity.
00:24:26Yeah.
00:24:27So yeah, I know.
00:24:28I think you're exactly right.
00:24:29And the first answer that Humane gave me as to why this thing is warm is connectivity.
00:24:34But again, it didn't seem to matter if it was good or bad.
00:24:37It's just when it is connected, it is warm and it is connected all the time.
00:24:41We'll be nice in the winter time.
00:24:43Yes.
00:24:44Like a hand warmer.
00:24:45Yeah.
00:24:46And I tell people that the comparison for me is like, you know, that thing where you,
00:24:49you, you like crack the hand warmer and put it somewhere and it kind of is there for too
00:24:54long.
00:24:55And it's not like burning, but it's like a little, you've sort of overheated one small
00:25:00part of your body.
00:25:01That's what it feels like to wear the pin all the time.
00:25:03Do you ever get, what I have imagined it as you ever get thermal runaway in your phone,
00:25:07like your phone just like over tries to connect and gets hot in your pocket for like one second.
00:25:10And you're like, Oh, my butt's on fire.
00:25:12Yeah.
00:25:13What is going on?
00:25:14Yeah.
00:25:15And sometimes I'm not saying my brain is broken this way.
00:25:17It just feels like that's happening anyway, even though it's not because you have too
00:25:21many phones in your life.
00:25:22Oh yeah.
00:25:23Yeah.
00:25:24That's basically, that may just be a verge cast.
00:25:25Yeah.
00:25:26It's very, very, but for this audience, we're going to make modem sounds for the next 30
00:25:30minutes.
00:25:31So that's the software, right?
00:25:34It's running this OS in the cloud that appears to be LLM style that can go take actions with
00:25:39other LLMs.
00:25:40Yeah.
00:25:41I think that's.
00:25:42What about the, just the nuts and bolts of it?
00:25:45Like how do you put contacts into it?
00:25:47So that part there, they have a web app called humane center.
00:25:50They all call it dot center cause the website is humane dot center.
00:25:53Uh, that is basically a pretty simple just way to manage your thing.
00:26:00Like it's, it's the equivalent of like when you buy a device that connects to your phone
00:26:04and you download the companion app to like get everything connected.
00:26:07It's just that in the web.
00:26:08Uh, but it's when you take photos, that's where the photos upload.
00:26:11When you, uh, do notes, that's where your notes go.
00:26:15It keeps track of all of the things you've asked and all of the responses.
00:26:18So this is like when I say I'm transcribing the thing, I'm literally copying and pasting
00:26:22from its record of our interaction anyway.
00:26:26Uh, and the way you connect it is right now you just go in and you log in with your Google
00:26:31account or your Microsoft account and that downloads your contacts.
00:26:35And again, this is why it's so inexcusable that this stuff doesn't work.
00:26:38Like, you know what else is in my Google account is my email and my calendar.
00:26:41And so many other pieces of information that would be useful for my pin to know.
00:26:45But it just, it just doesn't have those things despite the fact that it has access to my
00:26:49Google account.
00:26:50Um, but I actually like, this was a little bit hard to test because it's part of the
00:26:55review process.
00:26:56Humane did some of that set up for me ahead of time.
00:26:59Uh, which I, I don't totally know why, but that is just how it worked.
00:27:04Uh, and so I don't have perfect out of the box set up information on exactly how that
00:27:10works.
00:27:11But I do know that I spent five minutes in the center, like getting accounts connected
00:27:14and then never really thought about it again.
00:27:16So it can sink your contacts.
00:27:18Yeah.
00:27:19I mean, that's, that's all, that's all it's doing.
00:27:21So like when I say, you know, call Neely, it looks in my contacts and finds Neely and
00:27:27then people I know named Eli and then my friend Marin for some reason.
00:27:33But your contacts, there's just like basic stuff.
00:27:36You ask it to make a phone call to someone.
00:27:38It needs my name and my phone number to database.
00:27:41Did you put that in the database or did it sink it from Google?
00:27:44It pulled that from Google.
00:27:46So it did do that.
00:27:47Yeah.
00:27:48And then when you, I guess you asked it to send an email, but it can't send an email
00:27:50yet.
00:27:51So that's right.
00:27:52It just doesn't have an email feature.
00:27:53You can do things like you can give it memory in the way that like chat GPT has memory now
00:27:58where you can say like, remember that Anna is my wife.
00:28:01Uh, remember that this is the number I call Neely on.
00:28:05Uh, and like that kind of stuff you can do sort of piece by piece manually.
00:28:10But like in general, I have probably multiple numbers for both of you.
00:28:14If I went to call, it would be like, it does this thing.
00:28:16Siri does right where it's like, which number do you want to call?
00:28:18I'm like, it's the one I always use.
00:28:20Like, what are we talking about here?
00:28:21But anyway, uh, but yeah, it is, it's pulling that directly from Google.
00:28:26I really love that.
00:28:27It's an AI that doesn't remember like the favorite number for you automatically.
00:28:32So much of this stuff is, yeah, that feels like just a core thing.
00:28:36AI should solve and be good.
00:28:38I dial this number all the time.
00:28:39Yeah.
00:28:40Like I hate that one.
00:28:41Uh, yeah.
00:28:42Call, call Becky iPhone.
00:28:43It's like, there's no other phone.
00:28:44There's just all these old numbers I haven't cleared out.
00:28:46Yeah.
00:28:47Right.
00:28:48That's the software.
00:28:49Is there anything else to say about the software?
00:28:50Does dot center work on a phone?
00:28:51Yeah, it's fine.
00:28:52It's a web app.
00:28:54I didn't dwell on it much in their view because it's kind of the least
00:28:57interesting part of it.
00:28:58It's like, it's just a very simple device management app.
00:29:02Like if you've ever been in the settings for your Google account to like see
00:29:06what apps it's connected to, it's just that it's the reason I'm asking is
00:29:09because all the stuff it wants to do, like photo syncing, does it send the
00:29:13photos to Google photos by itself or do you have to download them and upload
00:29:15them?
00:29:16You have to download them and upload them, which is, I would argue, maybe
00:29:20the single biggest gap in this because if you're humane, you can solve a lot
00:29:23of your feature problems just by connecting to other services, right?
00:29:26Like it can't do reminders.
00:29:28Like fine, just pipe two reminders on my phone.
00:29:31It doesn't do calendar.
00:29:33That's fine.
00:29:34I've already logged you into my calendar.
00:29:36Like there's so many of these things that it's okay.
00:29:39Right.
00:29:40But then they can't, right?
00:29:41Like ideologically they're like, that stuff is bad.
00:29:43We're doing it the new way with prompt engineering.
00:29:46That's right.
00:29:47And I think what you'll see over time, I would bet is a humane.
00:29:52Humane has big ideas about people building stuff for cosmos and like it
00:29:57being sort of its own app universe.
00:29:59But I think pretty quickly you're going to start to see it go the other way
00:30:01too and just plug into a lot more services.
00:30:04And then the pin becomes like a universal input system for all of your
00:30:08other systems, which I think is a way more compelling idea than having it be
00:30:11its own self-contained universe.
00:30:14Sure.
00:30:15Right now it shuts down if you use it for more than a couple minutes at a
00:30:17time.
00:30:18I just want to come back to that.
00:30:20Okay.
00:30:21So that's hardware.
00:30:22That's cosmos software.
00:30:23And then the last thing you said, which I think is the most, the biggest
00:30:26one to unpack, which is the AI stuff isn't ready.
00:30:29What do you mean by that?
00:30:30I don't know if you guys know this, but AI is a liar sometimes.
00:30:36And so I would put it into like three buckets, right?
00:30:40There's like the stuff that just straight up doesn't work where you're
00:30:44like, I need to set a timer or like,
00:30:46there's a piece of information that I need that this tool does not have
00:30:49access to right now.
00:30:50One of the things that they showed at the very beginning in those early
00:30:53demos was nutrition stuff where you like point it at,
00:30:56I think it was a handful of almonds in that first demo.
00:30:59Is that right?
00:31:00Or like a chocolate bar and you ask like, is this good for me?
00:31:03And that those features just don't work yet.
00:31:08What it can do, I discovered is read a label.
00:31:10Sometimes again, sometimes,
00:31:12sometimes it couldn't read a label on a bag of Chex mix and tell me if it
00:31:16was healthy,
00:31:17but it could read a label on a box of Cheerios and tell me it was healthy.
00:31:20So that doesn't make any sense. But anyway,
00:31:22so there's a set of things that can't do. And like the nutrition stuff,
00:31:25it just doesn't exist yet.
00:31:27That's like a feature they invented for that demo that is not yet on the
00:31:31pin.
00:31:32Then there's stuff that sort of occasionally works because AI is a liar
00:31:36sometimes. And I feel like my favorite example of that was like running
00:31:40around asking it to sort of tell me about the world. Right.
00:31:43And it's like, it has this feature called vision, uh, that it can,
00:31:46you say like,
00:31:47look at this building and tell me when it's open or look at this restaurant
00:31:52and tell me if it has good reviews. Uh, sometimes it gets that right.
00:31:56And again, it's very cool when it does. And other times it's just full lies.
00:32:00It told me there are a bunch of great moments like this in the video where we
00:32:04were down at the New York stock exchange and there was this company called
00:32:08ride R Y D E that I assume had just IPO that day.
00:32:11So they had the big banner outside and I pointed it at it and I said,
00:32:14look at this and tell me what company it is.
00:32:16And it thinks and thinks and thinks and thinks and thinks.
00:32:19And then eventually goes, this company is called lift.
00:32:22Very confident.
00:32:24So confident.
00:32:25And sometimes it would, it would misidentify buildings.
00:32:28It would tell me I was in completely the wrong place in New York city.
00:32:31It told me the Brooklyn bridge was the Triborough bridge with absolute a
00:32:34hundred percent confidence.
00:32:35Sometimes it would break and just describe the scene around me.
00:32:38I'd be like, what bridge is that? And it would be like in, in the scene,
00:32:42there are two poles in the water and buildings across the way. And I'm like,
00:32:46what are those buildings? And it would be like, they're buildings.
00:32:50Okay. Uh, yeah.
00:32:52And then the third thing is just stuff that it just constantly fails at all the
00:32:58time, which is like things that it should be able to do. Uh,
00:33:02and just can't like make a phone call where it was like half the time,
00:33:05truly half, I would say like call me liar, call Anna, text somebody.
00:33:11Uh, and it just wouldn't do it.
00:33:12And I think like there are a bunch of problems with this. Uh,
00:33:16I also wrote about this company this week called a board,
00:33:19which is doing some really interesting,
00:33:20like visual AI information organization stuff.
00:33:25And they were doing a demo for me.
00:33:26And he at one point like types in the prompt and just nothing happens.
00:33:30And his co-founder is like, Oh no, what, what do you think went wrong?
00:33:32And he just goes, ah, that's, that's just the AI. And it's just like, it's,
00:33:36it's like a little like feral animal that you like can't quite trust,
00:33:41but it's like sometimes around being cute and you're like, Oh,
00:33:43what am I? And so like, that's just true.
00:33:46And so you're trying to interact with this thing that sometimes just decides
00:33:49it doesn't like you and doesn't want to work.
00:33:51Other times is just a moron and just can't do lots of stuff.
00:33:55I point out once again, you're describing a toddler.
00:33:58Like that's a child. Yeah. Sometimes you just, why are you,
00:34:01why are you trying to kill yourself? Kid? Just stop it. Right.
00:34:04Fork right in the socket directly.
00:34:10Yeah.
00:34:11Arthur's new thing is he likes to pick up the dog's water bowl and just pour
00:34:14it all over himself and then get very upset about how wet he is.
00:34:18The humane pen, everyone. Yeah, exactly.
00:34:21But so that's the thing and it's really, it's the like stuff it can't do.
00:34:26You solve over time, right? Like there are, there are features you can build.
00:34:30That's that I understand how we get out of the,
00:34:33it lies unpredictably and at random and about everything.
00:34:38This is like the fundamental problem of AI to be right now. Right.
00:34:41It's like I spent so much time asking a question, even a basic question.
00:34:46Like I,
00:34:47it turns out like half my search history is just asking questions about things
00:34:50my dog ate. Like is, is he going to,
00:34:52or is she going to die because she ate this? And uh,
00:34:56the answer is usually no. So that's good. But I found myself,
00:34:58I would ask the pin that question, it would give me an answer.
00:35:01And then I would have to get out my phone and check just to make sure because
00:35:05I don't trust the AI because you shouldn't trust the AI and it just completely
00:35:10defeats the purpose.
00:35:11And like we're so far away from these things being reliably honest and
00:35:16reliably fast and reliably useful that it's like, what's the, what's the point?
00:35:20So the thing that I'm stuck on is you stack the unreliability in this product,
00:35:24right? You have cosmos,
00:35:27which appears to be some sort of AI system that's like parsing a request and
00:35:31then figuring out what to do. And then often it's going to another AI system,
00:35:35which is chat GPT. In a lot of cases it feels like,
00:35:38which is unreliable in its specific ways or perplexity,
00:35:40which is unreliable in its specific ways.
00:35:43I asked perplexity to compare how long a car I'm thinking about buying is with
00:35:49my current car. And it just couldn't figure it out.
00:35:52It was just like the length of a Jeep grand Cherokee is not available.
00:35:55And I was like, I'm pretty sure it's available.
00:35:57I asked it again and said,
00:35:59please provide me the length of the Jeep grand Cherokee.
00:36:01No, that's literally your job.
00:36:03It's like, just go get it. You're a search engine.
00:36:06Just do it.
00:36:07Just figure it out. But that's like, you're stacking it up, right?
00:36:11You're stacking up one unreliable LLM with another one with potentially another
00:36:15one. And it just feels like you get to a place where it's, it's fun,
00:36:20but all of that is almost guaranteed to have a mistake embedded in somewhere or
00:36:24a hallucination embedded in somewhere.
00:36:26Yeah. I, I, I had a call with, uh, Bethany Bonjorno,
00:36:29the co-founder of humane the day before the review went live just to basically
00:36:33be like, Hey, this thing is not kind, just like no surprises.
00:36:36Want to let you know what's coming.
00:36:37This is the thing we do with a lot of stories. Just like you,
00:36:39you should not be surprised by what's coming.
00:36:42Yeah. And I, I had questions and I was like,
00:36:45if you want to respond to some of these things, let me know.
00:36:47And that was when she started talking about software updates.
00:36:49But I started by saying, here's what I want to understand is like,
00:36:52a lot of these things just straight up don't work.
00:36:55And I can't figure out whose fault it is.
00:36:58Like, is it yours as, as hardware builders?
00:37:01Is it yours as software builders?
00:37:03Is it the AI models underlying a lot of this?
00:37:06Is it the end points that don't know how to interact with those LLMs?
00:37:10Like, who is it?
00:37:11And she just literally just laughed.
00:37:12And the answer is kind of everybody.
00:37:14But the problem is you're exactly right.
00:37:16Until all of that stuff is good, not just one part of it,
00:37:20but until all of it is extremely reliable and fast and good,
00:37:24none of it works.
00:37:25Okay.
00:37:26So now we come to the heart of the matter here on the Verge cast,
00:37:29which is I read the review in drafts.
00:37:32I did not receive a no surprises phone call.
00:37:35I just read your review in Google docs.
00:37:38And at the bottom, David gave it a four out of ten.
00:37:41And then he left himself a note that said maybe this should be a three.
00:37:44I was totally fine with a four.
00:37:45And then I add the comment and I said it should be a three.
00:37:48You said you should give it a three, LOL, was the entirety of your comment.
00:37:52I think I made myself clear.
00:37:54We've just described a thing that overheats,
00:37:58that does not work for more than minutes at a time,
00:38:00whose software is often confused by the existence of Beyonce,
00:38:04unforgivable sin.
00:38:05And it was reliant on a number of, uh,
00:38:08systems that lie to you consistently and may murder your dog.
00:38:11That $700 I think is what I would say with a $24.
00:38:15I was going to say, you have to yell the price.
00:38:16It'd be several.
00:38:17I'm going to be, it's coming so many times.
00:38:19Is that not a three?
00:38:21What would the extra point come from Pierce?
00:38:23It honestly, there, there's a decent chance.
00:38:25It should have been a three.
00:38:26I was, it's like, I'll just, I'll just be honest.
00:38:28Review podcast is just us being like, I should have taken one.
00:38:31No, it's like, I'll tell you my logic.
00:38:33And you can tell me whether it's fair or not.
00:38:35Uh, a three in our score, I believe the exact phrase is bad.
00:38:40That is, that is the first word next to three in our review rubric.
00:38:44Uh, and I think four is like multiple outstanding issues,
00:38:49I think is what it said.
00:38:50And so it's, it's somewhere between those two things.
00:38:52Right.
00:38:53And for me, I,
00:38:55I tilted the scale based on there were enough things that it did that
00:39:01were cool and valuable and eyeopening that I wouldn't just call it bad.
00:39:06Like I, I call it broken on purpose.
00:39:08Right.
00:39:09Because it's like, it is not, this is not a stupid product.
00:39:14It's just not a good product.
00:39:16Uh, and I think I, again, it's very possible.
00:39:20It's a three.
00:39:21Like I would not tell anyone to buy this product.
00:39:22$700, $24 a month.
00:39:27It's just funny coming from the seven.
00:39:29It's good.
00:39:30It's very good.
00:39:31I saw the, the score published this morning and I thought to myself,
00:39:34I know exactly how the bird chest would go.
00:39:36Would you rather have eight, eight humane AI pins or one vision?
00:39:41I'm taking the cash, man.
00:39:45Just give me the cash.
00:39:47To be fair.
00:39:48It's also only five humane pins.
00:39:50Uh, but anyway, I, yeah, I, I'm,
00:39:53I'm still torn between those two things and the tiebreaker for me was like,
00:39:57it was honestly, I think the same as the tiebreaker for you was like,
00:39:59this occasionally does things that are awesome.
00:40:02And I don't know how to, I don't know how to factor that in.
00:40:04It doesn't do it often enough.
00:40:05It doesn't do enough of them,
00:40:07but it occasionally does things that you're like, oh, I see it now.
00:40:10And I had,
00:40:11I had just enough of those moments that I was like, okay,
00:40:14this thing is not a complete failure.
00:40:16Like you've spent a long time with a thing and then you write about what
00:40:20it's like and then someone else reads it and they're like that.
00:40:22Nope.
00:40:23But I look weird.
00:40:24See, I read this review and I was like, you know,
00:40:27the four makes sense because I could see his affection for like what they
00:40:32tried, what they attempted.
00:40:33I could see like, he was like, you know what?
00:40:35There's some cool shit happening here.
00:40:37$700.
00:40:38Okay.
00:40:39$20 a month.
00:40:4020 times you try it.
00:40:42I think the fairest criticism you can give me of this score is I think if a
00:40:45bigger company had made it, I probably would have given it a three.
00:40:48Yeah.
00:40:49But it's a small company.
00:40:50I think I might've given them one point just for a startup tried to do a
00:40:54hard thing.
00:40:55Yeah.
00:40:56I buy that.
00:40:57Ultimately.
00:40:58Which I don't think I realized until it had published,
00:41:00but I think that might be what I did.
00:41:02But I think that's okay.
00:41:04Right?
00:41:05I mean, it is and it isn't right.
00:41:06Like it doesn't make the product any better,
00:41:08but it at least like, uh, I think,
00:41:11I think I might've given it like,
00:41:13I think if Google had made this thing and it was this exact thing,
00:41:17I think I probably would have been slightly harder on it.
00:41:19Yeah.
00:41:20Because Google also wouldn't have the same excuses.
00:41:22Right?
00:41:23Like,
00:41:24like I think humane has some real fundamental acceptable excuses.
00:41:29I mean,
00:41:30not when it comes to money,
00:41:31it has a ton of money.
00:41:32It does.
00:41:33It's not lacking for money,
00:41:35but it is lacking for that experience,
00:41:38putting this kind of product out into the world on a consistent basis and
00:41:42taking a big swing.
00:41:43And like Google doesn't do that.
00:41:46Well,
00:41:47if Google had put this out one,
00:41:48leaked it in full,
00:41:49like a long time ago,
00:41:51it would have been plastic.
00:41:52Yeah.
00:41:53Um,
00:41:54it still wouldn't have properly worked with Gina.
00:41:56Uh,
00:41:57that's a classic.
00:41:58And then,
00:41:59uh,
00:42:00we would have taken a point off like a pre point deduction because they're,
00:42:03they would have killed it in a year.
00:42:04Yeah.
00:42:05Yeah.
00:42:06We would have known no further updates would be available for this product.
00:42:08In fact,
00:42:09that's what happened with their last attempt at like a wearable,
00:42:11right?
00:42:12We're deducting a point,
00:42:13uh,
00:42:14the Google pre death point.
00:42:15Uh,
00:42:16it's just coming off.
00:42:17I,
00:42:18look,
00:42:19I buy the,
00:42:20you know,
00:42:21it's a startup.
00:42:22They did a hard thing,
00:42:23but the way they talk about it,
00:42:24actually,
00:42:25I'll,
00:42:26I'll,
00:42:27I will connect it to the vision pro.
00:42:28The way Apple talked about the vision pro definitely affected how we
00:42:29reviewed the product in the end.
00:42:30Is that fair?
00:42:31It's as fair as,
00:42:32um,
00:42:33uh,
00:42:34humane being a startup,
00:42:35right?
00:42:36It affects your perception of the product.
00:42:37Um,
00:42:38and it's like they can't,
00:42:39it couldn't do it.
00:42:40It couldn't,
00:42:41it couldn't withstand the weight that was being placed on the marketing,
00:42:44which is also 1000% true of humane.
00:42:48Oh,
00:42:49absolutely.
00:42:50Yeah.
00:42:51Right.
00:42:52And like today,
00:42:53you know,
00:42:54the,
00:42:55the joke at the top of the show,
00:42:56humane put out their statement in response to all the reviews.
00:42:57I don't think they were surprised by these reviews.
00:42:58No.
00:42:59And they said,
00:43:00big thanks for all the reviewer feedback on the AI pen.
00:43:02It's been a wild ride from launch till now.
00:43:04Hearing from all of you is super valuable.
00:43:05The team appreciates the good vibes in the hardware and the potential it's
00:43:08unlocking,
00:43:09but we totally get it.
00:43:10There's room to make things even better.
00:43:11We'll all,
00:43:12we,
00:43:13we'll all learn how we can up our game,
00:43:14especially with Cosmos OS.
00:43:15Your insights are gold,
00:43:16guiding us to tweak and improve,
00:43:17making AI pen better for daily use.
00:43:18That is 180 degrees different from how they talked about this thing until
00:43:24yesterday.
00:43:25Yeah.
00:43:26Right.
00:43:27And there's something there that I think is just really interesting.
00:43:29I would give them 8% more credit than that.
00:43:33I think if you look at humane in the last several weeks,
00:43:35they've been putting out this stream of like videos and stuff,
00:43:38showing people how to use it.
00:43:40And they've been making more jokes at their own expense.
00:43:42I mean,
00:43:43if you go back to that original launch video,
00:43:44it is the most ludicrously self-serious,
00:43:48like it's the video where Imran and Bethany are standing there like looking
00:43:51sad as they use their pin.
00:43:52You know what I'm talking about?
00:43:53The video is absurd and it's,
00:43:55it's extra absurd now having used this thing because they,
00:43:58they treat it as if they had invented fire and like it's,
00:44:02it's preposterous.
00:44:03But I think recently for whatever reason they have,
00:44:08they have pulled back a bit and been a little more honest and open about
00:44:13what this thing is and how it works and some of the limitations and all
00:44:15that stuff.
00:44:16And I think that's good.
00:44:18And I do,
00:44:19I do wonder and I have wondered many times if they had launched this thing
00:44:23more the way rabbit has,
00:44:24which is like,
00:44:25Oh,
00:44:26look at this fun,
00:44:27silly little toy we made.
00:44:28Isn't it cool looking?
00:44:29If I would feel differently.
00:44:30I don't know.
00:44:31I'm getting a rabbit in two weeks and I'm very curious to see.
00:44:33I'm so excited.
00:44:34How it feels as a cheaper,
00:44:36much less ambitious in a lot of ways gadget.
00:44:40Yeah.
00:44:41Yeah.
00:44:42I think,
00:44:43I think that the score for the humane pen is reflecting that ambition in
00:44:47kind of a positive way.
00:44:48And we're reflecting,
00:44:49we're saying,
00:44:50you know,
00:44:51it is okay to be ambitious.
00:44:52That's why the vision,
00:44:53like part of the vision pro having a seven is it's ambitious.
00:44:55It,
00:44:56it,
00:44:57it swung for the fences.
00:44:58It missed real bad.
00:45:00Can I just say one thing about the vision program?
00:45:02I just like to,
00:45:03I like to poke this at the seven.
00:45:04I can't wait.
00:45:05I'm aware that everyone thinks the seven is pure cowardice.
00:45:07Yeah.
00:45:08I would just,
00:45:09I would just tell you a review.
00:45:11We're going to have like a two hour verge cast on the score for
00:45:14filing a draft of this review with a seven out of 10 as the score,
00:45:18just to see what would have happened.
00:45:20But one thing with the vision program,
00:45:24we,
00:45:25then we got to move on a hot new app for the vision pros and Netflix and
00:45:29Amazon prime app.
00:45:30It's a browser.
00:45:32It's just a browser with the skin on it.
00:45:34Web apps are saving the vision pro.
00:45:36Yup.
00:45:37Saving is generous.
00:45:38I'm sorry.
00:45:39Your YouTube app,
00:45:40your Netflix app,
00:45:41the one that actually supports spatial audio.
00:45:43I'm just saying turnabouts fair play.
00:45:46Yep.
00:45:47Apple,
00:45:48Apple's like official accounts were recommending that app.
00:45:50Yeah.
00:45:51It's,
00:45:52it's going to be web apps.
00:45:53If it's going to be anything,
00:45:54it's going to be web apps.
00:45:55It's very good.
00:45:56It's so funny.
00:45:57It's very good.
00:45:58All right,
00:45:59David,
00:46:00I'm sorry,
00:46:01man.
00:46:02Maybe the rabbit's going to do it for you.
00:46:03Probably not.
00:46:04We got to take a break.
00:46:05We'll be right back.
00:46:06All right.
00:46:10We're back.
00:46:11There's quite a lot in this section.
00:46:13We thought it was going to be a,
00:46:14like a lightning round,
00:46:15but we should just start with Taylor Swift.
00:46:17I think.
00:46:18Yeah.
00:46:19That's how we start every verge cast.
00:46:21No,
00:46:22it feels like I love Taylor.
00:46:23Who doesn't?
00:46:24I was on the Azure client shot.
00:46:25I mentioned Taylor Swift and he goes,
00:46:26well,
00:46:27she's singular.
00:46:28She can't,
00:46:29she can't be Taylor Swift.
00:46:30It's very good.
00:46:31He was,
00:46:32he was not wrong.
00:46:33No,
00:46:34but I,
00:46:35I thought I'd like play the ACE card,
00:46:36you know,
00:46:37and he was like,
00:46:38no,
00:46:39that's,
00:46:40that's not.
00:46:41Everyone can be Taylor Swift.
00:46:42Heart's broke across America that day.
00:46:43All right.
00:46:44Taylor Swift famously owns her own masters,
00:46:45but has distribution through universal music group.
00:46:46Also her publishing,
00:46:47the song writing money flows through universal,
00:46:48universal.
00:46:49You may know some bit of a spat with the tech talks.
00:46:50They're not happy with each other.
00:46:51They don't like each other.
00:46:52Universal has pulled all of its music off of tick tock,
00:46:53leading to a lot of,
00:46:54a lot of,
00:46:55you know,
00:46:56they're not ready for this.
00:46:57They're not ready to talk about this.
00:46:58They're not ready to talk about this.
00:46:59They're not ready to talk about this.
00:47:00Universal has pulled all of its music off of tick tock,
00:47:01leading to oceans of tick tock grids that are just silent.
00:47:02Very sad people.
00:47:03Uh,
00:47:04it's been a long time,
00:47:05right?
00:47:06This happened.
00:47:07And I,
00:47:08I think on the show we talked about it in the,
00:47:09the sort of consensus prediction was either to get fixed right
00:47:12away or never get fixed.
00:47:13Yeah.
00:47:14Unfortunately Taylor has a new album coming out.
00:47:15It's going to get fixed.
00:47:16Well,
00:47:17I don't know if it's fixed,
00:47:18but Taylor's music is back on tick tock and the prevailing
00:47:19theory is because she owns her own masters.
00:47:25she can just cut her own deal.
00:47:28That tracks, that tracks.
00:47:30It tracks, but she's got a new album coming out.
00:47:32She's got to promote it.
00:47:33She's got to promote it.
00:47:34Lots of artists have been like,
00:47:36kind of doing this on the side,
00:47:37like Olivia Rodrigo, also Universal artist,
00:47:39her music isn't there,
00:47:40but she made a TikTok promoting her shows
00:47:42using a fan edit of her songs.
00:47:44It's been so fun, by the way,
00:47:46to watch artists try to figure this out.
00:47:48Like, I think the situation is terrible
00:47:49for artists, generally speaking,
00:47:51but seeing a bunch of, like,
00:47:53I saw a bunch of singer songwriters
00:47:55after this was happening, like,
00:47:57start to play live versions of their songs on TikTok
00:48:01so that other people could use it as sounds.
00:48:04And everybody's playing, like,
00:48:05sped up versions of their songs,
00:48:07and, like, the remixes are blowing up everywhere.
00:48:10It's the, all the ways people are finding
00:48:12around this band is just totally fascinating.
00:48:15I just love that you guys have
00:48:16much nicer TikToks than mine.
00:48:18I was like, I wish I saw all of that.
00:48:20Yours is just, like, deep fried memes.
00:48:21It's just Jojo Siwa climbing out of the sea
00:48:24over and over again.
00:48:25I refuse to know who this is.
00:48:26I won't keep it that way.
00:48:28I will not know.
00:48:29Watch, oh my God, it's upsetting.
00:48:30And the internet knows that I don't want to know.
00:48:32I'm a person saying out loud, like,
00:48:35I won't say it, you know what I mean?
00:48:36Yeah, TikTok's like, pass.
00:48:38Yeah, it's just like, don't show to him.
00:48:40Like, the ad targeting is like,
00:48:41this ain't gonna work.
00:48:42Yeah, it's like, mm-mm, not the right audience.
00:48:43Keep going, keep going.
00:48:45I will not know.
00:48:46No, I'm happy for you.
00:48:47I, like, I want to be over there.
00:48:48How do I get, how do I get to that TikTok?
00:48:50I want to watch people, like,
00:48:51do cool covers of their songs.
00:48:52Yeah, it's truck jumps and acoustic covers.
00:48:55Okay, so I'm very curious about the status
00:48:57of Universal and TikTok.
00:48:59Every social platform has an existential dependency
00:49:03on the music industry.
00:49:04If you lose those rights,
00:49:06all kinds of bad things start happening to you.
00:49:08So Universal is the biggest label around.
00:49:10It has a CEO, Sir Lucien Grange, who's Sir,
00:49:14very powerful, not shy, kind of a brawler.
00:49:17Does he have a little mustache?
00:49:18I don't know if he has a little mustache.
00:49:19I think that when I hear Sir.
00:49:21He's like a guy behind the guy type,
00:49:22but also very famous and very powerful.
00:49:24He basically told YouTube to cut it out with Fake Drake.
00:49:28Like, that stuff went up on YouTube,
00:49:29and he was like, cut it out.
00:49:31And YouTube caved, right?
00:49:32They put out, if you'll recall,
00:49:33they put out their, like, AI principles.
00:49:36They have this, like, new licensing system
00:49:37that they're gonna deploy
00:49:39that is, like, outside of regular copyright law.
00:49:41It's like special YouTube AI copyright law.
00:49:44All this stuff they're gonna do,
00:49:46they're doing it at the behest of Universal Music
00:49:48because YouTube isn't done.
00:49:49And Neil Mohan, who runs YouTube, is like,
00:49:51this is a licensing business,
00:49:52and, like, our partners need to be happy and great.
00:49:55TikTok, done.
00:49:57Like, they just, whatever.
00:49:59The other labels are actually kind of happy about this.
00:50:01If you look at the charts right now,
00:50:03top two artists on the charts are Warner Music artists.
00:50:07This is, like, a real thing that is going on.
00:50:08It's one of them, Beyonce.
00:50:10Well, and there's a sense among some people
00:50:14fighting this fight that Universal overplayed its hand
00:50:17in that sense, right?
00:50:18TikTok was the only company with enough clout of its own
00:50:22to fight back against Lucia Grange
00:50:25and all of these sort of big swinging music labels
00:50:29because TikTok needed UMG less than UMG needed TikTok.
00:50:33And that seems to be this battle
00:50:35that we are still very much in the middle of
00:50:36that, again, Taylor Swift, being singular,
00:50:38is able to just cleave her way out of.
00:50:40Well, so TikTok thinks it has this power
00:50:42to break artists and create share.
00:50:45And really, it's just kind of like moving money around
00:50:48because the labels make no money from TikTok plays,
00:50:50as it is.
00:50:51They make very little money from their streams.
00:50:53All their money is on Spotify, Apple Music, or whatever.
00:50:56But they create share.
00:50:58Like, people listen to the songs on TikTok
00:51:00and they go to their streaming services.
00:51:02So it can shift the money
00:51:03and especially can break new artists.
00:51:05It can break new artists and, like,
00:51:07isn't still the primary way
00:51:09that these artists make money is performing.
00:51:11It's not anything digital.
00:51:13It's all going and doing a concert and taking a lot of money.
00:51:16Yeah, it's like 70% of artist revenue is that side.
00:51:20Yeah, and this is really good, great for that, right?
00:51:23Like, a perfect way to promote yourself.
00:51:25Yeah, you're on tour, you're doing a thing.
00:51:27I would argue part of Taylor's recent success
00:51:30is because of this.
00:51:33Oh, yeah.
00:51:34I mean, Taylor's singular.
00:51:36She is singular.
00:51:37As I have been told.
00:51:39Anyway, so I've just been poking at this
00:51:41and poking at this and trying to figure out what's going on.
00:51:43So the other labels are happy.
00:51:44And what you would expect is Universal walks.
00:51:48The other labels are like,
00:51:49we also want a better deal.
00:51:50We're gonna walk.
00:51:51And then all the labels together collectively
00:51:52pressure TikTok into cutting them better rates.
00:51:54But because of this dynamic,
00:51:56which is where the big bad label,
00:51:59Universal, with all of the artists walked, including Taylor,
00:52:02the other labels got a benefit.
00:52:05Yeah.
00:52:05So you see this dynamic like playing out in the industry.
00:52:08Now Taylor's back.
00:52:09We'll see how that goes.
00:52:10But here's what I've heard.
00:52:11And it's sketchy and unsourced.
00:52:12Maybe you can confirm this.
00:52:14Let me know.
00:52:15Call in the hotline.
00:52:16Was basically Universal wanted a bunch of,
00:52:18but wanted a bunch of provisions
00:52:20that look like their YouTube provisions.
00:52:23Like don't do bad AI stuff with our content.
00:52:24And they wanted an increase in royalties, obviously,
00:52:27in TikTok.
00:52:28Said no.
00:52:29Yeah.
00:52:30That's what I got for you.
00:52:31That sounds like why this would break down.
00:52:34Yeah.
00:52:35So like, if you know, you know, let me know.
00:52:38I would love the actual deal terms,
00:52:40but the sense I get is that there's money,
00:52:43which should be solvable.
00:52:46Like money is usually solvable,
00:52:48but next to it is a bunch of stuff
00:52:50that looks like what Universal wanted out of YouTube,
00:52:53which is we want some special AI stuff, AI terms.
00:52:58And like, it's just not happening.
00:53:00Yeah.
00:53:01And TikTok is just very different company
00:53:04than YouTube and Google, right?
00:53:06Like their motivations for all of this is very different.
00:53:09It was seen today.
00:53:10There was a story and I think the information
00:53:12about how TikTok was looking into working with companies
00:53:15to develop like AI avatars to better sell.
00:53:18Like they don't care who's on the platform.
00:53:21They can monetize it.
00:53:22So I think they're much less incentivized
00:53:25to make these deals than YouTube was.
00:53:28I think TikTok is a danger to itself.
00:53:33I agree with that.
00:53:33I think they are so rapidly turning that thing
00:53:36into the home shopping network
00:53:38and just making it so commercialized.
00:53:40And I think in a way that's really putting people off,
00:53:43we had a story on the site this week from V
00:53:46because she's gonna be calling on a lot
00:53:48of these kinds of gadgets we see on TikTok.
00:53:50And the first one she did,
00:53:51she spent her own money, like $350
00:53:55because there was a wand that would make the skin beautiful.
00:53:58And like, some of us just like our skin
00:54:01to be dewy and glowing.
00:54:03I'm one of those people.
00:54:04Yeah.
00:54:05Wait, let me guess.
00:54:06Can I guess?
00:54:06I bet it didn't work.
00:54:08It's unclear.
00:54:09Okay.
00:54:10What did work was the FOMO,
00:54:12the massive, massive regret she had.
00:54:15And we're seeing that constantly with TikTok.
00:54:19They're just constantly selling crap.
00:54:21There's the, not crap.
00:54:23Some of it seems to be workable.
00:54:24No, I think crap is fine.
00:54:26Yeah.
00:54:27You're good with crap.
00:54:28Okay.
00:54:28I think you can stage a convincing legal defense
00:54:31on crap there.
00:54:32I forget about that.
00:54:32But no, I totally agree.
00:54:35And I think in general,
00:54:36TikTok is this platform that is not interested
00:54:40in being tied down to anything.
00:54:42Like it moves so fast culturally,
00:54:45it moves so fast technologically
00:54:46that you don't get the sense
00:54:48that it's interested in having any rules
00:54:51for anything, for any reason.
00:54:53And YouTube is just so much more sort of lowercase
00:54:58and uppercase mature than that, right?
00:55:00That it is like learning how to play by these rules
00:55:03in order to be around for a long time.
00:55:06TikTok just wants to move a million miles an hour
00:55:07in every direction all the time.
00:55:09Yeah.
00:55:10And I certainly enjoy TikTok less than I used to
00:55:13because of that.
00:55:13Like the platform is sort of unrecognizable
00:55:16from what it was even 12 months ago now.
00:55:19Yeah.
00:55:20That's sort of like 2020, 21 TikTok heyday is gone.
00:55:25Yeah.
00:55:26You know, I maintain a list of TikToks
00:55:28that should be PhD theses in media studies.
00:55:31And many of them are deleted now.
00:55:32They're just gone.
00:55:33Like the creators have quit.
00:55:35They've pulled them down.
00:55:36Like they just don't want to be a part of it anymore.
00:55:37And there's something about that
00:55:39where it's like this thing is getting
00:55:40so commercial so fast.
00:55:41It's so, like I can't tell you how many ads I see now.
00:55:45And it's just got, it's mainly those Lenovo headphones
00:55:48and the guy being like, man.
00:55:48I found you.
00:55:49I love these Lenovo headphones.
00:55:51Apple doesn't want you to know how good they are.
00:55:53And I'm like, sir, I think that's just,
00:55:55you're lying to me.
00:55:57I don't think you have that insight
00:55:59into Apple or Lenovo's business.
00:56:02Why is this on my feed?
00:56:04Why does it have thousands of likes and views?
00:56:06Is that the one with the screen on the case?
00:56:08No, it's like they go in your ear
00:56:11in a weird like twisty way.
00:56:12Yeah.
00:56:13Don't worry.
00:56:14We'll be checking them out.
00:56:15We want to see if they really are as good.
00:56:17Is Apple lying to us?
00:56:18I love that you're just like V.
00:56:19Yeah.
00:56:20Buy crap.
00:56:21V was like, do I have to?
00:56:22And I was like, enjoy.
00:56:24I'm excited for this.
00:56:25Yeah, it's going to be fun.
00:56:26This is going to be the greatest series
00:56:27of TikTok videos we ever make.
00:56:29Because everyone else on TikTok is getting paid.
00:56:32And we're just like, does it actually work?
00:56:35Let's find out.
00:56:36And it doesn't.
00:56:37Stop buying it.
00:56:39Look, I think between the universal stuff,
00:56:40not having that catalog,
00:56:41and not, from what I can tell,
00:56:43making any steps towards a resolution,
00:56:46eventually all the other label deals are going to come up.
00:56:49And the labels might be short-term thinking
00:56:53and like they're getting benefit now.
00:56:55All of them are actively talking
00:56:58to all of the platforms about AI stuff.
00:57:00All of them, they do not want their stuff trained upon.
00:57:03They do not want the videos
00:57:05that their stuff gets used in trained upon.
00:57:07They want AI controls.
00:57:09And like some of the platforms have like weird,
00:57:13like you have to make a distinction
00:57:14between the kinds of AI you use.
00:57:16So if you roll up to, I don't know, Spotify,
00:57:18and you're like, don't use AI.
00:57:20So I was like, dude, we've been using ML
00:57:23to do our playlists for like 100 years.
00:57:25Yeah.
00:57:26Like we're going to keep using,
00:57:28like, and so there's this whole education process
00:57:30happening in the industry, which is fascinating.
00:57:32Other interesting companies are like tools
00:57:35that use AI to like help you filter sounds
00:57:37or like pull out stems.
00:57:38Like they're in the same conversation.
00:57:40We're like, this isn't that.
00:57:41This isn't the bad thing.
00:57:43Yeah.
00:57:44But it's all got, everyone thinks AI is generative AI.
00:57:47Yeah.
00:57:48I noticed that.
00:57:49It's really consistent and really annoying.
00:57:52Yeah.
00:57:53I'm like somewhere James is just screaming.
00:57:56But I just think like the next turn for TikTok
00:57:58is like this rapid, I mean, the term is
00:58:01Cory Doctorow's term.
00:58:01It's enshitification, right?
00:58:04Where it gave a lot of value to users.
00:58:06It built up a huge user base and now is like aggressively
00:58:10trying to re-extract that value from its user base.
00:58:13I feel like people are going to be like,
00:58:14you know what?
00:58:15Instagram Reels exists.
00:58:16YouTube Shorts exists.
00:58:17I don't have to do this here.
00:58:19Other ways.
00:58:20Also, I might get banned.
00:58:20Yeah.
00:58:21There's just other ways to spend your time.
00:58:26And I don't know.
00:58:27I think we're about to hear, have a turn
00:58:30from this real love of these short, super ephemeral videos
00:58:34to something more stable.
00:58:37Novels.
00:58:38Yeah.
00:58:39Scrolls.
00:58:40Yeah.
00:58:40Everybody's going to be doing Only Scrolls.
00:58:42That's the future.
00:58:43Only Scrolls is a hell of a name for a property.
00:58:46Yeah.
00:58:47It's just like Shakespeare nudes.
00:58:51Yeah.
00:58:54Famous, famous writer of scrolls,
00:58:56William Shakespeare.
00:58:58Sorry.
00:59:01But I do think we're going to see like a change
00:59:04in how people are consuming.
00:59:06Like, I think there's a rapid moment of change
00:59:09in how we consume media and what we're consuming
00:59:11and what we prioritize.
00:59:13And there's so much happening in the space.
00:59:17We're going to talk about it in a little bit,
00:59:18but like what's happening with movies and streaming,
00:59:20a ton is happening there and it's happening really quickly.
00:59:24And now we're having the same thing with social media.
00:59:25These are all ways we consume our media.
00:59:27And they're all in this moment of enormous transition.
00:59:30And what does that look like on the other side?
00:59:32Alex, you are so close to pitching Quibi
00:59:35that I just want it, you're so close.
00:59:38What you want is,
00:59:40I think what the future is,
00:59:41is actually like an app where the whole video
00:59:44will change format when you flip,
00:59:46when you turn the phone.
00:59:48It's going to blow your mind.
00:59:49Have you thought about a show that's only at night?
00:59:51Yeah.
00:59:51It will only launch with pandemics.
00:59:53What if shows came out at night?
00:59:54Huh?
00:59:56David, have you ever asked the AI pen
00:59:58to generate you a nude William Shakespeare?
01:00:02No, I could, but we'd get the explicit tag on the podcast.
01:00:05We can't have that.
01:00:06Actually, this brings us to the other big story
01:00:08of the week, I think.
01:00:10A big New York Times piece on OpenAI and Google
01:00:12and all the rest finding ways
01:00:14to generate more training data.
01:00:17I think OpenAI is getting itself
01:00:18into a lot of trouble here.
01:00:20Their startup, they played really fast and loose, right?
01:00:22Ask for forgiveness, not permission.
01:00:25We're now the hottest company in the world.
01:00:27We have billions of dollars in Microsoft partnership
01:00:30and we're getting sued by the Times.
01:00:31Yeah, like the ask forgiveness, not permission works
01:00:34when you're not a super powerful company
01:00:37and you don't have a ton of money behind you.
01:00:40Works a lot less when you've got Microsoft,
01:00:42money bags, Microsoft behind you.
01:00:44Yeah.
01:00:45So OpenAI needed more training data to train GPT-4.
01:00:48It developed a system called Whisper,
01:00:51which transcribed YouTube videos
01:00:52and trained on over a million hours of YouTube videos.
01:00:55That's not great.
01:00:56There's a little back and forth
01:00:57in the media going on about this.
01:00:59So Joanna Stern at the Wall Street Journal,
01:01:00we may have mentioned this particular thing before,
01:01:03but Joanna was talking to Meera Murati, the CTO of OpenAI.
01:01:06She said, did you train on YouTube videos?
01:01:09And Meera, in every way possible, said, I don't know,
01:01:12which is insane.
01:01:13While also making a face that said, yes.
01:01:15Yeah.
01:01:19Not great.
01:01:19And then Neil Mahan was on, I believe, Bloomberg,
01:01:23and they asked the same question.
01:01:25And he was like, well, if they did,
01:01:28that would violate our terms of service.
01:01:29But it also, in that same New York Times story,
01:01:32wasn't there bits about how Google had also trained
01:01:37on some YouTube videos?
01:01:38Yeah, so Google owns YouTube.
01:01:40So this is very challenging for Google.
01:01:42So Google is built, and I say this just as a factual matter,
01:01:46Google is built on a very expansive view of copyright law,
01:01:51and it has aggressively expanded the boundaries
01:01:53of copyright law throughout its existence.
01:01:55So the very idea of a Google index
01:01:58requires you to go read a bunch of data.
01:02:01Google Image Search requires them to host copies
01:02:03of lots of images.
01:02:04That was a lawsuit.
01:02:05YouTube, Viacom famously sued Google
01:02:08for a bunch of stuff on YouTube.
01:02:09It was found out later, Viacom employees
01:02:11uploading videos to YouTube to promote them
01:02:13in the echoes of the TikTok situation.
01:02:15But they beat Viacom.
01:02:17YouTube exists.
01:02:18So Google just constantly expands
01:02:21the boundary of copyright law.
01:02:23As a function of its existence, that's a thing that Google
01:02:25books.
01:02:26We're going to scan all the books
01:02:27in the world without permission to make an index of them
01:02:30and then convince a judge that this will sell more books.
01:02:33It worked.
01:02:33That worked.
01:02:35Google won those cases when it was like a cuddle bug,
01:02:38just a bunch of goofballs.
01:02:40We got slides in the office.
01:02:42The judge had like a Dell PC, and they're like,
01:02:45this internet's amazing.
01:02:47Those days are over.
01:02:48It's a gateway.
01:02:49Well, and again, when the theory behind all of those products
01:02:52was to help people find them.
01:02:54It wasn't always true, but that was at least the story,
01:02:57was that Google is saying we are going
01:02:59to ingest this stuff in service of helping people find them
01:03:03and go back to you, the creator of them.
01:03:05Yeah, we're going to index all these links,
01:03:07and then we're going to send you to the web pages
01:03:09that we're linking to.
01:03:10We're going to ingest all these images,
01:03:11and then you can go look at the images for real.
01:03:13Fine.
01:03:14But it is also true that the judges were evaluating
01:03:17a service that existed on like a CRT monitor in the den,
01:03:21in the computer room.
01:03:23And Google was a bunch of like 20-year-old kids,
01:03:25and they were just like a different company,
01:03:28a different time, a different company,
01:03:30different cast of characters, different relationships
01:03:32to power, blah, blah, blah, blah, blah.
01:03:34You come to now, and you have open AI headlong
01:03:39into a dispute with Google about training on YouTube.
01:03:43You have Google in a headlong dispute
01:03:44with its own creators about whatever
01:03:46the YouTube terms of service say.
01:03:48Google expanded its terms of service recently.
01:03:50All these are going to be, you have the New York Times
01:03:53suing open AI.
01:03:53You have all these lawsuits happening simultaneously,
01:03:56or all these conflicts happening simultaneously,
01:03:58and none of these companies are as sympathetic as Google was.
01:04:03I think it's because they're so nakedly doing it for money
01:04:07in a way they weren't before, right?
01:04:09The cost for, or the benefit for regular people
01:04:13is much lower now than it was before.
01:04:16And I think that's what we're seeing
01:04:18with a lot of just the general tension around generative AI,
01:04:21is it feels like, OK, we are devaluing things
01:04:25that we tend to feel very strongly about,
01:04:29all in the effort to make Eric Schmidt richer.
01:04:32And shockingly, people don't want to do that.
01:04:36I'm surprised.
01:04:37So a very funny part of the story
01:04:39is Meta, a less sympathetic company on the scale of things,
01:04:44although Zuck, now that he's ripped
01:04:46with the shaggy hair and the good jackets,
01:04:48he's come back around.
01:04:51They want to train the day so bad
01:04:53that they considered buying Simon & Schuster as just
01:04:57a straight up book publisher so they could just
01:04:59ingest all the books without copyright wars.
01:05:01That's crazy.
01:05:02That's just a crazy place for us all to be.
01:05:05This New York Times story, and I really
01:05:07encourage our audience to go read it.
01:05:09It's called How Tech Giants Cut Corners to Harvest Data for AI.
01:05:13It came out on Sunday.
01:05:15This Sunday.
01:05:15If you're listening to this on Friday,
01:05:17it came out this last Sunday.
01:05:19It's really, really good.
01:05:20And I think what really struck me,
01:05:21the big takeaway I came out of this story with,
01:05:24was that these companies have all
01:05:26recognized there is an intrinsic value in creating cool stuff
01:05:31and putting it out into the world.
01:05:32They recognize that.
01:05:34And they are now saying, how do we make machines
01:05:36copy all of that so we can do it worse for more money?
01:05:43And I think that was just a really demoralizing thing,
01:05:48I think, to read as somebody who does create things
01:05:50and that's my job.
01:05:52It was like, oh, that sucks.
01:05:54But the story was just fascinating
01:05:56because it was just like, they are in such a race
01:05:58to get data, to buy Simon & Schuster,
01:06:00to potentially steal their own products from Google
01:06:05or from YouTube and from, yeah.
01:06:07Right.
01:06:08So a clear problem here, and this is sort of always
01:06:09a problem when it comes to regulating companies this rich,
01:06:12is that all of this might just seem like an acceptable tax
01:06:15to them.
01:06:16Yeah.
01:06:16So, yep, we ripped off a bunch of book publishers.
01:06:19Sarah Silverman's mad.
01:06:21We're just going to pay her the money.
01:06:24But some of them are not going to settle.
01:06:26I don't think Sarah Silverman's going to settle.
01:06:28I don't think The Times is going to settle that case for money.
01:06:31And so if they lose those cases, the precedent is really bad.
01:06:34And the case that I'm just thinking about a lot
01:06:37is, what if YouTube creators pressure Google
01:06:42into suing OpenAI or being in some open financial conflict
01:06:46with OpenAI?
01:06:47What if YouTube creators sue OpenAI directly and say,
01:06:49you scraped YouTube, and implicate Google along the way?
01:06:54Because neither one of those companies
01:06:56wants to set the precedent that training an AI model
01:06:59is copyright infringement.
01:07:01Yeah, they both want it to not be.
01:07:02And that was one of the things in the story,
01:07:04Google was even like, are we allowed to scrape
01:07:08our own stuff from YouTube?
01:07:09And everybody was like, don't worry about it.
01:07:10Right, and then they expanded their terms of service.
01:07:12OpenAI told us when we were writing about the story
01:07:14that OpenAI uses numerous sources, including publicly
01:07:17available data, big circle around what that means,
01:07:22and partnerships for non-public data,
01:07:24and that it is looking into generating
01:07:25its own synthetic data.
01:07:27Which is insane.
01:07:29Multiple terms.
01:07:31I would just remind everyone listening to this,
01:07:34having something on the internet does not mean
01:07:35it is free to use.
01:07:37People get very confused about this concept.
01:07:39Like, you can put something on the internet
01:07:41that does not mean it is publicly available.
01:07:44Like, it means that it's there.
01:07:48It technically means that it's publicly available
01:07:50and that people can access it.
01:07:54But it doesn't mean that it's actually legally
01:07:55publicly available.
01:07:57Yeah, because there's incidents of people being like,
01:07:59getting sued for accessing publicly available.
01:08:03I'm using scare quotes here.
01:08:04Publicly available data, and then it's like,
01:08:06no, you weren't actually supposed to go in there
01:08:07and you knew you weren't supposed to.
01:08:10Bad.
01:08:11Yeah, that's bad.
01:08:11I would say two things to that, though.
01:08:12One is that the precedent we have on a lot of that stuff
01:08:17leans toward it is okay to scrape websites.
01:08:24And you're also the one who comes on this show
01:08:27every time we talk about copyright law
01:08:28and reminds us all that copyright fights are a coin toss.
01:08:33I'm just saying, like, if OpenAI's position
01:08:35is we use publicly available information,
01:08:38what OpenAI believes publicly available means
01:08:40actually turns out to be massively important.
01:08:44I think what they mean is we clicked on it on a website
01:08:49and so it's ours now.
01:08:50That does seem to be the case, yes.
01:08:52And I'm just saying, like, there's not an interpretation
01:08:54anywhere where that is the thing, right?
01:08:57And so you just get to a point where any YouTube creator,
01:09:00Mr. Beast, do it for the views.
01:09:03You know what I'm saying?
01:09:04Just tell OpenAI you're gonna sue them
01:09:06because they copied your YouTube videos.
01:09:08I sued Google for $100 million.
01:09:10Right?
01:09:12He'd give it away.
01:09:13No, OpenAI.
01:09:13I'm a YouTube creator and I sued OpenAI.
01:09:15And OpenAI is gonna say your video is publicly available.
01:09:19We believe that this is not copyright infringement.
01:09:21And Google is gonna sit in the middle of that,
01:09:23being like, we need to protect our YouTube creators.
01:09:27This is not in our terms of service.
01:09:28That's what Google said to us.
01:09:30Google takes technical and legal measures
01:09:32to prevent unauthorized use when we have
01:09:34a clear legal or technical basis to do so.
01:09:36That's Google's approach, right?
01:09:38This is ours.
01:09:39We have contracts with creators.
01:09:40You can't just take it.
01:09:42There's just a war here coming.
01:09:44Because all these companies are like,
01:09:46we're gonna go read the entire web.
01:09:48And Google's like, but that's what we do.
01:09:51There's something bad and they run
01:09:53one closed platform in YouTube.
01:09:55And I just could not tell you what happens next.
01:09:57So that's one.
01:09:59That's just publicly available.
01:10:00Then there's synthetic data.
01:10:01Oh my God.
01:10:01Which is bananas.
01:10:02So all these companies now are like,
01:10:04we're running out of training data.
01:10:05The humans are not making art fast enough for us to steal.
01:10:08What if we train on AI generated data?
01:10:12And I just am like, you guys are gonna kill yourselves.
01:10:15Well, that was the consensus in the story too.
01:10:18They spoke to actual experts who were like,
01:10:20no, that's bad.
01:10:22That just makes all the problems worse.
01:10:25All the hallucinations just get worse.
01:10:27You just start going down these weird rabbit holes.
01:10:29And then you have Kevin Roos like,
01:10:31am I in love with my AI?
01:10:33That's what happens.
01:10:34Yeah, Sidney's like, I love myself,
01:10:36but also I hate myself.
01:10:38Becoming, finally, the full goth Sidney
01:10:40that we've been looking for.
01:10:42I think the thing that really jumps out to me
01:10:44about the story, just from a purely unrelated
01:10:46but insane thing, is that Whisper
01:10:49was created at OpenAI explicitly to do this.
01:10:54Whisper is amazing technology.
01:10:57And the speech-to-text stuff that is happening,
01:11:00again, we're all journalists,
01:11:01we transcribe things all the time.
01:11:03More and more of that is being powered by Whisper.
01:11:05Whisper is now a publicly available technology
01:11:09that OpenAI just put out there in open source
01:11:11that anybody can have it.
01:11:12It is remarkable tech.
01:11:14And the idea that it was built just to steal
01:11:16a bunch of YouTube videos is wild.
01:11:19And what a deeply bizarre secondary effect
01:11:24of all of this.
01:11:26But it also, I think, this story of training data
01:11:30becoming really valuable and really expensive
01:11:34is starting to be everywhere.
01:11:35There's stories about all these companies,
01:11:38whether you wanna train a large language model
01:11:40or you're a creative tools company
01:11:43looking for images to train your healing brush on.
01:11:47This stuff is getting expensive.
01:11:49And I think one thing we hear a lot as journalists
01:11:51is that the future of the media is making stories
01:11:54that are training data for these models,
01:11:57that our job is gonna be to report real-time information
01:12:00to a large language model, and that's our business.
01:12:02And I find that bleak as all hell.
01:12:04But that future is coming,
01:12:08where there are going to be businesses that are made
01:12:11that make a lot of money just being training data.
01:12:15And that is nuts.
01:12:16Yeah.
01:12:17There are gonna be people who are like,
01:12:18don't worry, our artificial writing
01:12:20is way better than anybody else's.
01:12:22Oh, that's already true.
01:12:23It's already true.
01:12:25It's gonna be more common.
01:12:28It's like the next phase of ClickFarms.
01:12:30Yeah.
01:12:31Alex, you came to us from GeoMedia.
01:12:32Yeah.
01:12:33Wow.
01:12:35It wasn't always a club.
01:12:37It started as one, then it stopped.
01:12:39I'm sorry, I came up at Engadget.
01:12:42I always had an eye on Gizmodo, and I...
01:12:44Yeah.
01:12:45No, I'm just talking.
01:12:46No, no, no, no.
01:12:47Like, I think about that a lot.
01:12:49And you hear people talk about how upset...
01:12:52I talk to a lot of writers and stuff who are like,
01:12:53oh, no, our jobs are gonna be taken away.
01:12:55And it's like, well, the ClickFarm jobs
01:12:56are gonna be taken away.
01:12:58But those are also the first jobs you get in media
01:13:01a lot of times now, especially nowadays.
01:13:03And it's like, okay.
01:13:05So that on-ramp is gone.
01:13:07Yeah.
01:13:08Look, I think the idea that they're gonna train
01:13:10these systems on synthetic data
01:13:11because they've run out of real data
01:13:12and the copyright wars are in full effect.
01:13:16I just keep saying, I know what we're gonna write about
01:13:17for the next 10 years.
01:13:18Yeah.
01:13:19Like, no AI can figure that out.
01:13:22I just always thought it was gonna be monkeys.
01:13:23I can barely tell David the time.
01:13:25That's also like, that cycle you just described
01:13:27is the end of the internet.
01:13:30Seriously.
01:13:31Like, Google training on Google created data
01:13:36from Google training is the end of the internet.
01:13:40Like, that becomes an unusable disaster in record time.
01:13:45I keep, like Alex just said,
01:13:46she accidentally invented Quibi.
01:13:48Once you get enough people talking about that,
01:13:51everyone accidentally invents Yahoo.
01:13:53Like, what if we made a list of good websites?
01:13:55It's like, yeah, that's Yahoo.
01:13:58That's what that was.
01:13:58It was unsustainable.
01:14:01We're gonna make a list of good websites.
01:14:03Here at theverge.com, which you can visit to,
01:14:04which you can visit directly in your web browser,
01:14:07and which is written by people.
01:14:08And it'll be sustainable.
01:14:09For now.
01:14:10All right, we gotta take a break.
01:14:11We're gonna come back with lightning round.
01:14:12I'm very excited about my lightning round item.
01:14:14We'll be right back.
01:14:15We'll be right back.
01:14:19All right, we're back.
01:14:21I'm gonna go last.
01:14:22Yeah, there's big caps next to yours.
01:14:25Like, it's in all caps.
01:14:26It's in all caps.
01:14:27It's beautiful.
01:14:27You wanna go first?
01:14:28I'll go first.
01:14:29I'll go first.
01:14:30I'm gonna do two, actually.
01:14:31You're gonna do two?
01:14:32I love it.
01:14:33I'm gonna do two, yeah.
01:14:33The first one is gonna be,
01:14:34because I just have to talk about it,
01:14:35Kobo has new color e-readers.
01:14:38They're both sitting in my house right now.
01:14:40They're using Kaleido 3.
01:14:41It's the exact same stuff that Onyx Books
01:14:43has been using since last year.
01:14:45Oh, you know, Kaleido 3.
01:14:46That technology that everyone knows
01:14:48and is familiar with and has definitely heard of.
01:14:50Many people listening to this podcast right now
01:14:52are like, yes, she gets it.
01:14:53Classic Kaleido 3 talk, am I right, Eli?
01:14:56I got you.
01:14:57I'm like, Alex is talking Kaleido 3 again.
01:15:00Don't worry.
01:15:02But they look really cool.
01:15:04I only got them yesterday.
01:15:05I haven't gotten to play with them yet.
01:15:06But I think what's really exciting about it
01:15:08is Kobo is probably one of the largest
01:15:09e-reader manufacturers for buying stuff
01:15:12in the United States.
01:15:13They're Amazon's primary competitor.
01:15:15So that was what was exciting to me.
01:15:16I was like, oh, wow, someone who actually competes
01:15:18with Amazon is doing this.
01:15:20And Amazon has a new guy in charge of devices.
01:15:22I'm just saying, that color e-reader from Amazon,
01:15:25that color Kindle, it's coming.
01:15:27Hey, Panos is shaking things up over there
01:15:29from what I'm told.
01:15:29I'm feeling it, I'm feeling it.
01:15:30I mean, how could he not?
01:15:32Yeah.
01:15:33Do you think he walked in there and was like, I'm pumped?
01:15:34He was like, I'm not pumped.
01:15:37Let's work on it.
01:15:38You have 36 hours to pump me up.
01:15:40Get my pump on.
01:15:42And they were like, Kaleido 3.
01:15:43And he's like, I don't know what that means.
01:15:45But now I'm pumped.
01:15:48So are you getting one?
01:15:49You're gonna review this thing?
01:15:50Yeah, I've got them.
01:15:51They're sitting in my house.
01:15:52I've only turned on one so far
01:15:54because they got here last night.
01:15:55And I was like, whoo, very excited.
01:15:58Like in Slack at 6 p.m. being like, guys, guys, guys,
01:16:01I got packages.
01:16:02And they're like, it's 6 p.m. and I'm leaving.
01:16:04I'm going home to sleep.
01:16:05Very good.
01:16:07I'm excited for these reviews.
01:16:08Full one hour of Verticast.
01:16:09Whether or not we distribute that one,
01:16:11we'll decide at a later time.
01:16:11Yeah, and it'll just be for the three of us.
01:16:14And Liam, I'm sorry, Liam.
01:16:15Hey, if you wanna hear more about the Kaleido 3,
01:16:19write Alex a note, alex.kranzeverch.com.
01:16:21She'll read them.
01:16:22I will read it.
01:16:23On your Kaleido 3.
01:16:24Yeah, that's the technology, please.
01:16:26This is the Kobo Libra color.
01:16:31Don't quote me on that.
01:16:32I love that you know the display tech,
01:16:32but not the name of the product.
01:16:34I'm just very excited.
01:16:36And it's color with a U.
01:16:37It's very important, because it's Canadian.
01:16:39All right, so that's one.
01:16:40That's one.
01:16:41The other one is the MPA.
01:16:42Do you remember?
01:16:43They used to be called the MPAA.
01:16:45Yeah, they got rid of America.
01:16:46Yeah, they got rid of America.
01:16:47Because they did.
01:16:48This is what happens.
01:16:49They did.
01:16:50They were like, the Chinese work, it's huge.
01:16:51Yeah, we need to be more universal.
01:16:53We need to be more global.
01:16:54So we get rid of the America.
01:16:55They should have gone the other way.
01:16:56Just be the motion picture Americans.
01:16:58Or the W, just add the world.
01:17:01No, that, MPAA?
01:17:03Yeah, MPAA.
01:17:04MPAA.
01:17:06That's what they call jack ballots.
01:17:08But they are back.
01:17:10And there's big distributor conference happening
01:17:15right now in Vegas, where they see a lot of trailers
01:17:18and a lot of stuff.
01:17:19And the head of the MPAA said,
01:17:22we want to work with Congress,
01:17:24and we are working with Congress,
01:17:25to bring back site blocking.
01:17:27Which is, it's just annoying.
01:17:30Yeah.
01:17:31Really, you hear from their side where they say,
01:17:33there's a lot of piracy going on.
01:17:36We're losing a billion dollars a year to piracy.
01:17:39We want to stop that.
01:17:40So we want to block sites that Americans use
01:17:43to pirate stuff.
01:17:44And that's true.
01:17:45And you know how they got piracy
01:17:47to not be a big deal the last time?
01:17:49They introduced easy-to-use,
01:17:51accessible streaming platforms.
01:17:54And that worked really, really well
01:17:55until they decided to make them really expensive.
01:17:58And it's like, there's a direct correlation there, guys.
01:18:02You're not serving your audience.
01:18:04And so the MPAA is back.
01:18:06It's like 2007 all over again.
01:18:09I'm just.
01:18:09I'm gonna have to write a SOPA, PIPA explainer
01:18:11one more time.
01:18:11Where's Alexis Ohanian when you need him?
01:18:13Yeah, get ready, get ready.
01:18:15Probably half of our audiences remember this.
01:18:16There was an anti-piracy bill called SOPA
01:18:20that would basically have forced various ISPs
01:18:23to take websites off the internet
01:18:24if they did piracy stuff.
01:18:26And Alexis Ohanian, founder of Reddit,
01:18:28went on the war path to stop this.
01:18:31We wrote about it.
01:18:31I was all angry about it.
01:18:32The whole thing, SOPA and PIPA.
01:18:33You can go look it up.
01:18:34It was a real thing that happened.
01:18:35It didn't happen.
01:18:36The internet waged war on Congress,
01:18:38and they didn't do it.
01:18:39And I don't think they've ever forgiven us.
01:18:41Nope, and they're back.
01:18:42They're back.
01:18:43They're ready.
01:18:45It is such funny deja vu that like,
01:18:47Alex, you're exactly right that it's like,
01:18:49how is no one at this meeting being like,
01:18:50what if we just didn't make everything so horrible
01:18:54for everyone all the time?
01:18:56Like, isn't it weird that all these people
01:18:58are desperate to give us their money to watch shows
01:19:00and we won't let them, so they have to do it illegally?
01:19:04Like, strange.
01:19:05Wonder how we could fix that.
01:19:07Yeah, they just need to figure out
01:19:08a digital distribution model
01:19:10where they make enough money to be happy
01:19:12in all of their very expensive cars
01:19:13while also not forcing, apparently,
01:19:16a billion dollars worth of money to go into piracy instead.
01:19:19There might be something there.
01:19:20If you got an idea, again, alex.kranztheverge.com.
01:19:23Yeah, hit me up.
01:19:24She's open to ideas.
01:19:25All right, David, what's yours?
01:19:27Mine is just a thing that I find very funny.
01:19:29So, I think it was last week that Marissa Mayer's company
01:19:33came out with this new app called Shine,
01:19:35which was a way to basically share photo albums
01:19:38with friends, and it was the ugliest app
01:19:40that anyone had ever created in the history
01:19:43of the universe.
01:19:44It's like, imagine you had never seen an app
01:19:49and you said, I'm going to design an app in three minutes.
01:19:52That's what this app looked like.
01:19:53And there was some great stories,
01:19:55including from our friends at Platformer,
01:19:56about the weird chaos that it seems to be
01:19:59to work for Marissa Mayer and all this stuff
01:20:01that went wrong, but it has brought up
01:20:03this very funny thing that I have come to very much enjoy,
01:20:06which is that it turns out sharing photos with your friends
01:20:08remains a disastrous technical problem.
01:20:12It's part of the whole blue bubble, green bubble,
01:20:15Apple antitrust stuff.
01:20:16It's a really hard thing to do across platforms.
01:20:18It's one of the things that has made Google Photos
01:20:21very successful, is it's a thing you can use
01:20:22to share photos.
01:20:23And so, Meta rolled out an update to the Messenger app,
01:20:29which I still want to call Facebook Messenger
01:20:31every time, even though it's just called Messenger,
01:20:33that the whole update is that now you can send
01:20:37better photos in Messenger, and people are amped about it,
01:20:41where you can send original resolution photos
01:20:45in Messenger, and they will get to their recipient good.
01:20:50And I think that's great.
01:20:52It's like, you know what, what a cool world we live in,
01:20:55that this is an unbelievable feature upgrade.
01:20:58Yeah, all that 5G, and we're just now getting
01:21:01to full-res photo sharing.
01:21:02Yeah, it's just so funny to me, we've hit this point
01:21:05where phones are just cameras.
01:21:07For all intents and purposes, that is the most important
01:21:09thing your phone does at this moment,
01:21:10is take pictures and video.
01:21:12And we have not yet solved how to send those
01:21:15to your friends.
01:21:17Isn't that weird?
01:21:19It's very weird to me.
01:21:20We've solved it in a number of billion-dollar company ways.
01:21:23Have we?
01:21:24The US government's trying to solve it.
01:21:26That's what you want.
01:21:27If you took a photo, and you wanted to send
01:21:30an original resolution version of that photo
01:21:34to my Android phone and Alex's computer right now,
01:21:38how would you do it?
01:21:40Dropbox.
01:21:42No!
01:21:43Sorry, we just had the CEO of Dropbox on Decoder,
01:21:45so that's what I came up with.
01:21:46That's the most upsetting possible answer.
01:21:48But it's true.
01:21:49He was great.
01:21:50Maybe not Dropbox, but it would be cloud, right?
01:21:53Yeah, you'd upload it, that's what I'd do.
01:21:55Yeah, I'd walk over to you, and I would plug
01:21:56one USB-C cable into my phone, one USB-C cable
01:21:59into your phone, and we would just see what happens.
01:22:01Yeah, that works, right?
01:22:03Isn't that a thing that totally works
01:22:04with no confusion all the time?
01:22:05Yeah, what could possibly go wrong?
01:22:07But anyway, yeah, kudos to Messenger.
01:22:09For boosting Marissa Meyer's idea.
01:22:12For making good shine, as I call it.
01:22:15By the way, everyone should go read,
01:22:16we'll link to the platform, our story about the internals.
01:22:19That company is like 30 people,
01:22:20and that story is shockingly well-sourced.
01:22:23Oh yeah.
01:22:24It's like, wow, no one likes working here.
01:22:27Yeah, I'm shocked.
01:22:29It's very good.
01:22:30Where's the pic of her on the Zamboni?
01:22:32Anyhow, by the way, here's some like verge lore.
01:22:36A long time ago, in our first office on Fifth Avenue,
01:22:40there was a company next to us,
01:22:42and it was like a horrible office,
01:22:44but there was a company next to us called Stamped.
01:22:46David, remember Stamped?
01:22:46Oh yeah, with no E, right?
01:22:48It was just Stamped.
01:22:50Yeah, and I don't even know what Stamped did or does,
01:22:54but this was when Marissa Meyer was,
01:22:56she was at Yahoo, and she was buying every company
01:22:59to try to make Yahoo relevant again,
01:23:01and one day, she snuck into our offices and bought Stamped.
01:23:05Ran away again.
01:23:06And we were like, why didn't you tell us?
01:23:07We're right here.
01:23:08We just wanted to say hi.
01:23:10Yeah, very funny.
01:23:11It looks like some sort of clothing company now.
01:23:13Oh, sure, yeah.
01:23:14We woke up to a headline on TechCrunch
01:23:16that was like, Yahoo buy Stamped,
01:23:17and we're like, that, right there?
01:23:19Those guys?
01:23:20Next to us?
01:23:21Oh no.
01:23:22It was very good.
01:23:23All right, here's mine.
01:23:24The most important story of the week.
01:23:26As you may know, the Sony Corporation in the late 1980s
01:23:31invented the concept of megabase.
01:23:34I love where this is going.
01:23:36Also, because I know, because I have the rundown.
01:23:38Megabase, I would say, altered the fabric of our reality.
01:23:43100%.
01:23:45Across America, and really the world, my paw.
01:23:49People would push the megabase button,
01:23:51and things would change.
01:23:52Yeah.
01:23:53Just, you know, gravitational waves, the whole thing.
01:23:56Just bumpin'.
01:23:57Parties started happening.
01:23:58Yeah.
01:23:59The butterfly effect is actually about megabase.
01:24:01If you add a yellow sports walkman to the megabase button.
01:24:06Right?
01:24:07World was your oyster.
01:24:08Sony moved away from the megabase branding
01:24:11in the late 2010s, I would say.
01:24:14Was it because of Meghan Trainor?
01:24:17They moved to something called extrabase.
01:24:19Oh.
01:24:20Ooh, I had a pair of those headphones.
01:24:21They're good.
01:24:22Which was deeply confusing and upsetting to,
01:24:24I think, everyone.
01:24:24Is extrabase more or less than megabase?
01:24:27They never answered my questions.
01:24:29I've demanded the various CEOs of Sony
01:24:31come on our shows and explain what happened,
01:24:34and they have all uniformly refused.
01:24:36They just said,
01:24:37my email's deleted. I don't even think
01:24:38they got, yeah, I don't think they got the,
01:24:39I think the messages were sent away.
01:24:41They were put on a memory stick,
01:24:43and then no one could read the memory stick,
01:24:44and that was the end of that.
01:24:46Extrabase has been with us until recently.
01:24:50Sony is replacing extrabase.
01:24:52Double extrabase.
01:24:55With a ULT power sound.
01:24:57No.
01:24:59Sony.
01:25:00Who does the naming at Sony?
01:25:01ULT power sound is a new brand
01:25:04for Sony's new party speaker products.
01:25:08They have a new,
01:25:10they have a new,
01:25:12they have a new party speaker,
01:25:12which we'll come to.
01:25:14All of the new products have a ULT button on them.
01:25:18This replaces the megabase or extrabase buttons.
01:25:21The ULT, the ULT buttons offer several modes.
01:25:25ULT one gets you deeper, lower frequency base,
01:25:28while ULT two delivers powerful punchy base.
01:25:34What does that mean?
01:25:35Mega and extra.
01:25:36They should have just labeled the mega,
01:25:37I don't understand.
01:25:38Well, that's too many Xs on the case.
01:25:42The flagship of the ULT power sound line
01:25:45is a 64 pound party speaker called the ULT Tower 10,
01:25:50which costs $1,200.
01:25:51Does it have a screen on it?
01:25:53It has 34 LED light zones, but no screen.
01:25:56It has like a touch screen at the top.
01:25:58It looks bananas.
01:26:00It really does have 34 LED light zones,
01:26:03and I'm gonna have one.
01:26:07There's not like another,
01:26:08I don't know what you thought was gonna happen.
01:26:09Does your wife know you're gonna have one?
01:26:12It has an X balanced speaker unit.
01:26:14I don't know what that means, but it sounds sick.
01:26:15He did just open up Amazon.
01:26:17It comes with a wireless microphone.
01:26:20Oh, for karaoke.
01:26:21While you were saying all of those words,
01:26:23this thing's enormous.
01:26:24It's so big.
01:26:26It's literally, there's a picture of a man rolling it,
01:26:29and it's like, imagine,
01:26:31do you ever see the thing where people
01:26:34check a bag with golf clubs in it?
01:26:36Yeah, it's the size of a golf bag.
01:26:38Every single photo that they made for this product
01:26:41is bananas.
01:26:43Every single photo, it's in the center of dance floors.
01:26:48Just being four feet tall.
01:26:49And then everyone's standing in a circle around it.
01:26:52That's how people party.
01:26:53I love it so much, I can't get enough of it,
01:26:58and all I wish is that, A,
01:27:01anyone would tell us what ULT stands for.
01:27:04I think it is short for ultimate.
01:27:07So, okay, I'm so glad you brought this up,
01:27:10because while you were saying
01:27:12whatever those nonsensical words you were saying
01:27:14to explain this thing,
01:27:15I was trying to figure out the answer to this
01:27:17by scrolling up and down Sony's website.
01:27:20There are only two things it could be, right?
01:27:23It's either ULTRA or ULTIMATE.
01:27:25Are there other possibilities?
01:27:28ULTRALIS.
01:27:29ULTERIOR, it has like a ULTERIOR logo.
01:27:32I'm trying to think of, is it a backer name?
01:27:36It stands for ULTIMATE LIVE.
01:27:39Upsettingly Loud Tower.
01:27:41Yeah.
01:27:42Yeah.
01:27:43Probably.
01:27:44But I think, based on, again,
01:27:46the deep journalism I've been doing
01:27:47scrolling up and down this website,
01:27:49I believe it is ULTIMATE,
01:27:50because the word ULTIMATE appears,
01:27:52I would say, a surprising number of times on this website.
01:27:55It weighs more than a child.
01:27:58Yeah, I mean, some children.
01:27:59Some children.
01:28:00Children comes in a variety of sizes.
01:28:0263 pounds, more than a toddler, probably, right?
01:28:05Yeah, that's more than my, well,
01:28:07it's not a toddler anymore.
01:28:07I was like, I don't know what toddler weights are.
01:28:09Yeah, definitely, it's like two Arthurs.
01:28:10It's like a max and a half.
01:28:12Yeah, that's about right.
01:28:13I'm just guessing.
01:28:15Arthur's a little younger than Max.
01:28:16I'm just like doing some rough math here.
01:28:17Perfect units.
01:28:18Perfect units of measurement.
01:28:21That's good.
01:28:22I can't wait to get one of these.
01:28:24Also, the marketing material says
01:28:26Massive Base ULTIMATE Vibe,
01:28:28which is just what I say now.
01:28:31They're having so much fun in these pictures.
01:28:33Massive Base ULTIMATE Vibe.
01:28:34So from now on, if you wanna sponsor the lightning round,
01:28:37you are officially contributing
01:28:38to the Nilay Patel Party Speaker Fund.
01:28:39Yeah, we're trying to get $1,200.
01:28:41We're gonna put one in the back here.
01:28:42We've had other Sony speakers,
01:28:44like other giant Sony speakers in our office,
01:28:46and we can't get rid of them.
01:28:47Sony will send them to us.
01:28:49They don't want those back.
01:28:49Because they know that we care.
01:28:51And then we're like, do you want this back?
01:28:52And they're like, no, it would cost
01:28:53much too many dollars to send this back.
01:28:55Because it's four feet.
01:28:57It would be so tall.
01:28:58When we did the Mr. Robot after show,
01:29:00we would wheel it into the after show.
01:29:02Like the writer of Mr. Robot, I was like,
01:29:03what the fuck is that?
01:29:04It's very good.
01:29:07Anyway, I just want to mark,
01:29:09if you are a certain kind of technology fan,
01:29:11it is important to know that we've gone
01:29:13from mega bass to extra bass to ULT power sound.
01:29:18And I think it's important to just take a moment
01:29:19and say, look, a new generation is here.
01:29:21I don't know if that generation will be defined by AI.
01:29:24I don't know if it'll be defined by face computers.
01:29:28I don't know.
01:29:29But I know that this is the generation
01:29:30that is defined by ULT power sound.
01:29:32Massive bass, ultimate vibe.
01:29:35Oh, David, I forgot to ask you where the AI pin
01:29:37fits on the scale of wearable bullshit.
01:29:40Oh.
01:29:41I mean, it's nowhere.
01:29:43I think it's six, but it's value over fiddliness.
01:29:47And I think I know the answer,
01:29:48which is zero value, maximum fiddliness.
01:29:50What's interesting, that's true.
01:29:52But what's interesting about it is,
01:29:53I would say it gets like a 0.75 of the face multiplier.
01:29:59Because it's on your face.
01:30:00I was surprised at the extent to which people noticed it,
01:30:04like out in public.
01:30:06There was a very funny moment
01:30:07where we were running around shooting the video.
01:30:10And it's like a weird thing in general,
01:30:12because there's three people pointing cameras at me
01:30:14as I'm doing this.
01:30:14But I stood in front of the Fearless Girl on Wall Street,
01:30:17that little statue.
01:30:19And people were taking pictures of me
01:30:20as I took pictures of the Fearless Girl statue.
01:30:23It was fantastic.
01:30:24But like, it's very noticeable,
01:30:27especially when you're standing there
01:30:29sort of talking down at your chest.
01:30:33I get more looks wearing the pin
01:30:35than I do wearing the Ray-Ban smart glasses.
01:30:38That makes sense.
01:30:39You know what gets a lot of looks?
01:30:40The ULT Tower 10, which weighs 64 pounds
01:30:43and has 34 LEDs in it.
01:30:44As you carry it above your head, say anything style.
01:30:48Here's what you want to do.
01:30:49You want to downshift from ULT 1 to ULT 2.
01:30:52They'll feel it, every time.
01:30:54That's it, that's The Verge Cast, everybody.
01:30:56Rock and roll.
01:31:01And that's it for The Verge Cast this week.
01:31:03Hey, we'd love to hear from you.
01:31:04Give us a call at 866-VERGE-11.
01:31:07The Verge Cast is a production
01:31:08of The Verge and Vox Media Podcast Network.
01:31:11Our show is produced by Andrew Marino and Liam James.
01:31:14That's it, we'll see you next week.

Recommended