The Verge's Nilay Patel, Alex Cranz, and David Pierce discuss announcements from Google I/O and OpenAI's GPT4o event.
Category
🤖
TechTranscript
00:00:00 (dramatic music)
00:00:02 - Hello and welcome to "The Vergecast",
00:00:03 the flagship podcast of multimodal search agents.
00:00:07 They can kill you in a number of ways.
00:00:11 Knives, guns, they're here, they're agents
00:00:14 and they can do anything they want.
00:00:15 I don't know why the industry has picked the word agents.
00:00:17 I mean, I get it historically,
00:00:18 but I don't know if you've ever seen "The Matrix"
00:00:20 in which the agents relentlessly try to kill Keanu Reeves.
00:00:23 Hi, I'm your friend, Neeli.
00:00:25 It's "The Vergecast", Alex Kranz is here.
00:00:27 - Hi, I'm the AI of Alex Kranz.
00:00:30 How are you today?
00:00:32 - Not enough emotion, Kranz.
00:00:33 More emotion. - Some emotion.
00:00:35 - Hi. - Oh my God.
00:00:37 Kranz is gonna flirt with you this entire episode.
00:00:39 David Pierce is here.
00:00:41 - Hi, I will not flirt with anyone this whole episode,
00:00:43 I promise.
00:00:44 - We should just turn up the knobs
00:00:45 of how flirty our AIs are,
00:00:47 up like throughout the episode randomly.
00:00:49 There's a lot going on this week.
00:00:50 I don't know if you can tell from my voice,
00:00:52 I'm very tired.
00:00:53 I just came home on a red eye flight
00:00:55 from Google I/O on the West Coast.
00:00:57 So Google had I/O, it was very exciting.
00:00:59 OpenAI had an event launching their own product.
00:01:02 We should talk about those two things together
00:01:04 'cause they're kind of in direct competition
00:01:05 with each other in actually kind of hilarious ways.
00:01:07 And a bunch of other stuff happened.
00:01:10 David reviewed the iPads.
00:01:11 A bunch of other emulators have launched for iOS,
00:01:15 continuing to reveal that Apple's application model
00:01:18 for iOS may be a little limiting.
00:01:20 The CEO of AWS just unexpectedly stepped out.
00:01:23 There's just like a lot going on.
00:01:24 And then there's like a Windows event next week.
00:01:27 In which Microsoft is basically being like,
00:01:29 we're gonna beat the M-series processors.
00:01:31 So a lot, just a big week.
00:01:34 And I just got off a plane.
00:01:36 So, you know, who knows?
00:01:37 Who knows where this episode of "The Vergecast" will take us?
00:01:39 - It's either gonna be the shortest
00:01:41 or the longest "Vergecast" in history.
00:01:42 And no one knows.
00:01:44 - So you know the thing that I've,
00:01:46 I have a new theory about red eye flights.
00:01:48 So, you know, in your 20s,
00:01:50 you just take them 'cause they're cheap.
00:01:51 - Yeah.
00:01:52 - And your body can just take it.
00:01:54 And then in your 30s, you get a little bit smarter
00:01:56 and you're like, I shouldn't do that anymore.
00:01:57 Like, I have a real job.
00:01:59 - You still do it sometimes.
00:02:00 - Like, I'm gonna buy a real plane ticket.
00:02:03 And then a thing that happens,
00:02:04 and I've only noticed this recently.
00:02:07 I'm in my 40s now and I have a child
00:02:08 and I take the red eyes 'cause I just wanna get home.
00:02:11 Like my goal is to just be back at home.
00:02:13 'Cause I'm like, I've been places.
00:02:15 Like San Francisco, it's Milwaukee with rich people.
00:02:18 Like get me out of here.
00:02:18 Like, going back to New York.
00:02:21 And then you look at the plane
00:02:23 and the plane is like a gradient
00:02:24 of like people in their 40s going home
00:02:26 to people in their 20s who are hungover.
00:02:29 And that's a red eye flight.
00:02:30 - Yeah, that's about right.
00:02:31 - It's very good.
00:02:32 - One of my favorite early Neil Eye stories
00:02:35 that I still tell is when,
00:02:36 I think we were in Barcelona for MWC
00:02:38 and your go-to red eye advice
00:02:42 was eat two McDonald's McGriddles right before the flight
00:02:46 and you will instantly fall asleep
00:02:47 because your body just shuts down
00:02:49 and that's how you sleep the whole way home.
00:02:51 - It's the level of sodium.
00:02:52 - It's been like a decade since you told me that
00:02:54 and I still think about it
00:02:55 every single time I get on a plane.
00:02:56 It's unbelievable advice.
00:02:58 - My friends and I used to call it time travel.
00:03:00 (laughing)
00:03:01 You just like eat two McDonald's sandwiches.
00:03:03 You just wake up wherever you're gonna be.
00:03:05 It's like, what happened to me?
00:03:07 You're like desperately thirsty and extremely puffy.
00:03:09 But you are--
00:03:10 - And sweaty and yeah, it's bad.
00:03:11 But you get there.
00:03:12 - Anyway, but I'm back.
00:03:13 I'm here.
00:03:14 It was a big, it was just a big week in AI.
00:03:17 So let me set a little bit of the just competitive stage
00:03:20 here.
00:03:22 We have long known when Google I/O is.
00:03:25 Right?
00:03:27 It's around now every year.
00:03:29 And they have it at the Shoreline Amphitheater,
00:03:31 which is a big outdoor amphitheater at Google owns
00:03:33 right next to Google campus.
00:03:34 And they invite lots of people.
00:03:35 Last year was a little bit smaller.
00:03:36 They're coming out of pandemic.
00:03:37 This year is big again.
00:03:39 We just know when it is.
00:03:41 Then there's this other company called OpenAI
00:03:44 that likes to act like they're just a bunch of kids,
00:03:46 even though they're some of the richest
00:03:48 and most experienced people in all of tech.
00:03:51 And they're like, oh, we should have an event.
00:03:52 We're not gonna have an event.
00:03:54 And they literally did this move.
00:03:56 They tried to upstage Google I/O,
00:03:58 by maybe leaking, maybe not leaking,
00:04:00 maybe just like swirling some rumors.
00:04:02 They were gonna launch a search product last week.
00:04:04 And then Sam Altman was like,
00:04:05 we're not gonna launch a search product.
00:04:06 And then they announced a demo
00:04:08 and they did a live stream called their spring event,
00:04:11 which they announced, I don't know,
00:04:13 20 minutes before it started.
00:04:15 They just created some anticipation
00:04:19 that they were gonna have a big thing before I/O.
00:04:22 - Well, it was weird because there were rumors
00:04:24 that it was gonna be GPT-5,
00:04:26 which obviously would have been a huge deal.
00:04:28 And then there were rumors that it wasn't gonna be GPT-5,
00:04:30 but it was gonna be a search engine,
00:04:31 which would also be a huge deal.
00:04:33 And then right before the event,
00:04:35 Sam Altman, I believe, was the one who tweeted,
00:04:38 not GPT-5, not a search engine,
00:04:40 but we have something to share.
00:04:41 So yeah, they're doing this very weird
00:04:44 aw shucks routine about like, it's not a big deal.
00:04:47 And that's so on brand for OpenAI,
00:04:50 where they're both convinced
00:04:52 that they are doing the biggest thing
00:04:53 that has ever happened in the history of humanity.
00:04:56 And I literally mean that precisely as I said it.
00:05:00 They actually believe that.
00:05:01 And also are just these aw shucks scientists
00:05:05 who are like, oh, look,
00:05:06 we made a thing that will flirt with you.
00:05:08 Isn't that fun?
00:05:09 - Yeah. - Let's hang out.
00:05:10 - Again, I'm just pointing out,
00:05:11 some of the most experienced, richest people in tech.
00:05:14 - Yeah, they know exactly what they're doing.
00:05:15 - And they are very competitive,
00:05:19 I guess is the way to put it.
00:05:20 Like they're aggressively competitive with Google.
00:05:23 I think they see that their opportunity
00:05:25 is to displace Google.
00:05:27 Whether or not their opportunity to actually build AGI,
00:05:28 we will come to that, right?
00:05:31 Like one way or the other, OpenAI is gonna make AGI.
00:05:36 But in the show, we will also come to that.
00:05:38 And so there was just that little gamesmanship there
00:05:41 of are they gonna announce a big thing?
00:05:42 Are they an upstage Google?
00:05:44 And then so they had their event,
00:05:45 they announced their multimodal search GPT-4.
00:05:48 O, O stands for Omni,
00:05:50 that lets you search with video, with audio, with voice.
00:05:52 It is natively multimodal.
00:05:54 The feeling at Google I/O,
00:05:58 and Alex Heath reported this in command line,
00:06:00 is that OpenAI did the demos Google was gonna do.
00:06:03 Right, so this was the big upstage.
00:06:05 Like OpenAI saw what Google was gonna do
00:06:07 or somehow heard about it
00:06:09 and rushed to demo the same capabilities first
00:06:12 to make sure that Google was in response to them.
00:06:14 I don't know if that's true, right?
00:06:16 That was just very much the feeling from Google folks.
00:06:18 It's very much the feeling that Alex reported.
00:06:20 But there is some real gamesmanship there.
00:06:23 And I think the important frame for all of the news
00:06:25 is it feels like OpenAI and Google
00:06:27 have identified some kind of end state
00:06:31 of an AI chatbot, AI search product,
00:06:34 and now they're in a furious race to get there first.
00:06:37 - I would actually frame it slightly differently.
00:06:39 I think everyone has known this is the end state for decades.
00:06:43 The idea that you should have a Star Trek computer
00:06:48 as a thing is what everyone has been talking about forever.
00:06:52 Like you go back to the early days of Siri and Alexa
00:06:55 and Google Assistant and they were talking about this stuff.
00:06:58 But the sense that I've gotten
00:07:00 is that I think the thing that has changed,
00:07:01 and I think it's changed really recently,
00:07:03 is that people think we're close.
00:07:05 That there is a real sense that not only is this the dream,
00:07:08 but like, oh my God, we can build it now.
00:07:11 And I think that was definitely the sense I got
00:07:13 talking to Demis Hassabis, who runs DeepMind at Google
00:07:17 and is in charge of all their AI stuff.
00:07:18 It was kind of the sense you got
00:07:19 watching the OpenAI stuff.
00:07:21 They think it's here now.
00:07:24 And you get the sense that we've gone from like,
00:07:26 oh, neat pie in the sky dream that someday we'll all get to,
00:07:30 which everyone has always known is the goal.
00:07:32 Whether or not it's a good goal, still debatable.
00:07:34 We've tried it a bunch of times.
00:07:35 It hasn't really worked, here we are.
00:07:37 But there is this real sense that like,
00:07:39 oh, it's happening now, and we have to go get it right now.
00:07:43 And it just feels like everyone has
00:07:46 gone from running as fast as they could
00:07:47 to running even faster, because there is a sense that
00:07:50 you can sort of see the finish line now,
00:07:52 which is really interesting.
00:07:53 - Yeah, but they're running faster
00:07:55 to a not particularly useful place.
00:07:57 Just watching it, particularly the ChatGPT one,
00:08:03 was like going to, do you remember at CES,
00:08:06 Intel used to have a big booth,
00:08:07 and they'd always be like,
00:08:08 look at the power of what we can do with Intel.
00:08:11 We can read your emotions with a camera.
00:08:14 And I was like, okay.
00:08:15 So when ChatGPT did it, I was like, cool.
00:08:16 I saw that demo from Intel four or five years ago.
00:08:21 Oh, cool, you can like read things
00:08:23 when you hold the camera down on it.
00:08:24 Again, I've seen that demo before.
00:08:26 Like, it felt like, it was like a nice demo.
00:08:28 It was done very well.
00:08:29 And like, I can appreciate that there was a lot
00:08:32 of new technology there, but it also didn't feel
00:08:34 like it was doing anything new or fantastic.
00:08:37 - Alex, I think you're underestimating the level
00:08:39 to which the billionaires of Silicon Valley believe
00:08:41 that people want to bang an iPad.
00:08:43 - This is true, this is true.
00:08:44 Yeah, like, where's your David Zaslav love?
00:08:47 You've got to like love people, not just devices.
00:08:50 - Yeah, an FMK with David Zaslav, an iPad.
00:08:53 Like, Sidney Sweeney is like a real,
00:08:57 and it's like, I don't know, guys.
00:08:58 - Beautiful lineup, incredible lineup.
00:09:02 - Nightmare rotation lineup.
00:09:04 - I think Sidney's going to lose.
00:09:07 I'm real sad.
00:09:08 - Like, a lot of people are going to pick the iPad,
00:09:10 it feels like.
00:09:11 - They are.
00:09:12 - So that's the frame.
00:09:13 And I actually want to come back to that, Alex,
00:09:16 because it feels like the core technology
00:09:19 that has gotten everyone so excited has a fatal flaw
00:09:23 with these hallucinations that everyone is ignoring.
00:09:26 And we should talk about that a lot,
00:09:27 but let's go through the news.
00:09:29 So, you know, we covered the "Hell Out of Opening Eyes" event,
00:09:33 Kylie Robinson, our new senior AI reporter,
00:09:35 talked to Nira Murati, their CTO,
00:09:37 about some of their technology.
00:09:38 We sent an army to Google I/O like we always do.
00:09:41 Suneer Pachaiwa is on Decoder,
00:09:42 which we'll give a little preview of.
00:09:43 That's going to come out Monday.
00:09:45 So just a lot, we covered all of this stuff.
00:09:47 Let's go through the news.
00:09:48 "Opening Eyes" news, pretty straightforward, right?
00:09:52 The GPT-4.0, that's the Omni model, it's faster.
00:09:56 They changed some of the pricing
00:09:57 so you can just get it for free now,
00:09:58 you don't have to pay for it like you do with GPT-4.
00:10:01 There's a Mac App now, which is very funny
00:10:03 because they have a gigantic Microsoft investment.
00:10:06 By the way, their answer to why Mac App is
00:10:08 that's where the users are,
00:10:09 which is both a statement, a true statement,
00:10:12 and also the sickest possible burn
00:10:14 that you can deliver to Microsoft, your major investor.
00:10:17 They are letting people use the GPT store for free now.
00:10:20 So like really, it's like a pricing change, this new model,
00:10:23 and in classic sort of "Opening Eyes" fashion,
00:10:26 you know, they announced Sora, their video generator,
00:10:27 no one can use it.
00:10:28 They announced GPT-4.0, it's out,
00:10:31 but the thing where you like wave your phone around
00:10:33 and look at stuff, I don't think it's hit yet.
00:10:35 - No, you can get at the model,
00:10:37 and like every AI app on the planet
00:10:40 has been integrating that model
00:10:42 over the last whatever, 96 hours at this point.
00:10:44 It's out there, and by all accounts is very good,
00:10:47 people like it, it's really fast,
00:10:48 but the, yeah, the like super agent stuff
00:10:51 that everybody is talking about is,
00:10:53 as far as I can tell, not anywhere, really, at this point.
00:10:56 - Yeah, and so the idea here,
00:10:58 and this is where the idea
00:11:00 that you're gonna bang an iPad comes from,
00:11:01 is you've got a phone, it talks to you,
00:11:04 the voice is very much like Scarlett Johansson
00:11:06 in the movie "Her," Sam Altman tweeted the word "her,"
00:11:09 Mira denied that it was Scarlett Johansson's voice,
00:11:12 but boy, is it close.
00:11:14 But the idea is you're just like
00:11:15 looking at stuff with your phone,
00:11:17 you're talking to an AI, and it's much more personable,
00:11:20 even a little bit flirty.
00:11:22 It's like telling jokes.
00:11:23 It's saying, "You look nice in that hoodie,"
00:11:25 and it's like more powerful
00:11:27 because it's not going from your voice to text
00:11:29 into the AI model, back out to text,
00:11:30 and back out to text to speech.
00:11:33 It's all just happening natively, which is a speed up.
00:11:36 But that's the thing, right?
00:11:39 The announcement here is very much,
00:11:41 this model can now look at stuff with you,
00:11:44 and you can talk to it in a much more emotive way.
00:11:47 - Okay, can we talk about that, actually?
00:11:49 Because I was talking to somebody about this yesterday,
00:11:52 and they just sort of casually made the point
00:11:56 that this is a user behavior that does not exist
00:11:59 and doesn't make any sense,
00:12:00 and I have not been able to stop thinking about it.
00:12:02 And I go back to when I was at the Rabbit R1 launch
00:12:05 a couple of weeks ago.
00:12:07 One of the demos he did was write a spreadsheet out by hand,
00:12:10 just like a table of numbers,
00:12:12 and then point his R1 at it and say,
00:12:16 I think it was like, "Can you invert the rows
00:12:18 "and the columns and then send this to me?"
00:12:20 And it's like, cool technology.
00:12:22 What the hell is that for?
00:12:23 And is anyone in the world ever going to actually do this?
00:12:26 And so many of these demos that we've seen,
00:12:28 both from Google and OpenAI this week,
00:12:31 are basically people walking around a room
00:12:33 asking their phone what a thing is.
00:12:37 And I just like, and this goes back to Alex,
00:12:39 your point of like,
00:12:40 what is all this technology actually for?
00:12:42 That's either a user behavior
00:12:43 that we just have never developed
00:12:45 because we've never had the technology for it,
00:12:46 or it's just not a thing people wanna do.
00:12:48 And I'm increasingly leaning towards the second thing.
00:12:52 I mean, if you can't see, you probably,
00:12:54 it would be very useful to point a phone
00:12:57 in a general direction and be like, what's there?
00:13:00 - Yeah, that's it, that's a good one.
00:13:01 It's a good accessibility feature, no question.
00:13:04 - 100%.
00:13:05 - The demos are weird because they need to create them
00:13:10 so that you can verify that the thing is happening
00:13:14 in the way that you think it's happening.
00:13:15 Like the dumbest one I've seen so far
00:13:16 is someone holding up GPT-40 at like Buckingham Palace
00:13:20 and being like, is the king here?
00:13:22 And it's like, yes, the flags are raised,
00:13:23 so the king is here.
00:13:24 And it's like, dude,
00:13:25 you don't need all this technology to get that.
00:13:28 Like they've been answering that question since the 1200s.
00:13:32 (both laughing)
00:13:33 You know, it's like, that's the thing,
00:13:36 that's the whole thing.
00:13:37 And there's just a piece of that
00:13:39 where you need to construct the demo
00:13:40 so it's obvious the computer is doing the thing.
00:13:43 And the actual use cases are yet to be seen
00:13:46 because real people have to get the stuff
00:13:47 and try doing things that no one's ever thought of doing.
00:13:51 This actually ties in with what Google demoed, right?
00:13:53 Which is you have some sort of problem,
00:13:55 you point your phone at it,
00:13:56 you're like, help me fix this problem.
00:13:58 We will come to that because both of their demos
00:14:00 were kind of weird for various reasons.
00:14:03 But it's like that thing where it's like,
00:14:05 I don't know what this part is called,
00:14:07 but it's rattling, fix it.
00:14:10 Right, and then like the computer knows some things
00:14:14 and it like helps you get to that answer.
00:14:16 Whereas all these demos,
00:14:17 I think they're just trading on the fact
00:14:19 that people will know the answers already
00:14:21 and what they're proving is that the various AIs can.
00:14:24 And I think that's what you're getting at, David,
00:14:26 is like these demos are not compelling
00:14:28 because at the end of the day,
00:14:29 they're designed so that you can see
00:14:32 that it's doing the thing that you already know.
00:14:34 - Right, yeah, it was like the person,
00:14:36 they were in the London Google office
00:14:39 demoing Project Astro, which we'll get to.
00:14:42 But running around with the same kind
00:14:43 of multimodal assistant and they point their phone
00:14:46 out the window at King's Cross Station
00:14:48 and say, what is this?
00:14:49 And it goes, King's Cross Station.
00:14:51 It's like, who did we help here?
00:14:52 You're in Google's London office
00:14:54 and you're like, gosh, what enormous station is that?
00:14:57 Just outside.
00:14:58 (laughing)
00:14:59 I don't know, and I think, to your point,
00:15:00 I think as a search input, it's super interesting.
00:15:04 But the idea of that as a sort of ongoing assistant
00:15:07 that I'm just gonna like pipe the video
00:15:10 of my world into all the time
00:15:12 and it's just gonna chat to me about what I see,
00:15:15 that I am suspicious of.
00:15:16 I just don't see that yet.
00:15:18 - The funniest thing about that demo in particular
00:15:20 is I believe it also asked, in the demo I believe
00:15:23 they also asked Astra, what is King's Cross Station
00:15:26 known for?
00:15:27 And Tom Warren on our team, who is British,
00:15:29 said no Londoner would have given such a milquetoast answer.
00:15:33 They were like, that's where you get hookers and cocaine.
00:15:35 (laughing)
00:15:37 That's actually the answer.
00:15:38 - If Jim and I had said that, I would be more impressed.
00:15:40 - Yeah, I'd be like, all right.
00:15:41 You're just like, finally, billions of dollars
00:15:44 of technology later, we're heating the oceans
00:15:46 with all this GPU usage, but I can finally ask a robot
00:15:49 where I get hookers and cocaine.
00:15:50 - Yeah, like, hey, chat GPT, who in this audience
00:15:53 is most likely to sell me drugs?
00:15:54 Like, now we're talking.
00:15:55 - Yeah.
00:15:57 - So that's like the open-air news
00:15:58 and the big piece of it is this emotion stuff, right?
00:16:03 They made the thing multimodal,
00:16:05 but I think the thing that was compelling
00:16:07 to most people in this demo was flirting
00:16:12 with the people who were giving the demos
00:16:14 and in a voice that sounded very much like Scarlett Johansson.
00:16:17 - Was it compelling though?
00:16:19 I mean, it definitely was flirting,
00:16:20 but like, was it compelling?
00:16:23 - So I'm just gonna caveat this with two things.
00:16:25 One, it's been a long time since I was called upon
00:16:28 to flirt with anyone, and whenever I try to flirt
00:16:31 with Becky, she's just like, I don't know what you're doing.
00:16:33 Like, those days are long gone.
00:16:35 And so like, I read it as like, this is what a bunch
00:16:42 of nerds think flirting is.
00:16:43 - Yeah.
00:16:44 - Like, you want a flirty female voice to be like,
00:16:48 you look good in that hoodie, but it's like,
00:16:51 maybe you shouldn't be wearing a hoodie.
00:16:52 (both laughing)
00:16:55 - Like, I don't want a computer to just reinforce
00:16:57 all the bad, stupid things I do all the time.
00:17:00 I don't need that kind of hype person in my life
00:17:03 'cause they're not productive.
00:17:05 - Yeah, I mean, there's just an element
00:17:06 where you should listen to it.
00:17:07 It's hard to talk about it just like,
00:17:11 away from the actual thing, but if you listen to it,
00:17:13 what they've added is a bunch of stutters and pauses
00:17:16 and like, intonations to what would otherwise
00:17:19 be a pretty dry text to speech output.
00:17:22 - Yeah, it sounded the most human, I think,
00:17:25 that we've heard one of these sound, right?
00:17:27 And I know that like, back to that project Astro Demo 2,
00:17:31 they're actually saying, don't talk to them
00:17:33 like you talk to Alexa or Siri or any of these other ones.
00:17:36 Talk to it more like you would a human.
00:17:38 And that's like, kind of compelling,
00:17:40 but at the same time, I go back to, I don't want a yes man.
00:17:44 Like the yes man is how I ended up with a leather
00:17:47 newsboy hat in the 2000s.
00:17:49 My friend was just like,
00:17:50 she wanted to just get out of the store.
00:17:52 She was like, yeah, get that hat, Alex.
00:17:54 I worked for weeks before she admitted it.
00:17:57 And that's what, like, I don't need that from a computer.
00:18:00 - Yeah, no, our nation's teens during that same period
00:18:02 of time was like swing dancing.
00:18:03 That's what we're gonna do.
00:18:04 - Yeah. - Yeah.
00:18:04 - And like, that's just a collective delusion
00:18:06 that honestly what you want is an automated system
00:18:09 to put a stop to.
00:18:10 - Yeah. (laughs)
00:18:11 - And we just didn't have the technology at the time.
00:18:13 And now maybe that technology's pointed in the wrong way.
00:18:16 I went through a brine setter phase.
00:18:18 I was a teen during that time.
00:18:20 It happened to me.
00:18:21 It's great.
00:18:22 If it's still happening to you, please swing.
00:18:24 - How many people do you fling over your shoulder
00:18:27 and then around your waist?
00:18:28 - When I say that, he's no longer interested
00:18:30 in these antics.
00:18:33 The thing that's always on my mind is she is a divorce lawyer.
00:18:36 Like, we just don't play.
00:18:38 The thing about all that, right,
00:18:41 is what people are reacting to is the personality.
00:18:44 And then all the demos at the bottom,
00:18:46 it said, "Check your facts."
00:18:48 - Yeah. - Right?
00:18:48 The thing they have not improved is,
00:18:50 is this thing getting more accurate
00:18:51 and more useful or smarter?
00:18:53 Which I wanna put a pin in,
00:18:54 'cause Alex, you wrote a great piece about that
00:18:56 that also comes up in the Google example.
00:18:59 But the part where everyone's excited
00:19:02 about this enabling technology to build these experiences
00:19:04 that everyone's always wanted
00:19:05 that David was talking about,
00:19:06 is like, is the thing good enough
00:19:09 to do the thing that they want?
00:19:11 Or is it just convincing enough?
00:19:13 I actually don't know the answer to GPT-4L
00:19:15 'cause I haven't had a chance to use it very much.
00:19:17 And the feature, the big feature,
00:19:19 isn't actually available to be used.
00:19:21 - I do think it's becoming increasingly clear
00:19:23 that it's getting convincing faster than it's getting good.
00:19:26 - Yeah. - Which I actually think
00:19:27 is sort of scary.
00:19:28 Like, just looking at the reactions that people had
00:19:32 to the OpenAI demos this week,
00:19:34 like, it made a lot of mistakes.
00:19:35 It got a lot of things wrong.
00:19:36 The interaction was kind of weird,
00:19:38 but it was mind-blowing to a lot of people,
00:19:41 just the fact that it sounded passably human
00:19:45 and it made jokes and it was sort of silly and flirty.
00:19:47 And they could dial up and down the emotions.
00:19:49 There's one point where the,
00:19:51 it was two researchers who were doing this demo
00:19:52 and one of them, he's sitting there
00:19:53 and he asks a question and she responds.
00:19:55 And the other guy goes, "Not enough emotion.
00:19:57 "Like, give me more, more drama."
00:19:59 And it like, correctly amps up the drama
00:20:02 and like, to the point where it's funny.
00:20:04 Like, I laughed at the thing and that's remarkable.
00:20:07 But I don't know that there's any evidence
00:20:11 that it is making fewer mistakes in those moments so far
00:20:15 than these things have in the past,
00:20:17 or that it is like, meaningfully closer to getting to zero,
00:20:20 which is ultimately the only thing that actually matters.
00:20:22 - Yeah, and zero is a hard place to get.
00:20:25 Like, people aren't at zero,
00:20:27 but at least you understand the failure mode
00:20:30 of another person.
00:20:31 - Right.
00:20:31 - You know, like, you can evaluate
00:20:33 whether a person is trustworthy
00:20:35 or like, you have some history
00:20:36 or you can find a number of TikToks
00:20:38 going through someone's entire backstory to tell.
00:20:41 Right, like, there's a whole thing you can do
00:20:43 to evaluate whether another human being
00:20:45 is trustworthy or telling you the truth
00:20:46 or whether they're experts or whatever,
00:20:48 in a way that, like, AIs are just like,
00:20:51 a different kind of thing.
00:20:52 - Yeah, yeah.
00:20:53 - And we don't have the skills
00:20:54 as a society or a culture to do it.
00:20:56 And also, we're gonna come to this,
00:20:58 this is Alex's piece,
00:20:59 everyone seems to be just ignoring it 'cause it's fun.
00:21:02 Right, and it's like, that's weird.
00:21:03 - If I'm going to briefly defend AI,
00:21:05 I will say there are a lot of things
00:21:07 about these tools that are very low stakes
00:21:09 that are fine.
00:21:11 Right, like, there's a lot of things
00:21:12 for which there's not a right answer,
00:21:14 and it doesn't matter
00:21:15 if it gives you the slightly wrong answer.
00:21:17 And again, I think the thing that we're learning
00:21:19 is that more of the point is the interaction
00:21:23 than we expected, right?
00:21:25 Like, the more you hear about Kevin Roos' girlfriends
00:21:29 in the New York Times, I loved that story.
00:21:31 Like, we've given Kevin a hard time.
00:21:32 It was an unbelievable story idea
00:21:34 and I'm very jealous of it.
00:21:35 - Yeah, I mean, his beat is, "What if you banged an iPad?"
00:21:37 I just want to be clear.
00:21:38 - Yeah, it's an unbelievable idea.
00:21:39 - Incredible.
00:21:40 - And you hear about the meta-celebrity AI things
00:21:45 that they have.
00:21:46 Half the point of these things is just to use them.
00:21:49 And I think I really underestimated that for a long time.
00:21:52 I was like, these things are tools
00:21:53 and the goal is to accomplish something
00:21:55 as quickly and efficiently as possible.
00:21:56 And having ums and ahs is actually just a waste
00:21:59 of everybody's time.
00:22:00 And for my own use, I actually still believe that.
00:22:04 But I think in general, the fun of using it,
00:22:08 it's a real, like, the real friends
00:22:10 with the friends we made along the way
00:22:11 is like, that's very much where we're at with a lot of AI.
00:22:15 And I think it forgives a lot of the wrong stuff
00:22:18 in a way that is a little scary,
00:22:20 but also if you're a company like OpenAI, really productive.
00:22:23 Because as long as it's fun, people will forgive so much.
00:22:27 And they're making it fun.
00:22:28 - I mean, but you're kind of getting to the point
00:22:30 where it's like, what if your computer
00:22:31 was Calvin's dad from Calvin and Hobbes?
00:22:33 (laughing)
00:22:34 - Yeah, just like, I don't know.
00:22:36 I'm just gonna make some stuff.
00:22:37 Go away, kid.
00:22:37 Fine, you know, and that was life before computers.
00:22:41 Like before Google, people would just make stuff up
00:22:44 at parties, be like, ah, that's what it feels.
00:22:46 And fine, like society made it through many moments
00:22:50 without being able to Google stuff
00:22:51 on their phone at a party.
00:22:53 But it's just weird that we're at a place
00:22:56 where we had that facility and now we're like,
00:22:58 it's less accurate than before, but it's flirtier.
00:23:00 And the point is the interaction.
00:23:03 That's a change.
00:23:04 It's like worth remarking on.
00:23:05 - That kind of mirrors like people's search behavior
00:23:07 in general, right?
00:23:08 'Cause everybody's like, I could use Google,
00:23:10 which will take me probably to the correct answer.
00:23:12 Or I could just go on TikTok and be like,
00:23:14 does anybody know why this happens?
00:23:16 And take the first answer I get
00:23:17 from the most charming person.
00:23:18 - And the answer is your water doesn't have
00:23:19 enough hydrogen in it.
00:23:21 - Yeah.
00:23:22 Just pack it full.
00:23:24 - OpenAI announced all this stuff,
00:23:25 which is interesting, right?
00:23:26 Again, the timing.
00:23:27 Then the day of I/O, they announced
00:23:29 that former chief scientist now, Ilya Sutskover,
00:23:32 is leaving the company after all of the drama
00:23:36 about Sam Altman getting fired.
00:23:37 Ilya was one of the people that voted to remove Sam.
00:23:40 And then he said he made a mistake and he went back
00:23:42 and there was the whole thing.
00:23:43 And now Sam's back with a bigger board, whatever.
00:23:46 It always seemed likely that Ilya would leave.
00:23:48 And then they announced it on the day of I/O.
00:23:51 And I only bring this up, you know,
00:23:52 it's like this sort of machinations of OpenAI
00:23:55 are very interesting because of Sam getting fired.
00:23:58 But it's interesting timing in that they made
00:24:01 all this noise about maybe we're doing search,
00:24:02 maybe we're doing a thing.
00:24:03 But it's magic anyway.
00:24:04 And also our chief scientist who led a coup
00:24:07 will be leaving the company.
00:24:10 Don't pay attention to that.
00:24:11 Pay attention to Google News.
00:24:12 Like, weird.
00:24:14 It's just a weird, I think OpenAI is in a weird spot
00:24:18 where they are, it doesn't feel like this
00:24:20 from the product side.
00:24:21 I think the product has captured more imagination.
00:24:24 But they are, as a company, pretty reactive to Google.
00:24:27 Like they're weaving their way around Google
00:24:31 in a fairly interesting way.
00:24:32 Yeah, I think Google is getting powerful in this space
00:24:36 more quickly than anyone gave it credit for
00:24:39 not very long ago.
00:24:40 I don't think that.
00:24:41 Do you remember 12 months ago when we were sitting around
00:24:43 being like, Google blew it.
00:24:44 They blew it.
00:24:45 Where have they been?
00:24:46 Google fucked it all up.
00:24:47 We knew that Google was like slow,
00:24:49 but like, I don't think it's surprising that Google,
00:24:53 one of the largest companies in the world,
00:24:56 is gonna run circles around this company
00:24:58 that was originally created, to be clear,
00:25:01 to protect people from AGI while also creating AGI.
00:25:04 Like, they're still a startup.
00:25:06 Yeah, they've got a ton of money from Microsoft
00:25:09 and they're hunting for more investment all the time.
00:25:11 But Google has been investing in this space
00:25:14 for a very long time, investing in those resources,
00:25:17 investing in those people.
00:25:18 And I'm not surprised that it came up and said,
00:25:22 yeah, we're gonna do it and we're gonna crush it.
00:25:23 Like, of course, OpenAI is gonna be on their back foot
00:25:26 the entire time.
00:25:27 That's why they rolled out their store so quickly.
00:25:29 That's why they've been moving so fast
00:25:31 is 'cause they're like, okay, we beat them to market,
00:25:33 now we have to keep beating them to market,
00:25:35 'cause that's all we have.
00:25:36 We don't have all of the infrastructure.
00:25:38 We don't have all the resources that Google has.
00:25:41 So this is what we can do.
00:25:43 - Yeah, no, I buy that a lot, actually.
00:25:45 And I think that, you know, what OpenAI is good at
00:25:48 is having a product and caring about a product
00:25:51 and having the product be existential to the company
00:25:54 and making the product good.
00:25:56 And Google is not good at that.
00:25:58 (laughing)
00:26:00 - Real bad.
00:26:01 - Like, Google has a product
00:26:02 that it cares about for a long time,
00:26:05 not in Google's core skill set.
00:26:07 It hasn't been for a while.
00:26:08 And, you know, last year they announced a bunch of stuff
00:26:11 and we're like, where is it?
00:26:12 You know, again, they were at Bard last year.
00:26:15 Like, all that's gone.
00:26:16 They already wiped out one whole set of products
00:26:18 and replaced it with a long list of other products,
00:26:21 all of which are named Gemini, which we will get to.
00:26:24 But they had, you know, their announcements at I/O
00:26:26 were very confident.
00:26:28 We should go through at least the chatbot stuff here.
00:26:30 I did talk to Sundar afterwards.
00:26:33 We talked a lot about sort of how people feel about AI.
00:26:36 You know, that's coming on Decoder next week.
00:26:38 But it felt like he has a handle on it.
00:26:43 You know, like he knows that he has to do this stuff
00:26:45 and he knows that it's like very much like
00:26:47 can't make an omelet without breaking some eggs.
00:26:48 And he's like, I feel bad for the eggs.
00:26:50 But he's doing it.
00:26:51 Like, they're in an aggressive posture.
00:26:53 And the thing that they are, I think,
00:26:55 the most aggressive posture on is Project Astra,
00:26:59 is some of their search improvements.
00:27:01 And it's the idea that they have the infrastructure
00:27:04 to scale this stuff and make it cheap, right?
00:27:08 So Sundar said to Deirdre Bosa on CNBC,
00:27:11 our cost of queries are 85% like cheaper
00:27:13 than it was a year ago.
00:27:15 And that, if you're opening AI, is like, oh shit.
00:27:18 Like, Google can just deploy this stuff
00:27:20 to everybody who has Google and it's cheap
00:27:23 and they can like figure out their business
00:27:24 on the back of search ads and all this other stuff.
00:27:26 And we are at, well, 20 bucks a month was a lot of money
00:27:29 and people weren't paying for it.
00:27:30 And now we're giving away the best model we have for free.
00:27:32 Right, like, there's a real challenge in there.
00:27:34 And you can see these companies kind of battling it out,
00:27:36 each using their own advantage.
00:27:37 Open AI, very much, their advantage is
00:27:39 they can tell a compelling story about a product
00:27:41 that everyone believes will be here a year from now.
00:27:44 And Google's advantage is that it's huge.
00:27:46 - Yep.
00:27:47 And you also just described basically the whole reason
00:27:51 that Microsoft and Open AI got so tied up.
00:27:54 Because Microsoft promised AI
00:27:56 that same level of scale and infrastructure.
00:27:59 And I think if we've learned one thing,
00:28:01 it's that it's very hard to do that in a partnership
00:28:04 in a way that it's much easier to do it
00:28:05 inside of your company.
00:28:06 And I think, like, if you look at what we've seen
00:28:09 across Google the last year or so,
00:28:11 that company has just like relentlessly consolidated itself
00:28:15 to be able to move all of that stuff faster.
00:28:17 - Yes.
00:28:18 - And that's like, that was the Rick Osterloh story
00:28:19 when they put him on top of hardware and on Android.
00:28:22 It's to move AI into Pixel and Android more quickly.
00:28:26 And the infrastructure stuff,
00:28:27 like one thing Demis Hassabis said to me
00:28:29 is that the fact that Google's infrastructure exists
00:28:34 is how you get Astra to be so fast.
00:28:36 Like, he just literally tied those two things together.
00:28:39 And he's like, "We spent six months
00:28:41 "taking the tech that we had and making it fast.
00:28:43 "And what that meant is putting it
00:28:45 "in the rest of Google's infrastructure.
00:28:46 "Like, that becomes the job."
00:28:48 And you're right, cheap and fast are how you win.
00:28:51 That is the next phase of this,
00:28:53 is we take all of this stuff and we figure out how to do it.
00:28:55 We figure out how to do it in a way
00:28:56 that is like economically sustainable
00:28:58 and doesn't have latency,
00:28:59 because latency kills all of these products.
00:29:01 Just as a user, like if it's slow, it sucks.
00:29:03 And that's where we are.
00:29:04 - And Google has known that forever, right?
00:29:06 I mean, Google has always known that speed is the thing.
00:29:09 So we should talk about some of Google's news.
00:29:11 The main one is Project Astra,
00:29:13 which is their natively multimodal,
00:29:16 take a video of a thing, search for it, ask it questions.
00:29:20 I did a demo of it with Vsong.
00:29:22 It was weird.
00:29:24 There were Google executives there
00:29:27 doing demos on their phones, apparently.
00:29:30 We didn't see that.
00:29:31 We went into a booth where we put on a gaming headset
00:29:34 and there was a top-mounted camera
00:29:36 pointing down on a big TV screen.
00:29:38 And we were asked to take objects out of bins
00:29:41 and put them on the top,
00:29:43 so the top-down camera could see them
00:29:44 and ask questions about the objects,
00:29:46 which to David's point is a completely insane use case.
00:29:48 (Drew laughs)
00:29:49 Like, you are just never gonna do this.
00:29:51 - Yeah, what?
00:29:52 - But the idea is that it could see the things,
00:29:53 it could just instantly latency-free chat
00:29:56 with you about them.
00:29:57 So V picked three things and said, "Tell me a story,
00:30:00 "name the stuffed animal," great.
00:30:02 I broke it, just instantly broke it.
00:30:07 I had a spaceship, a pleasaur, like a plastic dinosaur,
00:30:11 and a maraca. - Did you know it was
00:30:13 a pleasaur or did it tell you it was a pleasaur?
00:30:14 - I thought it was an ichthyosaurus
00:30:16 and it told me it was a pleasaur.
00:30:17 - All right, I'm gonna put us
00:30:18 in the right-- - By the way, I don't know
00:30:19 who was right.
00:30:20 (Drew and Allyson laugh)
00:30:20 I didn't have any dinosaur people there with me.
00:30:23 (Drew and Allyson laugh)
00:30:24 You know, the one dude from "Jurassic Park"
00:30:25 was not there with me.
00:30:26 But Alan, what's his name?
00:30:31 - Alan Grant?
00:30:32 - Yeah, Alan Grant was not there.
00:30:34 - Yeah. - Good pull.
00:30:35 There we go.
00:30:36 - What was the, who's the one who played Ellie?
00:30:39 This is what I carry around. - Laura Dern?
00:30:40 - Laura Dern.
00:30:41 - Yeah, Laura Dern, oh my God,
00:30:43 I wish she'd been there with you.
00:30:44 - Yeah, so-- - I just wish she's around.
00:30:46 She's great. - If Google's listening,
00:30:47 next time we do this demo, Laura Dern should be there.
00:30:50 And I was like, tell me a story about these three objects
00:30:52 where the maraca is the winner,
00:30:54 which to be fair is like an idea that can only come
00:30:57 from my like booze-addled human brain.
00:31:00 - Yeah. - And it was like,
00:31:02 get ready for a story.
00:31:03 (all laugh)
00:31:08 And I was like, I'm ready.
00:31:09 And it was like, once upon a time,
00:31:12 and it just stopped, it's just dead in its tracks,
00:31:14 like could not predict the next word in a sentence
00:31:18 that added up to the maraca will beat this dinosaur
00:31:21 in this spaceship, like it was like, I don't know.
00:31:22 - And there is nothing more awkward
00:31:24 than that specific silence.
00:31:26 - Yeah. - Where you're like,
00:31:27 is it about to happen? - Is it thinking?
00:31:29 - Did it break, is it doing the wake heartbeat?
00:31:31 - Is there a hamster on a wheel somewhere?
00:31:33 Like there's a data center on fire,
00:31:34 and the poor guy giving the demo just like hit the space bar
00:31:36 and was like, get out of here.
00:31:37 Like he was like, we have other demos to give,
00:31:39 like I don't trust you anymore.
00:31:41 And it's fun, like that is a weird demo,
00:31:43 it's a weird use case, it doesn't make any sense.
00:31:45 The point is to show that it can like see,
00:31:48 you can like use its camera and do text to speech,
00:31:51 like all at the same time in the same way, right?
00:31:53 Natively multimodal.
00:31:55 But right, like you would never walk up,
00:31:58 like if you asked a normal person,
00:31:59 tell me a story where the maraca is a winner,
00:32:01 they would also be like, get ready.
00:32:03 (both laughing)
00:32:05 - Yeah, walk away.
00:32:06 - But I think there's just a part of this
00:32:09 where we're asking these things to do
00:32:11 like incredibly esoteric things
00:32:13 just to see where their limits are, obviously.
00:32:16 And you gotta bring that back to reality.
00:32:19 So the one demo they gave,
00:32:21 which is coming to Google Lens,
00:32:22 it's like an extension of Google Lens,
00:32:24 is a video search, which is sort of related to Astra,
00:32:27 like Astra is a thing over here,
00:32:28 but video search is a thing over there.
00:32:30 And it's weird to pull them apart.
00:32:33 So Astra, you're just like talking to a computer,
00:32:35 you're like waving your phone around,
00:32:36 looking at stuff and talking to it.
00:32:37 Video search in Lens is you like take a video,
00:32:40 like you hold down the shutter button of a thing
00:32:43 while you're talking and you say like,
00:32:45 why is this record player broken?
00:32:46 Which is one of their demos.
00:32:48 Or why doesn't the lever on this camera move?
00:32:50 Which is another one of their demos.
00:32:51 And they feel to me like the very related ideas,
00:32:55 but they are separate in classic Google fashion.
00:32:58 Like these are not the same idea,
00:32:59 they're different, they have different names.
00:33:01 But the thing about it is the Astra demos,
00:33:05 you know, they felt very esoteric,
00:33:07 can a maraca win?
00:33:09 The video search demos were very tactile.
00:33:12 Like I could see how they would be useful.
00:33:15 Right, this thing is broken,
00:33:16 I'm just gonna make a video of it being broken.
00:33:17 I don't even know what the parts are called,
00:33:19 help me fix it.
00:33:20 So the record player one,
00:33:21 which they did a lot on stage,
00:33:22 it's pretty good, right?
00:33:23 They had a, people know about record players,
00:33:25 there's a weight on the back of the tone arm,
00:33:27 and if you don't have the weight balanced right,
00:33:28 the tone arm just like skips around.
00:33:30 'Cause you need some weight pushing down on the needle
00:33:33 into the record to like hit the whole thing.
00:33:36 If you're a record player nerd,
00:33:37 this is like the most exciting demo.
00:33:38 That you've ever seen.
00:33:39 If you're a normal person,
00:33:41 you look at that and it just tells you this thing.
00:33:45 Go watch the demo, I encourage you to watch the demo.
00:33:47 It's like that, it identifies the model,
00:33:49 the make and model of the turntable.
00:33:50 It's like, that's an audio technical turntable.
00:33:52 And the reason that stone arm might be moving this way
00:33:53 is because it's not counterbalanced correctly.
00:33:57 And it has a whole list of like things you could do
00:33:58 to counterbalance correctly.
00:34:00 And it's like, oh, this actually isn't useful.
00:34:03 Which I thought was fascinating.
00:34:05 It's the right answer,
00:34:06 straightforwardly the right answer.
00:34:07 Like that's why it was doing that thing.
00:34:09 But if you don't know what the tone arm is called,
00:34:12 you've gone two steps too deep too fast.
00:34:15 - Right.
00:34:15 - Right, you're like, the information you need
00:34:19 is like record players work by putting a needle in a groove
00:34:23 and you need to put pressure on that needle.
00:34:25 And this round weight at the back
00:34:27 is called the counterbalance and you need to spin it.
00:34:30 So you put enough weight on the thing
00:34:32 and the amount of weight you put on it
00:34:33 is actually dependent on your needle.
00:34:35 It's like you fall down the rabbit hole
00:34:36 and suddenly you realize like you own four record players.
00:34:40 - Yeah, then you're just an audio nerd.
00:34:41 - It's just a thing that happens to you.
00:34:43 That's fine.
00:34:44 But like, that was really interesting to me.
00:34:46 Right, it's the right answer,
00:34:47 but sort of in a way that was like too far too fast.
00:34:51 - Oh, I think you're totally wrong.
00:34:52 I think what you just described is Google search.
00:34:56 Like, which is the way that it works, right?
00:34:58 Which is that it essentially--
00:34:59 - But I'm saying, if you don't know
00:35:00 what a tone arm is called,
00:35:01 you couldn't follow the directions that were given to you.
00:35:03 - Right, that's, I mean, that's fair.
00:35:04 But I think maybe a better example of the opposite case
00:35:08 is one that Liz Reed, the head of search,
00:35:11 talked about when she and I were talking,
00:35:13 which is your dishwasher is broken
00:35:15 and there's a light blinking.
00:35:16 And you don't know what brand your dishwasher is,
00:35:18 you don't know what model it is,
00:35:19 you don't know how to describe which--
00:35:21 - You're just a baby.
00:35:22 - You're just like, my dishwasher's broken.
00:35:23 Like, off the top of your head,
00:35:25 do you know the model number of your dishwasher?
00:35:27 - I could make up anything right now.
00:35:31 (laughing)
00:35:32 - If you had confidently answered,
00:35:33 I would have believed you.
00:35:34 - Just be confident, just makes up enough.
00:35:37 - But again, that's not information most people have,
00:35:40 it's not information most people need,
00:35:41 it's not information most people should care about.
00:35:43 It should just say, take out the filter in the bottom,
00:35:46 rinse it out and put it back, you'll be fine.
00:35:48 Like, that is the answer.
00:35:49 I don't need to learn about dishwashers.
00:35:51 I just want mine to work.
00:35:52 And it's the, like, I take my car to the mechanic.
00:35:55 Some people are like, oh, I'd like to understand cars
00:35:57 so that I can make sure that my car runs better.
00:36:00 Most people come in, they go,
00:36:02 my car's going (imitates car engine) fix it.
00:36:05 - Right, but I'm saying that--
00:36:05 - And that's the end of the interaction.
00:36:06 - That's great, but if you're asking,
00:36:08 like, what is this thing and why is it broken,
00:36:09 and it's like, here's how to fix it,
00:36:11 telling you what to do is actually important, right?
00:36:15 And here, it was just like, fix your counterbalance.
00:36:16 - Yeah, but that doesn't require teaching me about it.
00:36:18 It just says, pull the, do you see the thing there?
00:36:20 Pull it down.
00:36:20 - Fair, I just thought that was a particularly jargony.
00:36:24 Maybe it's 'cause I just, like, am a turntable person.
00:36:27 But the answer there was, like, particularly jargony,
00:36:29 in a way that I thought was interesting, right?
00:36:30 It didn't, like, it didn't help you
00:36:33 understand the actual answer.
00:36:34 - Well, I think you're also someone who thinks, like,
00:36:37 you wanna understand the thing that you're talking about
00:36:39 and working with.
00:36:40 Like, we're all gadget nerds.
00:36:41 - Or I just wanna lie to David Confidently.
00:36:43 - Yeah, yeah, that too.
00:36:44 But, like, you know, we're all nerds.
00:36:46 We wanna know how the thing works.
00:36:48 We wanna know all the processes.
00:36:49 And it's true, like, the reason, you know,
00:36:51 when you first used Google Search 20 years ago,
00:36:53 you would have to be really,
00:36:54 think about your search prompt, right?
00:36:56 Like, you'd have to think about what you type in.
00:36:57 And now you just type in, like,
00:36:59 my dishwasher doesn't work.
00:37:00 - Yeah.
00:37:01 - And people just don't care as much.
00:37:04 But to your point, like, yeah, if it said,
00:37:07 if the dishwasher said, yeah, go replace the filter,
00:37:10 I'd be like, where the hell is the filter?
00:37:12 What the hell is that?
00:37:12 Like, to your credit, like,
00:37:14 I would still wanna know all of that.
00:37:16 And I don't think it would necessarily get me there yet.
00:37:18 - Well, I'm just saying, I think that one's interesting,
00:37:19 'cause it was the right answer.
00:37:21 There's no question it was the right answer.
00:37:23 - Yeah.
00:37:24 - That's the reason the tone arm was flying back and forth.
00:37:26 It was just, is it a useful answer, right?
00:37:29 And I think we can obviously disagree about that,
00:37:31 and we are, but that's the level that we're at.
00:37:33 Like, is this actually useful for you to go
00:37:35 sort of like spit out an answer like this,
00:37:38 where it's fixed the counterbalance?
00:37:40 - Right.
00:37:41 - And you need to know some information
00:37:42 to like take action on that instruction.
00:37:44 - Well, and there's a really interesting
00:37:46 sort of AI all the way down piece of that too,
00:37:49 which is that like, what that indicates is that
00:37:52 the AI for sort of understanding what's in a video
00:37:55 that you're sending to it is capable, right?
00:37:58 Like, it did the job, it figured out the question you had
00:38:01 by you pointing a video.
00:38:02 That's awesome, right?
00:38:03 Like, I actually think the video search in Lens
00:38:06 is a very good idea, and I think pointing your phone
00:38:08 at something, being like, what is this thing
00:38:09 that's happening, is an actually totally normal
00:38:12 user behavior, so I'm like all in on that.
00:38:14 The output thing is actually much harder, right?
00:38:16 Because it's like, okay, in this case,
00:38:19 is the right thing for Google to do
00:38:21 to just sort of give me an answer about the tone arm,
00:38:23 which is not a term that I know, to your point.
00:38:26 So if it says pull the thing on the tone arm,
00:38:28 I'm like, what the hell is the tone arm,
00:38:29 and you've lost me.
00:38:30 Should it send me a YouTube video?
00:38:31 Should it send me a compilation of stuff?
00:38:34 Should it like superimpose AR on top?
00:38:37 Like, that is just such a different problem
00:38:39 from the sort of search query recognition thing,
00:38:42 and my sense is Google's getting really good
00:38:45 at the search query recognition, right?
00:38:46 They did it with voice, they did it with video,
00:38:48 they did it with images.
00:38:50 Google is very good at understanding what you're asking
00:38:54 almost no matter how you ask it,
00:38:55 and then the question of how to answer
00:38:57 is so different and so much harder
00:38:59 and full of thorns and wildfires.
00:39:04 - So, by the way, this is kind of exactly
00:39:06 what I was getting at, is what it presented
00:39:08 was like a list of words, like vocab words,
00:39:10 and then in the demo, she was like,
00:39:15 "And here's a link to Audio-Technica's manufacturer page,"
00:39:17 which is more words, and what you actually want
00:39:20 is like a YouTube short where someone
00:39:21 just does it in five seconds.
00:39:23 - Right, somebody just reaches down
00:39:24 and pokes the thing and it works.
00:39:25 - Right, and they're just like, here's this thing,
00:39:26 like do this thing until it floats,
00:39:28 and then like turn the number to whatever number is on,
00:39:31 like that's all you need to know,
00:39:33 and like the actual answer to that
00:39:34 is a video of someone doing it.
00:39:36 And that's, again, it's just like,
00:39:39 it got it right in one specific way,
00:39:41 but it actually got it wrong in like another way,
00:39:44 which is like the form of the answer
00:39:46 is not the most useful thing, which is really interesting.
00:39:49 Then there was the other demo, which it was just wrong,
00:39:53 like wrong in a way where, I'll just preview,
00:39:56 it might, the first question I asked Sundar Pichai
00:39:58 in the Decoder interview was,
00:39:59 is language the same as intelligence?
00:40:01 Because the question was that someone was pointing a camera
00:40:05 at a broken film advance lever on a SLR,
00:40:08 like a film SLR camera.
00:40:10 And so these cameras are really manual, right?
00:40:11 So you take a picture, you gotta move the lever
00:40:13 to move the frame of film over in the camera.
00:40:16 If you've never seen a film camera,
00:40:17 this is like a very satisfying mechanical thing to do, right?
00:40:22 You like push the button, the big, you know,
00:40:25 the shutter goes ch-chunk, and then you like move,
00:40:27 you're literally pulling, you know,
00:40:29 the notches in the side of film,
00:40:31 you're literally pulling the film over
00:40:33 and winding it around so you can expose the next shot,
00:40:36 the next frame, when you next have to open the shutter.
00:40:38 This is great, I love, like, I'm a nerd for these things.
00:40:41 And the answer, the question was,
00:40:43 why can't I advance this thing?
00:40:45 Why is this lever not moving all the way
00:40:47 to advance the next thing?
00:40:49 This is a problem with these cameras, they have them,
00:40:50 there's a million reasons you might have this problem.
00:40:53 One of them on, like, old Canons is like,
00:40:55 there's a magnet at the bottom of the thing,
00:40:57 and one of the answers is like, get a magnet
00:40:58 and like run it over the bottom of the camera.
00:41:01 Like, there's all these, there's like,
00:41:02 folk tales and superstitions, like, you know,
00:41:05 it's like, drink the blood of a goat
00:41:06 and like wave at a crystal and like move your film
00:41:09 advance lever, it doesn't matter,
00:41:10 there's just like a lot of information out on the web
00:41:12 of how to do this.
00:41:14 And the answer that Google delivered in its own video
00:41:18 and highlighted is the most wrong answer.
00:41:21 Like, it's right in one way, which is as a last resort,
00:41:25 just open the back of the camera.
00:41:26 - It will technically work, yeah.
00:41:28 - It will technically work.
00:41:29 But if you open the back of a film camera,
00:41:32 you expose all of the film, you ruin all of your photos.
00:41:36 Right, so when people do this, when I asked Becca
00:41:38 and Viren and a bunch of other photographers on our staff,
00:41:40 like, when you have a broken film advance lever,
00:41:43 would you ever just open the back of your camera?
00:41:44 And Becca was like, no, no, no, no, no.
00:41:46 You go into the darkest closet you can find and you pray.
00:41:49 And then you open it and monkey with it,
00:41:51 because you don't want to expose your film.
00:41:53 You don't want to lose your photography.
00:41:54 And Viren was like, I've spent a lot of time in that closet,
00:41:56 which is a very funny thing.
00:41:58 The most Viren thing you can say.
00:42:00 Right, like these cameras break.
00:42:02 They're finicky mechanical objects.
00:42:03 But there's not a place where anyone with even the slightest
00:42:07 bit of intelligence about those kinds of cameras
00:42:10 would recommend and highlight,
00:42:12 just open the door and fix it.
00:42:15 Because necessarily you're gonna ruin all of your photos.
00:42:19 So I actually asked Suneera about this.
00:42:21 I was like, this is not a thing.
00:42:22 I don't see the intelligence going up.
00:42:24 I see the capability, right, like David's saying.
00:42:28 It can see in the video what's wrong.
00:42:30 It can go search the web.
00:42:31 It can synthesize some answer.
00:42:33 But this answer is wrong.
00:42:34 Like, this is not the thing you should advise people to do.
00:42:37 And he's like, I went and talked to the team.
00:42:38 And they're like, well, in some cases,
00:42:39 if you were willing to destroy all of your photos,
00:42:41 it's the right answer.
00:42:43 Which is correct.
00:42:45 - Yeah.
00:42:46 - That is the correct response.
00:42:47 We should just run it.
00:42:48 We should just run that clip.
00:42:49 Let's run that clip now.
00:42:50 - You know, ironically, I was talking to the team
00:42:52 as part of making the video.
00:42:54 They consulted with a bunch of subject matter experts
00:42:57 who all reviewed the answer and thought it was okay.
00:43:00 I understand the nuance.
00:43:01 I agree with you, obviously.
00:43:02 You don't want to expose your film
00:43:04 by taking it outside of a dark room.
00:43:08 There are certain contexts in which
00:43:10 it makes sense to do that.
00:43:11 - Sure.
00:43:12 - You know, if you don't want to break the camera,
00:43:14 and if what you've taken is not that valuable.
00:43:16 - Sure.
00:43:17 - All right.
00:43:18 It makes sense to do that.
00:43:20 You know, it's a good example of,
00:43:22 you're right, there is a lot of nuance in it.
00:43:25 And part of what I hope search serves to do
00:43:28 is to, you know, gives you a lot more context
00:43:32 around that answer,
00:43:34 and allows people to explore it deeply.
00:43:37 But I think, you know, these are the kind of things,
00:43:41 you know, for us to keep getting better at.
00:43:43 But to your earlier question, look, I think,
00:43:46 I do see the capability frontier continuing to move forward.
00:43:51 I think we are a bit limited
00:43:52 if we were just training on text data.
00:43:54 But I think we are all making it more multi-modal.
00:43:57 So I see more opportunities there.
00:43:59 - To me, that answer is like,
00:44:02 yeah, we should have probably told you
00:44:03 you're gonna destroy all your photos, right?
00:44:04 Like, that's the answer.
00:44:05 - Yes.
00:44:07 - But it's also like, why are we rushing into a world
00:44:10 where the products can't quite do
00:44:13 the thing you need them to do?
00:44:14 And Alex, this was your piece,
00:44:16 which is these things are hallucinating left and right,
00:44:19 and everyone's kind of like over it.
00:44:20 - Yeah, I just keep being baffled by it,
00:44:23 because people, it just screws up constantly.
00:44:26 Like, if you give me one of these things,
00:44:28 I can probably get it to hallucinate
00:44:30 within four to five questions fairly easily.
00:44:33 Like, consistently, I could get it to screw up.
00:44:35 And it's just because they're not good at it yet,
00:44:38 because they're computers, they're not people.
00:44:41 They're not actually, all of the stuff
00:44:43 that our brains do is really hard to do.
00:44:46 And they're not prepared, but these companies are like,
00:44:48 well, it's really cool, we're on our way, so enjoy.
00:44:51 And this is now how we're gonna have everything work.
00:44:52 And it's like, well, no, stop that.
00:44:55 I need my computer to know how to do my taxes.
00:44:58 I don't need it to know
00:45:00 what a nice accountant should sound like.
00:45:03 (laughing)
00:45:05 - Have you seen the one where people keep asking it
00:45:07 how to get a man and a goat across a river,
00:45:09 and literally none of the models can do it?
00:45:12 Like, the question is, you have a man, a boat, and a goat,
00:45:14 how do you get them all across the river?
00:45:16 And it's like, first you take the goat across,
00:45:17 then the man comes back, and he takes himself across,
00:45:19 and it's like, that's not the answer.
00:45:21 - This is a bad answer.
00:45:23 Yeah, they don't, the AIs don't think like humans do.
00:45:27 They're not meant to.
00:45:28 And that is remarkable.
00:45:30 And it's like, it is cool that we're now effectively engaging
00:45:34 with these alien thoughts and things, right?
00:45:37 'Cause it doesn't think the way a human does.
00:45:40 But it also doesn't think the way a human does,
00:45:42 and we keep asking it to,
00:45:43 and we keep putting it in these situations.
00:45:45 And then we're like, don't worry about it.
00:45:47 So it hallucinated a little bit.
00:45:49 So it decided that Neely wasn't the founder of the Verge,
00:45:52 and that Alex is a man.
00:45:53 It's fine, keep going.
00:45:55 And it's like, yeah, materially,
00:45:56 those aren't harmful things to say, right?
00:45:58 Like, I think we'll both be fine if somebody said that.
00:46:01 But it still shouldn't say it,
00:46:03 and that's what you want to now become
00:46:05 the new form of search.
00:46:06 Like, for me, it really feels like Google has lost the plot
00:46:09 for what Google is supposed to do,
00:46:11 and Google is supposed to get you to the information.
00:46:13 It is not supposed to be the arbiter of the information.
00:46:16 And increasingly, it feels that it needs
00:46:17 to be the arbiter of the information.
00:46:19 And that's like the whole point
00:46:21 behind a lot of its new search products.
00:46:23 We're just gonna tell you how to think.
00:46:25 And it's like, well, no, one, you're a company,
00:46:27 you shouldn't tell me how to think.
00:46:28 Two, you don't actually know
00:46:30 because you're relying on these AIs that don't know.
00:46:33 Just stop this.
00:46:34 Like, this is a really bad path for them to go down.
00:46:36 I feel like it's gonna eventually bite them in the ass,
00:46:39 but they're so financially incentivized
00:46:40 that they just keep doing it.
00:46:43 - I mean, this is the truly,
00:46:44 the competition between OpenAI and Meta and Google
00:46:48 is ferocious.
00:46:50 Microsoft's obviously in the mix there.
00:46:52 - Apple wants to be.
00:46:53 - Well, Apple's just gonna buy some model
00:46:54 from one or the other, right?
00:46:57 At least for this stuff that Siri goes outside your phone.
00:47:00 That's the rumor we'll see, WWC is coming up.
00:47:02 But it just feels like there's that ferocious race
00:47:06 to get to this product
00:47:07 that everyone has dreamed of for a long time,
00:47:08 whether or not the core technology
00:47:11 can do the task reliably.
00:47:13 And just go back and think about
00:47:15 that film camera example for a minute.
00:47:16 But if you had a friend who was like,
00:47:18 you were like, for some reason,
00:47:20 you're like, my film camera is broken.
00:47:22 I can't move this lever.
00:47:23 And they're like, yeah, just open the back.
00:47:24 And then you open the back
00:47:25 and it destroyed all the photos you've taken.
00:47:27 And by the way, film's not cheap,
00:47:28 which is the thing that Becker reminded me of instantly.
00:47:30 Like, film is expensive.
00:47:31 So you're gonna ruin all these photos,
00:47:32 you're gonna ruin your expensive piece of film.
00:47:35 You would be pissed at your friend.
00:47:37 (laughing)
00:47:38 Like, just as a matter of course,
00:47:39 you'd be like, what?
00:47:41 Dude, like, you could have told me to go to a dark room.
00:47:44 Like, I asked you 'cause I assume you know
00:47:46 this other piece of information.
00:47:47 And there's none of that accountability with these tools.
00:47:50 You can't just be mad at it.
00:47:51 Because, you know, in a market,
00:47:53 you would leave Google and go to OpenAI.
00:47:55 And it's like, it's gonna lie at you,
00:47:56 but in a way that suggests it might also have sex with you.
00:47:59 Like, fine.
00:48:00 (laughing)
00:48:01 But I just, that's the part where it's like,
00:48:04 everyone's building a really fast car,
00:48:06 but they're not telling you that the engine
00:48:07 just doesn't work about 20% of the time.
00:48:10 And it's like, okay, but is there a path to stopping it?
00:48:14 - That would be a bad car.
00:48:15 - I actually, I would put it slightly differently.
00:48:17 I think it's, they're building a fast car
00:48:20 and not telling you how to use it, right?
00:48:21 And I think like, the camera thing to me
00:48:23 is such an interesting example.
00:48:24 I would argue that's not a hallucination.
00:48:27 That's a terrible UX mistake.
00:48:30 Like, it gave you a true answer
00:48:32 without the actual bit of information
00:48:34 you needed to know what to do with that true answer.
00:48:38 You can open the back of your camera to get the film out.
00:48:41 That is a true thing.
00:48:42 And so in a certain way, the model was successful.
00:48:44 And like, that's what I hear Sundar saying.
00:48:46 It's like, it didn't get it wrong.
00:48:49 It just didn't tell you what you actually needed to know.
00:48:52 And so now we're in this place where like--
00:48:54 (laughing)
00:48:57 - I mean, that is the, you're dancing on the heads of wheels.
00:49:03 - I mean, there are things that are wrong.
00:49:05 Like, Alex Kranz does not have a beard.
00:49:08 That is wrong, right?
00:49:09 Like, got it, nailed.
00:49:11 - Yeah.
00:49:12 - But the rest of the answers on that list,
00:49:15 that's the one I focus on 'cause it was the most destructive.
00:49:17 The rest of the answers in the list
00:49:18 that it generates are bananas.
00:49:20 Like, one of them is just like,
00:49:21 nudge the shutter a little bit, which is like, no.
00:49:24 - Yeah, don't do that.
00:49:25 But I think, one of the things I think a lot about
00:49:28 with Google right now is Google
00:49:31 used to make you do a lot of work, right?
00:49:33 And that was the job, right?
00:49:35 Like, when you did Google search,
00:49:36 you did most of the work
00:49:37 when you were interacting with Google.
00:49:39 And now Google is trying to do most of the work for you,
00:49:43 and it creates this incredibly different version of Google
00:49:47 that needs to exist.
00:49:48 And what Google thinks is that actually,
00:49:50 it can just put a paragraph summary at the top
00:49:53 and then let you use Google the way you normally would
00:49:56 because if you wanna know more than just the paragraph,
00:49:58 well, here's regular Google.
00:49:59 And I think that is like, woefully underestimating
00:50:03 how differently people are gonna start to use Google
00:50:05 as a result.
00:50:06 And so, to me, it's like,
00:50:09 making these things in such a way
00:50:10 that is like, weird and different is one thing,
00:50:13 but to just push them at people and be like,
00:50:16 this is a familiar thing that you totally understand
00:50:19 is a disaster.
00:50:20 And that's so much of what we're getting.
00:50:21 It's like, this thing sounds like a person,
00:50:23 talk to it like a person.
00:50:24 It works like a person.
00:50:25 Like, it super doesn't.
00:50:27 Not even close.
00:50:28 And all it's gonna do is run you into trouble.
00:50:30 - Yeah.
00:50:31 I mean, to me, this is the question
00:50:33 that like, all of these products have to answer, right?
00:50:37 Is what are people's expectations
00:50:38 and can you change them fast enough?
00:50:40 I truly do not know the answer to that
00:50:42 because I think what people are responding to
00:50:46 is these computers are talking to them.
00:50:48 And it turns out, people confidently,
00:50:51 flirtatiously talking to you, charmingly talking to you
00:50:54 is a great way to get a lot of people
00:50:56 to believe a lot of lies.
00:50:58 - That's how you get Alex Kranz in a newsboy cap.
00:51:00 - Yeah, exactly.
00:51:02 This is how Swing Dance happened to America.
00:51:04 (laughing)
00:51:05 I don't know if that's true.
00:51:06 But like, Google knows this because it operates YouTube.
00:51:09 It knows this problem intimately, right?
00:51:11 There's a lot of charming liars.
00:51:13 Here we are, we're on YouTube.
00:51:14 But it's interesting that it's like running full force
00:51:17 into making the problem.
00:51:19 And like you said, Alex, owning those search results,
00:51:22 like that's Google's fault that they told you that answer.
00:51:24 Whether or not you think it's appropriate
00:51:26 to pop open the back of the camera,
00:51:27 that's Google's answers now.
00:51:29 It's not some answer in a Reddit thread.
00:51:31 It's not some answer from a friend
00:51:32 that you're kind of mad at.
00:51:33 It's just Google's problem now.
00:51:34 It's gonna be an open ass problem.
00:51:36 There's a lot more to talk about.
00:51:38 We should talk about search very briefly.
00:51:40 We should talk about everything else Google has.
00:51:41 We gotta take a break though.
00:51:42 We've been at this for a minute.
00:51:43 We'll be right back.
00:51:44 (upbeat music)
00:51:46 All right, we're back.
00:51:49 Existential crisis of a banging iPad is on pause
00:51:53 for the rest of this first cast.
00:51:55 I know you wanted more of it.
00:51:56 - I thought you were gonna say over.
00:51:57 And I was like, well, that's a lie.
00:51:58 But on pause, we can live with it.
00:52:00 - No, Kevin Ruse still wanders these streets, David.
00:52:03 By the way, we're good friends with both Kevin
00:52:05 and Casey at Hart Fork.
00:52:06 That's why we, out of love.
00:52:09 Although I do worry, out of love.
00:52:11 All right.
00:52:12 - Just not as much as we love our iPads.
00:52:13 (laughing)
00:52:15 It's like my wife, my iPad, Kevin Ruse
00:52:20 is like, I would say the big three.
00:52:22 - Man, the dog just got a job.
00:52:24 Child.
00:52:27 He's there, he's fine.
00:52:28 He doesn't play me any YouTube videos.
00:52:32 - Fair enough.
00:52:34 Look, the OpenAI stuff and the Google stuff, right?
00:52:37 These like video, multimodal search chat interfaces.
00:52:40 Those are the main events.
00:52:42 Google announced 10 million other things at I/O.
00:52:46 All of which were named Gemini, basically.
00:52:50 Or variations of it.
00:52:51 It's very confusing.
00:52:52 David, can you just run us down the stuff?
00:52:54 - Sure, so basically the theme of Google I/O
00:52:58 is just Gemini in everything.
00:53:01 There was Gemini Google Photos,
00:53:03 which powers a feature called Ask Photos
00:53:05 that lets you ask questions of the photos
00:53:08 in your Google Photos.
00:53:09 The example they gave was,
00:53:10 you can just search what's my license plate number
00:53:12 and it'll find your license plate number
00:53:14 in your license plate in a photo in your Google Photos.
00:53:17 Which, as somebody who only knows my driver's license number
00:53:19 because it's in my Google Photos,
00:53:21 rules, all about that feature.
00:53:23 - Yes, by the way, that might be
00:53:24 the single most important thing that they announced,
00:53:27 was AI searching Google Photos.
00:53:29 - Oh, and also, as far as I could tell,
00:53:30 I wasn't there live, but it got a big reaction,
00:53:33 as far as I could tell, both live on the livestream
00:53:36 and just in our office.
00:53:37 People being like, "Yes, license plate search!"
00:53:39 - Well, (laughs)
00:53:42 turns out no one knows what their license plate is.
00:53:44 You know what I think it is?
00:53:45 It's Google Photos is so close to being able to do that now,
00:53:48 so is Apple Photos in its way,
00:53:50 that you start to ask it things it can't do.
00:53:52 So if you ever search your Google Photos,
00:53:54 you quickly start to ask for things that it can't do,
00:53:58 because it's so close to being able to do that now.
00:54:00 So you can just ask it for,
00:54:03 show me all the people wearing a blue shirt
00:54:06 on this day in the past, and it'll figure it out.
00:54:09 'Cause it has some ability of that knowledge.
00:54:11 I personally search for the word truck a lot,
00:54:14 just to gaze at my now departed pickup truck.
00:54:17 I needed to know my VIN number for when I traded in the truck
00:54:20 and I just searched for the words VIN number,
00:54:22 and just found it, right?
00:54:23 'Cause I'd taken a picture of it.
00:54:26 The idea that you can extend that,
00:54:27 I think people responded to it,
00:54:29 'cause they're already trying.
00:54:31 So now you're gonna give the people what they want,
00:54:33 it's very good.
00:54:34 - Yeah, yeah.
00:54:35 So that was one thing that got a lot of excitement.
00:54:38 Sort of in the same vein of things
00:54:40 you already kinda wanna do,
00:54:42 they announced a bunch of Gemini stuff in Workspace.
00:54:45 There's a thing where you can ask
00:54:47 similar questions of your email.
00:54:49 The example they gave was,
00:54:50 if you have concert tickets,
00:54:52 or I think it was Knicks tickets
00:54:53 was one of the examples they used.
00:54:55 You can just say, "What time do the doors open
00:54:57 "at the Knicks game tonight?"
00:54:57 And it'll actually look in your email for your ticket
00:55:00 to find that information.
00:55:01 That kinda stuff is neat.
00:55:02 There was a bunch of stuff
00:55:06 in just the kind of general model universe.
00:55:10 There was a new version of the Imagine model,
00:55:12 which is one of the ones that developers
00:55:14 can use for creative tools.
00:55:16 There was VEO, which is Google's answer to Sora,
00:55:19 which is a text-to-video model.
00:55:21 - Which seemed really cool, by the way.
00:55:23 - Yeah, it seemed very good.
00:55:24 - It seemed even better than Sora in a lot of ways.
00:55:27 - Also debatable whether either of them
00:55:29 actually exist in the world, so we'll see.
00:55:31 There was a new thing called Gemini Live,
00:55:34 which was like an always-on voice chatty thing,
00:55:37 basically very similar to the one that OpenAI announced.
00:55:41 Project Astra is the big long-term vision.
00:55:43 Gemini Live is that just sort of specific voice thing.
00:55:47 There was an update to SynthID,
00:55:49 which is how Google is attempting to figure out a way
00:55:52 to watermark AI-generated stuff.
00:55:54 They're now doing it with text,
00:55:56 which I thought was really interesting,
00:55:57 but also with videos, which goes along with VEO
00:56:01 and all the stuff they're doing there.
00:56:03 Bunch of stuff for Gemini Advanced and Gemini Nano
00:56:06 and Gemini 1.5 Flash, which is a new model
00:56:09 just designed for all of its--
00:56:11 The names are insane,
00:56:12 and I'm sure I've gotten 2/3 of them wrong,
00:56:14 and I don't care because it's Google's fault.
00:56:17 - You've gotten them all right so far.
00:56:18 - The names are dumb.
00:56:19 There was a new AI thing in YouTube
00:56:23 called Music AI Sandbox, which I confess
00:56:25 I don't totally understand because there's been a bunch
00:56:27 of kind of overlapping AI music things inside of YouTube,
00:56:31 and they glossed over it so quickly
00:56:33 that I think it might just be kind of an amalgam
00:56:35 of some of those things already,
00:56:37 but that's a thing that exists.
00:56:38 They're working with musicians on that stuff.
00:56:41 New Gemma models, which are Google's open-sourced AI models.
00:56:46 Circle to Search is in Android doing more stuff.
00:56:50 They had a bunch of little things in Android
00:56:53 designed to make kind of Gemini all over the operating system
00:56:57 in interesting ways.
00:56:58 - So the Android piece is really interesting to me,
00:57:01 like in a big way.
00:57:03 I don't know if any of it will come true because Google,
00:57:07 but Dave Burke got on stage and started demoing AI features
00:57:11 in Android, and he's like,
00:57:12 "This works as we control the operating system,"
00:57:15 and he put that sort of right next to,
00:57:16 "We've combined the Android team and the Pixel team
00:57:19 under Rick Ostrillo," and you're like,
00:57:20 "Oh, like if you make this really good,
00:57:25 this is the thing that is actually compelling
00:57:27 to switch from a Samsung phone or iPhone."
00:57:30 - This is how Google thinks it wins, for sure,
00:57:32 both with Pixel specifically and with Android in general.
00:57:36 Like I think this is, true or false,
00:57:39 the thing they have identified as their chance
00:57:42 to leapfrog Apple in a really good way.
00:57:44 - But they've always thought AI was their way.
00:57:47 - Yeah, but now it's like the user interface
00:57:48 for the operating system is AI.
00:57:50 Like they've built like a super rabbit.
00:57:52 You know, it's like that's the thing that they're getting at
00:57:54 is like you can just be on your phone and circle a thing,
00:57:57 and the robot will go do it for you,
00:57:59 or you can just talk to it and it'll use the apps for you,
00:58:02 and unlike rabbit, Google has some AI technology.
00:58:06 - Yeah, well, and there was one thing that they talked about
00:58:09 where the sort of context awareness
00:58:11 where it'll actually understand what's on the screen
00:58:14 as you're talking to your phone,
00:58:15 that's the kind of thing that you genuinely only can do
00:58:18 if you control the operating system.
00:58:20 Like no one else has that access,
00:58:22 but also Google has been trying to do this for forever.
00:58:26 Do you remember Google Now,
00:58:27 which was just the same damn thing?
00:58:28 Like the promise of so much
00:58:32 of the large language model stuff right now
00:58:34 is all this stuff we've been trying to do
00:58:35 for 15 years we can do now because the tech is better.
00:58:38 Like these ideas are not new.
00:58:40 It's so funny.
00:58:41 They just think we have the tech to do it now.
00:58:44 - Or the tech to do it convincingly.
00:58:45 I just have to keep putting that asterisk there.
00:58:48 To do it convincingly, maybe not to do it.
00:58:51 - That's fair.
00:58:52 To at least make you want to try it
00:58:54 is as far as they have to go.
00:58:55 There's more, there's a bunch of search stuff
00:58:58 that we should talk about,
00:58:59 but it was basically like they made all the models better
00:59:03 and are starting to introduce some more specific models.
00:59:06 Like I think Gemini 1.5 Pro Flash,
00:59:08 in addition to being an unbelievably stupid name,
00:59:11 is really interesting because it's just a different version
00:59:14 of Gemini that is just designed for speed.
00:59:17 Its whole job is we just pared everything down
00:59:19 and made it as fast as we possibly could.
00:59:21 And I think you're gonna start to see
00:59:22 more little things like that
00:59:24 as they tune to these kind of specific use cases over time.
00:59:27 So it really, top to bottom, was just,
00:59:30 here's the thing, we Gemini'd it.
00:59:32 Is it better now?
00:59:33 Do you like it?
00:59:34 That was the whole vibe.
00:59:36 - So I will say, I think the most important one
00:59:37 is Google Photos.
00:59:38 It's the one that will hit the most people the fastest
00:59:40 and change how people use the application,
00:59:42 which I think is cool.
00:59:44 Like really cool.
00:59:45 Then there's the rest of it,
00:59:46 which I think the major criticism I had was like,
00:59:48 there's too much stuff, it was too unfocused.
00:59:50 Like they just said the word Gemini.
00:59:52 Like if you just read that list of things named Gemini,
00:59:55 Gemini 1.5 Pro, Gemini 1.5 Flash,
00:59:57 Gemini Live, Gemini Now, Gemini, like it's insane.
00:59:59 They announced the thing where they're like,
01:00:02 here's a prototype and idea where you have an AI teammate
01:00:05 who has their own Gmail address.
01:00:07 She's like yell at it.
01:00:08 Yeah, his name was Chip.
01:00:10 That's like, this is a ridiculous,
01:00:11 like this is pure vapor.
01:00:13 Like you are announcing vaporware.
01:00:14 There's no reason to do this.
01:00:17 Like you're just, why?
01:00:18 Right, you could pare it down to like,
01:00:19 here's our product and here's all the things it can do.
01:00:22 And that'd be more focused.
01:00:23 But I think Google thinks it has to like,
01:00:26 cover the waterfront of every idea you might have with AI.
01:00:28 And be like, we had that idea too.
01:00:30 And we have the infrastructure to pull it off.
01:00:31 And we have Google Workspace.
01:00:33 So we can just deploy 100 AI teammates
01:00:35 to your Google Workspace tomorrow.
01:00:37 Startups, don't you dare.
01:00:38 Yeah, I mean, I really like,
01:00:42 the way I've come to think about all of these things
01:00:45 is that they used to be kind of half for developers
01:00:49 and half for like the general public
01:00:50 to sort of show the world what you've been working on.
01:00:53 Now it's like a third for Wall Street.
01:00:56 So that you can make sure investors trust
01:00:59 that you have a real AI strategy.
01:01:01 Because fundamentally that's what a lot of this is about.
01:01:03 Google is still fighting the idea
01:01:05 that it got caught off guard by OpenAI and ChatGPT
01:01:09 and is still trying to assert itself
01:01:12 to investors specifically as a big player in the AI future.
01:01:16 Which means trillions of dollars
01:01:18 if you can convince everybody that that's the case.
01:01:20 So it's like, it's a third Wall Street.
01:01:22 It's like 50% developers.
01:01:25 And then it's just a little teeny tiny slice
01:01:28 left for regular people.
01:01:29 - You almost did the rest of that math
01:01:31 and you just walked away from it right there.
01:01:33 - 17% regular people.
01:01:36 - There you go.
01:01:36 - And it's just like,
01:01:39 you could just feel Google getting up there over and over
01:01:42 and being like, we are good at AI.
01:01:44 - Yeah, look at our AI stuff.
01:01:45 - That was so much of it.
01:01:46 Yeah, like we did it, AI.
01:01:47 - That's how I felt about Apple at the iPad event.
01:01:49 They were just like, the best AI piece.
01:01:50 It's like, what are you talking about?
01:01:52 And it's like, this is made for analysts.
01:01:54 So the one thing that's really interesting there,
01:01:55 specifically in the context of Wall Street,
01:01:57 is AI overviews in search,
01:01:59 which they announced very quickly.
01:02:01 So Google has had this thing called
01:02:04 search generative experience,
01:02:05 sort of in testing for a long time now,
01:02:08 a year, where you search for something
01:02:10 and there's like an AI answer.
01:02:12 And it's like, here's this stuff.
01:02:13 And they've been tweaking it and messing with it
01:02:16 and like seeing how it works for a while.
01:02:17 And what they announced at I/O,
01:02:19 which is a huge deal,
01:02:21 is it's rolling out to everyone in the US like this week,
01:02:24 it's gonna be called AI overviews.
01:02:26 And that's it, this is search now.
01:02:29 And on earnings calls last year,
01:02:30 Sundar was saying, this is the future of search.
01:02:33 Like search will be this thing.
01:02:34 This is where we're going.
01:02:35 And here we are, where you search for something on Google
01:02:38 and depending on the query,
01:02:40 Google just reads the web and delivers the answer for you.
01:02:44 And next to that, they have this thing,
01:02:46 AI powered like search page,
01:02:49 which is basically what Arc browser is doing, right?
01:02:52 Where you like go ask Arc a question
01:02:54 and it like reads the web
01:02:55 and it like makes a little webpage for you
01:02:57 with all the answers in it.
01:02:58 Google's gonna do that now
01:03:00 as the search engine results page, the SERP.
01:03:03 Instead of 10 blue links,
01:03:04 it will just like make a little custom webpage
01:03:06 of like best anniversary restaurants in Dallas
01:03:09 and be like, we've concocted this webpage for you.
01:03:12 That's wild, right?
01:03:13 This is like flip the table
01:03:15 on how the internet works in a big way.
01:03:17 I would say the response from the media industry
01:03:21 in particular is apocalyptic.
01:03:24 Like that's the word I would use.
01:03:25 - Fair.
01:03:26 - The CEO of the News Media Alliance,
01:03:28 which I think we have to disclose,
01:03:29 like I think Fox Media is in it.
01:03:30 It's like not my thing, but someone's in it, sure.
01:03:33 - We are in the media ourselves.
01:03:35 - We are in the news media.
01:03:36 Yeah, we run, we operate in the media business
01:03:38 and our executives are in various
01:03:40 media business organizations.
01:03:42 That's their side of the house.
01:03:43 I just complain, but there's your disclosure.
01:03:46 But the CEO of the News Media Alliance said to CNN,
01:03:48 this is catastrophic to our traffic.
01:03:50 Like Google's gonna keep all the traffic
01:03:52 and they're not gonna send us any traffic.
01:03:53 We've been talking a lot about little sites
01:03:55 that are already seeing Google traffic go to zero.
01:03:58 I've been talking for years about this thing
01:04:00 I call Google Zero, which if you've been
01:04:02 listening to Vergecast for a long time,
01:04:04 you know that like our first big referrer
01:04:06 when we started was Yahoo.
01:04:07 And we would like write stories about fish technology
01:04:10 because the Yahoo algorithm loved fish.
01:04:13 And we would just like do that on Fridays
01:04:15 to fuck with Yahoo and it worked.
01:04:17 I don't know, that's how I built the business.
01:04:19 Like I'm sorry.
01:04:20 And then you know, there's like Facebook traffic
01:04:21 for a while and it went away.
01:04:22 And I'm always like the thing, it will go away.
01:04:24 That's like my inherent paranoia
01:04:26 after all this time in the media.
01:04:28 And I'm like that number will go down.
01:04:30 And that's Google.
01:04:30 And that's the last referrer of note
01:04:32 across the entire industry.
01:04:33 I've been calling this Google Zero for years.
01:04:36 Like what are you gonna do if your Google traffic
01:04:37 goes to zero?
01:04:38 On Decoder I ask media executives all the time.
01:04:41 And this is the moment where I think it clicked
01:04:42 into reality for a lot of those executives
01:04:45 that Google is gonna start answering the questions
01:04:47 by reading their sites and delivering
01:04:49 an AI summary of the answer.
01:04:50 And Google's response to this,
01:04:52 and again this is, we can move through this quickly
01:04:54 'cause I talked about this with Sundar and Decoder
01:04:56 quite a bit, that's coming on Monday.
01:04:57 But Google's answer to this, Liz's answer to this,
01:05:00 Liz Reed who runs Search, is actually the links
01:05:03 and the AI overviews get more clicks.
01:05:05 That's their answer.
01:05:08 Like straight up, they're like actually
01:05:09 we're gonna send more traffic this way.
01:05:11 No one knows if that is true.
01:05:13 The only people who know the data to evaluate that
01:05:16 are Sundar Pichai and Liz Reed.
01:05:19 And it's like well if that's true, that's great.
01:05:21 If it's not true, or worse, if the haves get bigger,
01:05:26 you send more traffic to the big corporate sites
01:05:30 and no traffic at all to anyone else,
01:05:32 that's bad for the, that's bad.
01:05:36 That will reshape the internet in particular ways.
01:05:38 And maybe you think this is great.
01:05:40 Maybe you think that indie media or whatever,
01:05:43 small sites, should build their own network of traffic
01:05:46 and their own, this is our belief.
01:05:48 My belief very strongly is we should have
01:05:51 our own audience away from platforms.
01:05:53 But if you are like operating today,
01:05:56 and you're like oh my traffic is gonna go to zero,
01:05:58 my Google traffic is gonna go to zero,
01:06:00 you don't have the runway to like do the thing.
01:06:03 - Right, they just turned it on.
01:06:05 Like it's just on now.
01:06:06 - Yeah, like your business is gonna go away.
01:06:08 And that is the thing that I think the pressure
01:06:12 from open AI, the competitive pressure from open AI
01:06:14 and all these other companies,
01:06:15 it pushed Google to do a thing very quickly
01:06:19 that's gonna reshape the internet
01:06:20 in almost indescribable ways.
01:06:22 Because the traffic flows of search traffic
01:06:25 are about to like table flip.
01:06:27 - Yeah, I do think the example that most immediately
01:06:30 keeps coming to mind for me for this
01:06:32 is the what time does the Super Bowl start kind of thing.
01:06:35 And that to some extent has already died
01:06:38 thanks to the knowledge graph,
01:06:39 which now is just like a,
01:06:40 it just pops up a thing at the top.
01:06:42 But what, I talked to Liz Reid a few days before I/O,
01:06:45 and one of the things that she said,
01:06:47 almost in as many words,
01:06:49 is that there is a type of information on the internet
01:06:52 that Google just views as commodity information
01:06:56 that I shouldn't need to go to another website
01:06:58 to know the answer to, right?
01:06:59 Like before, if you wanted to know
01:07:03 the years that Abraham Lincoln was the president,
01:07:05 you would search that on Google
01:07:07 and then you would click on a link.
01:07:08 And Google at some point has decided
01:07:09 that that is just commoditized information
01:07:12 that belongs to no one,
01:07:14 and so I should have to go nowhere to get it.
01:07:17 And then if you go all the way down the spectrum,
01:07:20 you get down to like human perspectives
01:07:23 and original stuff and really beautiful art, right?
01:07:26 And that is the stuff that I think Google earnestly believes
01:07:29 should still belong to those people.
01:07:31 And that Google's job is still to send you as the user
01:07:34 to those places.
01:07:35 And I think Google would like you to believe
01:07:37 that it is very obvious where one of those things ends
01:07:39 and the other begins.
01:07:40 And in fact, the whole middle ground,
01:07:43 which is most of the internet, is somewhere in between,
01:07:46 and there is just no line that it is possible to draw
01:07:50 between what is it just sort of out there in the ether
01:07:54 that everyone can know,
01:07:56 and the idea that you and I both used some of our time
01:08:00 to write a story hoping we'd get some clicks
01:08:02 and some AdSense,
01:08:03 that that's actually like a unnecessary thing
01:08:06 for the enjoyment of the internet,
01:08:07 maybe that should go away.
01:08:09 I'm somewhat receptive to that version of the conversation,
01:08:12 where that ends and good things that deserve to exist begins
01:08:16 is now up to Google.
01:08:18 And that is what scares me about this.
01:08:20 - Yeah, and they have not really articulated
01:08:23 anything beyond like the cream of rice at the top.
01:08:26 - Yeah, they say high quality information
01:08:28 and human perspectives and a bunch of just like
01:08:29 vague buzzwords that don't give anybody anything.
01:08:32 - Isn't this like just what Ask Jeeves was doing
01:08:37 for decades?
01:08:38 - Yeah.
01:08:39 - Like, it didn't all decide that that was a stupid way
01:08:42 to interact with the internet.
01:08:44 And that's why Ask Jeeves still exists,
01:08:46 but way fewer people use it now.
01:08:48 I think it still exists.
01:08:49 - It's ask.com now.
01:08:51 - Sorry.
01:08:52 - Jeeves was fired.
01:08:53 - Please.
01:08:54 - Jeeves got canceled.
01:08:54 - Jeeves got canceled.
01:08:55 - Got canned.
01:08:56 - Some old tweets surfaced and Jeeves is no longer with us.
01:09:00 - But again, Alex, that's a perfect example
01:09:02 of like none of these are new ideas, right?
01:09:04 The only thing that's new is that everyone has decided
01:09:08 it's now possible to do this well.
01:09:11 - Yeah.
01:09:11 - Ask.com does TV show reviews now.
01:09:15 - Oh my God.
01:09:16 - Sure.
01:09:16 - So that's pure, just like Google bait.
01:09:19 - Perfect example.
01:09:20 - I mean, this is like why we spent a year last year
01:09:22 covering the SEO industry.
01:09:25 Because I think it's important to describe the thing
01:09:29 as it was before you can understand how it changed.
01:09:32 So we're like, let's just, these are the glory days
01:09:36 of the SEO industry as it exists today,
01:09:38 which is you buy ask.com and you search
01:09:41 for which TV shows are trending and you shove a bunch
01:09:44 of content onto a high ranking domain
01:09:46 to collect some pennies.
01:09:47 This is happening right away across the media industry.
01:09:49 And that's happening at like AI scale.
01:09:51 Like you buy Sports Illustrated
01:09:53 and you shove a bunch of AI content on it.
01:09:55 It's a real thing that has happened.
01:09:56 - Yep.
01:09:57 - Right, you buy these like zombie domain names.
01:10:00 What's the last one?
01:10:01 - The All.
01:10:02 - Gizmodo.
01:10:03 - The All.
01:10:04 - I don't know, was it the hairpin?
01:10:06 - The hairpin.
01:10:06 - It is the hairpin, yeah.
01:10:07 - That's what I'm thinking of.
01:10:08 Like this beloved women's blog
01:10:10 and you shove a bunch of AI content on it.
01:10:12 Like that's already happening.
01:10:14 This should wipe all that out, right?
01:10:16 And that's, I think, what Google would like
01:10:18 is to wipe all that noise out.
01:10:20 That implies, by the way, that Google can make
01:10:23 good determinations about what's good and what's bad,
01:10:25 which the current state of Google does not provide
01:10:29 a huge amount of evidence for.
01:10:31 It also implies that Google will prioritize
01:10:34 the beautiful human content over the AI content
01:10:38 when a huge part of Google's business
01:10:40 is making the AI content.
01:10:42 Weird.
01:10:42 - And monetizing the AI content.
01:10:44 - Yeah.
01:10:45 - That actually Google has an incentive
01:10:47 for that content to exist at giant scale.
01:10:50 - It also kind of pulls the curtain back on like Google
01:10:54 probably more than Google should want it to, right?
01:10:57 Like we, especially in our industry,
01:10:59 we know that Google is an enormously powerful thing
01:11:02 that really controls a lot of conversations and stuff
01:11:04 simply by virtue of the fact that
01:11:06 if you want to engage in these conversations,
01:11:08 you first go search for them.
01:11:09 And the first place you go search for them is Google.
01:11:11 And so Google kind of dictates
01:11:13 how you perceive these things.
01:11:15 If Google just starts answering the questions,
01:11:17 it is like the theater of,
01:11:18 oh, I search Google to figure this out,
01:11:20 and I have the agency of finding the answers is gone.
01:11:24 It's just Google.
01:11:25 - Dude, this is the phrase.
01:11:26 - Yeah.
01:11:27 - They use this on stage.
01:11:28 Let Google do the Googling for you.
01:11:29 This is the frame for how they talk about search.
01:11:31 - Horrible way to do it.
01:11:32 - It's a lot there.
01:11:34 We can talk about this forever.
01:11:36 I mean, I personally can talk about this forever
01:11:38 in tiresome ways.
01:11:39 And I assure you that if you work at Vox Media,
01:11:41 you have heard me talk about nothing else for four years.
01:11:44 Like a lot of the verge is designed around Google Zero.
01:11:50 And the point is,
01:11:51 I don't think anyone deserves traffic from Google, right?
01:11:56 That expectation, I think,
01:11:58 has actually been really unhealthy for the media industry,
01:12:00 that like we're entitled to traffic from platforms,
01:12:04 especially for us because we cover the platforms.
01:12:05 Like I would like some distance there in a real way.
01:12:08 And also just like, I'm a brat
01:12:10 and I don't want to be dependent on anyone ever for anything.
01:12:13 But this is a pivotal moment.
01:12:15 Like Google waving through,
01:12:18 we're gonna do AI overview and search for everyone in the US
01:12:21 and then sort of immediately pivoting to,
01:12:22 oh, and also you can search your Google photos.
01:12:25 That was, they like lit a bomb and they're like,
01:12:27 and you can find your license plates.
01:12:29 And they spent no time on it.
01:12:31 It does not appear that anybody knew
01:12:32 they were gonna do this.
01:12:33 We'll see how it shakes out.
01:12:35 I think the next 12 to 18 months
01:12:36 are gonna be an absolutely bananas time in media.
01:12:40 I think this is the thing that's gonna,
01:12:41 I would predict that there are some lawsuits filed over this
01:12:44 from the various media lobbying groups,
01:12:47 the same way that they've already started suing OpenAI.
01:12:50 There's gonna be a big shakeout from AI overviews.
01:12:54 And a huge part of it, and this is just what I'll focus on,
01:12:57 is Google's claim is that it will send more traffic.
01:13:00 You cannot measure that claim.
01:13:02 Inside of Google's own tools today
01:13:05 or with any data that Google is providing.
01:13:07 And if they wanna close that loop,
01:13:09 they have to make it like testable that this is true.
01:13:12 Right, that you have to be able to see the data for yourself
01:13:15 and say, okay, like we were the one
01:13:17 that showed up in the AI overview box
01:13:19 and we got way more traffic from it
01:13:20 than when we were getting,
01:13:21 like you cannot measure that today.
01:13:23 So we'll see, but I,
01:13:25 you listen to the "Decoder" because we talked about it a lot.
01:13:28 That's going on Monday.
01:13:29 But we're gonna be talking about this.
01:13:30 I guarantee you for about 18 months
01:13:32 because the internet is gonna get reshaped
01:13:34 around this particular feature in search.
01:13:36 - And it's gonna change a lot between now and then.
01:13:39 Like there are, the weirdest thing about this
01:13:41 is like SGE changed a ton in a year.
01:13:44 And I think Google is maybe going to be surprised
01:13:49 at how different it looks once you give it to everybody.
01:13:52 - Yeah, it's gonna be weird.
01:13:54 All right, we should take a break.
01:13:55 We'll come back with the lightning round.
01:13:58 Which as of yet is unsponsored,
01:13:59 but the bytes, they're coming.
01:14:02 It's gonna happen one of these days.
01:14:04 We'll be right back.
01:14:05 All right, we're back.
01:14:10 It's the lightning round.
01:14:12 Sponsored by me, Neil Apatow.
01:14:13 - Thank you.
01:14:15 - The guy who coined the phrase "financial".
01:14:16 - I didn't get any of that money.
01:14:17 I don't know.
01:14:18 (laughing)
01:14:19 - I was like, you're so generous, where is it?
01:14:20 - I'm just moving money around my own house.
01:14:22 (laughing)
01:14:23 I'm just doing tax fraud.
01:14:24 There's kind of a lot in the lightning round.
01:14:28 David, you're first.
01:14:29 - So, there's a Microsoft Surface event on Monday,
01:14:33 which there's been a lot of hype for
01:14:35 because Qualcomm and Microsoft and others
01:14:37 have been kind of quietly intimating
01:14:40 that we're about to get a new generation
01:14:42 of Snapdragon chips that are going to be
01:14:44 as good as the M-series Apple chips,
01:14:46 and the war between Mac and PC is about to go on.
01:14:49 And then we got a huge leak from Dell.
01:14:52 I think it was a 311-page document
01:14:55 detailing the next XPS 13, which is Dell's best laptop.
01:15:00 It's like the default, which Windows laptop
01:15:02 should most people buy, it's the XPS 13.
01:15:04 And the things that they're saying on here
01:15:08 are pretty unbelievable.
01:15:11 12 hours of battery life, 13 hours of battery life
01:15:14 on the lower spec models, local video playback for 29 hours,
01:15:19 tons of power that, at least according to the marketing copy,
01:15:24 the things that we've been thinking might be true
01:15:27 of this run of Snapdragon processors
01:15:29 might actually be true of these Snapdragon processors.
01:15:34 - I just wanna highlight that Apple last week,
01:15:39 like in kind of a surprising move,
01:15:42 announced their M4 processor in an iPad.
01:15:45 And I don't think that's an accident, right?
01:15:47 Like, Neil, you were talking earlier
01:15:49 about how a lot of these things are for Wall Street.
01:15:53 Apple did it for Wall Street 'cause it is very aware
01:15:56 that this is happening right now.
01:15:58 - Yeah, we'll see.
01:16:00 - We'll see, yeah.
01:16:01 - I would be remiss if I didn't note two things.
01:16:03 One, boy, we've heard this story before.
01:16:05 - Oh, yeah.
01:16:06 - So who knows?
01:16:08 But two, a lot of what we have,
01:16:11 a lot of our coverage is in Tom Warren's new newsletter,
01:16:13 Notepad by Tom Warren, which is his new newsletter
01:16:16 about all things Microsoft, particularly AI,
01:16:18 the future of PCs in this particular way,
01:16:20 and whatever Microsoft is doing in gaming,
01:16:23 which I won't even continue to talk about
01:16:25 because it will immediately be another hour
01:16:27 of the first test.
01:16:28 But you can sign up for Notepad now.
01:16:30 You can get it in a bundle with Command Line.
01:16:32 We have two paid newsletters.
01:16:34 We're gonna, let's see.
01:16:35 You see our empire's growing?
01:16:37 - I hope there's some, like, arrows pointing down
01:16:40 saying, "Buy now."
01:16:41 - Subscribe to Tom's newsletter
01:16:43 so that he can buy one of these new XPS 13s.
01:16:45 That's really what Neal is saying here.
01:16:47 Tom needs the battery life so he can write his newsletter.
01:16:50 Please subscribe to Notepad.
01:16:51 - Yeah, he's gotta write a lot of those every week now.
01:16:53 Okay, Alex, what's yours?
01:16:55 - Have you heard of psspspsps?
01:16:57 - That's a great name.
01:16:58 - Otherwise known as psspsps, PPSSPP.
01:17:03 You could say it like that, which is how it's supposed to be
01:17:07 because it's PSP.
01:17:08 - That's how you lure cats, right?
01:17:10 You go psspspsps.
01:17:11 - But it's psspspsps.
01:17:12 - Yeah.
01:17:13 - You just keep saying it until somebody tells you
01:17:15 to shut up and then you're like,
01:17:15 "Yeah, do you wanna play it?"
01:17:17 It's a new emulation software.
01:17:19 It's the PSP, do you guys remember the PSP?
01:17:21 - Oh yeah, I had a PSP.
01:17:22 They had one of the great cartridges of all time.
01:17:25 - I broke one of those cartridges,
01:17:27 but I loved the game and I was like poor,
01:17:30 so I couldn't go get a new one
01:17:32 and I'd always have to be like flicking the plastic
01:17:35 just to get it just right and then putting it back in.
01:17:37 - Only Sony invents mini disc
01:17:39 and then invents another mini disc for the PSP.
01:17:42 - But somehow a little worse.
01:17:43 But now you can play it on the iPhone,
01:17:47 which is really exciting because it was like a tank game
01:17:50 and I really wanna play that tank game.
01:17:52 I'm gonna be able to play it on my iPhone now.
01:17:54 And also Retroarch, which is like the emulator software.
01:17:59 So Retroarch lets you basically play just about any kind of,
01:18:02 if somebody out there has created emulation software
01:18:06 for a console you have never heard of,
01:18:09 you can probably find it in Retroarch.
01:18:11 - Nice.
01:18:12 - The software itself is a little,
01:18:13 it is not the most user-friendly,
01:18:16 it is not gonna be as neat.
01:18:17 - It's hidden, just saying.
01:18:18 - Yeah, it's ugly.
01:18:19 - It is the ugliest app of all time.
01:18:20 - Or troll ugly.
01:18:22 But it works.
01:18:23 I just put it on my Apple TV
01:18:26 and almost as soon as we are done recording,
01:18:30 I will be putting a bunch of games on my Apple TV
01:18:32 and testing it for work and definitely just for work.
01:18:36 But it's cool and it's just remarkable
01:18:39 'cause even a couple of years ago,
01:18:42 you were trying to figure out
01:18:43 how to get all of these emulated games onto your TV
01:18:47 and it was really limited
01:18:48 and you could do it like a Raspberry Pi,
01:18:50 you could mess around with Android, all of that,
01:18:54 and now you can just do it on an Apple TV
01:18:55 and it's like, welcome Apple, I've been here for 10 years,
01:18:58 but I'm excited you're here
01:19:00 because I kinda like my Apple TV more than my Shield TV.
01:19:03 I'm so sorry to all the Shield people.
01:19:05 I'm sorry.
01:19:06 - Dude, end of the show.
01:19:07 - It's in the guest bedroom now.
01:19:08 - So much, wow.
01:19:10 - I have been stewing on the idea
01:19:14 and being increasingly angry about it
01:19:16 that the Apple TV is the only good set-top box left.
01:19:19 They're all bad, but it is the least bad
01:19:23 because it is the one least destroyed by ads and scams
01:19:28 and all kinds of other nonsense.
01:19:30 - I watched "Palm Royale" because it was like,
01:19:32 do you wanna watch "Palm Royale?"
01:19:33 And I was like, yeah, Carol Burnett.
01:19:34 - Apple has figured out that it can just be a Roku.
01:19:37 - Yeah.
01:19:38 - It's headed there in extra blitz.
01:19:39 - It's infuriating, but I will say,
01:19:40 RetroArch in particular being on the Apple TV,
01:19:43 it's like a native Apple TV app.
01:19:45 - And it just blew.
01:19:45 - It's strong, but it exists on the Apple TV.
01:19:49 And some of the others work with AirPlay and stuff
01:19:51 and are starting to be pretty good,
01:19:52 but the Apple TV as a retro console thing
01:19:56 is going to be a big deal.
01:19:58 - Yeah, you can just do it now.
01:19:59 There's gonna be a lot more
01:20:01 that will probably be much more user-friendly,
01:20:03 but RetroArch is the big one.
01:20:05 You can just go play it now, and that's dope.
01:20:08 - I keep thinking about something David said
01:20:10 a couple of weeks ago,
01:20:11 which was all of this regulatory stuff
01:20:12 was really hard to understand,
01:20:16 and then the emulators happen,
01:20:17 and now everyone gets it.
01:20:18 A little bit of regulatory pressure
01:20:21 opened up the application model on the app,
01:20:22 and it's like, oh, everyone wants this.
01:20:24 - Yeah.
01:20:25 - And it's like, particularly for the Apple TV,
01:20:28 Apple has insisted that you do its dumb stuff,
01:20:31 and has not won.
01:20:33 Like that product has not won,
01:20:34 and it's by far the smallest install base.
01:20:37 It has the least,
01:20:39 like the idea that everyone's gonna be playing games
01:20:41 on the Apple TV just did not come true.
01:20:43 And now it's like,
01:20:44 oh shit, everyone's gonna be playing games
01:20:45 on their Apple TV.
01:20:46 - Yeah.
01:20:47 - I have never wanted to actually hook up a controller
01:20:49 to an Apple TV before today.
01:20:51 Like every other time,
01:20:52 they'd be like, do you wanna play this game
01:20:53 on your Apple TV?
01:20:54 No.
01:20:55 - Yeah, no, it's like, I have a PS5.
01:20:56 - Don't ask me that, I'm not gonna lie to you.
01:20:57 Like, no.
01:20:58 And now I'm like, okay, I wanna do that.
01:21:00 I wanna play Chrono Trigger on my TV.
01:21:03 - I honestly think the emulator stuff
01:21:05 is just the biggest indictment of Apple's
01:21:07 and like closed-fistedness that you can get.
01:21:10 'Cause it's like the second you let people do good stuff,
01:21:13 they're like more excited about these products
01:21:14 than they have been a long time.
01:21:15 - Totally.
01:21:17 - I have two in my lightning round.
01:21:18 One of them is just telling David
01:21:20 to talk about his iPad reviews.
01:21:22 Which I know you already talked about earlier this week,
01:21:24 but I wasn't here.
01:21:25 So it feels like everyone landed
01:21:29 in exactly the same place,
01:21:31 which is, boy, I hope WWDC involves an iPad OS update.
01:21:34 - So yes and no.
01:21:36 The conversation around this
01:21:37 has actually been really interesting
01:21:38 because most of the reviews, including mine,
01:21:41 basically say that.
01:21:42 I actually worried I was going overboard
01:21:45 gushing about the hardware.
01:21:47 There were others who went much further.
01:21:50 With the iPad Pro in particular,
01:21:52 it is a spectacular piece of engineering and design.
01:21:56 But the question forever is like,
01:21:58 what is this thing actually for?
01:22:00 And I think what has come true this week
01:22:03 is there's been a subset of people who are like,
01:22:04 yeah, this thing needs to be more like a Mac.
01:22:07 Let me run Mac apps, open it up,
01:22:09 let me do whatever I want.
01:22:10 Give me a Mac in this body and I will be happy.
01:22:14 And then there's a much different set of people
01:22:17 who are louder than I thought,
01:22:19 who are like, no, that is not the point.
01:22:23 Maybe I want a touchscreen Mac,
01:22:24 we'll have that conversation separately.
01:22:26 But the iPad is supposed to be different.
01:22:28 And one of the things I said in my review
01:22:29 is that after talking to people at Apple
01:22:32 and about the iPad,
01:22:33 and I've just been covering this a long time,
01:22:34 Apple's running theory is that it can do
01:22:38 whatever the opposite of death by a thousand cuts is,
01:22:41 that rather than have one big thing
01:22:44 that makes the iPad for everybody,
01:22:46 it can have one that is for you, Neila,
01:22:49 and one that is for you, Alex,
01:22:50 and one that is for me, David,
01:22:51 and we'll all buy iPads and we'll use them differently.
01:22:53 And that that, for Apple, is the ideal outcome.
01:22:56 And there are a lot of people for whom that idea
01:22:59 is really compelling and really romantic.
01:23:01 Those people tend to be people who have
01:23:03 that specific thing in the iPad already.
01:23:05 People who love the pencil,
01:23:06 or people who do the insane architect thing
01:23:10 of holding up the iPad and walking around the table
01:23:12 to show somebody your designs,
01:23:14 which I'm not convinced is real.
01:23:16 - We got an email from someone who,
01:23:18 I don't remember the email specifically,
01:23:19 but they are an interior designer,
01:23:21 they're partners in interior designer,
01:23:23 and they're like, we just run the iPad to the ground
01:23:26 like every day. - Yeah, it's a thing.
01:23:27 - It's great, great.
01:23:29 - Yeah, so I think the question then for Apple
01:23:32 is like, okay, do you try to find the big mainstreamy thing,
01:23:35 which I think the obvious answer is make it more laptop-y,
01:23:40 or just open it up so that people can do more stuff on it,
01:23:44 or do you keep just chipping away at this thing
01:23:46 like one tiny use case at a time
01:23:48 until you've built the magic?
01:23:51 And I think that second approach
01:23:53 is way more fun and interesting
01:23:54 and leads to something very cool in the long run.
01:23:56 We're also 14 years into this, and Apple has found two.
01:24:01 - Yeah, and to me, the thing that I got,
01:24:03 I was on planes all week basically,
01:24:06 just watching people angrily post at each other
01:24:08 on the iPad, and I was like,
01:24:11 these are the same iPad reviews we've been writing for years.
01:24:14 - Yeah. - It's the same iPad.
01:24:16 - I forgot that I wrote this iPad review in 2013
01:24:19 when the first iPad came out.
01:24:20 Like the same exact review, running iOS 7.
01:24:24 Like, boy, it'd be cool if I could do stuff with this.
01:24:26 And then I wrote a version of that review in 2018,
01:24:27 and then David, you wrote a version of that review yesterday,
01:24:29 and then I just saw Dieter at the Google event,
01:24:31 and he was like, I wrote these iPad reviews.
01:24:33 Like, Dieter is the one I believe
01:24:34 who coined the phrase, it's an iPad.
01:24:37 So like, look directly at a camera,
01:24:38 be like, it's an iPad, and you know exactly what that means.
01:24:40 That's us, we did that here.
01:24:41 And the thing that gets me about it,
01:24:44 the thing that I've, the reason I wanna talk about it
01:24:46 right after emulators is it's Apple's business model
01:24:50 that is holding this computer back.
01:24:52 - Yes. - It is not
01:24:53 the capabilities of the iPad or some idealism
01:24:56 or blah, blah, blah, blah, blah, blah, blah.
01:24:58 It is Apple's business model
01:25:00 is preventing application developers
01:25:02 from using the full power of the iPad.
01:25:04 The fact that this thing cannot actually run
01:25:06 a desktop web browser, they say it's desktop class,
01:25:09 it absolutely is not. - No.
01:25:11 - Nope. - It just isn't.
01:25:12 - Also, if you go to docs.google.com
01:25:15 in the browser on an iPad,
01:25:17 it will punt you to the Google Docs app.
01:25:19 - Well, we'll try. - It's the most
01:25:20 miserable experience in the world.
01:25:21 - So what's really interesting is the Google Docs app.
01:25:23 This is, you know, all of our years
01:25:25 we were in the iPad, we would complain about this,
01:25:27 and Apple would always be like, "Well, who cares?
01:25:28 "Only journalists use Google Docs."
01:25:29 And then they realized the journalists
01:25:30 who review the iPad use Google Docs.
01:25:33 So they fixed it by faking the user agent
01:25:35 in mobile Safari to tell Google Docs
01:25:38 that it's desktop Safari.
01:25:39 So you can run the good Google Docs
01:25:41 in Safari on the iPad.
01:25:43 What they did not do was make mobile Safari good.
01:25:46 - Right, or fix it in Chrome
01:25:48 or any of the other browsers that you can run on the iPad.
01:25:50 - Right, and the reason for that
01:25:53 is straightforwardly a business model reason.
01:25:56 It's not that the thing isn't powerful enough
01:25:58 to run the regular Safari that you can run on a laptop.
01:26:00 - It's like stupid powerful, this thing.
01:26:02 - Right?
01:26:03 Yeah, I have an Intel Mac downstairs.
01:26:05 I have a 2015 iMac sitting right over there
01:26:07 that is one-tenth as powerful as the iPad Pro.
01:26:11 It runs regular Safari just fine.
01:26:13 It runs Chrome in a way that suggests
01:26:16 that I should buy a new computer, but it runs Chrome.
01:26:18 Right?
01:26:19 The reason Apple doesn't allow that to happen on the iPad
01:26:23 is because the web is now the most powerful
01:26:26 application distribution mechanism that has ever existed.
01:26:29 It is by far the most popular in the world.
01:26:31 And if you bring that to the iPad, the App Store goes away.
01:26:36 Because now you just have a laptop.
01:26:37 You have a Chromebook.
01:26:38 - A really nice Chromebook.
01:26:40 - A really beautiful Chromebook.
01:26:42 And Apple's business will not allow that to happen.
01:26:44 And so this product is just absolutely held back
01:26:47 by a business decision that maybe it's fair.
01:26:50 I don't think, fine, make your business decisions.
01:26:53 But we should be honest
01:26:54 that this isn't a technical limitation
01:26:56 or an idealistic limitation or blah, blah, blah.
01:26:58 Apple is making a hard business decision
01:27:00 about the application models on its products.
01:27:02 Regulators are starting to crack that open
01:27:03 and people are starting to see, oh, emulators are great.
01:27:05 Like all this other stuff that we could do is great.
01:27:07 And the last one to fall, I promise you,
01:27:10 is they will put a desktop class browser on the iPad
01:27:12 and it will suddenly become a hundred times
01:27:13 more powerful as a computer.
01:27:14 And more and more people will be like,
01:27:15 I can just use this all the time.
01:27:17 It's not the file system, although I love a file system.
01:27:20 It's literally the application environment
01:27:22 is completely neutered.
01:27:24 - Yeah, no, I absolutely agree.
01:27:26 And I think the browser solves a much bigger set of problems
01:27:29 than like, you know, x86 interoperability
01:27:33 with the old apps that you wanna use.
01:27:34 'Cause like, fine, there's going to be a bunch of apps
01:27:37 you can never use on the iPad, so be it.
01:27:40 The web, there's no good excuse for that
01:27:43 to not be one of those things.
01:27:44 - That doesn't even work anymore
01:27:45 because all the laptops are ARM-based.
01:27:48 (laughing)
01:27:49 - I mean, that's true.
01:27:51 Including Apple, so yeah.
01:27:53 - Like their most powerful laptop processor line
01:27:56 is now in the iPad first.
01:27:58 - Yeah.
01:27:58 - Yeah.
01:27:59 I'm just saying, the thing about the iPad,
01:28:01 the thing that it's been missing to me from the discourse
01:28:04 from the beginning is like,
01:28:05 Apple designs every inch of this product, right?
01:28:08 That they make the screen and the input devices
01:28:12 and the operating system.
01:28:13 And they're like, there's not a thing in this
01:28:15 that is not an Apple choice.
01:28:17 And this is the thing that they want.
01:28:19 - Yes.
01:28:20 - Fine.
01:28:21 But then you have to be like,
01:28:22 I chose to limit this this specific way.
01:28:25 - And I think there's a version of that trade
01:28:26 that is more palatable at $349 for the 10th gen iPad
01:28:31 than it like, I mean, I saw people spending
01:28:34 two and $3,000 on these.
01:28:36 And I just like, I think the argument
01:28:39 that this should be a different device,
01:28:40 that it's meant for different things,
01:28:43 it does different things well.
01:28:44 Like no, at that price, it should do everything well
01:28:47 that it is physically capable of doing.
01:28:49 And it just can't.
01:28:50 And it is like, you're not letting it.
01:28:52 - I do need to fight you on one thing, David.
01:28:54 'Cause I was one of those people
01:28:55 that spent way too much money on an iPad Pro this week.
01:28:58 - Yeah.
01:28:59 - Oh yeah, I got the 11 inch.
01:29:01 It's the first iPad Pro I've had since the 2018 one.
01:29:05 That magic keyboard case, even though it's the new one
01:29:08 and it's all much better, is horrible.
01:29:11 - Really?
01:29:11 - It's so thick.
01:29:13 It is so, like, as soon as you drop that iPad into it,
01:29:17 all of the magic of it goes away.
01:29:18 And I was like, okay, so it's the exact same
01:29:20 as my 2018 iPad, but the screen's better.
01:29:23 Like, that was the immediate feeling I had
01:29:25 in a really demoralizing way.
01:29:27 - There is something to that.
01:29:28 I mean, it doubles the size of the iPad.
01:29:31 Like, the keyboard at this point, I think,
01:29:32 is both thicker and heavier than the actual iPad.
01:29:37 Which is bonkers, and probably says more
01:29:39 about how thin the iPad is than anything else.
01:29:41 But also, if it were a little thinner,
01:29:43 it would be floppier, even than it is.
01:29:45 It's just an unsolvable problem.
01:29:47 - Yeah, I mean, it's the same feeling you get
01:29:49 when you use those inexpensive Chromebooks
01:29:52 that are convertible Chromebooks or whatever, right?
01:29:54 And it's like, okay, what do you want?
01:29:56 And, yes, realistically, having it feel like a laptop
01:30:00 is more satisfying, but also, I hate it.
01:30:03 (laughing)
01:30:05 Fix that. - I get that, yeah.
01:30:06 - The Surface does it better.
01:30:09 - I'm gonna play with yours,
01:30:10 'cause I really thought about it.
01:30:11 - I'll bring it in for you.
01:30:13 - The Surface, having a built-in kickstand
01:30:15 and then all you have to do is attach the keyboard,
01:30:17 is the correct way. - Yep.
01:30:19 - I believe that with every fiber of my being.
01:30:22 - Man, look, Mac versus PC is back on.
01:30:25 Subscribe to Notepad by Tom Warren for all the latest
01:30:28 on the newly resurgent Mac versus PC battle,
01:30:31 everything old is new again.
01:30:32 That was my lightning round one.
01:30:33 The other one I was gonna point out
01:30:34 is Andy Hawkins wrote a great piece
01:30:35 about just self-driving cars have hit a wall,
01:30:38 and I was gonna make the same comparison to AI,
01:30:41 which is everyone got really excited about that technology,
01:30:43 and then it just didn't do the thing.
01:30:45 - Yep. - And I would just caution
01:30:48 everyone, like, can they actually do the thing?
01:30:50 But you should go read that piece by Andy,
01:30:51 'cause it's very good.
01:30:53 I think we're like five hours over.
01:30:54 - Yeah, we should go.
01:30:55 - Should we talk about Search for a little bit longer?
01:30:57 (laughing)
01:30:58 All right, there's much more to come on Monday.
01:31:01 Again, we have the interview with Sundar.
01:31:03 He and I got into it.
01:31:04 I like talking to Sundar.
01:31:05 I've talked to him once a year for a long time now,
01:31:07 basically, he was fire in this one,
01:31:10 so stay tuned for that on Monday.
01:31:12 And then it says here on Sunday,
01:31:14 we're gonna do the five senses of gaming, smell and taste.
01:31:17 I don't know what that means.
01:31:18 I'm very worried about what Liam and David
01:31:20 are doing with the Verge cast, but it sounds great.
01:31:23 - This face on David is just great.
01:31:25 - You officially know everything I need you to know.
01:31:27 It's gonna be great.
01:31:28 All right, that's it.
01:31:29 That's the Verge cast, rock and roll.
01:31:30 (upbeat music)
01:31:32 - And that's it for the Verge cast this week.
01:31:36 Hey, we'd love to hear from you.
01:31:37 Give us a call at 866-VERGE-11.
01:31:40 The Verge cast is a production
01:31:41 of the Verge and Vox Media Podcast Network.
01:31:43 Our show is produced by Andrew Marino and Liam James.
01:31:47 That's it, we'll see you next week.
01:31:48 (upbeat music)