• last year
Criticism of Oppenheimer and the Atomic Bomb - Marc Andreessen and Lex Fridman

Category

😹
Fun
Transcript
00:00 as you mentioned this in the essay about nuclear,
00:02 which was also, I mean, you don't shy away
00:05 from a little bit of a spicy take.
00:09 So Robert Oppenheimer famously said,
00:12 "Now I am become death, the destroyer of worlds,"
00:14 as he witnessed the first detonation of a nuclear weapon
00:17 on July 16th, 1945.
00:20 And you write an interesting historical perspective,
00:23 quote, "Recall that John von Neumann responded
00:26 "to Robert Oppenheimer's famous hand-wringing
00:28 "about the role of creating nuclear weapons,
00:31 "which," you note, "helped end World War II
00:35 "and prevent World War III,
00:37 "with some people confess guilt
00:39 "to claim credit for the sin."
00:42 And you also mentioned that Truman was harsher
00:44 after meeting Oppenheimer.
00:45 He said that, "Don't let that crybaby in here again."
00:49 - Real quote, real quote, by the way,
00:52 from Dean Acheson.
00:53 - Boy.
00:56 - 'Cause Oppenheimer didn't just say the famous line.
00:58 - Yeah.
00:59 - He then spent years going around,
01:00 basically moaning, going on TV,
01:02 and going into the White House,
01:03 and basically just doing this hair shirt thing,
01:06 this sort of self-critical,
01:07 like, "Oh my God, I can't believe how awful I am."
01:09 - So he's widely considered,
01:12 perhaps because of the hand-wringing,
01:15 as the father of the atomic bomb.
01:17 - This is von Neumann's criticism of him,
01:21 is he tried to have his cake and eat it too.
01:22 Like he wanted to, and so,
01:25 and von Neumann, of course,
01:26 a very different kind of personality,
01:27 and he's just like, "Yeah, screw it,
01:28 "this was like an incredibly useful thing,
01:29 "I'm glad we did it."
01:30 - Yeah.
01:31 Well, von Neumann is widely credited
01:35 as being one of the smartest humans of the 20th century.
01:38 Certain people, everybody says,
01:40 like, "This is the smartest person I've ever met
01:42 "when they've met him."
01:43 Anyway, that doesn't mean smart,
01:46 doesn't mean wise.
01:47 So I would love to sort of,
01:52 can you make the case both for and against
01:54 the critique of Oppenheimer here?
01:56 'Cause we're talking about nuclear weapons,
01:59 boy, do they seem dangerous.
02:01 - So the critique goes deeper,
02:02 and I left this out,
02:03 here's the real substance I left it out,
02:04 'cause I didn't wanna dwell on nukes in my AI paper.
02:07 But here's the deeper thing that happened,
02:10 and I'm really curious,
02:11 this movie coming out this summer,
02:12 I'm really curious to see how far he pushes this,
02:14 'cause this is the real drama in the story,
02:15 which is it wasn't just a question of,
02:17 are nukes good or bad?
02:18 It was a question of, should Russia also have them?
02:20 And what actually happened was Russia got the,
02:24 America invented the bomb,
02:25 Russia got the bomb,
02:26 they got the bomb through espionage,
02:28 they got American scientists
02:31 and foreign scientists working on the American project,
02:33 some combination of the two,
02:35 basically gave the Russians the designs for the bomb,
02:38 and that's how the Russians got the bomb.
02:40 There's this dispute to this day
02:42 of Oppenheimer's role in that.
02:44 If you read all the histories,
02:46 the kind of composite picture,
02:47 and by the way, we now know a lot actually
02:49 about Soviet espionage in that era,
02:50 'cause there's been all this declassified material
02:52 in the last 20 years
02:53 that actually shows a lot of very interesting things.
02:56 But if you kind of read all the histories,
02:57 what you kind of get is Oppenheimer himself
02:58 probably was not a,
03:00 he probably did not hand over the nuclear secrets himself.
03:02 However, he was close to many people who did,
03:04 including family members,
03:06 and there were other members of the Manhattan Project
03:08 who were Russian Soviet SS and did hand over the bomb.
03:11 And so the view that Oppenheimer and people like him had
03:15 that this thing is awful and terrible,
03:17 and oh my God, and all this stuff,
03:19 you could argue fed into this ethos at the time
03:22 that resulted in people thinking that the Baptists,
03:24 thinking that the only principle thing to do
03:26 is to give the Russians the bomb.
03:27 And so the moral beliefs on this thing
03:31 and the public discussion
03:33 and the role that the inventors of this technology play,
03:35 this is the point of this book,
03:36 when they kind of take on
03:37 this sort of public intellectual moral kind of thing,
03:40 it can have real consequences, right?
03:41 'Cause we live in a very different world today
03:44 because Russia got the bomb than we would have lived in
03:45 had they not gotten the bomb, right?
03:47 The entire 20th century,
03:48 second half of the 20th century
03:49 would have played out very different
03:50 had those people not given Russia the bomb.
03:52 And so the stakes were very high then.
03:55 The good news today is nobody's sitting here today,
03:58 I don't think, worrying about an analogous situation
04:01 with respect to, like, I'm not really worried
04:02 that Sam Altman's gonna decide to give the Chinese
04:04 the design for AI,
04:06 although he did just speak at a Chinese conference,
04:08 which is interesting.
04:09 But however, I don't think that's what's at play here.
04:12 But what's at play here
04:13 are all these other fundamental issues
04:15 around what do we believe about this
04:16 and then what laws and regulations and restrictions
04:18 are we gonna put on it?
04:19 And that's where I draw a direct straight line.
04:22 And anyway, and my reading of the history on nukes
04:24 is the people who were doing the full hair shirt public,
04:27 this is awful, this is terrible,
04:28 actually had catastrophically bad results
04:30 from taking those views.
04:32 And that's what I'm worried is gonna happen again.
04:34 - But is there a case to be made
04:35 that you really need to wake the public up
04:37 to the dangers of nuclear weapons
04:39 when they were first dropped?
04:40 Like, really educate them on,
04:43 this is an extremely dangerous and destructive weapon.
04:46 - I think the education kinda happened quick and early.
04:48 - How?
04:49 - It was pretty obvious.
04:50 - How?
04:51 - We dropped one bomb and destroyed an entire city.
04:53 - Yeah, so 80,000 people dead.
04:55 - Yeah.
04:56 And look, the nukes-
04:58 - But I don't, like the reporting of that,
05:00 you can report that in all kinds of ways.
05:02 You can do all kinds of slants,
05:04 like war is horrible, war is terrible.
05:06 You can do, you can make it seem like nuclear,
05:10 the use of nuclear weapons is just a part of war
05:12 and all that kind of stuff.
05:13 Something about the reporting
05:15 and the discussion of nuclear weapons
05:16 resulted in us being terrified in awe
05:21 of the power of nuclear weapons.
05:23 And that potentially fed in a positive way
05:27 towards the game theory of mutually assured destruction.
05:31 - Well, so this gets to what actually happened.
05:33 Let's get to what actually happened.
05:33 - Some of it's me playing devil's advocate here.
05:35 - Yeah, yeah, sure, of course.
05:36 Let's get to what actually happened
05:37 and then kind of back into that.
05:38 So what actually happened, I believe,
05:39 and again, I think this is a reasonable reading of history,
05:41 is what actually happened
05:42 was nukes then prevented World War III.
05:44 And they prevented World War III
05:45 through the game theory of mutually assured destruction.
05:47 Had nukes not existed, right,
05:50 there would have been no reason
05:51 why the Cold War did not go hot, right?
05:53 And then, you know, and the military planners at the time,
05:56 right, thought, both on both sides,
05:57 thought that there was gonna be World War III
05:59 on the plains of Europe,
06:00 and they thought there was gonna be
06:00 like 100 million people dead, right?
06:02 It was like the most obvious thing in the world to happen.
06:04 Right, and it's the dog that didn't bark, right?
06:06 Like, it may be like the best single net thing
06:08 that happened in the entire 20th century
06:10 is that like that didn't happen.
06:11 - Yeah, actually, just on that point,
06:13 you say a lot of really brilliant things.
06:14 It hit me just as you were saying it.
06:18 I don't know why it hit me for the first time,
06:21 but we got two wars in a span of like 20 years.
06:26 Like, we could have kept getting more and more world wars,
06:30 and more and more ruthless.
06:32 It actually, you could have had a US versus Russia war.
06:35 - You could have.
06:36 By the way, there's another hypothetical scenario.
06:39 The other hypothetical scenario
06:40 is the Americans got the bomb, the Russians didn't, right?
06:43 And then America's the big dog,
06:44 and then maybe America would have had the capability
06:46 to actually roll back the Iron Curtain.
06:48 I don't know whether that would have happened,
06:50 but like, it's entirely possible, right?
06:52 And the act of these people who had these moral positions
06:55 about, 'cause they could forecast, they could model,
06:57 they could forecast the future
06:58 of how the technology would get used,
06:59 made a horrific mistake,
07:00 'cause they basically ensured that the Iron Curtain
07:02 would continue for 50 years longer
07:03 than it would have otherwise.
07:04 And again, like, these are counterfactuals.
07:05 I don't know that that's what would have happened,
07:08 but like, the decision to hand the bomb over
07:12 was a big decision.
07:14 Made by people who were very full of themselves.
07:16 - Yeah, but so me as an American,
07:19 me as a person that loves America,
07:21 I also wonder if US was the only ones
07:23 with the nuclear weapons.
07:25 - That was the argument for handing the,
07:29 that was the guys who, the guys who handed over the bomb.
07:32 That was actually their moral argument.
07:33 - Yeah, I would probably not hand it over to,
07:36 I would be careful about the regimes you hand it over to.
07:40 Maybe give it to like the British or something.
07:43 Or like a democratically elected government.
07:47 - Well, look, there are people to this day
07:48 who think that those spies, Soviet spies,
07:49 did the right thing,
07:50 because they created a balance of terror
07:52 as opposed to the US having just,
07:53 and by the way, let me, let me.
07:54 - Balance of terror.
07:55 - Let's tell the full version of the story.
07:56 - Has such a sexy ring to it.
07:57 - Okay, so the full version of the story is,
07:59 John von Neumann's a hero of both yours and mine.
08:00 The full version of the story is,
08:02 he advocated for a first strike.
08:03 So when the US had the bomb and Russia did not,
08:07 he advocated for, he said,
08:09 "We need to strike them right now."
08:11 - Strike Russia.
08:12 - Yeah.
08:13 - Ooh.
08:14 - Yes.
08:15 - Von Neumann.
08:16 - Yes, because he said, "World War III is inevitable."
08:18 He was very hardcore.
08:21 His theory was, "World War III is inevitable.
08:25 "We're definitely gonna have World War III.
08:26 "The only way to stop World War III
08:28 "is we have to take them out right now.
08:29 "And we have to take them out right now
08:30 "before they get the bomb, 'cause this is our last chance."
08:33 Now again, like--
08:34 - Is this an example of philosophers in politics?
08:36 - I don't know if that's in there or not,
08:37 but this is in the standard--
08:38 - No, but is it, I mean, meaning is that--
08:39 - Yeah, this is on the other side.
08:40 So most of the case studies,
08:42 most of the case studies in books like this
08:43 are the crazy people on the left.
08:45 - Yeah.
08:46 - Von Neumann is a story, arguably,
08:48 of the crazy people on the right.
08:49 - Yeah, stick to computing, John.
08:51 - Well, this is the thing,
08:52 and this is the general principle,
08:53 is it goes back to our core thing,
08:55 which is like, I don't know whether any of these people
08:56 should be making any of these calls.
08:58 - Yeah.
08:59 - 'Cause there's nothing in either Von Neumann's background
09:01 or Oppenheimer's background,
09:02 or any of these people's background
09:03 that qualifies them as moral authorities.
09:05 - Yeah, well, this actually brings up the point of,
09:07 in AI, who are the good people
09:09 to reason about the morality, the ethics?
09:13 Outside of these risks, outside of,
09:14 like, the more complicated stuff that you agree on is,
09:18 you know, this will go into the hands of bad guys,
09:21 and all the kinds of ways they'll do
09:22 is interesting and dangerous,
09:24 is dangerous in interesting, unpredictable ways,
09:28 and who is the right person,
09:30 who are the right kinds of people
09:31 to make decisions how to respond to it?
09:33 Are these tech people?
09:35 - So the history of these fields,
09:37 this is what he talks about in the book,
09:38 the history of these fields is that
09:39 the competence and capability and intelligence
09:43 and training and accomplishments of senior scientists
09:46 and technologists working on a technology,
09:48 and then being able to then make moral judgments
09:51 on the use of that technology,
09:52 that track record is terrible.
09:54 That track record is, like, catastrophically bad.
09:57 - The people, just to let you know,
09:58 the people that develop that technology
10:00 are usually not going to be the right people.
10:03 - Well, why would they?
10:04 So the claim is, of course, they're the knowledgeable ones,
10:06 but the problem is they've spent their entire life
10:08 in a lab, right?
10:09 They're not theologians.
10:11 So what you find when you read this,
10:14 when you look at these histories,
10:15 what you find is they generally are very thinly informed
10:17 on history, on sociology, on theology,
10:21 on morality, on ethics.
10:24 They tend to manufacture their own worldviews from scratch.
10:27 They tend to be very sort of thin.
10:29 They're not remotely the arguments
10:34 that you would be having if you got, like,
10:35 a group of highly qualified theologians or philosophers,
10:37 or, you know.
10:39 - Well, let me sort of, as the devil's advocate takes
10:42 a sip of whiskey, say that I agree with that,
10:47 but also it seems like the people who are doing
10:50 kind of the ethics departments in these tech companies
10:54 go sometimes the other way.
10:56 - Yes.
10:57 They're definitely, yes.
10:59 - They're not nuanced on history or theology
11:03 or this kind of stuff.
11:04 It almost becomes a kind of outraged activism
11:07 towards directions that don't seem to be grounded
11:12 in history and humility and nuance.
11:15 It's, again, drenched with arrogance.
11:17 So I'm not sure which is worse.
11:20 - Oh, no, they're both bad.
11:21 Yeah, so definitely not them either.
11:23 - But I guess-
11:24 - But look, this is a hard-
11:26 - Yeah, it's a hard problem.
11:26 - This is a hard problem.
11:27 And this goes back to where we started,
11:29 which is, okay, who has the truth?
11:30 And it's like, well, you know,
11:32 like how do societies arrive at truth?
11:34 And how do we figure these things out?
11:35 And like our elected leaders play some role in it.
11:38 You know, we all play some role in it.
11:41 There have to be some set of public intellectuals
11:43 at some point that bring rationality and judgment
11:45 and humility to it.
11:46 Those people are few and far between.
11:48 We should probably prize them very highly.
11:50 - Yeah, celebrate humility in our public leaders.
11:53 (silence)
11:55 (silence)
11:57 (silence)
12:00 (silence)
12:02 (silence)
12:04 (silence)
12:07 (silence)
12:09 [BLANK_AUDIO]

Recommended