fbpx

PODCAST

Eichmann in Jerusalem: A Report on the Banality of Evil by Hannah Arendt – w/Tom Libby and Jesan Sorrells

Eichmann in Jerusalem: A Report on the Banality of Evil by Hannah Arendt – w/Tom Libby and Jesan Sorrells

00:00 Welcome and Introduction: Eichmann in Jerusalem: A Report on the Banality of Evil by Hannah Arendt 
01:00 The Nature of Conscience-less Leadership

09:39 Thoughtlessness and Evil Uncovered

11:49 Hannah Arendt’s Political Philosophy

19:08 Lost Stories of Past Generations

27:22 Questioning Authority and Responsibility

29:06 Opioid Crisis and Accountability

35:46 AI Accountability and Regulation Needed

41:53 Eichmann’s Distorted Kantian Ethics

46:30 Courage to Say “No”

50:58 Evolving Reactions to Pandemic Information

59:20 AI Search Quality Issues

01:04:38 Leaders Resisting AI Conformity

01:07:05 Navigating Leadership and Feedback

01:14:30 Staying on the Path: Lessons from Eichmann in Jerusalem.


Opening and closing themes composed by Brian Sanyshyn of Brian Sanyshyn Music.


★ Support this podcast on Patreon ★

6
00:00:18,880 –> 00:00:22,600
Hello, my name is Jesan Sorrells, and

7
00:00:22,600 –> 00:00:26,160
this is the Leadership Lessons from the Great Books podcast,

8
00:00:26,800 –> 00:00:29,520
episode number 154.

9
00:00:31,040 –> 00:00:34,320
This is going to be kind of a long intro, so bear with me

10
00:00:35,520 –> 00:00:38,840
on this show. We have discussed the writing of

11
00:00:38,840 –> 00:00:41,680
Alexander Solzhenitsyn and talked about

12
00:00:41,840 –> 00:00:45,680
gulags. We’ve addressed the long, dark night of persecution

13
00:00:45,680 –> 00:00:48,880
brought to us in the documentation of Eli Weisel.

14
00:00:49,440 –> 00:00:53,120
We’ve talked about the importance of leadership from Rabbi Jonathan Sachs.

15
00:00:54,100 –> 00:00:57,140
We have talked about war and peace, slavery and

16
00:00:57,380 –> 00:01:01,180
ethnic cleansing, freedom and conscience, tyranny and

17
00:01:01,180 –> 00:01:04,740
totalitarianism. And we have done it all while groping

18
00:01:04,740 –> 00:01:08,180
towards what those things may mean at a practical level for leaders

19
00:01:08,580 –> 00:01:11,300
in their real lived leadership lives.

20
00:01:13,460 –> 00:01:17,140
But we’ve never to this date on

21
00:01:17,140 –> 00:01:20,580
this show asked or attempted to answer this

22
00:01:20,580 –> 00:01:23,860
disturbing core question. Who

23
00:01:24,100 –> 00:01:27,380
exactly are the types of people, the types of leaders

24
00:01:27,860 –> 00:01:31,220
who would perpetuate bureaucratic orders

25
00:01:31,540 –> 00:01:34,980
without evincing a conscience of any kind?

26
00:01:36,260 –> 00:01:39,780
And what can possibly be done about them?

27
00:01:41,780 –> 00:01:45,580
The post World War II Nuremberg Trials of members of the Nazi government

28
00:01:45,580 –> 00:01:49,220
of Germany sought justice for the outcomes of the bureaucratic acts of

29
00:01:49,220 –> 00:01:52,630
such leaders without addressing the basic psychological and

30
00:01:52,630 –> 00:01:55,590
spiritual question, what type of people were these?

31
00:01:57,190 –> 00:02:00,950
Without fixing such justice within a shared religious, Jewish or Christian

32
00:02:01,190 –> 00:02:04,990
framework. The secular humanist modernists who won the war

33
00:02:04,990 –> 00:02:08,790
by leveraging all the tools at their disposal, including the frightening new weapon of the

34
00:02:08,790 –> 00:02:12,470
atomic bomb, attempted to reassemble a world broken by war and

35
00:02:12,470 –> 00:02:16,240
shocked into silence by the evidence at Auschwitz, the

36
00:02:16,310 –> 00:02:19,670
Buchenwald and all the other Nazi concentration camps at

37
00:02:19,670 –> 00:02:22,710
Nuremberg. And they succeeded

38
00:02:23,510 –> 00:02:27,270
for a time. The reassembly of

39
00:02:27,270 –> 00:02:30,310
the world and the justice delivered at Nuremberg reinforced

40
00:02:30,310 –> 00:02:34,070
unambiguously, at least for a little while, via popular

41
00:02:34,070 –> 00:02:37,910
culture and education, and convinced a couple of generations of

42
00:02:37,910 –> 00:02:41,670
Americans to accept that such reassembly was, quote, unquote, just

43
00:02:41,670 –> 00:02:45,220
the way things are. This was assumed as part and parcel of the

44
00:02:45,220 –> 00:02:48,780
secular humanist ethic for most of the remainder of the 20th

45
00:02:48,780 –> 00:02:52,300
century. However, in our contemporary

46
00:02:52,300 –> 00:02:55,460
era, 80 plus years later, this is being

47
00:02:55,460 –> 00:02:59,139
deconstructed or forgotten. Take your pick. As

48
00:02:59,139 –> 00:03:02,860
the post World War II secular liberal world order is falling

49
00:03:02,860 –> 00:03:06,620
apart everywhere as far as those with eyes can see in the

50
00:03:06,620 –> 00:03:10,380
West. The COVID 19 pandemic. It weighs both

51
00:03:10,380 –> 00:03:14,190
great and small, but brought this question and many others to sharp relief

52
00:03:14,190 –> 00:03:17,790
in the minds of many people in the United States, including

53
00:03:17,870 –> 00:03:21,390
for the last three generations of people, most notably the Gen Xers,

54
00:03:22,030 –> 00:03:24,270
Millennials and gen zers.

55
00:03:25,870 –> 00:03:29,550
These three new generations of post World War II Americans never

56
00:03:29,710 –> 00:03:33,150
stared the atrocities of concentration camps directly in the face.

57
00:03:33,790 –> 00:03:37,590
Generations whose connection to that World War II world is only through grainy black

58
00:03:37,590 –> 00:03:40,820
and white films like the one I just watched the this weekend

59
00:03:41,380 –> 00:03:45,100
or via Baby Boomer generated film hagiographies produced in the

60
00:03:45,100 –> 00:03:47,780
last 30 years like saving Private Ryan,

61
00:03:48,740 –> 00:03:52,220
hagiographies that extol the Greatest generation and seek to

62
00:03:52,220 –> 00:03:55,420
reinforce the importance of seeking justice in a

63
00:03:55,420 –> 00:03:58,260
secularist, humanist society.

64
00:04:00,980 –> 00:04:04,300
However, confronting the terror of blank bureaucratic

65
00:04:04,300 –> 00:04:07,740
disinterest, governmental insistence on legal and social compliance, and the

66
00:04:07,740 –> 00:04:11,460
application of state power to those who rebel or would rebel

67
00:04:12,080 –> 00:04:15,200
feels new to us living right now.

68
00:04:16,080 –> 00:04:19,880
But these dynamics would have been very familiar to the pre World War II generations

69
00:04:19,880 –> 00:04:23,520
who fought in the trenches, freed the concentration camp prisoners,

70
00:04:23,600 –> 00:04:25,840
and prosecuted the war’s losers.

71
00:04:27,760 –> 00:04:31,400
Today on the podcast we will be talking about an

72
00:04:31,400 –> 00:04:34,400
author who wrote most of her work in direct and vehement

73
00:04:34,640 –> 00:04:38,160
opposition to totalitarianism in the forms of both fascism

74
00:04:38,900 –> 00:04:42,500
and communism. She was unapologetically

75
00:04:42,500 –> 00:04:46,140
politically philosophical during a post World War II era where women were just

76
00:04:46,140 –> 00:04:49,900
finding their feet in the space of political and social philosophy and

77
00:04:49,900 –> 00:04:53,380
where the individual was being gradually morphed into

78
00:04:53,380 –> 00:04:56,660
becoming just another one of the masses.

79
00:04:58,500 –> 00:05:02,060
By the way, the themes that we’re going to be exploring on the show today

80
00:05:02,060 –> 00:05:05,560
dovetail quite nicely with the themes we explored our previous

81
00:05:05,560 –> 00:05:07,680
episode on 1984.

82
00:05:09,120 –> 00:05:12,400
Orwell would have quite a bit to say to this woman. I think

83
00:05:14,160 –> 00:05:17,200
today on the show we will look at the major themes and explore the

84
00:05:17,200 –> 00:05:20,520
controversies within Eichman in

85
00:05:20,520 –> 00:05:24,240
Jerusalem, the report on the Banality of Evil

86
00:05:24,720 –> 00:05:28,400
by Hannah Aaron Leaders.

87
00:05:28,800 –> 00:05:32,120
I am personally convinced, and this is why I’m doing this book on the show

88
00:05:32,120 –> 00:05:35,680
today, I personally convinced that we are forgetting the moral and ethical

89
00:05:35,680 –> 00:05:38,860
lessons of Nuremb, and that is to our detriment

90
00:05:39,500 –> 00:05:43,300
in the West. And today on the

91
00:05:43,300 –> 00:05:47,020
show I’m of course joined by our regular contributor Tom

92
00:05:47,100 –> 00:05:49,340
Libby. How you doing today, Tom?

93
00:05:51,100 –> 00:05:54,460
I am ecstatic to be here today.

94
00:05:54,460 –> 00:05:57,660
Jesan, I,

95
00:05:58,700 –> 00:06:02,540
I thought that would get a reaction from you. Well,

96
00:06:02,540 –> 00:06:06,300
it’s, it’s going to be a thing today. This was a book I read

97
00:06:06,300 –> 00:06:08,860
probably about a year ago

98
00:06:10,540 –> 00:06:13,100
and I wasn’t going to do it on the show because we do a lot

99
00:06:13,100 –> 00:06:16,060
of heavy books on the show that are similar or that are in this vein.

100
00:06:17,660 –> 00:06:21,460
And then current events, you know, started catching

101
00:06:21,460 –> 00:06:24,980
up with me a little bit. And I started thinking about some things that I’ve

102
00:06:24,980 –> 00:06:28,020
been seeing online. And I thought, this book is a book that leaders need to

103
00:06:28,020 –> 00:06:31,780
at least consider reading, and we at least need to consider offering some of our

104
00:06:31,780 –> 00:06:35,150
thoughts on it to them. Um, and then I went and looked at Hannah Arendt

105
00:06:35,150 –> 00:06:38,790
and I looked at her life and I looked at her. Her opinions on

106
00:06:38,790 –> 00:06:42,630
things. I. I sent you, actually, a. A YouTube video from her. Her

107
00:06:43,030 –> 00:06:46,590
foundation or whatever, and I watched that whole interview, and it was

108
00:06:46,590 –> 00:06:50,430
fascinating to. To listen to how she had grown up and what she thought

109
00:06:50,430 –> 00:06:53,910
about herself and especially what she thought about this book,

110
00:06:54,470 –> 00:06:57,910
particularly as it generated so much controversy when it was initially published and

111
00:06:58,150 –> 00:07:01,880
continues to generate controversy still. So I think

112
00:07:01,880 –> 00:07:05,680
there are valuable lessons for leaders around that question of what kind of people.

113
00:07:07,040 –> 00:07:09,920
Well, what kind of people just follow orders?

114
00:07:10,640 –> 00:07:14,480
And how, as leaders, do we. What responsibility do we

115
00:07:14,480 –> 00:07:18,240
have for not allowing that to happen on our watch? So.

116
00:07:21,280 –> 00:07:25,040
So let’s pick up with Eichmann in Jerusalem. I’m going to read a

117
00:07:25,040 –> 00:07:28,680
couple of pages from this. By the way, you could pick up this book, a

118
00:07:28,680 –> 00:07:32,520
PDF version of this book online. Okay. So there is a publicly

119
00:07:32,520 –> 00:07:36,320
available copy. And while I’m not reading from the publicly

120
00:07:36,320 –> 00:07:39,560
available copy today, you can also pick up

121
00:07:39,960 –> 00:07:43,240
and download the original New Yorker

122
00:07:43,480 –> 00:07:47,080
correspondence, the original New Yorker article that she wrote that

123
00:07:47,080 –> 00:07:50,200
eventually became this book back in the early

124
00:07:50,600 –> 00:07:54,320
1960s. And I

125
00:07:54,320 –> 00:07:58,080
quote from Ikea to Jerusalem. This is from the postscript. This is her wrapping up.

126
00:07:58,080 –> 00:08:01,730
This is Hannah Aaron wrapping up. There is, of course, no

127
00:08:01,730 –> 00:08:05,450
doubt that the defendant, and that was Adolf Eichmann. And the nature of his acts,

128
00:08:05,450 –> 00:08:08,690
as well as the trial itself, raised problems of a general nature which go far

129
00:08:08,690 –> 00:08:12,370
beyond the matters considered in Jerusalem. I have

130
00:08:12,370 –> 00:08:15,370
attempted to go into some of these problems in the epilogue, which ceases to be

131
00:08:15,370 –> 00:08:19,130
simple reporting. I would not have been surprised if people had found my treatment

132
00:08:19,130 –> 00:08:22,330
inadequate. And I would have welcomed a discussion of the general significance of the entire

133
00:08:22,330 –> 00:08:25,850
body of facts, which could have been made all the more meaningful the more directly

134
00:08:25,850 –> 00:08:29,430
it referred to the concrete events. I can also well imagine that an

135
00:08:29,430 –> 00:08:32,670
authentic controversy might have arisen over the subtitle of the book.

136
00:08:33,230 –> 00:08:36,710
For when I speak of the banality of evil, I do so on the

137
00:08:36,710 –> 00:08:40,190
strictly factual level, pointing to a phenomenon which

138
00:08:40,190 –> 00:08:44,030
stared one in the face at the trial. Eichmann was not

139
00:08:44,030 –> 00:08:47,710
Iago and not Macbeth, and nothing would have been

140
00:08:47,710 –> 00:08:51,510
farther from his mind than to determine what Richard III quote to prove a

141
00:08:51,510 –> 00:08:55,360
villain, except for an extraordinary diligence in looking

142
00:08:55,360 –> 00:08:58,920
out for his personal advancement. He had no motives

143
00:08:59,480 –> 00:09:02,880
at all. And this diligence in

144
00:09:02,880 –> 00:09:06,680
itself was in no way criminal. He certainly would never

145
00:09:06,680 –> 00:09:10,280
have murdered his superior in order to inherit his post. He merely, to put it

146
00:09:10,280 –> 00:09:13,880
the matter colloquially, never realized what he was doing.

147
00:09:14,680 –> 00:09:18,320
It was precisely this lack of imagination which enabled him to sit for months on

148
00:09:18,320 –> 00:09:22,120
end facing a German Jew who is conducting the police interrogation, pouring

149
00:09:22,120 –> 00:09:24,910
out his heart to the man and explaining again and again how it was that

150
00:09:24,980 –> 00:09:27,660
that he reached only the rank of Lieutenant Colonel in the ss and that it

151
00:09:27,660 –> 00:09:31,260
had not been his fault that he was not promoted in

152
00:09:31,260 –> 00:09:34,060
principle. He knew quite well what it was all about. And in his final statement

153
00:09:34,060 –> 00:09:37,820
to the court, he spoke of the quote re evaluation of the values prescribed

154
00:09:37,820 –> 00:09:41,299
by the Nazi government, unquote. He was not stupid.

155
00:09:41,780 –> 00:09:45,540
It was sheer thoughtlessness, something by no means identical with

156
00:09:45,540 –> 00:09:49,300
stupidity, that predisposed him to become one of the greatest criminals of that period.

157
00:09:49,460 –> 00:09:53,310
And if this is banal and even funny, if with the best

158
00:09:53,310 –> 00:09:57,070
will of the world, will in the world, one cannot extract any diabolical

159
00:09:57,070 –> 00:10:00,750
or demonic profundity from Eichmann, that it is still

160
00:10:00,750 –> 00:10:04,590
far from calling it commonplace, it surely cannot be so common

161
00:10:04,590 –> 00:10:07,510
that a man facing death and moreover standing beneath the gallows should be able to

162
00:10:07,510 –> 00:10:10,910
think of nothing but what he has heard at funerals all his life, and that

163
00:10:10,910 –> 00:10:14,550
these quote unquote lofty words should completely be cloud the reality of his own death.

164
00:10:14,630 –> 00:10:18,350
That such remoteness from reality and such thoughtlessness can wreck more havoc than all the

165
00:10:18,350 –> 00:10:21,270
evil instincts taken together, which perhaps are inherent in man.

166
00:10:22,160 –> 00:10:25,360
That was, in fact the lesson one could learn in Jerusalem.

167
00:10:26,640 –> 00:10:29,680
But it was a lesson neither an explanation of the phenomenon

168
00:10:30,400 –> 00:10:32,400
nor a theory about it.

169
00:10:34,400 –> 00:10:38,240
Seemingly more complicated, but in reality, far simpler than examining the

170
00:10:38,240 –> 00:10:41,960
strange interdependence of thoughtlessness and evil is the question of what kind of crime is

171
00:10:41,960 –> 00:10:45,200
actually involved here. A crime, moreover, which all agree, is

172
00:10:45,200 –> 00:10:48,890
unprecedented. For the concept of genocide, introduced

173
00:10:48,890 –> 00:10:52,330
explicitly to cover a crime unknown before, although applicable up to a point, is not

174
00:10:52,330 –> 00:10:56,010
fully adequate, for the simple reason that massacres of whole peoples

175
00:10:56,010 –> 00:10:59,770
are not unprecedented. They were the order of the day

176
00:10:59,770 –> 00:11:03,450
in antiquity, and the centuries of colonialization and imperialism provide plenty of

177
00:11:03,450 –> 00:11:06,610
examples of more or less successful attempts of that sort.

178
00:11:07,250 –> 00:11:10,690
The expression administrative massacre seems better to fill the bill.

179
00:11:11,570 –> 00:11:15,250
The term arose in connection with British imperialism. The English deliberately

180
00:11:15,250 –> 00:11:19,050
rejected such procedures as a means of maintaining their rule over India. The phrase has

181
00:11:19,050 –> 00:11:22,710
the virtue of dispelling the prejudice that such monstrous acts could be committed only against

182
00:11:22,710 –> 00:11:26,470
a foreign nature or a different race. There is

183
00:11:26,470 –> 00:11:30,150
the well known fact that Hitler began his mass murders by granting mercy

184
00:11:30,150 –> 00:11:33,950
deaths to the incurably ill and that he intended to wind up his extermination

185
00:11:33,950 –> 00:11:37,590
program by doing away with genetically damaged Germans heart and

186
00:11:37,590 –> 00:11:41,270
lung patients. But quite aside from that, it is apparent that this

187
00:11:41,270 –> 00:11:44,830
sort of killing can be directed against any given group, that is that the

188
00:11:44,830 –> 00:11:48,270
principle of selection is dependent only on circumstantial

189
00:11:48,270 –> 00:11:51,970
factors. And it is quite conceivable that in

190
00:11:51,970 –> 00:11:55,450
the automated economy of a not too distant future

191
00:11:56,250 –> 00:11:59,770
men may be tempted to exterminate all those whose

192
00:11:59,770 –> 00:12:03,370
intelligence quotient is below a certain

193
00:12:04,090 –> 00:12:04,490
level.

194
00:12:15,640 –> 00:12:19,080
We talked in my introductory episode to this episode I would go back and listen

195
00:12:19,080 –> 00:12:22,760
to that was episode 153. We talked about Hannah Arendt and her political

196
00:12:23,160 –> 00:12:26,960
philosophies and her background and where she came from. We also have

197
00:12:26,960 –> 00:12:30,680
a link in the show notes to the video where she talks about her own

198
00:12:30,840 –> 00:12:33,960
background. I would recommend going and watching that

199
00:12:35,320 –> 00:12:37,800
either after you finish listening to this episode or before

200
00:12:41,410 –> 00:12:45,090
in that. In that episode 153 I did mention

201
00:12:45,090 –> 00:12:48,690
this that Erin constantly prioritized political

202
00:12:48,770 –> 00:12:52,290
over social questions and that this got her into trouble

203
00:12:52,450 –> 00:12:55,890
over various years as people insisted on pushing her

204
00:12:56,050 –> 00:12:59,730
towards a political understanding rather than a social understanding

205
00:13:00,050 –> 00:13:02,690
of things like, well, evil.

206
00:13:05,410 –> 00:13:09,210
Aaron is of course remembered for the controversy surrounding her reporting on the trial of

207
00:13:09,210 –> 00:13:12,850
Adolf Eichmann, particularly the conclusion that she wrote

208
00:13:12,850 –> 00:13:16,650
there. I think she was attempting to explain

209
00:13:16,650 –> 00:13:20,090
to herself as a German Jew who had been arrested by the

210
00:13:20,090 –> 00:13:23,770
Gestapo and had just barely missed going to Auschwitz

211
00:13:23,770 –> 00:13:24,290
herself.

212
00:13:28,530 –> 00:13:32,210
She was trying to look at Eichmann and explain who

213
00:13:32,210 –> 00:13:35,890
this person was in a way that seemed in a Post World War

214
00:13:35,890 –> 00:13:39,540
II context, rational without converting or

215
00:13:39,540 –> 00:13:43,300
reverting to a spiritual explanation. And she had

216
00:13:43,300 –> 00:13:46,700
nothing but spiritual language to describe what she was seeing there.

217
00:13:49,500 –> 00:13:52,940
However, her explanation was seen as an apologia, an apology

218
00:13:53,740 –> 00:13:57,380
for totalitarian systems and for how ordinary people become actors in

219
00:13:57,380 –> 00:14:01,220
those systems and for the phrase the banality

220
00:14:01,220 –> 00:14:04,890
of evil. Some thought that she was robbing

221
00:14:04,890 –> 00:14:08,330
evil of its power by claiming it or making the claim

222
00:14:08,490 –> 00:14:10,650
that because Eichmann was thoughtless

223
00:14:12,170 –> 00:14:15,690
he could also be evil. She wasn’t saying

224
00:14:15,690 –> 00:14:19,490
those two things at all. She was saying that thoughtlessness led to

225
00:14:19,490 –> 00:14:23,130
his evil. And this paraphrases or this goes along with something that we read

226
00:14:23,130 –> 00:14:26,010
about in our episode nine covering 1984

227
00:14:26,730 –> 00:14:30,060
and talking about Orwell in the English language. Back in the

228
00:14:30,060 –> 00:14:33,900
1940s, Orwell made the point the British Orwell made

229
00:14:33,900 –> 00:14:37,580
the point that we are lazy in our thoughts and

230
00:14:37,580 –> 00:14:41,180
thus our Speech has become lazy. And when our speech becomes

231
00:14:41,180 –> 00:14:44,980
lazy, our thoughts become lazy. I think Arendt would

232
00:14:44,980 –> 00:14:48,780
agree. And lazy speech and lazy thoughts

233
00:14:48,780 –> 00:14:50,900
leads to bureaucrats who just

234
00:14:51,940 –> 00:14:55,670
comply thoughtlessly and commit

235
00:14:55,670 –> 00:14:59,070
evil. As

236
00:14:59,070 –> 00:15:02,830
usual or not as usual. I often

237
00:15:02,990 –> 00:15:06,670
send Tom resources on the books that we are reading. And I don’t think Tom

238
00:15:06,670 –> 00:15:08,830
had read this book before or even knew of its existence.

239
00:15:10,829 –> 00:15:14,670
So I’m going to ask Tom the typical question. I don’t know

240
00:15:14,670 –> 00:15:17,910
if you watched the video. I know you clicked on the other link and read

241
00:15:17,910 –> 00:15:21,000
the other thing and we’ll talk about that in a minute. But what did you

242
00:15:21,000 –> 00:15:24,440
think of what was your first impressions of Hannah Arendt? And I know we just

243
00:15:24,440 –> 00:15:27,480
read that little piece there from Eichmann in Jerusalem, but what do you think of

244
00:15:27,480 –> 00:15:31,200
her, some of her ideas here that I’ve sort of laid out so far?

245
00:15:33,040 –> 00:15:36,880
Well, I think, I think interestingly, first of all, I

246
00:15:36,880 –> 00:15:40,520
tried, I did watch some of the video. I didn’t finish the whole thing, unfortunately,

247
00:15:40,520 –> 00:15:44,240
because I just ran out of time. Yeah, it’s long. It’s an hour.

248
00:15:44,240 –> 00:15:47,920
It’s long. Yeah, it’s an hour. And I played it on 1.25

249
00:15:48,080 –> 00:15:51,890
speed, even though you have to read the whole thing. So I was even trying

250
00:15:51,890 –> 00:15:54,730
to get through it faster by trying to speed it up and reading it was

251
00:15:54,730 –> 00:15:57,370
fine. I had no problem reading it at one point to five speed. But.

252
00:15:58,410 –> 00:16:00,010
But I find.

253
00:16:02,250 –> 00:16:05,610
I, I had a. Lot of mixed emotions watching and

254
00:16:06,170 –> 00:16:09,530
like hearing the. She.

255
00:16:11,690 –> 00:16:15,410
She seemed ahead of her time. First of

256
00:16:15,410 –> 00:16:19,220
all, number one, and I don’t mean that in a negative or positive way, just

257
00:16:19,220 –> 00:16:22,860
in the sense that being a woman back then,

258
00:16:23,900 –> 00:16:27,660
being involved in like being at one

259
00:16:27,660 –> 00:16:31,340
point labeled a philosopher was unheard of for a woman back then

260
00:16:31,660 –> 00:16:35,500
being. And then trying to pigeonhole her. And even

261
00:16:35,500 –> 00:16:39,180
in the conversation you can tell by the, the person doing the interview

262
00:16:39,180 –> 00:16:42,780
is trying to like kind of force her to answer things in a kind of

263
00:16:42,780 –> 00:16:46,540
certain way. She wasn’t having it. Which again, for a woman back then

264
00:16:47,040 –> 00:16:50,600
was pretty strong willed again. Oh yeah. Whether you like her or not or disagree

265
00:16:50,600 –> 00:16:53,920
with or agree with her or not, I do think for the time

266
00:16:54,320 –> 00:16:57,960
she was relatively unique. She was relatively unique in the

267
00:16:57,960 –> 00:17:01,680
sense that she didn’t worry about

268
00:17:01,680 –> 00:17:05,400
voicing her opinions. She wasn’t concerned about repercussions

269
00:17:05,400 –> 00:17:08,880
about her thought processes or anything like that. And, and to your point

270
00:17:08,960 –> 00:17:12,720
about her, you know, with the evil scenario, and

271
00:17:12,720 –> 00:17:16,440
I’m just paraphrasing one of her more famous quotes and I don’t know the

272
00:17:16,440 –> 00:17:19,960
quote well enough to Quote it, but paraphrasing it basically saying,

273
00:17:20,280 –> 00:17:23,960
you know, someone who, who performs an evil act isn’t inherently

274
00:17:23,960 –> 00:17:27,600
evil. That just means they just did some stupid stuff. Like, you know,

275
00:17:27,600 –> 00:17:31,280
like, but, but like we don’t view that the same way

276
00:17:31,280 –> 00:17:34,600
anymore. Like we look at like a single

277
00:17:34,760 –> 00:17:36,920
act could basically

278
00:17:38,210 –> 00:17:41,770
classify somebody for the rest of their life. Again, like we look at

279
00:17:41,770 –> 00:17:45,170
even psychology today. You have a, you know, an eight year old kid that, that

280
00:17:45,170 –> 00:17:48,770
purposefully goes out of his way to harm a small animal.

281
00:17:48,770 –> 00:17:51,890
And now all of a sudden that kid is being followed by,

282
00:17:52,450 –> 00:17:55,610
you know, people so closely because they just think he’s going to become a serial

283
00:17:55,610 –> 00:17:59,210
killer. Right. Because he’s evil. Right. He hurt some little animal when he was 8

284
00:17:59,210 –> 00:18:02,930
years old, so now he’s evil. And in her philosophy that one act

285
00:18:02,930 –> 00:18:06,430
does not necessarily make that person evil. That

286
00:18:06,510 –> 00:18:10,150
it’s, you know, you have to be an evil person to be considered evil.

287
00:18:10,150 –> 00:18:13,910
You can’t just perform an evil act. So. Right. And that, that goes

288
00:18:13,910 –> 00:18:17,390
back to like to your. When you were talking earlier about

289
00:18:17,870 –> 00:18:20,910
some of our disassociation with, with,

290
00:18:22,910 –> 00:18:26,110
I mean, let’s face it, that was pure evil. Like what happened.

291
00:18:27,230 –> 00:18:30,990
So, but we’re, we’re so removed from it now that

292
00:18:30,990 –> 00:18:34,590
we don’t view it the same way. And we’re starting to see

293
00:18:34,830 –> 00:18:38,110
the, the,

294
00:18:39,310 –> 00:18:42,390
you know, again in films we’re starting to see the, the

295
00:18:42,390 –> 00:18:45,910
glorification of the outcome. Right. Like, so the

296
00:18:45,910 –> 00:18:49,750
glorification of the outcome downplays the seriousness of the

297
00:18:49,750 –> 00:18:53,470
actual like, which I think, which I think is what, why we’re

298
00:18:53,470 –> 00:18:56,910
desensitizing to some of this stuff. Right. We’re losing some of that,

299
00:18:57,360 –> 00:19:01,120
that guttural. Most of the

300
00:19:01,120 –> 00:19:04,920
World War II veterans are starting, they’re dying off at like a 15000

301
00:19:04,920 –> 00:19:08,400
a day clip or something like that. They’re not very many left. Yep.

302
00:19:08,480 –> 00:19:12,280
I remember growing up listening to their stories thinking, oh my good God,

303
00:19:12,280 –> 00:19:15,760
I hope this never happens again. Our, our next generations

304
00:19:15,920 –> 00:19:19,280
are not hearing those stories from real life people who were there.

305
00:19:19,840 –> 00:19:23,600
Like, you know what I mean? Like, so that they’re not getting that direct,

306
00:19:23,920 –> 00:19:27,640
that, that direct impact like my grandfather, my gr. You know,

307
00:19:27,640 –> 00:19:31,180
whatever had to live through this, my GR had to fight this

308
00:19:31,180 –> 00:19:35,020
evil. Like they don’t have that. They’re getting this glorification of

309
00:19:35,020 –> 00:19:38,540
it from cinema and, and books that people who

310
00:19:38,540 –> 00:19:42,260
weren’t there are writing and they’re not going back and reading her

311
00:19:42,260 –> 00:19:46,100
stuff because Hannah’s stuff because it’s not modern and

312
00:19:46,100 –> 00:19:49,420
popular anymore. So they’re just reading Today. I don’t know. I. I know I’m going

313
00:19:49,420 –> 00:19:52,780
off on a tangent here, but. No, no, no, I don’t. I think you’re onto

314
00:19:52,780 –> 00:19:56,000
something. I mean, look, as much as I’ll use an example, because we always talk

315
00:19:56,000 –> 00:19:59,160
about film eventually. So I’ll do it early. Always. Always.

316
00:20:00,680 –> 00:20:04,240
I like the Dark Knight, the Batman movie from like the mid

317
00:20:04,240 –> 00:20:07,760
2000s. I liked the Dark Knight. Right. The main character of the Dark Knight was

318
00:20:07,760 –> 00:20:11,480
the Joker. The Joker is a sociopath. Yeah. He’s an element.

319
00:20:11,560 –> 00:20:15,280
He’s represented there as an element of chaos. And

320
00:20:15,280 –> 00:20:18,400
because it’s a movie and because to your point, we’ve been

321
00:20:18,400 –> 00:20:21,720
desensitized. You know, he blows up a hospital building,

322
00:20:23,880 –> 00:20:27,720
you’re a sociopath. Right. Like. Like

323
00:20:27,800 –> 00:20:31,640
the, the constant pushback or challenge to Batman as character.

324
00:20:31,800 –> 00:20:33,720
Even going back into the DC Comics,

325
00:20:35,480 –> 00:20:38,440
the constant question is, if you have the ability to kill that guy, why don’t

326
00:20:38,440 –> 00:20:42,280
you kill that guy? Because every time you let that guy go or let that

327
00:20:42,280 –> 00:20:45,640
guy get locked up in Arkham Asylum for the mentally insane, he’s going to escape,

328
00:20:45,640 –> 00:20:49,250
you know, this, and he’s going to kill more people. So, what. How.

329
00:20:49,250 –> 00:20:52,690
What, What. What culpability do you have, Bruce,

330
00:20:53,890 –> 00:20:57,490
which was my father’s name, interestingly enough. What culpability do you have, Bruce?

331
00:20:57,490 –> 00:21:01,170
Mr. Wayne. Right. And. And. And we don’t deal with

332
00:21:01,170 –> 00:21:05,010
any of that. And of course, you know, the. The Dark Knight ends with, you

333
00:21:05,010 –> 00:21:08,810
know, the Joker getting. Getting locked up, of

334
00:21:08,810 –> 00:21:11,970
course, because, you know, if Heath Ledger hadn’t died, I think he probably would have

335
00:21:11,970 –> 00:21:15,450
been in the sequel. But my point is

336
00:21:15,530 –> 00:21:18,330
that that desensitization

337
00:21:19,050 –> 00:21:22,570
leads us to. In an attempt to

338
00:21:22,570 –> 00:21:25,770
deconstruct everything or find truth everywhere.

339
00:21:27,130 –> 00:21:29,370
We’re trying to find truth in villainy,

340
00:21:30,810 –> 00:21:34,610
and there’s no truth in villainy. There’s no truth in. To

341
00:21:34,610 –> 00:21:37,570
your. I’m going to use the word. There’s no truth in evil. But to your

342
00:21:37,570 –> 00:21:41,100
point, we’ve been desensitized to it. Plus, I have younger kids.

343
00:21:41,180 –> 00:21:44,820
And then you do, too. Your kid. Your younger kids are

344
00:21:44,820 –> 00:21:48,660
probably older than my younger kids. But I look at my younger kid,

345
00:21:48,660 –> 00:21:51,500
my youngest kid, who was born in the mid 2000s,

346
00:21:52,460 –> 00:21:55,460
and I just go. I think I was thinking about the other day, I was

347
00:21:55,460 –> 00:21:58,900
like, he’s going to live to the end of this century. He’s going to be

348
00:21:58,900 –> 00:22:02,700
so far away from World War II, it’s going to seem like ancient

349
00:22:02,860 –> 00:22:06,670
history to him. Yeah. And

350
00:22:06,670 –> 00:22:09,670
so how do we pass along. This is something I’m obsessed with, which is one

351
00:22:09,670 –> 00:22:12,550
of the reasons why I do this podcast, one of the many reasons. How do

352
00:22:12,550 –> 00:22:16,310
we pass along the lessons from the old things to people so they

353
00:22:16,310 –> 00:22:20,150
keep that visceral pull. You just read it in the passage you

354
00:22:20,150 –> 00:22:23,870
were talking about. Here’s the lies. The biggest problem in

355
00:22:23,870 –> 00:22:27,270
my opinion. It doesn’t matter. It doesn’t matter.

356
00:22:28,550 –> 00:22:32,190
Listen, World War II and the genocide of the Jewish people was not the first

357
00:22:32,190 –> 00:22:36,030
time something like this has happened. No, it was not. And by the way, it

358
00:22:36,030 –> 00:22:39,310
also wasn’t the last. No, it was not. So. So

359
00:22:39,630 –> 00:22:42,190
it, it. What frustrates the absolute

360
00:22:42,830 –> 00:22:46,350
bonkers out of me is that we have not figured out

361
00:22:46,510 –> 00:22:50,310
a way to actually learn from the past and not

362
00:22:50,310 –> 00:22:53,910
reproduce it and not go and, and do this stuff all over

363
00:22:53,910 –> 00:22:57,670
again. We’ve had genocidal act actions

364
00:22:57,670 –> 00:23:01,350
throughout the course of history. Go look, look, in the course of history, it doesn’t

365
00:23:01,350 –> 00:23:04,470
like pick an era. It doesn’t even matter. Pick an error. And I,

366
00:23:04,950 –> 00:23:08,350
I’m gonna be bold and say within 60 or

367
00:23:08,350 –> 00:23:11,830
75 years of any date you pick in history, there was some form of

368
00:23:11,990 –> 00:23:15,750
genocide. I’ll challenge people on that. Because

369
00:23:16,310 –> 00:23:20,150
even within the, the history of

370
00:23:20,150 –> 00:23:23,870
the United States alone, 250 years, whatever it was, I think we

371
00:23:23,870 –> 00:23:27,340
just know. Sorry, 400. We just celebrated the.

372
00:23:28,300 –> 00:23:31,420
Actually so where I live, the town of Plymouth,

373
00:23:31,420 –> 00:23:34,940
Massachusetts just celebrated a few years ago

374
00:23:34,940 –> 00:23:37,900
its 400th anniversary. 400 years.

375
00:23:38,300 –> 00:23:41,340
Yep. Let me just remind everybody that

376
00:23:42,060 –> 00:23:45,860
from 4. Even in our own 400 year history, you cannot put a

377
00:23:45,860 –> 00:23:49,340
pin in within 70 years, not find a genocidal act

378
00:23:49,900 –> 00:23:53,620
in our history. I’m not talking about on US soil, I’m just saying in

379
00:23:53,620 –> 00:23:56,780
the world’s history and just in the, in the existence of the United States

380
00:23:57,370 –> 00:24:01,010
from 1620 to now, it doesn’t happen. There’s

381
00:24:01,010 –> 00:24:04,690
been the Kumar Rouge, there’s been World War II, the Jewish genocide, there’s been

382
00:24:04,690 –> 00:24:08,370
Native American genocide here in the United States. There’s been the Cambodian genocide. There’s

383
00:24:08,370 –> 00:24:11,970
been. It repeats itself over and over. So until

384
00:24:11,970 –> 00:24:15,650
we figure out a way. And by the way, I’m just, I’m gonna throw

385
00:24:15,650 –> 00:24:19,290
cinema back in there again because we don’t even

386
00:24:19,290 –> 00:24:23,050
learn our lesson in movies. One of the largest grossing movies

387
00:24:23,050 –> 00:24:26,490
of our, of our time right now, Avatar.

388
00:24:26,810 –> 00:24:30,490
If you. The little blue, you know the giant blue on the planet.

389
00:24:30,810 –> 00:24:34,490
Yep. Did you, does anybody ever stop and wonder why

390
00:24:34,650 –> 00:24:37,530
the blue people, if you listen to their act,

391
00:24:38,970 –> 00:24:42,610
if you listen to their accents, they sound either African

392
00:24:42,610 –> 00:24:45,770
or Native American. There’s a reason for that.

393
00:24:47,210 –> 00:24:51,060
There’s a reason that they picked those two cultures to be

394
00:24:51,060 –> 00:24:54,700
represented on that movie. And by the way, just

395
00:24:54,700 –> 00:24:58,460
where it happens to turn out the other way, where

396
00:24:58,460 –> 00:25:02,260
they win. Great for them. But that didn’t happen so much here in the. On

397
00:25:02,260 –> 00:25:05,500
our planet. But, but the. But, but, but that’s what I’m saying. When we

398
00:25:05,500 –> 00:25:09,220
glorify some of these things in movies, right? Like, so we’re glorifying

399
00:25:09,220 –> 00:25:12,540
evil. And. And by the way, in that movie with Heath Ledger,

400
00:25:12,940 –> 00:25:16,540
everybody I know, Heath Ledger was their favorite character in that movie.

401
00:25:16,700 –> 00:25:20,240
Yeah. Oh, yeah. So, yeah. So we’re picking our favorite

402
00:25:20,240 –> 00:25:23,840
characters as the. As the personification of evil in that

403
00:25:23,840 –> 00:25:27,560
movie. We’re picking a movie and giving it the most amount of

404
00:25:27,560 –> 00:25:30,720
money any movie has ever made for the

405
00:25:30,800 –> 00:25:33,120
annihilation of another planet. And another.

406
00:25:34,480 –> 00:25:38,280
Why, until we start learning these lessons of, like, we shouldn’t be doing this

407
00:25:38,280 –> 00:25:41,560
stuff, we’re gonna continue. So I. I hate to say it this way, Hasan. This

408
00:25:41,560 –> 00:25:44,540
is what I’m getting at. I know it’s long winded. No, no, you’re fine. Your

409
00:25:44,540 –> 00:25:46,740
son is going to see this happen again.

410
00:25:48,020 –> 00:25:51,740
Like, he’s going to see it. Whether. Whether he remembers World War II or not,

411
00:25:51,740 –> 00:25:55,340
whether he learns it in his history books or not, it’s going to happen. He’s

412
00:25:55,340 –> 00:25:59,019
going to see it live. He himself is going to experience this, because we all

413
00:25:59,019 –> 00:26:02,660
have in one form. And it frustrates the hell out of

414
00:26:02,660 –> 00:26:06,340
me. So the last piece of this is where I think my son

415
00:26:06,340 –> 00:26:09,670
and people who are born in his generation will see it. This is why I

416
00:26:09,670 –> 00:26:12,830
sent you the. The one article I did. And we’re going to talk about this

417
00:26:12,830 –> 00:26:15,590
in the next section, but I’m gonna. I’m gonna. I’m gonna put it right here.

418
00:26:15,670 –> 00:26:19,430
It is quite conceivable. This is from Hannah Aaron. It is quite conceivable

419
00:26:19,430 –> 00:26:22,630
that in the automated economy of the not too distant future,

420
00:26:23,270 –> 00:26:26,310
men may be tempted to exterminate all those

421
00:26:26,630 –> 00:26:30,390
whose intelligence quotient is below a certain level.

422
00:26:30,870 –> 00:26:34,230
Yeah. Yeah. That’s where we’re going,

423
00:26:34,310 –> 00:26:37,870
kids. So it may not have anything to do with the color of your skin

424
00:26:37,870 –> 00:26:41,660
or the place you were born or whatever, but. But that’s still the. It’s

425
00:26:41,660 –> 00:26:45,460
a genocidal act either way. When. When the guy who’s running

426
00:26:45,460 –> 00:26:49,300
the World Economic Forum or was slated to run the

427
00:26:49,300 –> 00:26:53,140
World Economic Forum, Yuval Noah Harari, who wrote a book

428
00:26:53,140 –> 00:26:56,700
called Sapiens and believes that there’s no such thing as free will.

429
00:26:56,860 –> 00:27:00,620
Right? When. When he says that most people

430
00:27:00,780 –> 00:27:04,540
on the Earth are. And again, this is a direct quote from him. You can

431
00:27:04,540 –> 00:27:07,900
go to Google and find it. Are quote unquote, useless eaters.

432
00:27:15,360 –> 00:27:17,600
And he’s merely thinking bureaucratically.

433
00:27:22,160 –> 00:27:26,000
Okay, so I’m useless eater and my kids a useless eater, and

434
00:27:26,000 –> 00:27:29,840
Tom’s kids are useless eaters and Tom’s a useless eater. But you, you’ve all,

435
00:27:30,400 –> 00:27:31,920
somehow, you’ve ascended

436
00:27:34,810 –> 00:27:38,250
to what? And really Yuval is not really the problem

437
00:27:38,330 –> 00:27:42,090
because he’s the tippy top of the mountain. The, the problem

438
00:27:42,250 –> 00:27:46,050
is all the little Eichmanns, all

439
00:27:46,050 –> 00:27:49,650
the little bureaucrats that are going to help him get where he

440
00:27:49,650 –> 00:27:53,330
wants to go. And those are the people, I think, that have to be

441
00:27:53,330 –> 00:27:57,090
disrupted. That’s where we have to go. And so the question that

442
00:27:57,090 –> 00:28:00,690
lays before us today, as I said in my very long opening there is

443
00:28:00,690 –> 00:28:01,930
what kind of people,

444
00:28:05,320 –> 00:28:08,800
what kind of people are these? So I’ll use, I’ll use a, I’ll use a

445
00:28:08,800 –> 00:28:11,080
modern example. So the Sacklers, right?

446
00:28:12,200 –> 00:28:15,720
Sackler family, who created, well, not created, but

447
00:28:15,960 –> 00:28:19,400
they were the ones that owned the pharmaceutical companies that

448
00:28:19,480 –> 00:28:23,320
allowed the opioid crisis to develop. They

449
00:28:23,320 –> 00:28:26,840
were sued to the tune of

450
00:28:26,840 –> 00:28:30,120
several millions, billions dollars, whatever. It’s a large number.

451
00:28:30,460 –> 00:28:33,660
Okay, we’re going to drain the

452
00:28:33,660 –> 00:28:37,460
Sacklers of their money. But that, the money is

453
00:28:37,460 –> 00:28:40,980
not the thing. Because eventually after you

454
00:28:40,980 –> 00:28:44,460
spread the money out to all the victims and the states get all their cut

455
00:28:44,540 –> 00:28:48,100
and the taxes get taken out, you still

456
00:28:48,100 –> 00:28:51,500
get less money than the life would be

457
00:28:51,500 –> 00:28:54,940
worth or the value that would be created by a person

458
00:28:55,910 –> 00:28:59,630
who, to your point, you use the word genocide. I would not use

459
00:28:59,630 –> 00:29:03,270
that term with. I think a term has to be kept very

460
00:29:03,270 –> 00:29:07,030
specially for certain specific things. Sure. But when a high

461
00:29:07,030 –> 00:29:10,870
school student gets on an

462
00:29:10,870 –> 00:29:14,310
opioid for pain medication,

463
00:29:14,790 –> 00:29:18,230
then can’t get off of it because it’s so addictive, then gets into

464
00:29:18,230 –> 00:29:22,080
heroin and fentanyl and then OD’s, you know, 10

465
00:29:22,080 –> 00:29:25,640
years later or five years later. And the Sackler

466
00:29:25,640 –> 00:29:29,040
family is paying out billions of dollars

467
00:29:29,200 –> 00:29:32,880
in aggregate. But what will come out to be merely thousands of

468
00:29:32,880 –> 00:29:36,640
dollars of the life of that dead high school student, and by the way, hundreds

469
00:29:36,640 –> 00:29:40,000
of thousands of other dead high school students across the United States in the last

470
00:29:40,240 –> 00:29:43,200
15 years, just at minimum, 15 years.

471
00:29:47,770 –> 00:29:51,290
Where was the administrator inside of the Sackler

472
00:29:51,690 –> 00:29:54,650
foundation that said, this is evil, this has to stop.

473
00:29:55,370 –> 00:29:59,090
Where was the little Eichmann in there that needed to get

474
00:29:59,090 –> 00:30:02,490
checked? Heck, I’ll go a step further. Where was the leader

475
00:30:02,729 –> 00:30:06,530
who should have checked that middle manager and said, we’re not doing this

476
00:30:06,530 –> 00:30:10,090
anymore and it doesn’t matter if we’re fired and our whole division is gone. We

477
00:30:10,090 –> 00:30:13,890
can’t be a party to this because we know what’s happening. By the way, we

478
00:30:13,890 –> 00:30:17,720
did this with the nicotine, we did this with the tobacco companies back in the

479
00:30:17,720 –> 00:30:21,120
1990s. We said the tobacco companies were so

480
00:30:21,120 –> 00:30:24,320
evil that tobacco itself

481
00:30:24,640 –> 00:30:27,440
needed to be quote unquote sued. And yet

482
00:30:28,080 –> 00:30:31,800
Philip Morris still exists. RJR

483
00:30:31,800 –> 00:30:35,640
still exists. Sure, they’re called Nabisco now, but they still exist. And

484
00:30:35,640 –> 00:30:39,240
it’s easy, by the way, for us to point at corporations for this kind of

485
00:30:39,240 –> 00:30:42,880
evil. I did mention the Sackler family and the tobacco companies.

486
00:30:43,040 –> 00:30:46,640
Those are private corporations because you don’t have to get on opioids.

487
00:30:46,990 –> 00:30:50,350
Okay? You could just gut it out. Sure, okay.

488
00:30:51,470 –> 00:30:54,990
But government, you can’t escape government.

489
00:30:55,390 –> 00:30:59,150
Government takes my taxes and rjr, Nabisco’s

490
00:30:59,150 –> 00:31:02,750
taxes and Google’s taxes. Government takes all

491
00:31:02,750 –> 00:31:06,430
of our money. Government’s supposed to, in the liberal world

492
00:31:06,430 –> 00:31:09,550
order, understanding of government, serve all of us. And yet,

493
00:31:10,590 –> 00:31:13,840
and yet we saw bureaucratic

494
00:31:13,840 –> 00:31:17,600
thoughtlessness that would rival, particularly during COVID 19. That’s

495
00:31:17,600 –> 00:31:21,040
the most recent example. But we’ve seen bureau, I can name other examples during the

496
00:31:21,040 –> 00:31:24,760
course of the last 40 years. It’s just the most recent one where government

497
00:31:25,240 –> 00:31:28,880
bureaucratic thoughtlessness that was at the level of

498
00:31:28,880 –> 00:31:32,520
Adolf Eichmann killed people, led

499
00:31:32,520 –> 00:31:36,360
directly to their deaths. And yet we all sort

500
00:31:36,360 –> 00:31:39,080
of shrug our shoulders and we go, well, what are you going to do? You

501
00:31:39,080 –> 00:31:42,910
can’t fight city hall. And then we walk away. It

502
00:31:42,910 –> 00:31:46,750
might be worse than Eichmann. Honestly, Hasan, because think about

503
00:31:46,750 –> 00:31:50,550
it. Eichmann being, I mean this. So, so the World War

504
00:31:50,550 –> 00:31:53,990
II situation, what Hannah’s referring to, I mean, she’s talking about

505
00:31:54,310 –> 00:31:58,030
that it’s like our current military, right? You’re taking an order from

506
00:31:58,030 –> 00:32:01,390
an order from an order. You’re, you’re doing, you’re doing your, your job. You’re

507
00:32:01,390 –> 00:32:05,070
disconnected from the top of the ranks far enough that you

508
00:32:05,070 –> 00:32:07,710
don’t question that order because you have no idea what they know and what they

509
00:32:07,710 –> 00:32:11,260
don’t know. The government, the government bureaucracies and the government,

510
00:32:11,340 –> 00:32:13,660
people that should be fired or

511
00:32:14,700 –> 00:32:18,260
prosecuted for being evil is

512
00:32:18,260 –> 00:32:21,980
different because they know, they have first hand knowledge and they do nothing

513
00:32:21,980 –> 00:32:25,500
about it. That’s, to me, that’s different, that’s more evil than

514
00:32:25,500 –> 00:32:29,100
Eichmann. And I’m not suggesting Eichmann wasn’t evil, you know, yeah,

515
00:32:29,100 –> 00:32:32,940
anything that, that happened. But there’s, there’s a level, there’s a level of

516
00:32:32,940 –> 00:32:36,710
consciousness that happens that, that, that a bureaucrat that a. That

517
00:32:36,710 –> 00:32:40,470
a politician looks at and they make this decision based on, do

518
00:32:40,470 –> 00:32:43,390
I want to do what’s best for the country and the best for our people,

519
00:32:43,470 –> 00:32:46,670
Do I want to do what’s best for my constituents? Or do I want to

520
00:32:46,670 –> 00:32:50,390
do what’s best for me and take that lobby money and vote whatever

521
00:32:50,390 –> 00:32:53,630
way they’re asking me to vote on and protect that bureau, that

522
00:32:53,630 –> 00:32:57,270
corporation from prosecution over X, Y or Z,

523
00:32:57,270 –> 00:33:00,830
whatever. That’s how the tobacco companies were protected for so long. We eventually

524
00:33:00,830 –> 00:33:04,390
got. We eventually got politicians in office that weren’t in their

525
00:33:04,390 –> 00:33:08,190
pocket. But quite honestly, that’s. To

526
00:33:08,190 –> 00:33:11,630
me, that’s more evil. You are making a. That is not an

527
00:33:11,630 –> 00:33:15,110
unconscious decision that you’re making that you are not just following orders. You are not

528
00:33:15,110 –> 00:33:18,870
disconnected from the upper ranks of the military so that the order came down so

529
00:33:18,870 –> 00:33:22,670
many times that you just. You can’t. You can’t question it because it’s

530
00:33:22,670 –> 00:33:26,070
different. To me, this is. This is way more pure evil where

531
00:33:26,230 –> 00:33:28,250
the lobbyists and

532
00:33:29,770 –> 00:33:33,170
the big corporations can spend enough money on a

533
00:33:33,170 –> 00:33:36,970
politician or a group of politicians to circumvent or

534
00:33:37,130 –> 00:33:40,970
even better, change the laws in their favor so they don’t even have to circumvent

535
00:33:40,970 –> 00:33:44,570
the law. They can have the law changed to benefit them. And now I’m

536
00:33:44,570 –> 00:33:48,410
voting that. I’m not voting my conscience. I’m not voting the constituents. I’m

537
00:33:48,410 –> 00:33:51,890
not voting the country. I’m voting money because it puts money in my

538
00:33:51,890 –> 00:33:55,260
pocket. How many politicians do you know sit in Washington that. That are not worth

539
00:33:55,260 –> 00:33:58,900
millions of dollars? Zero. I don’t know a single one that is

540
00:33:58,900 –> 00:34:02,380
worth that is not millionaire. So these

541
00:34:02,380 –> 00:34:06,140
people and, and we can claim all we want that we have

542
00:34:06,140 –> 00:34:09,580
control where the. We’re the. We are the people. We can vote them in, vote

543
00:34:09,580 –> 00:34:13,100
them out. But do we really. We don’t really, because when they get down to

544
00:34:13,100 –> 00:34:16,740
the nitty gritty in our. In our local constituents and they. They take their tie

545
00:34:16,740 –> 00:34:19,700
off and they look human and they’re talking to us like we. Like they’re the

546
00:34:19,700 –> 00:34:23,060
same kind of person as we are. We vote for them and then they go

547
00:34:23,060 –> 00:34:26,620
do whatever the hell they want in Washington anyway. It’s like. It’s so

548
00:34:26,620 –> 00:34:30,380
frustrating to me. Like this whole. Frustrating. No, no, no.

549
00:34:30,380 –> 00:34:33,780
This is, this is. This is something where.

550
00:34:36,420 –> 00:34:39,780
Well, well, I. Go ahead. And by the way, I. Because I. I just had

551
00:34:39,780 –> 00:34:43,620
this conversation this morning. I was talking to somebody. It’s so funny about your article

552
00:34:43,620 –> 00:34:46,580
because it brought up a conversation this morning. I was talking to somebody else about

553
00:34:46,580 –> 00:34:50,230
AI in, in our, in our work environments and that the

554
00:34:50,230 –> 00:34:53,950
whole concept of the conversation was AI friend or foe. Like do you

555
00:34:53,950 –> 00:34:57,030
think AI is great? Do you think it’s bad? Whatever, right? So we had this

556
00:34:57,030 –> 00:35:00,670
conversation and I brought up another conversation I had with my daughter over the

557
00:35:00,670 –> 00:35:04,350
weekend who happens to be in her early 20s. And she, she

558
00:35:04,350 –> 00:35:08,190
was telling me about, about a

559
00:35:08,430 –> 00:35:11,790
post that she saw. I believe it was on Tick Tock. It was a,

560
00:35:12,270 –> 00:35:14,080
it was an, it was an AI

561
00:35:15,760 –> 00:35:18,560
newsreel, 100% fake.

562
00:35:19,520 –> 00:35:22,880
And they, by the way, I’ll give the, the person who created it credit

563
00:35:23,600 –> 00:35:27,120
very like told everybody it was fake. He just wanted to show

564
00:35:27,120 –> 00:35:30,480
everybody how cool it was that he could make AI look real.

565
00:35:30,800 –> 00:35:34,560
My daughter flat out told me that this newsreel could have, could

566
00:35:34,560 –> 00:35:38,160
pass as real if they, if they just posted it and didn’t say anything. People

567
00:35:38,160 –> 00:35:41,100
would have bought into the fact that this was real. And by the way, it

568
00:35:41,100 –> 00:35:44,660
was an image of Washington D.C. getting bombed from

569
00:35:44,660 –> 00:35:47,940
what’s going on in the Middle east right now. Like somebody in the Middle east,

570
00:35:47,940 –> 00:35:50,980
it doesn’t matter what it is, just sent over a missile. It blew up one

571
00:35:50,980 –> 00:35:54,820
of the buildings in Washington before the US could react. And now the US

572
00:35:54,820 –> 00:35:58,260
is on full alert and we’re going full steam ahead with our military

573
00:35:59,540 –> 00:36:03,220
that if that looks that real. You think that the American public

574
00:36:03,220 –> 00:36:07,060
are, you think they’re going to go fact check that Anyway, I bring that up

575
00:36:07,060 –> 00:36:10,730
because I think what you’re talking about with the article that you sent me, at

576
00:36:10,730 –> 00:36:14,170
some point, do we not realize that we have to have some sort of

577
00:36:14,170 –> 00:36:17,770
governing body for this? Somebody has to stand

578
00:36:17,770 –> 00:36:21,610
up and say we’ve got to, we’ve got to put our heads

579
00:36:21,610 –> 00:36:25,370
and our hands around AI. We have to start thinking about governing

580
00:36:25,370 –> 00:36:28,770
this. And I’m not talking about censorship and I’m not talking no

581
00:36:28,930 –> 00:36:32,610
creative liberties, go for it, whatever. But there should be some

582
00:36:32,610 –> 00:36:36,290
sort of foundational knowledge that it is AI. When you’re looking at something

583
00:36:36,290 –> 00:36:39,880
on a screen, screen like, right, so something, so somebody has to start

584
00:36:40,520 –> 00:36:44,200
the process of, of making sure that we know

585
00:36:44,200 –> 00:36:47,720
what’s real and what’s fake. Once it starts getting so real

586
00:36:47,800 –> 00:36:49,960
that we can’t tell the difference with our own eyes,

587
00:36:53,160 –> 00:36:56,720
it’s craziness. And by the way, that what I was getting the whole. Now wrap

588
00:36:56,720 –> 00:36:59,960
this up in a little bowl for you. Because the problem

589
00:37:00,280 –> 00:37:03,400
again, these bureaucrats and these and these politicians,

590
00:37:03,970 –> 00:37:06,850
they’re not doing anything about it because they’re making money off of It,

591
00:37:07,890 –> 00:37:11,530
So they have no. Because until people start dying over

592
00:37:11,530 –> 00:37:15,050
this, we’re not doing anything about it. And that’s because that’s a tragedy in.

593
00:37:15,050 –> 00:37:18,610
Itself or because they don’t understand it. Which

594
00:37:19,090 –> 00:37:22,810
part of the. Some of them don’t understand it. Just in case you

595
00:37:22,810 –> 00:37:26,530
don’t believe me. Just, just look at the last time that, that Mark

596
00:37:26,530 –> 00:37:29,650
Zuckerberg was dragged up in front of the, in front of Congress.

597
00:37:30,930 –> 00:37:34,750
And you know the youngest senator there was ted

598
00:37:34,750 –> 00:37:38,510
Cruz at 50 something. And he was the

599
00:37:38,510 –> 00:37:42,230
only one that knew how to ask Mark Zuckerberg the correct questions

600
00:37:42,230 –> 00:37:45,630
about the algorithm because he was the only one that didn’t need

601
00:37:45,630 –> 00:37:49,310
notes from his 20 year old staff members who no

602
00:37:49,310 –> 00:37:52,790
longer use Facebook. Right. And everybody else on that

603
00:37:52,790 –> 00:37:54,710
dais, if you go look at the video,

604
00:37:56,870 –> 00:38:00,370
was over 60. Yeah, yeah,

605
00:38:01,170 –> 00:38:04,810
right. They don’t understand what they’re looking

606
00:38:04,810 –> 00:38:08,450
at. They really don’t. And they don’t understand the

607
00:38:08,450 –> 00:38:11,930
implications of it. And then the middle

608
00:38:11,930 –> 00:38:13,490
managers below them

609
00:38:15,890 –> 00:38:19,730
are thoughtless bureaucrats. But let me, let

610
00:38:19,730 –> 00:38:23,530
me, let me make my point. I’m going to let

611
00:38:23,530 –> 00:38:26,170
Tom hang on this one for just a minute because I’m going to bring up

612
00:38:26,170 –> 00:38:28,770
something else. There’s another thing that we need to bring up. Kind of been talking

613
00:38:28,770 –> 00:38:31,750
about this other article. Let me, let me introduce this idea. So back to the

614
00:38:31,750 –> 00:38:35,590
book, back to Eichmann in Jerusalem. We’re going

615
00:38:35,590 –> 00:38:39,070
to pick up with chapter eight, Duties of a Law Abiding Citizen.

616
00:38:39,390 –> 00:38:42,990
I’m going to read this short piece.

617
00:38:44,990 –> 00:38:48,790
So Eichmann’s opportunities for feeling like Pontius Pilate were many. And as

618
00:38:48,790 –> 00:38:52,110
the months and years went by, he lost the need to feel anything at all.

619
00:38:52,750 –> 00:38:55,270
This is the way things were. This was the new law of the land based

620
00:38:55,270 –> 00:38:59,080
on the Fuhrer’s order. Whatever he did, as far as he could see,

621
00:38:59,080 –> 00:39:02,360
was as a Tom’s point, law abiding citizen.

622
00:39:02,760 –> 00:39:06,120
He did his duty as he told the police and the court over and over

623
00:39:06,120 –> 00:39:09,960
again. He not only obeyed orders, he also obeyed the law.

624
00:39:11,080 –> 00:39:14,840
Eichmann had a muddled inkling that this could be an important distinction. But neither the

625
00:39:14,840 –> 00:39:18,320
defense nor the judges ever took him up on it. The well worn coins of

626
00:39:18,320 –> 00:39:22,080
superior orders versus acts of state were handed back and forth. They

627
00:39:22,080 –> 00:39:25,860
had governed the whole discussion of these matters during the Nuremberg trials for no other

628
00:39:25,860 –> 00:39:29,460
reason than they gave the illusion that the altogether unprecedented could be

629
00:39:29,460 –> 00:39:33,020
judged according to precedents and the standards that went with them.

630
00:39:34,060 –> 00:39:37,780
Eichmann, with his rather modest mental gifts, was certainly

631
00:39:37,780 –> 00:39:40,740
the last man in the courtroom to be expected to challenge these notions and to

632
00:39:40,740 –> 00:39:44,460
strike out on his own. Since in addition to performing what he conceived

633
00:39:44,460 –> 00:39:47,820
to be the duties of a law abiding citizen, he had also acted upon orders.

634
00:39:47,820 –> 00:39:51,460
Always so careful to be covered, he became completely muddled

635
00:39:51,460 –> 00:39:54,940
and ended by stressing alternatively the virtues and the vices of blind

636
00:39:54,940 –> 00:39:58,090
obedience or the obedience of corpses.

637
00:39:58,650 –> 00:40:01,290
Cadaver gorshom, as he himself called it.

638
00:40:02,330 –> 00:40:05,770
The first indication of Eichmann’s vague notion that there was more involved in this whole

639
00:40:05,770 –> 00:40:09,570
business than the question of soldiers carrying out orders that are clearly criminal

640
00:40:09,570 –> 00:40:13,050
in nature and intent appeared during the police examination

641
00:40:13,370 –> 00:40:16,530
when he suddenly declared with great emphasis that he had lived his whole life according

642
00:40:16,530 –> 00:40:20,250
to Kant’s moral precepts and especially

643
00:40:20,490 –> 00:40:24,140
according to the Kant definition of duty. This was

644
00:40:24,140 –> 00:40:27,580
outrageous on the face of it and also incomprehensible. By the way, let me pause.

645
00:40:27,980 –> 00:40:31,820
Hannah Arendt knew a lot about Immanuel Kant, who was a German philosopher.

646
00:40:32,620 –> 00:40:36,460
She read him quite extensively and studied him quite extensively in college,

647
00:40:37,180 –> 00:40:40,060
where Martin Heidegger and

648
00:40:40,860 –> 00:40:44,420
Carl Jaspers were two folks that she was

649
00:40:44,420 –> 00:40:47,820
intimately two other philosophers giants of the 20th century.

650
00:40:48,580 –> 00:40:52,340
Although Heidegger did have sympathies with the. With the Nazis

651
00:40:52,340 –> 00:40:56,100
and was a member of the National Socialist Party,

652
00:40:56,340 –> 00:41:00,060
a fact that Hannah Arendt actually critiqued him about and

653
00:41:00,060 –> 00:41:03,820
critiqued him over, over the course of many years in a post World War

654
00:41:03,820 –> 00:41:07,460
II concept or construct. Sorry. All right, back to the

655
00:41:07,460 –> 00:41:11,300
book. Since Kant’s moral philosophy I’m going to

656
00:41:11,300 –> 00:41:14,820
pick up with the sentence is so closely bound up with man’s faculty of judgment,

657
00:41:15,310 –> 00:41:19,150
which rules out blind obedience. The examining officer did not press

658
00:41:19,150 –> 00:41:22,590
the point. But Judge Rava, either out of curiosity or out of indignation at Eichmann’s

659
00:41:22,590 –> 00:41:26,310
having dared to invoke Khan’s name in connection with his crimes, decided to question the

660
00:41:26,310 –> 00:41:30,070
accused. And to the surprise of everybody, Eichmann came

661
00:41:30,070 –> 00:41:33,870
up with an approximately correct definition of the categorical imperative

662
00:41:34,430 –> 00:41:38,110
quote I meant by my remark about Kant. The principle of my will

663
00:41:38,350 –> 00:41:41,710
must always be such that it can become the principle of general laws.

664
00:41:43,800 –> 00:41:47,240
Which is not the case with theft or murder, for instance, because the thief or

665
00:41:47,240 –> 00:41:50,240
the murderer cannot conceivably wish to live under a legal system that would give others

666
00:41:50,240 –> 00:41:52,200
the right to rob or murder him.

667
00:41:53,720 –> 00:41:57,320
Upon further questioning, he added that he had read Kant’s Critique of Practical

668
00:41:57,320 –> 00:42:00,960
Reason. He then proceeded to explain that from the moment he was charged with carrying

669
00:42:00,960 –> 00:42:04,600
out the Final Solution he had ceased to live according to Kantian principles,

670
00:42:05,000 –> 00:42:07,840
that he had known it and that he had consoled himself with the thought that

671
00:42:07,840 –> 00:42:11,440
he no longer, quote, was master of his own deeds, close quote, that he was

672
00:42:11,440 –> 00:42:15,230
unable to change anything. What he failed to point out in court was that

673
00:42:15,230 –> 00:42:18,870
in this period of crimes legislated by the state, as he himself now

674
00:42:18,870 –> 00:42:22,430
called it, he had not simply dismissed the Kantian formula as no longer

675
00:42:22,430 –> 00:42:26,190
applicable. He distorted it to read act as

676
00:42:26,190 –> 00:42:29,470
if the principle of your actions were the same as that of the legislator or

677
00:42:29,470 –> 00:42:32,750
of the law of the land, or in Hans Frank’s formulation of the

678
00:42:32,750 –> 00:42:36,150
categorical imperative of the Third Reich, which Eichmann might have known,

679
00:42:36,470 –> 00:42:39,750
act in such a way that the Fuhrer, if he knew your action, would approve

680
00:42:39,750 –> 00:42:43,570
of it. Kant, to be sure, had never intended to say anything of

681
00:42:43,570 –> 00:42:47,370
the sort. On the contrary, to him every man was a legislator the moment

682
00:42:47,370 –> 00:42:50,810
he began to act. By using his practical reason,

683
00:42:50,970 –> 00:42:54,370
man found the principles that could and should be the principles of the law. But

684
00:42:54,370 –> 00:42:58,130
it is true that Eichmann’s unconscious distortion agrees with what he himself

685
00:42:58,130 –> 00:43:01,530
called the version of Kant. For the household use of the little man.

686
00:43:02,090 –> 00:43:05,490
In this household use, all that is left of Kant’s spirit is the demand that

687
00:43:05,490 –> 00:43:08,660
a man do more than obey the law, that he go beyond the mere call

688
00:43:08,660 –> 00:43:12,500
of obedience and identify his own will with the principle behind

689
00:43:12,500 –> 00:43:15,820
the law, the source from which the law sprang.

690
00:43:16,460 –> 00:43:19,740
In Kant’s philosophy, that source was practical reason. In

691
00:43:19,740 –> 00:43:23,459
Eichmann’s household use of him, it was the

692
00:43:23,459 –> 00:43:26,540
will of the Fuhrer. Much of the horrifying,

693
00:43:26,540 –> 00:43:30,340
painstakingly horrible, painstakingly thoroughness of the execution of the

694
00:43:30,340 –> 00:43:34,180
Final Solution, a thoroughness that usually strikes the observer as typically

695
00:43:34,180 –> 00:43:37,890
German or else as a characteristic of the perfect bureaucrat, can

696
00:43:37,890 –> 00:43:41,730
be traced to the odd notion, indeed very common in Germany, that

697
00:43:41,730 –> 00:43:45,570
to be law abiding means not merely to obey the laws, but to

698
00:43:45,570 –> 00:43:49,170
act as though one were the legislature legislator of the

699
00:43:49,170 –> 00:43:52,450
laws that one obeys. Hence the

700
00:43:52,450 –> 00:43:56,290
conviction that nothing less than going beyond the

701
00:43:56,290 –> 00:43:58,970
call of duty will do.

702
00:44:02,900 –> 00:44:06,380
I think we’re there in America, we just don’t know

703
00:44:06,380 –> 00:44:10,140
Kant. I think we’ve been there for

704
00:44:10,140 –> 00:44:13,820
a while and I think what scared the hell out

705
00:44:13,820 –> 00:44:17,620
of all of us, people who are outside of the legislative or

706
00:44:17,620 –> 00:44:21,420
bureaucratic systems, whether they are corporate or governance. I think

707
00:44:21,420 –> 00:44:24,020
what scared the hell out of all of us with COVID was just how far

708
00:44:24,020 –> 00:44:27,790
that thinking has gone. And it has never been critiqued

709
00:44:27,790 –> 00:44:31,630
or exposed in the way that Hannah Arendt just exposed that.

710
00:44:32,030 –> 00:44:35,710
Because how many public intellectuals do we have that are even smart enough to

711
00:44:35,710 –> 00:44:39,510
explain it to us? Which is of course, one of the

712
00:44:39,510 –> 00:44:43,230
declines that we are experiencing or have experienced in the last 40 years

713
00:44:43,790 –> 00:44:44,590
in the West.

714
00:44:46,990 –> 00:44:50,790
Eichmann, you’ve brought this up before on this, on this podcast too. Isan.

715
00:44:50,790 –> 00:44:54,430
It’s. It’s also a lack of, it’s also a lack of new,

716
00:44:54,750 –> 00:44:58,470
young, observant writers. Like, yes,

717
00:44:58,470 –> 00:45:02,270
so, so, like that era that you’re talking about, there was so much

718
00:45:02,270 –> 00:45:06,030
going on and people, people were writing about it to

719
00:45:06,030 –> 00:45:09,470
the depths of like, like she’s coming up with her own

720
00:45:09,550 –> 00:45:13,230
theories and pro. Like, we don’t have that anymore. Like, we don’t. At least

721
00:45:13,230 –> 00:45:16,910
not to my knowledge. Like, I don’t know anybody coming out with. Even today’s

722
00:45:16,910 –> 00:45:20,200
journalists are even kind of. I was gonna say a joke. I don’t want to

723
00:45:20,200 –> 00:45:23,240
make. I don’t want to. I mean, they’re still in danger, like in, in war

724
00:45:23,240 –> 00:45:27,000
zones and stuff like that. So I’m not suggesting that they’re. They’re not doing. But

725
00:45:27,000 –> 00:45:30,680
they’re not. It’s not the same. Like, you’re not, it’s. They’re not coming

726
00:45:30,680 –> 00:45:33,720
at it from a, a perspective of like,

727
00:45:34,680 –> 00:45:38,320
you know, this, this, this shouldn’t happen. Or why. Why are we doing that? Like,

728
00:45:38,320 –> 00:45:41,160
they’re just reporting. Yeah. Another bomb went off.

729
00:45:41,960 –> 00:45:45,810
Look how destructive this is. We have or, or even children

730
00:45:45,810 –> 00:45:49,570
dying. In the streets over here. Like that’s okay. Or even, or even

731
00:45:49,570 –> 00:45:53,370
worse. They want to be influencers. Yeah. Or they want to like, they want to

732
00:45:53,370 –> 00:45:56,850
like, they want to like, tell me what their inner thoughts are on blue

733
00:45:56,850 –> 00:45:59,810
sky or on X, wherever the hell they’re going.

734
00:46:00,930 –> 00:46:04,610
Sub stack, of course, is also becoming popular, is a place where they’re going to

735
00:46:04,610 –> 00:46:07,570
like, spat off about

736
00:46:08,290 –> 00:46:09,250
whatever. Look.

737
00:46:12,180 –> 00:46:15,900
But they’re not intellectuals. They’re not intellectuals. Right. They’re not doing

738
00:46:15,900 –> 00:46:19,460
stuff like this. They’re not. They’re not. They’re not looking at it and dissecting it

739
00:46:19,460 –> 00:46:22,860
from a, from a philosophical, political, analytical

740
00:46:22,860 –> 00:46:26,500
standpoint. They’re, to your point, they’re just spouting off stuff to get

741
00:46:26,980 –> 00:46:30,660
likes and clicks. It’s like, it’s crazy to me.

742
00:46:30,980 –> 00:46:34,740
Well, and what we need is what we require for the

743
00:46:34,740 –> 00:46:38,500
safety of a civilization. To your point earlier, that you were making

744
00:46:39,130 –> 00:46:42,490
about doing about, about human beings

745
00:46:42,810 –> 00:46:46,250
performing genocidal acts once every 60 to

746
00:46:46,250 –> 00:46:50,050
75 years. What we need if we are

747
00:46:50,050 –> 00:46:53,570
going to break that cycle, here’s what we need. We need

748
00:46:53,570 –> 00:46:57,370
people in roles. And I wrote this down.

749
00:46:57,770 –> 00:47:01,370
We need people in roles who have the moral courage to say

750
00:47:01,370 –> 00:47:05,220
no and have the

751
00:47:05,220 –> 00:47:08,860
intellectual. And I’m going to go a step

752
00:47:08,860 –> 00:47:11,620
Further, the spiritual language

753
00:47:13,060 –> 00:47:16,580
to talk to other people in real terms and move

754
00:47:16,580 –> 00:47:20,340
them so that they don’t do the thing. Look,

755
00:47:20,499 –> 00:47:22,660
look, all you need

756
00:47:24,340 –> 00:47:28,100
to stop a genocide, all you need, and we

757
00:47:28,100 –> 00:47:31,710
see this repeatedly, all you need is a bunch of

758
00:47:31,710 –> 00:47:34,510
people to basically just quietly

759
00:47:36,190 –> 00:47:39,870
and confidently, with courage, say one

760
00:47:39,870 –> 00:47:43,310
word, and that word is no.

761
00:47:46,670 –> 00:47:50,430
Yeah, that’s it. And now if they said now, now here’s all the

762
00:47:50,430 –> 00:47:53,910
consequences, because people now want to engage in consequentialist thinking. When you start talking about

763
00:47:53,910 –> 00:47:56,590
this, well, what if they send drones to my house? What if they kill my

764
00:47:56,590 –> 00:47:58,790
kids? What if they lock me up? What if, what if, what if, what if,

765
00:47:58,790 –> 00:48:02,480
what if, what if you get hit walking

766
00:48:02,480 –> 00:48:04,440
your dog crossing the street?

767
00:48:06,520 –> 00:48:08,920
What if those are cowardly

768
00:48:11,720 –> 00:48:15,359
considerations? Now, now, you may question my

769
00:48:15,359 –> 00:48:19,080
bonafies on saying this, and I’ve never

770
00:48:19,080 –> 00:48:22,680
gone public about this, but I’m going to go public now.

771
00:48:26,130 –> 00:48:29,850
I resisted masking during COVID for a whole variety

772
00:48:29,850 –> 00:48:31,170
of reasons. I just did.

773
00:48:33,490 –> 00:48:36,770
I resisted the thing underneath it, the initial

774
00:48:36,770 –> 00:48:40,570
explanation underneath it, because I thought the science was nonsense. I

775
00:48:40,570 –> 00:48:43,890
absolutely did. People are gonna argue with me, that’s fine. Argue with me all day.

776
00:48:43,890 –> 00:48:47,090
I thought the science was nonsense, by the way that’s been proven later on.

777
00:48:47,650 –> 00:48:51,250
I thought the reasoning behind why we were social distancing was nonsense,

778
00:48:52,040 –> 00:48:55,680
by the way that was proven later on. You can go, look, I also

779
00:48:55,680 –> 00:48:59,080
resisted getting the COVID 19 vaccine.

780
00:48:59,480 –> 00:49:03,240
I have never gotten it and I will never get it for a whole

781
00:49:03,240 –> 00:49:06,960
variety of reasons, partially health related, personal

782
00:49:06,960 –> 00:49:10,560
health related, but also because the way

783
00:49:10,560 –> 00:49:14,320
that was pitched to me, the way all three of those things were

784
00:49:14,320 –> 00:49:18,040
pitched to me, looked and sounded

785
00:49:19,180 –> 00:49:21,580
far too much like echoes

786
00:49:23,820 –> 00:49:27,580
of the categorical imperative for the little man. It

787
00:49:27,580 –> 00:49:31,420
sounded far too much like, to my ears, not for anybody

788
00:49:31,420 –> 00:49:35,100
else. I wasn’t running around advocating. And by the way, by me disclosing

789
00:49:35,100 –> 00:49:38,700
this, I’m not telling you what you should do. You do what you need to

790
00:49:38,700 –> 00:49:41,220
do for your own family and for your own household. And I do know people

791
00:49:41,220 –> 00:49:44,990
that went in a bunch of different directions. And that’s fine. You do whatever

792
00:49:44,990 –> 00:49:47,230
you do. I’m not going to judge you. I’m not saying what you should have

793
00:49:47,230 –> 00:49:50,430
done or shouldn’t have done. That’s another thing for Kant, by the way, shoulds and

794
00:49:50,430 –> 00:49:53,910
odds. And I know the difference between the shoulds and odds,

795
00:49:54,710 –> 00:49:58,430
okay? Oughts are based on things that can go back to

796
00:49:58,430 –> 00:50:02,230
tradition, like, I don’t know, we ought not commit, as

797
00:50:02,630 –> 00:50:06,150
Hannah Arendt brought up, we ought not commit murder. That’s an odd.

798
00:50:06,950 –> 00:50:09,990
I shouldn’t get the jab. That’s a should.

799
00:50:11,020 –> 00:50:14,620
Shoulds are flexible, oughts go back in

800
00:50:14,620 –> 00:50:18,020
tradition and can appeal to something that’s bigger than

801
00:50:18,020 –> 00:50:21,860
myself. In order to think about that effectively and

802
00:50:21,860 –> 00:50:25,340
guide my family through that, I had to actually be able to engage in critical

803
00:50:25,340 –> 00:50:25,740
thinking.

804
00:50:29,980 –> 00:50:32,980
And then, and then, and then I didn’t give a bunch of excuses. I just

805
00:50:32,980 –> 00:50:36,740
said no. I, I actually like the way you’re, you’re talking about

806
00:50:36,740 –> 00:50:40,390
that to your, to your point, right? So the, I think the, the

807
00:50:40,390 –> 00:50:44,070
main point to that is whether you did or didn’t or should or should.

808
00:50:44,630 –> 00:50:48,230
The critical thinking part to me is the most important because so my experience

809
00:50:48,390 –> 00:50:52,030
through Covid was a little bit different than yours, but not by

810
00:50:52,030 –> 00:50:55,630
much. So like, so again, again, you got to think of

811
00:50:55,630 –> 00:50:59,230
access to information, right? When we first, Absolutely. When the information first

812
00:50:59,230 –> 00:51:03,070
hit us, it was, it was, it was

813
00:51:03,070 –> 00:51:06,670
just fear. It, Fear and fear mongering, like,

814
00:51:06,670 –> 00:51:10,350
just like information overload and all the negative. So my initial

815
00:51:10,350 –> 00:51:13,330
re was like, okay, I’m gonna put the mask on.

816
00:51:14,050 –> 00:51:17,170
I put the mask on. But to your point though, as I started

817
00:51:17,170 –> 00:51:20,890
understanding and I started watching the science come in, I’m going, okay,

818
00:51:20,890 –> 00:51:24,410
what, what, what, what are we doing here now? Like, right? I take it off

819
00:51:24,410 –> 00:51:27,249
and I’m going, something doesn’t seem right here. Like,

820
00:51:28,050 –> 00:51:31,690
how is this a, like, I, I, I started even

821
00:51:31,690 –> 00:51:35,370
like, understanding definitions of pandemics, epidemics,

822
00:51:35,370 –> 00:51:39,090
like, starting, I started researching, like, what does that mean? And like I’m saying, so

823
00:51:39,300 –> 00:51:42,020
I take the mask off and I went, no,

824
00:51:42,980 –> 00:51:46,820
I’m not doing this. So now, so, but again, in fairness,

825
00:51:46,820 –> 00:51:49,980
my initial reaction was to put it on because of all of the information that

826
00:51:49,980 –> 00:51:53,380
came at me all at once. And it was all fear mongering right?

827
00:51:54,180 –> 00:51:57,780
Now, as for the shot, I got it. But

828
00:51:58,260 –> 00:52:02,020
there was a, there was a very selfish reason for that.

829
00:52:02,340 –> 00:52:06,160
Okay? They wouldn’t let me go on vacation unless I had it. So

830
00:52:06,560 –> 00:52:10,120
I was like, I was like. Listen,

831
00:52:10,120 –> 00:52:13,120
just, I’m going to, I’m going to the Bahamas. Just stick it. I don’t give

832
00:52:13,120 –> 00:52:16,880
a crap. So there was, there was a, there was a, there was a gain.

833
00:52:17,680 –> 00:52:20,800
The pro and con of gain from my family was that we all went on

834
00:52:20,800 –> 00:52:24,400
vacation and nobody else was going on vacation at that time. So sure, we had

835
00:52:24,400 –> 00:52:26,640
an awesome cruise. There was nobody on the boat.

836
00:52:28,640 –> 00:52:32,160
And because we had that, that card that said, said we had been

837
00:52:32,160 –> 00:52:35,640
vaccinated, we didn’t have have to wear masks on the boat. We, we were able

838
00:52:35,640 –> 00:52:39,320
to walk around the Boat free and clear of the so again. But it was

839
00:52:39,320 –> 00:52:42,960
calculating, it was critical. Think to your point. Exactly your point. We

840
00:52:42,960 –> 00:52:46,760
used critical thinking and, and pros and cons and weighing the, the

841
00:52:46,760 –> 00:52:50,520
risks for our family based on

842
00:52:50,520 –> 00:52:53,640
the information that we had. Right. So again to your point about, and by the

843
00:52:53,640 –> 00:52:57,400
way, I, I, I, I, I feel

844
00:52:57,400 –> 00:53:01,080
kind of sheepish like I followed the sheep in the beginning of that because like

845
00:53:01,080 –> 00:53:04,800
I said, I just, I fell for the, the fear mongering that happened.

846
00:53:06,320 –> 00:53:09,080
It only took, it took me. It took us a couple weeks once we got

847
00:53:09,080 –> 00:53:12,800
through all the, of the, all that initial wave of like, oh crap,

848
00:53:12,800 –> 00:53:16,240
this is like the world is ending like that. Because that’s the information we got.

849
00:53:16,240 –> 00:53:19,920
All the news media, every, the world is ending. You’re all gonna die.

850
00:53:19,920 –> 00:53:23,000
Everyone’s gonna die. Like this is gonna kill everybody. This is gonna be the next

851
00:53:23,000 –> 00:53:26,560
Spanish flu that kills. Oh, this is the next bubonic

852
00:53:26,560 –> 00:53:30,180
plague that’s gonna kill X. That’s all we heard

853
00:53:30,180 –> 00:53:33,820
for weeks on end. So anyway, so I,

854
00:53:33,900 –> 00:53:37,340
it’s not that. No. And then by the way, I’m not suggesting you made the

855
00:53:37,340 –> 00:53:40,340
wrong, like you were saying, like I, I don’t feel like I made the wrong

856
00:53:40,340 –> 00:53:43,260
choice because again, once I realized the facts

857
00:53:44,059 –> 00:53:47,340
that off and throw it away. Yeah. Excuse my language. But I was like, forget

858
00:53:47,340 –> 00:53:50,860
this, this is dumb. Like what are we doing here, people? Like, this is, I

859
00:53:50,860 –> 00:53:54,460
think, I think. I only wore a mask once and that was because

860
00:53:54,460 –> 00:53:57,420
somebody who I personally knew and trusted

861
00:53:58,380 –> 00:54:01,980
ask me to do so in an environment to protect I believe

862
00:54:01,980 –> 00:54:05,580
was their, their mother in law. Okay. And because

863
00:54:05,820 –> 00:54:09,340
that person asked me directly, not the state

864
00:54:09,500 –> 00:54:13,180
compelling me. Yeah. Not a bureaucrat on TV

865
00:54:14,780 –> 00:54:18,500
saying stuff. Right. That I could then go back check

866
00:54:18,500 –> 00:54:21,180
on this great sampling tool we have called Google.

867
00:54:22,860 –> 00:54:26,340
Not because of any of those reasons, but because somebody who I knew and

868
00:54:26,340 –> 00:54:30,030
trusted said, hey, go do this. I

869
00:54:30,030 –> 00:54:33,710
said sure, because I trust you and because I value our relationship

870
00:54:33,710 –> 00:54:37,510
with you. Yeah. I don’t care in

871
00:54:37,510 –> 00:54:41,110
this environment, in this moment, to your point about fear, when

872
00:54:41,110 –> 00:54:44,950
the entire narrative is being pushed through a fear based

873
00:54:45,030 –> 00:54:48,790
lens and we’re not letting go of that. And then of

874
00:54:48,790 –> 00:54:50,950
course it’s going to spiral out into a bunch of other different things. We don’t

875
00:54:50,950 –> 00:54:54,210
need to get into all that. I’m just keeping this very narrow. Yeah. Yeah. I.

876
00:54:56,530 –> 00:55:00,330
And maybe it’s my temperament, I’ll admit that. And temperaments are

877
00:55:00,330 –> 00:55:04,170
different among people. Maybe it’s my psychological makeup.

878
00:55:04,170 –> 00:55:07,490
Maybe I’m just designed to be more rebellious. Whatever. Maybe.

879
00:55:08,530 –> 00:55:11,170
But I just Said no.

880
00:55:12,770 –> 00:55:16,490
Now, if you ask me during that time why I was saying no, I

881
00:55:16,490 –> 00:55:19,800
could give you the, the meanings and the understandings all the way down. And by

882
00:55:19,800 –> 00:55:22,640
the way, I talk with my family about this. And we made decisions all the

883
00:55:22,640 –> 00:55:26,280
way down, some of which, by the way, cost us money.

884
00:55:28,440 –> 00:55:31,880
There was, there was no consequence free decision here, by the way. That’s what

885
00:55:31,880 –> 00:55:35,400
adults understand. There’s no consequence free decision. Leaders understand

886
00:55:35,560 –> 00:55:39,200
this. Every decision has consequences. Do I know people who died of

887
00:55:39,200 –> 00:55:43,000
COVID Absolutely I do. Do I know people who got the.

888
00:55:43,960 –> 00:55:47,200
Got the vaccine or got the jab, however you want to frame that, and then

889
00:55:47,200 –> 00:55:50,960
had problems later? Absolutely. And do I know people who got

890
00:55:50,960 –> 00:55:54,640
it and we’re just fine. Absolutely. Yeah. There you

891
00:55:54,640 –> 00:55:57,480
go. Got it. And we’re just fine. Absolutely right.

892
00:55:59,080 –> 00:56:02,520
And went on vacation because of it. And went on vacation because of it. Actually,

893
00:56:02,520 –> 00:56:05,960
I know a guy who he. He couldn’t get his pilot’s license

894
00:56:07,480 –> 00:56:10,680
in a, in a branch of the military that he’s in.

895
00:56:11,960 –> 00:56:15,080
And he was like, well, this is my job. I got to feed my family.

896
00:56:15,820 –> 00:56:19,620
And so he made that decision based on that. And I’ve talked to him subsequent

897
00:56:19,620 –> 00:56:23,020
to that and he said, I probably wouldn’t make that same decision

898
00:56:23,340 –> 00:56:26,980
again. These are the kinds of

899
00:56:26,980 –> 00:56:30,620
dynamics that have to. My intro

900
00:56:31,740 –> 00:56:35,140
sort of brought up all this stuff that, that, that Hannah

901
00:56:35,140 –> 00:56:38,980
Aaron was talking about with Eichmann and what is

902
00:56:38,980 –> 00:56:42,660
the duties of a law abiding citizen. Now we’re in a

903
00:56:42,660 –> 00:56:46,420
space and I want to revisit what Tom said about AI. We’re in a

904
00:56:46,420 –> 00:56:50,060
space now where the algorithm, we are

905
00:56:50,060 –> 00:56:53,180
outsourcing our brains to the algorithm. If the algorithm tells us what to do. We’re

906
00:56:53,180 –> 00:56:56,300
already seeing this with people, though. The algorithm tells us what to do,

907
00:56:56,859 –> 00:57:00,580
we just do what the algorithm says. And instead

908
00:57:00,580 –> 00:57:04,220
of it being the rule of nobody from nowhere,

909
00:57:04,620 –> 00:57:08,060
now it’s the rule of an AI from

910
00:57:08,060 –> 00:57:11,620
nowhere. When the AI screws up. Here’s my

911
00:57:11,620 –> 00:57:15,380
giant question. When the AI screws up and gives us

912
00:57:15,380 –> 00:57:18,460
the wrong information and we all followed it off the cliff like lemmings and no

913
00:57:18,460 –> 00:57:22,180
one, or very few say no. That’s

914
00:57:22,180 –> 00:57:25,660
a disaster. That’s a disaster. Bigger even than Covid.

915
00:57:26,140 –> 00:57:29,980
Who will we hold responsible? Because the AI cannot

916
00:57:30,780 –> 00:57:33,580
think. And if there were no.

917
00:57:37,320 –> 00:57:41,160
The 2008 financial collapse, one of the massive critiques from the left is that during

918
00:57:41,160 –> 00:57:44,720
the 2008 financial collapse, no

919
00:57:44,720 –> 00:57:47,720
banker was put on trial or went to jail.

920
00:57:49,959 –> 00:57:53,680
That’s a legitimate point. I have a problem with that. One

921
00:57:53,680 –> 00:57:57,520
of the critiques of some of the things around COVID

922
00:57:57,520 –> 00:58:00,920
19 has been, not one bureaucrat has been put in jail

923
00:58:01,560 –> 00:58:04,920
are made to do a perp walk. That’s a legitimate

924
00:58:04,920 –> 00:58:08,760
critique. Not one. As a matter of fact, there have been

925
00:58:08,760 –> 00:58:11,960
a lot of articles written from places like the Atlantic and other

926
00:58:12,360 –> 00:58:16,080
approved outlets like CNN and MSNBC that we should all just

927
00:58:16,080 –> 00:58:18,920
sort of forget this and just sort of move on. Okay.

928
00:58:21,400 –> 00:58:25,120
When there is no one to perp walk, when there

929
00:58:25,120 –> 00:58:28,610
is no one to blame, when it is the algorithm, and the

930
00:58:28,610 –> 00:58:31,890
algorithm does not think, what will we do then?

931
00:58:33,250 –> 00:58:35,890
I don’t think we have an answer for that question. Matter of fact, I don’t

932
00:58:35,890 –> 00:58:39,370
think we’re conceptualizing it. No, we don’t. But that’s kind of what I, I. Said

933
00:58:39,370 –> 00:58:43,170
earlier, and that’s what you said earlier. Yeah, somebody, somebody has to

934
00:58:43,170 –> 00:58:46,890
put some guardrails up. Somebody has to start governing

935
00:58:46,890 –> 00:58:50,050
this. Like it’s like, it is what it is. Like.

936
00:58:50,290 –> 00:58:54,100
Okay, we, we, When Google first came

937
00:58:54,100 –> 00:58:57,260
out and people were getting all this information thrown at them by Google,

938
00:58:58,540 –> 00:59:02,060
nobody said we should be making sure this information

939
00:59:02,140 –> 00:59:05,780
is legitimate. But somebody

940
00:59:05,780 –> 00:59:09,540
did. Eventually us as consumers pushed back and said, hey, Google, why

941
00:59:09,540 –> 00:59:13,260
do you keep showing me this crap like I keep asking you for, I

942
00:59:13,260 –> 00:59:16,620
don’t know how many, how many

943
00:59:16,780 –> 00:59:20,540
orangutans are, are left in Borneo. I don’t know, whatever. I’m just. Yeah, yeah, whatever.

944
00:59:20,780 –> 00:59:24,630
No idea where that just came from. But you know,

945
00:59:24,630 –> 00:59:27,790
you ask that and it comes up with some random crap and then you’re like,

946
00:59:27,870 –> 00:59:31,630
so eventually Google had to put its own guardrails up, but it was because

947
00:59:31,790 –> 00:59:35,630
we had massive amounts of people saying, your search is giving me

948
00:59:35,870 –> 00:59:39,550
crap. Like, your search is giving me garbage. Your search is giving me garbage. Like,

949
00:59:39,550 –> 00:59:43,230
fix it, fix it, fix it. And they eventually fixed it, but it was,

950
00:59:43,310 –> 00:59:46,830
but we have some, we have to start doing that now with

951
00:59:46,990 –> 00:59:50,820
the AIs of the world because we’re already starting to see what

952
00:59:50,820 –> 00:59:53,900
you’re talking about. And you just said that. So a

953
00:59:55,020 –> 00:59:58,220
large language models are starting to produce content

954
00:59:58,700 –> 01:00:01,420
at a, at a pace that is

955
01:00:01,980 –> 01:00:05,740
unprecedented in our, in, in our history. And

956
01:00:07,420 –> 01:00:10,300
it’s, that’s, that’s not even the right way to word it. It’s,

957
01:00:11,980 –> 01:00:15,620
it’s. I, I, I heard a statistic the other day and I, I have

958
01:00:15,620 –> 01:00:19,390
nowhere to justify this, so we can fact check it later or whatever. But it’s

959
01:00:19,390 –> 01:00:22,110
something to the, to the point of like,

960
01:00:22,990 –> 01:00:26,430
every year more data is created

961
01:00:26,510 –> 01:00:30,150
than every year behind it combined. Or,

962
01:00:30,150 –> 01:00:33,470
yeah, I’ve heard that effect. Right? I’ve heard Something like that. Yeah, I’ve heard something.

963
01:00:33,790 –> 01:00:37,150
Yes. Think about that for a second. So every year, meaning,

964
01:00:39,630 –> 01:00:43,470
let’s just use small numbers for a second. Okay, sure, yeah. One

965
01:00:43,470 –> 01:00:46,880
plus one plus one is two. Two. So the next year

966
01:00:47,600 –> 01:00:51,120
it’s five instead of two and the next year

967
01:00:51,120 –> 01:00:54,720
it’s 10 instead of five. Like that’s what we’re talking about here, people.

968
01:00:54,800 –> 01:00:58,600
Every single year more information is putting, being put on

969
01:00:58,600 –> 01:01:02,320
the Internet now. Especially now because of things like chat GPT and

970
01:01:02,320 –> 01:01:05,120
Claude and perplexity and all these other AIs

971
01:01:06,000 –> 01:01:09,680
that nobody’s policing what the information is in

972
01:01:09,680 –> 01:01:13,260
worse off, the AIs are starting to use each each other

973
01:01:13,260 –> 01:01:16,780
as reference. So like, okay, again just

974
01:01:16,780 –> 01:01:20,100
let’s be clear here. Chachi PT produces a

975
01:01:20,100 –> 01:01:23,260
document that nobody fact checks and just

976
01:01:24,060 –> 01:01:27,500
gets put on on the, on the Internet. The next AI

977
01:01:27,660 –> 01:01:31,500
reads that document, it doesn’t know if it’s right or wrong or good or

978
01:01:31,500 –> 01:01:35,340
bad or, or it just uses it as a reference point for,

979
01:01:35,340 –> 01:01:38,140
to produce another document. We are having bad

980
01:01:38,770 –> 01:01:42,450
decisions like influence bad decisions influence

981
01:01:42,450 –> 01:01:46,210
bad decisions at an enormously increased clip.

982
01:01:47,810 –> 01:01:51,610
At some point, at some point you know that, that

983
01:01:51,610 –> 01:01:55,250
old adage, garbage in, garbage out. Like, like you put garbage in, you’re going to

984
01:01:55,250 –> 01:01:58,210
get garbage out. Well, we’re seeing the garbage come out right now, people.

985
01:01:59,090 –> 01:02:02,210
Can we, we should, can we stop it please? Like, can we

986
01:02:03,010 –> 01:02:06,540
should, we should have, we should have a logo for all the LLMs. LLMs.

987
01:02:06,780 –> 01:02:10,620
Allowing you to make bad decisions faster. Yeah, right.

988
01:02:11,900 –> 01:02:15,300
And I’m not saying they’re all bad. Don’t get me wrong. Like, is there a

989
01:02:15,300 –> 01:02:18,900
place for AI? Absolutely. I think there, there’s use cases for it. I think AI

990
01:02:18,900 –> 01:02:22,020
is helpful. I think there’s a lot of benefit to it. Blah, blah, blah, blah.

991
01:02:22,020 –> 01:02:25,740
Yeah, all that stuff is right. But if we, if we

992
01:02:25,740 –> 01:02:29,500
take just the good and ignore the bad, we’re going to

993
01:02:29,500 –> 01:02:31,340
end up in a bad predicament later.

994
01:02:34,790 –> 01:02:38,350
So I guess the core question as we kind of round the corner on this

995
01:02:38,350 –> 01:02:41,430
episode, there’s a ton of other stuff in Eichmann in Jerusalem.

996
01:02:42,150 –> 01:02:45,350
I strongly recommend you reading leaders reading it.

997
01:02:45,830 –> 01:02:49,510
There’s even a good. Oh, not a good. There’s a, an

998
01:02:49,510 –> 01:02:53,270
interesting piece in there. An interesting chapter.

999
01:02:53,270 –> 01:02:56,310
Not piece interesting chapter in Eichmann in Jerusalem.

1000
01:02:57,270 –> 01:02:57,830
It is,

1001
01:03:04,000 –> 01:03:07,640
is the, the chapters on the way. Not every country

1002
01:03:07,640 –> 01:03:11,040
that the Nazis conquered deported their

1003
01:03:11,040 –> 01:03:14,480
Jews. We tend to think that every country went along

1004
01:03:14,800 –> 01:03:18,280
and not every country did. Matter of fact, a notable country that did not go

1005
01:03:18,280 –> 01:03:21,920
along was Denmark. And actually the king of Denmark

1006
01:03:21,920 –> 01:03:23,840
Just simply said no.

1007
01:03:25,880 –> 01:03:29,560
Pushed back. Right. And pushed back on the furor and said, just, no, we’re just,

1008
01:03:29,560 –> 01:03:33,080
we’re not sending you our Jews. And if you come to get them.

1009
01:03:35,080 –> 01:03:37,720
Oh, no, actually it wasn’t that one. It was. What do you say? Because it

1010
01:03:37,720 –> 01:03:41,440
was the star to identify them. He’s like, okay, well, everybody in the country will

1011
01:03:41,440 –> 01:03:45,000
wear. Will wear. Will wear a yellow star, so good

1012
01:03:45,000 –> 01:03:48,760
luck. So

1013
01:03:48,760 –> 01:03:52,390
there was leadership, courage during

1014
01:03:52,390 –> 01:03:56,110
World War II. Yeah. From a country that couldn’t even fight back either.

1015
01:03:56,110 –> 01:03:59,510
Like, that’s the other thing. Like, they didn’t even have the mechanism to fight. If

1016
01:03:59,510 –> 01:04:02,990
Hitler decided, okay, fine, then I’m just gonna go destroy everybody with the star.

1017
01:04:02,990 –> 01:04:06,310
Like, he would have wiped out the whole country. And. And they would. They would

1018
01:04:06,310 –> 01:04:09,470
not have been able to do much about it. But he stood up.

1019
01:04:09,790 –> 01:04:13,550
You. This is again, the core lessons in here

1020
01:04:14,270 –> 01:04:17,900
are not about a rebel yellow and a

1021
01:04:17,900 –> 01:04:20,740
southern flag. You know, they’re not about

1022
01:04:21,700 –> 01:04:25,460
rednecks or MAGA or any of that other kind of garbage that people

1023
01:04:25,460 –> 01:04:28,900
think of when the right people thinking. The right thinking people think of

1024
01:04:28,900 –> 01:04:32,620
rebellion. It’s not. That’s not the lessons. The lessons in here aren’t even about

1025
01:04:32,620 –> 01:04:36,460
good old fashioned American, because those are people in Europe. Good old

1026
01:04:36,460 –> 01:04:40,140
fashioned American cussedness. The Patrick Henry type that we have a

1027
01:04:40,140 –> 01:04:43,810
strain of. I’ve talked about those podcasts. A strain of that runs in

1028
01:04:43,810 –> 01:04:47,570
our national character quite strongly and of

1029
01:04:47,570 –> 01:04:50,690
course came out during COVID uncritically, but did come out.

1030
01:04:52,050 –> 01:04:55,090
This is about humans, human leaders

1031
01:04:56,770 –> 01:04:59,650
looking at the people who are working for them

1032
01:05:00,850 –> 01:05:04,610
and saying, we’re not going to have any Eichmanns here and it doesn’t

1033
01:05:04,610 –> 01:05:08,050
matter if everybody else goes off the cliff with the AIs,

1034
01:05:08,450 –> 01:05:11,710
which is what’s going to happen in the future. It is going to happen. Someone’s

1035
01:05:11,710 –> 01:05:15,470
going to go off the cliff with the, with the algorithms. The ones

1036
01:05:15,470 –> 01:05:18,910
who don’t go off the cliff with the algorithms will be led by leaders

1037
01:05:19,070 –> 01:05:22,470
who will do two things. One, combine a

1038
01:05:22,470 –> 01:05:26,190
thoughtful bureaucrat to monitor the AI

1039
01:05:26,670 –> 01:05:30,030
and back check it. But then number two,

1040
01:05:30,510 –> 01:05:34,110
will also override both the thoughtful

1041
01:05:34,110 –> 01:05:37,480
bureaucrat who can be convinced of something and

1042
01:05:37,480 –> 01:05:41,320
override the AI and say, kind

1043
01:05:41,320 –> 01:05:44,920
of like in the great submarine movie Crimson Tide

1044
01:05:45,880 –> 01:05:49,480
or, you know, the Hunt for Red October. No,

1045
01:05:49,880 –> 01:05:53,600
we’re not going to launch the nukes. We’re

1046
01:05:53,600 –> 01:05:57,280
just not. Well, what if we die? Well, then I

1047
01:05:57,280 –> 01:05:59,000
guess we die. It’s been glorious.

1048
01:06:01,400 –> 01:06:04,870
I’ll see you all on the other side, I guess. Or maybe not.

1049
01:06:04,870 –> 01:06:08,510
I Hope you. Hope you’re praying up even tithing like good

1050
01:06:08,510 –> 01:06:12,270
Catholics. But this is the posture

1051
01:06:12,270 –> 01:06:15,750
that we have to take as human leaders, I think, moving forward into our

1052
01:06:15,750 –> 01:06:19,110
AI driven future. Because when we have the rule of nobody from

1053
01:06:19,110 –> 01:06:22,830
nowhere, to paraphrase from Matthew Crawford, we don’t want

1054
01:06:22,830 –> 01:06:26,230
to have people in those bureaucratic positions who are like

1055
01:06:26,230 –> 01:06:30,030
Winston in 1984, who will just comply and say, two and

1056
01:06:30,030 –> 01:06:33,130
two is five, no matter what their eyes actually see.

1057
01:06:35,450 –> 01:06:39,050
There’s a. There’s a fundamental problem that I think we have too, as leaders.

1058
01:06:39,530 –> 01:06:43,050
So, like, okay, so let’s say. Let’s say you’re. You’re trying to build

1059
01:06:43,130 –> 01:06:46,930
an organization that gives people that autonomy to

1060
01:06:46,930 –> 01:06:50,730
say no, right? Like, you’re, you’re looking at it from a. You know,

1061
01:06:51,130 –> 01:06:54,970
again, we, We. You’ve seen this a thousand times where people will say,

1062
01:06:54,970 –> 01:06:58,050
well, I’m the kind of owner of a company, or I’m the kind of president

1063
01:06:58,050 –> 01:07:01,580
of a leader of a company, whatever. People can come to me with any idea,

1064
01:07:01,580 –> 01:07:05,100
no idea is a dumb idea, I’ll listen, blah, blah, blah, Great.

1065
01:07:05,820 –> 01:07:09,300
If I, if I tell people to do something, I want them to push back,

1066
01:07:09,300 –> 01:07:11,660
if they, if they feel compelled to do so, great.

1067
01:07:12,860 –> 01:07:16,620
That. That’s not the challenge. Saying that is easy, right?

1068
01:07:17,660 –> 01:07:21,460
When it happens, your reaction is

1069
01:07:21,460 –> 01:07:25,180
the hard part. Because here’s what I’ve

1070
01:07:25,180 –> 01:07:28,100
experienced in my lifetime with, with leaders who were

1071
01:07:30,580 –> 01:07:33,860
saying that they accept all of what we’re talking about,

1072
01:07:34,820 –> 01:07:38,660
what they never, what they have a hard time learning. And I, and I,

1073
01:07:38,660 –> 01:07:42,020
I, I would love to hear your thoughts on this and where we can learn.

1074
01:07:42,100 –> 01:07:45,820
How do you learn how to be better at this? When

1075
01:07:45,820 –> 01:07:49,500
that person tells you the idea, how do you make

1076
01:07:49,500 –> 01:07:52,740
them feel like it’s not a dumb idea when you’re not going to do it?

1077
01:07:52,740 –> 01:07:56,170
Like, so you’re. I want you to give me all your ideas. I know. No

1078
01:07:56,170 –> 01:07:59,890
idea is a dumb idea, right? Okay, Hasan, you got this great idea. And I’m

1079
01:07:59,890 –> 01:08:02,330
thinking in the back of my head, this guy’s a complete idiot. What is he

1080
01:08:02,330 –> 01:08:05,850
talking about? But I can’t say that to him, right? Because I want this culture

1081
01:08:05,850 –> 01:08:09,409
in my company of willingness to say no, willingness to push back,

1082
01:08:09,409 –> 01:08:13,050
willingness to come with ideas to the table, all that stuff that

1083
01:08:13,050 –> 01:08:16,890
we’re just talking about. And the first words out of my mouth are, you

1084
01:08:16,890 –> 01:08:19,560
know what? Hey, son, that sounds like a great idea. But. And as soon as

1085
01:08:19,560 –> 01:08:23,200
you say but, they shut down, they’re like, I’m out. I’m done. I’m not

1086
01:08:23,200 –> 01:08:26,280
listening. Because you don’t. You don’t really think my idea is good. You don’t really

1087
01:08:26,280 –> 01:08:28,200
like me. You don’t. You don’t want to hear me. You don’t want to listen.

1088
01:08:28,200 –> 01:08:31,160
You’re not really listening. Like, they go off on this

1089
01:08:31,800 –> 01:08:35,600
tangent in their own head. Oh, yeah, how do we not do that

1090
01:08:35,600 –> 01:08:39,280
as a leader? How do we take that environment that

1091
01:08:39,280 –> 01:08:43,040
we want to build, that we all want to build? Well, I would

1092
01:08:43,040 –> 01:08:46,130
imagine most people listening to this podcast, that’s why they’re listening to it. They want

1093
01:08:46,130 –> 01:08:48,610
to hear. That’s good. Lessons, leaders.

1094
01:08:49,570 –> 01:08:53,410
I’ve never found anybody. Now, I’m not excluding myself, by the

1095
01:08:53,410 –> 01:08:57,050
way, because I try to have that open dialogue and that opens all the

1096
01:08:57,050 –> 01:09:00,850
time. But I am one of those people that’ll say, hey, listen. And

1097
01:09:00,850 –> 01:09:04,610
I tried. I think I’m being good about it, right? And

1098
01:09:04,610 –> 01:09:08,130
I can see it right in their face. I can. I watch the micro

1099
01:09:08,130 –> 01:09:11,749
expressions in their face just look definitely defeated. And I don’t know what else to

1100
01:09:11,749 –> 01:09:14,989
say to them when I say, you know what, hey, son, that’s a great idea,

1101
01:09:14,989 –> 01:09:18,669
but we’re going to go a little bit different direction because, you know,

1102
01:09:18,669 –> 01:09:22,269
the, the research has shown this, studies have shown us that

1103
01:09:22,269 –> 01:09:25,989
whatever. And, like, so we’re going to go in this direction now on

1104
01:09:25,989 –> 01:09:29,709
a rare occasion where somebody says, but, Tom, the research is wrong,

1105
01:09:29,789 –> 01:09:33,549
you’re looking at, you know, go check this and go check that. Okay? But that

1106
01:09:33,789 –> 01:09:37,440
doesn’t happen often. It. I think it’s the. I think what we’re

1107
01:09:37,440 –> 01:09:41,080
asking people to do is learn and have the ability to say no twice.

1108
01:09:41,560 –> 01:09:45,240
Like, at least twice. Because you say no once and that you get

1109
01:09:45,240 –> 01:09:48,600
pushed back from the leadership and you become defeated and walk away.

1110
01:09:49,320 –> 01:09:53,040
If you’re willing to say no twice and stand your ground on your moral

1111
01:09:53,040 –> 01:09:56,560
compass or your. Or your whatever, where it is. We’re talking about

1112
01:09:56,560 –> 01:10:00,240
your thought process, your own research. You did, like, you went and

1113
01:10:00,240 –> 01:10:04,030
you found something for the company or whatever. If you

1114
01:10:04,030 –> 01:10:07,870
can say no twice. And it makes me now really check

1115
01:10:07,870 –> 01:10:11,430
my theory and philosophy about running the, the company being a good leader.

1116
01:10:11,990 –> 01:10:15,430
If you say no to the second time. And I’m like, okay, hold on a

1117
01:10:15,430 –> 01:10:18,429
second now. All right, let me, Let me hear the whole thing now, because now

1118
01:10:18,429 –> 01:10:21,270
you got my attention. Like, but leaders don’t.

1119
01:10:22,070 –> 01:10:25,710
It’s not that we don’t have the ability to do that. It’s just that as

1120
01:10:25,710 –> 01:10:29,520
soon as we say no, but, you know, or. It’s

1121
01:10:29,520 –> 01:10:32,920
a great idea, but thanks for bringing that to my attention. But as soon as

1122
01:10:32,920 –> 01:10:36,640
we say the word but it’s over. How do we say that to

1123
01:10:36,640 –> 01:10:39,320
them without saying the word but so that we can tell them that we want.

1124
01:10:39,320 –> 01:10:42,160
To engage you Say the word and.

1125
01:10:46,160 –> 01:10:49,920
Walk me through that. I like, I want to hear. I want to

1126
01:10:49,920 –> 01:10:52,920
hear how. Okay, so here’s how it would sound. So you’re going to bring me

1127
01:10:52,920 –> 01:10:56,740
this wild and crazy idea right tomorrow and you’re going to tell me something

1128
01:10:56,740 –> 01:11:00,500
that’s totally, completely the opposite of the vision or maybe goes

1129
01:11:00,500 –> 01:11:03,220
a different direction of the vision or the goals than what we, than what we

1130
01:11:03,220 –> 01:11:06,540
want to have or whatever. And I know what the vision and goals are. I

1131
01:11:06,540 –> 01:11:09,700
know that you don’t have all the information. You bring me the idea

1132
01:11:10,180 –> 01:11:13,300
and I let you talk. That’s the first thing. I created an environment where you

1133
01:11:13,300 –> 01:11:15,740
can actually talk and you feel comfortable to bring it to me in the first

1134
01:11:15,740 –> 01:11:19,300
place. So it’s not a fear based environment. It’s actually a growth environment.

1135
01:11:19,300 –> 01:11:23,150
Okay, so that’s sort of the, the table stakes,

1136
01:11:23,150 –> 01:11:26,910
right? To begin with, I eliminate. But from my response,

1137
01:11:26,910 –> 01:11:30,670
I say, Tom, that is a great idea. And I would like you to go

1138
01:11:30,670 –> 01:11:34,350
do some more research on that, find me everything that’s possible about that

1139
01:11:34,350 –> 01:11:37,350
idea and come up with a plan for me for how we can execute on

1140
01:11:37,350 –> 01:11:38,830
this in the next six months.

1141
01:11:42,990 –> 01:11:46,250
I didn’t say I wouldn’t do it. I didn’t say it didn’t fit. I didn’t

1142
01:11:46,250 –> 01:11:49,930
say I wasn’t interested. Even my facial expression, my micro expressions are

1143
01:11:49,930 –> 01:11:53,570
ones of curiosity and are ones of interest. I’m not

1144
01:11:53,570 –> 01:11:57,410
telling you what I don’t, what I know. Because see, here’s the thing most

1145
01:11:57,410 –> 01:11:59,490
leaders fail to understand.

1146
01:12:01,890 –> 01:12:05,290
This is going to be a hard truth coming out of here. But I think

1147
01:12:05,290 –> 01:12:09,050
Hannah Arendt would appreciate this. Most leaders have to

1148
01:12:09,050 –> 01:12:12,770
understand that most people don’t really care what you

1149
01:12:12,770 –> 01:12:16,490
know, they care what they can bring

1150
01:12:16,490 –> 01:12:20,290
to the table. They care very much about how they feel about what they

1151
01:12:20,290 –> 01:12:23,610
can bring to the table. And they care very much about

1152
01:12:24,010 –> 01:12:27,850
the advancing of themselves coming to the table. It

1153
01:12:27,850 –> 01:12:31,490
is rare that you are going to find someone who can handle the

1154
01:12:31,490 –> 01:12:35,050
butt in that sentence and still go away undefeated.

1155
01:12:35,770 –> 01:12:39,410
So since that’s a rarity in the beginning, we have to create the

1156
01:12:39,410 –> 01:12:41,930
table stakes of the environment. This is what I would do if I was advising.

1157
01:12:41,930 –> 01:12:44,070
This is what I would say. We have to create the table stakes of the

1158
01:12:44,070 –> 01:12:47,910
environment where that trust building can begin. And then we have to

1159
01:12:47,910 –> 01:12:51,630
continually build on those building blocks with every single word that we say. And

1160
01:12:51,630 –> 01:12:55,350
so it has to be intentional. So we replace the but with

1161
01:12:55,350 –> 01:12:58,750
an and. We replace the but with an or.

1162
01:12:59,870 –> 01:13:03,710
And here’s the other thing we do when that person comes back

1163
01:13:03,710 –> 01:13:07,430
to us because they will. We

1164
01:13:07,430 –> 01:13:10,980
take seriously what they have brought. Brought to us. And we

1165
01:13:10,980 –> 01:13:14,460
say, thank you for bringing this to me and doing the research. You clearly have

1166
01:13:14,460 –> 01:13:17,780
looked at this. This is what I’ve seen. Where are the gaps?

1167
01:13:18,180 –> 01:13:21,820
And now we’re actually collaborating. Now we’re actually being innovative.

1168
01:13:21,820 –> 01:13:24,980
Now we’re actually moving the thing forward at

1169
01:13:25,220 –> 01:13:29,060
scale. This defeats us because at a certain

1170
01:13:29,060 –> 01:13:32,780
point we can’t know everybody and everybody’s personalities. By the

1171
01:13:32,780 –> 01:13:36,100
way, Dunbar’s number says that we can only keep track of about

1172
01:13:36,100 –> 01:13:39,820
150 people. 150 different relationships in our head at any

1173
01:13:39,820 –> 01:13:43,620
given point. Any given point. Most humans can’t go past that.

1174
01:13:44,420 –> 01:13:48,260
And so scale defeats us, which is why we have to have good lieutenants

1175
01:13:48,500 –> 01:13:52,260
and good captains who are also trained in not saying

1176
01:13:52,740 –> 01:13:54,660
the word but.

1177
01:13:58,340 –> 01:14:00,900
And that’s really hard.

1178
01:14:04,180 –> 01:14:07,990
So I gave you the correct answer. And I also gave you the

1179
01:14:07,990 –> 01:14:10,750
answer that’s really hard, which is why it’s the correct one.

1180
01:14:12,910 –> 01:14:13,870
I get it. And

1181
01:14:16,990 –> 01:14:20,830
Tom’s gotta go. Tom’s got a hard stop here coming up in about three

1182
01:14:20,830 –> 01:14:23,790
minutes. And we didn’t really resolve anything today, but.

1183
01:14:25,870 –> 01:14:28,830
But we did. I’m out. You said, but I’m out.

1184
01:14:30,830 –> 01:14:34,580
We did successfully discuss and talk about some of the

1185
01:14:34,580 –> 01:14:38,260
themes, a couple of the themes, anyway. I think the core theme actually in

1186
01:14:38,260 –> 01:14:42,020
Eichmann in Jerusalem by Hannah Arendt. I would encourage you to pick this

1187
01:14:42,020 –> 01:14:45,700
book up if you are a leader and read it closely. I would also

1188
01:14:45,700 –> 01:14:49,500
encourage you to watch the video on Hannah Arendt and the

1189
01:14:49,500 –> 01:14:52,780
article that we are going to link to about AI from Matthew Crawford that are

1190
01:14:52,780 –> 01:14:55,780
going to be in the links in the show notes

1191
01:14:56,340 –> 01:15:00,160
below, the player of whatever podcast player you are listening to

1192
01:15:00,160 –> 01:15:03,880
this podcast on. And I would like to thank

1193
01:15:03,880 –> 01:15:07,480
Tom Libby for coming on and joining us today. And with that, well,

1194
01:15:09,240 –> 01:15:09,880
we’re out.