Why Don’t We Learn From History by B.H. Liddell Hart w/Tom Libby & Jesan Sorrells
—
00:00 Welcome and Introduction – Why Don’t We Learn From History by B.H. Liddell Hart
01:00 “Cracks in the Human Condition”
09:17 Informal Committees Shape Decisions
11:07 “Psychology, Memory, and Bias”
18:29 “Majority Shapes Right and Wrong”
25:14 “Negotiating Reality in America”
30:30 “Systems, Democracy, and Global Chaos”
35:17 Historical Bias and Emotional Impact
39:29 AI Adoption: Reality vs Expectations
44:20 “AI Essential for Business Success”
48:27 “Pre-WWII Polish Guarantee Debate”
56:42 Post-COVID Education System Concerns
59:43 “Making Poetry and History Alive”
01:06:26 Teaching Complexity in History
01:11:23 “Drunk on Ideas, Not Rote”
01:16:12 “Perspective and Counterfactual Insights”
01:23:39 “The Perils of Intellectual Neglect”
01:24:54 “Emotion, Truth, and Progress”
01:31:28 “Misguided Solidarity and History”
01:39:23 Passion for History and Context
01:41:20 “Cycle of Pride and Conflict”
01:46:57 “Wealth Fails to Ensure Legacy”
—
Opening and closing themes composed by Brian Sanyshyn of Brian Sanyshyn Music.
—
- Pick up your copy of 12 Rules for Leaders: The Foundation of Intentional Leadership NOW on AMAZON!
- Check out the 2022 Leadership Lessons From the Great Books podcast reading list!
—
★ Support this podcast on Patreon ★
- Subscribe to the Leadership Lessons From The Great Books Podcast: https://bit.ly/LLFTGBSubscribe
- Check out HSCT Publishing at: https://www.hsctpublishing.com/.
- Check out LeadingKeys at: https://www.leadingkeys.com/
- Check out Leadership ToolBox at: https://leadershiptoolbox.us/
- Contact HSCT for more information at 1-833-216-8296 to schedule a full DEMO of LeadingKeys with one of our team members.
—
- Leadership ToolBox website: https://leadershiptoolbox.us/.
- Leadership ToolBox LinkedIn: https://www.linkedin.com/company/ldrshptlbx/.
- Leadership ToolBox YouTube: https://www.youtube.com/@leadershiptoolbox/videos
- Leadership ToolBox Twitter: https://twitter.com/ldrshptlbx.
- Leadership ToolBox IG: https://www.instagram.com/leadershiptoolboxus/.
- Leadership ToolBox FB: https://www.facebook.com/L
00:00:01,360 –> 00:00:05,000
Hello, my name is Jesan Sorrells, and
2
00:00:05,000 –> 00:00:08,480
this is the Leadership Lessons from the Great Books podcast,
3
00:00:08,720 –> 00:00:12,560
episode number 167,
4
00:00:13,760 –> 00:00:17,600
which puts us at about. Let’s see, how
5
00:00:17,600 –> 00:00:20,560
many episodes is that? Eight.
6
00:00:21,040 –> 00:00:24,640
Yeah. Eight episodes from our 175th
7
00:00:25,120 –> 00:00:28,910
episode. So we should get there by the end of this year. And thank you
8
00:00:28,910 –> 00:00:31,430
all for joining us on this journey. All right,
9
00:00:32,710 –> 00:00:35,990
so on this show, my co host
10
00:00:36,390 –> 00:00:39,670
who’s joining me today, Tom and I have
11
00:00:40,070 –> 00:00:43,830
circled around and around and around, coming back
12
00:00:44,070 –> 00:00:47,750
similar to the Nietzschean Myth of Return, to
13
00:00:47,990 –> 00:00:51,670
a core idea that is reflected in all of the books
14
00:00:51,800 –> 00:00:55,480
that we talk about on this show. And it is
15
00:00:55,480 –> 00:00:58,880
the idea that, and Tom is probably going to say it again today, the more
16
00:00:58,880 –> 00:01:02,680
things change, the more they stay the same.
17
00:01:03,560 –> 00:01:06,920
And this is true. It’s. It’s what people in Washington,
18
00:01:06,920 –> 00:01:10,760
D.C. pre Covid used to call a true fact.
19
00:01:10,920 –> 00:01:13,480
I don’t know what they call it now, but it is definitely true,
20
00:01:14,600 –> 00:01:18,430
which is why we keep returning to this idea repeatedly, no matter the
21
00:01:18,430 –> 00:01:21,630
book, no matter the genre, and no matter the author.
22
00:01:22,990 –> 00:01:26,790
In one way, if you’re listening to this, you could conclude that
23
00:01:26,790 –> 00:01:30,150
this fact of return reflects something inherent in the human condition
24
00:01:30,150 –> 00:01:33,989
itself. I think Richard Messing, who
25
00:01:33,989 –> 00:01:37,590
came on and talked with us about man’s search for meaning, and Viktor Frankl
26
00:01:37,590 –> 00:01:41,070
might say that. Right. In another
27
00:01:41,070 –> 00:01:44,470
way, you could say that this fact of noticing and
28
00:01:44,470 –> 00:01:47,950
recognizing such a conclusion reflects the idea that we are
29
00:01:47,950 –> 00:01:51,430
aware, deeply so, of the broken parts and
30
00:01:51,430 –> 00:01:55,190
cracks in the facade of how we address the clearly
31
00:01:55,190 –> 00:01:57,590
broken parts of the human condition.
32
00:01:58,790 –> 00:02:02,590
And it may indicate that noticing the how opens up an
33
00:02:02,590 –> 00:02:06,310
opportunity for all of us as leaders to seek and to
34
00:02:06,310 –> 00:02:09,950
explore and to maybe try to get a glimpse of the
35
00:02:09,950 –> 00:02:13,670
light that lies behind the cracks in the human
36
00:02:13,910 –> 00:02:17,750
condition. So today on the show we are
37
00:02:17,750 –> 00:02:21,470
covering, we’re going to talk about topics from a book
38
00:02:21,470 –> 00:02:25,190
that does its best and the best that it can
39
00:02:25,670 –> 00:02:29,430
to engage in noticing the cracks in the human face of
40
00:02:29,430 –> 00:02:32,710
the problems and challenges of the human condition. And
41
00:02:33,110 –> 00:02:36,910
a book that tries to examine and really talk about what lies behind those cracks
42
00:02:36,910 –> 00:02:40,620
in a coherent fashion. So let’s start on
43
00:02:40,620 –> 00:02:44,380
our journey repeating probably same things
44
00:02:44,380 –> 00:02:48,140
we’ve repeated before on this show, although there’s always new people joining us,
45
00:02:48,140 –> 00:02:51,820
so it’s always new to you. Today we will
46
00:02:51,820 –> 00:02:55,340
be covering the book and the question
47
00:02:55,500 –> 00:02:59,100
posed by it. Oh, I love that. Why don’t we Learn
48
00:02:59,100 –> 00:03:02,780
From History by B.H. liddell
49
00:03:03,180 –> 00:03:07,030
Hart? By the way, my book has a yellow cover.
50
00:03:07,590 –> 00:03:10,470
Your mileage will vary. I do like the yellow cover.
51
00:03:11,670 –> 00:03:15,350
Leaders. The penultimate question of all time, which
52
00:03:15,350 –> 00:03:19,110
is the question of why don’t we learn from history, has nothing
53
00:03:19,110 –> 00:03:21,910
but hard answers to it.
54
00:03:23,830 –> 00:03:27,590
And of course, on my journey through this book,
55
00:03:27,990 –> 00:03:31,420
we will be joined today by our regular co host, Tom Libby.
56
00:03:31,730 –> 00:03:35,490
How you doing today, Tom? Doing well, Jesan. And you know, I gotta be
57
00:03:35,490 –> 00:03:38,850
honest, I’m not surprised in the least that you selected me to help you with
58
00:03:38,850 –> 00:03:41,730
this book because to your point, I think I’ve said
59
00:03:42,530 –> 00:03:45,810
at least two dozen times on this podcast
60
00:03:46,130 –> 00:03:49,570
that phrase that the more things change, the more things stay the same. So I
61
00:03:49,570 –> 00:03:53,370
appreciate, I appreciate you tapping me in on this one because I think we’re
62
00:03:53,370 –> 00:03:57,130
gonna have a lot of fun with the, with this particular subject matter.
63
00:03:57,130 –> 00:04:00,810
So, so appreciate it. Yeah, absolutely. Well, I would encourage everybody to go and
64
00:04:00,810 –> 00:04:04,050
listen to the intro episode where we talk a little bit about
65
00:04:04,690 –> 00:04:06,850
BH Lidell Hart as a writer.
66
00:04:08,530 –> 00:04:12,210
He was, he was an interesting guy just
67
00:04:12,210 –> 00:04:15,730
in general. He was part of the generation of
68
00:04:15,730 –> 00:04:18,210
folks that was born in the,
69
00:04:19,570 –> 00:04:23,250
in the late 19th century, came of age
70
00:04:23,410 –> 00:04:26,130
in the early 20th century,
71
00:04:27,310 –> 00:04:30,990
fought in World War I. And just like C.S. lewis and
72
00:04:30,990 –> 00:04:34,110
J.R.R. tolkien and Eric Remarque
73
00:04:34,750 –> 00:04:38,550
and other notable names like T.E. lawrence, Lawrence of
74
00:04:38,550 –> 00:04:41,150
Arabia, Winston Churchill,
75
00:04:42,589 –> 00:04:46,270
and of course, your friend and mine, Adolf Hitler,
76
00:04:48,590 –> 00:04:52,350
he learned all kinds of interesting lessons
77
00:04:52,350 –> 00:04:55,750
about history, BH Lidelhart did, and about human
78
00:04:55,750 –> 00:04:59,590
nature in the trenches of the Somme
79
00:04:59,670 –> 00:05:03,150
and of Verdun and at Verdun. And we
80
00:05:03,150 –> 00:05:06,750
explore a lot of that history on that intro episode. So I would
81
00:05:06,750 –> 00:05:10,070
encourage you to, to go and take a listen to it. Now. We’re not going
82
00:05:10,070 –> 00:05:13,750
to dive so much into, into Liddell Hart’s as a person today
83
00:05:13,750 –> 00:05:16,710
on the show. Instead, we’re going to really focus on, we’re really going to focus
84
00:05:16,710 –> 00:05:20,190
on the ideas in the book. And, and when you open up, why don’t we
85
00:05:20,190 –> 00:05:23,750
learn from history what you see. And I mentioned this in the intro
86
00:05:23,750 –> 00:05:27,350
episode as well. What you see is that the
87
00:05:27,350 –> 00:05:30,950
preface was written by his son, which is great. The version that I have,
88
00:05:30,950 –> 00:05:34,790
Adrian J. Liddell Hart, who actually made a name for himself in
89
00:05:34,790 –> 00:05:37,950
World War II and in the military as well.
90
00:05:38,510 –> 00:05:41,190
And when you open up the table of contents, you see that the book is
91
00:05:41,190 –> 00:05:44,270
divided into three parts, right? There’s
92
00:05:44,670 –> 00:05:48,030
History and Truth, which is a great way to start a book
93
00:05:48,480 –> 00:05:52,280
entitled why Don’t We Learn from History, Government and Freedom and then
94
00:05:52,280 –> 00:05:56,120
War and Peace. And so as a military historian, you would think that he would
95
00:05:56,120 –> 00:05:59,520
start with War and Peace first, but no, no, he saves that for the end.
96
00:06:00,160 –> 00:06:02,640
And each One of the
97
00:06:03,520 –> 00:06:07,120
sections features a short essay, usually no more than four
98
00:06:07,120 –> 00:06:10,800
pages, where he lays out his ideas very succinctly.
99
00:06:11,040 –> 00:06:14,080
And this book really, Tom, really
100
00:06:15,570 –> 00:06:19,370
sort of puts me in mind of the essay
101
00:06:19,370 –> 00:06:23,050
by George Orwell that we covered where he talked about English literature.
102
00:06:23,050 –> 00:06:26,890
Right. It’s, it’s sort of the clearest example of clear writing that I’ve
103
00:06:26,890 –> 00:06:30,649
seen in a while. And, and partially that’s because they’re both, you
104
00:06:30,649 –> 00:06:33,570
know, they’re both English. They both came out of the English, you know, writing,
105
00:06:35,570 –> 00:06:39,410
thinking, literature structure that was built in Europe.
106
00:06:39,410 –> 00:06:43,220
Right. So they’re products of that. They’re products of that European and
107
00:06:43,220 –> 00:06:46,860
English tradition that goes back, you know, well over
108
00:06:46,940 –> 00:06:50,300
1500 years of just getting clarity in writing.
109
00:06:52,220 –> 00:06:55,580
And as a military historian, Liddell Hart really pursued
110
00:06:56,140 –> 00:06:59,860
clarity in thinking and clarity in writing. And so when you read
111
00:06:59,860 –> 00:07:03,500
these essays, he doesn’t mince words. There’s no
112
00:07:03,500 –> 00:07:05,660
fat in anything in here.
113
00:07:07,670 –> 00:07:11,350
So I’m going to open up with
114
00:07:11,670 –> 00:07:15,150
the section Restraints of Democracy and Power
115
00:07:15,150 –> 00:07:18,990
Politics in a Democracy. And this essay is in Government
116
00:07:18,990 –> 00:07:22,390
and Freedom, which is in the, the second
117
00:07:22,710 –> 00:07:26,550
section of his essays. And each one of the
118
00:07:26,550 –> 00:07:29,350
essays does build on the other one. So we’re going to pull them out separately,
119
00:07:29,350 –> 00:07:32,390
we’re going to talk about them in relation to why don’t we learn from history?
120
00:07:32,550 –> 00:07:36,310
But there’s lessons for leaders in all of this, you know, whether you’re a
121
00:07:36,310 –> 00:07:40,060
civic leader, leader of a nonprofit, leader of a for profit, or
122
00:07:40,060 –> 00:07:42,100
even just a leader of your family or community.
123
00:07:43,700 –> 00:07:47,380
So let’s go to the book Restraints
124
00:07:47,380 –> 00:07:51,020
of Democracy, and I quote, we
125
00:07:51,020 –> 00:07:54,300
learn from history that democracy has commonly put a premium on
126
00:07:54,300 –> 00:07:58,100
conventionality. By its nature, it prefers those who keep
127
00:07:58,100 –> 00:08:01,940
in step with the slowest march of thought and frowns on
128
00:08:01,940 –> 00:08:05,540
those who may disturb, quote, unquote, the conspiracy for mutual
129
00:08:05,540 –> 00:08:09,260
inefficiency. Thereby, this system
130
00:08:09,260 –> 00:08:12,820
of government tends to result in the triumph of mediocrity and
131
00:08:12,820 –> 00:08:16,380
entails the exclusion of first rate ability. If this is combined
132
00:08:16,380 –> 00:08:20,180
with honesty. But the alternative to it,
133
00:08:20,180 –> 00:08:23,580
despotism, almost inevitably means the triumph of
134
00:08:23,660 –> 00:08:27,420
stupidity. And of the two evils, the former
135
00:08:27,660 –> 00:08:31,420
is less. Hence it is better that ability should
136
00:08:31,420 –> 00:08:35,220
consent to its own sacrifice and subordination to the regime of
137
00:08:35,220 –> 00:08:38,660
mediocrity. Rather than assist in establishing a regime where,
138
00:08:38,900 –> 00:08:42,340
in the light of past experience, brute stupidity will be enthroned
139
00:08:42,820 –> 00:08:46,340
and ability may only preserve its footing at the price of
140
00:08:46,340 –> 00:08:49,780
dishonesty. What is the value
141
00:08:50,020 –> 00:08:53,700
in England and America and what is of value
142
00:08:53,700 –> 00:08:57,300
in England and America and worth defending its tradition of freedom, the
143
00:08:57,300 –> 00:09:01,020
guarantee of its vitality. Our civilization, like the Greek, has,
144
00:09:01,020 –> 00:09:04,620
for all its blundering way, taught the value of freedom, of criticism, of authority,
145
00:09:04,620 –> 00:09:08,460
and of harmonizing this with order. Anyone who urges a different
146
00:09:08,460 –> 00:09:11,780
system for efficiency’s sake is betraying the vital
147
00:09:11,780 –> 00:09:15,580
tradition. Then he switches to power
148
00:09:15,580 –> 00:09:19,020
politics in a democracy. And I want to point this out.
149
00:09:19,340 –> 00:09:23,070
He says this talking about how decisions get
150
00:09:23,070 –> 00:09:26,830
made in a democratic government. He says this while committee meetings are
151
00:09:26,830 –> 00:09:30,390
not so frequently held in the late afternoon as in the morning, dinner
152
00:09:30,390 –> 00:09:34,070
itself provides both an opportunity and an atmosphere suited to the informal kind of committee
153
00:09:34,230 –> 00:09:37,749
that tends to be more influential than those which are formally constituted.
154
00:09:38,150 –> 00:09:41,430
The informal type is usually small, and the smaller it is, the more
155
00:09:41,430 –> 00:09:45,150
influential it may be. The two or three gathered
156
00:09:45,150 –> 00:09:48,950
together may outweigh a formal committee of 20 or 30 members, to which
157
00:09:48,950 –> 00:09:52,710
it may often be related under the blanket, where it is assembled by
158
00:09:52,710 –> 00:09:56,550
someone who has a leading voice in a larger official committee. For it
159
00:09:56,550 –> 00:10:00,310
will represent his personal selection in the way of consultants. And
160
00:10:00,310 –> 00:10:03,990
its members being chosen for their congeniality as well as for their advisory
161
00:10:03,990 –> 00:10:07,470
value is likely to reach clear cut conclusions, which in turn may be
162
00:10:07,470 –> 00:10:11,110
translated into the decisions of a formal committee. For at any
163
00:10:11,110 –> 00:10:14,590
gathering of 20 or 30 men there is likely to be so much diversity
164
00:10:14,830 –> 00:10:18,390
and nebulosity of views that the consent of the majority can
165
00:10:18,390 –> 00:10:21,730
generally be gained for any conclusion that is sufficiently definite and
166
00:10:21,880 –> 00:10:25,480
impressively backed by well considered arguments and sponsored by a heavyweight
167
00:10:25,480 –> 00:10:28,760
member, especially if the presentation is
168
00:10:28,760 –> 00:10:32,360
carefully stage managed.
169
00:10:36,840 –> 00:10:37,640
I love that
170
00:10:40,440 –> 00:10:43,280
we don’t have to talk about power politics because we, we don’t like to talk
171
00:10:43,280 –> 00:10:47,080
about that. But this still, especially in the
172
00:10:47,080 –> 00:10:50,720
current political landscape. What you just
173
00:10:50,720 –> 00:10:54,380
read could probably blow up half the people’s brains in country right
174
00:10:54,380 –> 00:10:58,060
now. Well, so let’s open
175
00:10:58,060 –> 00:11:00,820
up with that. The very first question that I have from here,
176
00:11:01,860 –> 00:11:04,420
Tom, why don’t we learn from history?
177
00:11:05,940 –> 00:11:09,620
Get right into it. Good lord. So I,
178
00:11:09,620 –> 00:11:12,900
I think there’s so many factors here and I, I. One thing that I will
179
00:11:12,900 –> 00:11:16,580
say, and I know we’re not going to get into Liddell’s life per
180
00:11:16,580 –> 00:11:20,110
se, but I wonder how
181
00:11:20,110 –> 00:11:23,910
much more, how much more impact his
182
00:11:23,910 –> 00:11:27,430
book would have had if he had today’s access to
183
00:11:27,430 –> 00:11:31,070
psychology and like the psychological research, like some of the
184
00:11:31,070 –> 00:11:34,630
psychological research behind some of the stuff that he talks about is like,
185
00:11:35,190 –> 00:11:38,630
it’s actually because, because of his book, I’ve, I’ve seen several
186
00:11:39,590 –> 00:11:40,950
research, research
187
00:11:44,870 –> 00:11:48,470
papers been done basically because of this. Right. So in,
188
00:11:48,630 –> 00:11:52,430
so to go back to your, to your point and, and I don’t know what
189
00:11:52,430 –> 00:11:55,830
the technical terms for them are, but like there is something to be said about,
190
00:11:55,830 –> 00:11:57,510
like, about
191
00:11:59,670 –> 00:12:03,510
memory bias, right? So like, so we sometimes don’t
192
00:12:03,510 –> 00:12:07,190
learn from history because quite honestly we’re only, we’re
193
00:12:07,190 –> 00:12:10,790
very biased to the history that we read, right? So take, I mean
194
00:12:10,790 –> 00:12:14,560
US History is a very good example of this when you go and you start
195
00:12:14,560 –> 00:12:16,080
reading, if you were to read.
196
00:12:18,240 –> 00:12:20,480
And the other part of it too, and I think he talks about it a
197
00:12:20,480 –> 00:12:24,000
little bit in the book where the, because the victor
198
00:12:24,000 –> 00:12:27,760
usually writes the majority of, of passages when it comes
199
00:12:27,760 –> 00:12:31,360
to enter your subject matter here, whether it’s
200
00:12:31,600 –> 00:12:35,040
War one, World War two, it could be the cola wars for all I care.
201
00:12:35,120 –> 00:12:38,880
It doesn’t matter. Like whatever, whatever conflict
202
00:12:38,880 –> 00:12:42,580
or situation that you’re talking about, it’s usually the, to
203
00:12:42,580 –> 00:12:46,180
the victor goes the spoils, right? So somebody who wins that fight or
204
00:12:46,180 –> 00:12:50,020
wins that race or wins that whatever is going to write.
205
00:12:50,020 –> 00:12:53,860
You’re going to pay more attention to their writings. Therefore, you’re going
206
00:12:53,860 –> 00:12:57,460
to see the results of the victory and you’re not going to, you have a
207
00:12:57,460 –> 00:13:01,020
very, you have a very, very conscious bias of what
208
00:13:01,020 –> 00:13:04,820
history looks like. So you tend to, to not worry about.
209
00:13:04,980 –> 00:13:08,740
Good example. Again, you are talking about the current political landscape that
210
00:13:08,740 –> 00:13:12,220
we’re talking about. Everybody. You can go back in history
211
00:13:12,220 –> 00:13:16,060
and pinpoint times in history where you can say
212
00:13:16,300 –> 00:13:19,420
Hitler gained control of
213
00:13:19,900 –> 00:13:23,380
the political catastrophe that
214
00:13:23,380 –> 00:13:27,140
happened in Germany. We view it now as
215
00:13:27,140 –> 00:13:30,780
a political catastrophe. Well, guess what? Because all the Allies wrote all the,
216
00:13:31,260 –> 00:13:34,780
wrote all the, all the history, right? So but
217
00:13:35,350 –> 00:13:39,070
at the moment and in the time frame of the, in Germany
218
00:13:39,070 –> 00:13:42,670
when all that was happening, the, the people in the moment did not view this
219
00:13:42,670 –> 00:13:46,390
as a, as a, necessarily a bad thing. Were there people that were
220
00:13:46,390 –> 00:13:50,150
like, hey, wait a second, should this really be happening? Maybe,
221
00:13:50,150 –> 00:13:53,990
but they, their voice was never heard because a vast
222
00:13:53,990 –> 00:13:57,670
majority of people, he was a charismatic speaker, people he followed,
223
00:13:57,670 –> 00:14:01,350
people followed him, et cetera, et cetera. And nobody ever saw it coming, so to
224
00:14:01,350 –> 00:14:04,890
speak. Yeah, everybody thought they could do
225
00:14:05,050 –> 00:14:08,410
a deal with that guy. I mean even, even
226
00:14:09,850 –> 00:14:13,650
not Joseph Kennedy. Charles Lindbergh who ran for President. Charles Lindbergh
227
00:14:13,650 –> 00:14:17,410
who ran for President. Henry Ford, right. Who wanted
228
00:14:17,410 –> 00:14:21,130
to. Not wanted to, but helped. I
229
00:14:21,130 –> 00:14:24,930
believe it was either Mercedes or it
230
00:14:24,930 –> 00:14:28,210
might have been IG Farben. I can’t remember who he helped out, but he went
231
00:14:28,210 –> 00:14:30,890
over to, he did, he went over to Germany and he helped him set up
232
00:14:31,150 –> 00:14:34,950
factories to like for the purposes of re. Industrialization after
233
00:14:34,950 –> 00:14:38,110
the, to your point, the disaster of the Weimar Republic and inflation.
234
00:14:38,430 –> 00:14:42,190
Right. But Henry Ford was another guy who thought,
235
00:14:42,270 –> 00:14:45,950
yeah, okay, you know, we can, we can deal with this guy. Even Joe
236
00:14:45,950 –> 00:14:49,790
Kennedy, John Kennedy’s and Ted Kennedy’s dad,
237
00:14:50,270 –> 00:14:54,110
Joe Kennedy got in trouble. I believe he was the ambassador
238
00:14:54,190 –> 00:14:57,870
to England from America during
239
00:14:57,870 –> 00:15:00,830
the Roosevelt administration. He got in trouble
240
00:15:01,990 –> 00:15:05,710
just before the war kicked off by basically saying, hey, you know what?
241
00:15:05,710 –> 00:15:09,350
This Hitler guy, he’s not terrible. Like, we
242
00:15:09,350 –> 00:15:12,830
could probably do a deal with him. It’s fine. Hell, Stalin did a deal with
243
00:15:12,830 –> 00:15:16,510
Hitler. So, so what is the rest of the world looking at our
244
00:15:16,510 –> 00:15:20,270
current administration as? And there’s. There are people in our
245
00:15:20,270 –> 00:15:24,110
country that thinks that that’s, that, that we’re watching history repeat itself as
246
00:15:24,110 –> 00:15:27,910
we speak now. There are. And I want to, I want
247
00:15:27,910 –> 00:15:30,710
to talk a little about that today, because reading this book in the context of
248
00:15:30,710 –> 00:15:34,540
the current political climate that we are in was extremely
249
00:15:34,540 –> 00:15:37,460
interesting. Yeah. There’s something that I, I think
250
00:15:38,260 –> 00:15:41,980
there’s stark differences that. Oh, yeah, what was in place? Like, we have
251
00:15:41,980 –> 00:15:45,540
some checks and balances and we have things that like that are, that our government.
252
00:15:45,940 –> 00:15:49,700
I, I don’t see, I don’t see our current
253
00:15:49,700 –> 00:15:53,460
administration turning into Hitler, but there are people in our country that think that
254
00:15:53,860 –> 00:15:57,580
there are. There are. Correct. Right. Because. Because some of the signs and stuff are
255
00:15:57,580 –> 00:16:01,150
there. But, but again, where Germany didn’t have checks and
256
00:16:01,150 –> 00:16:04,790
balances, we do. The science can be there all they want as long
257
00:16:04,790 –> 00:16:08,110
as our government operates as the way that they’re supposed to
258
00:16:08,110 –> 00:16:11,750
operate. We don’t, we, we’re not going to have that. It’s, it’s,
259
00:16:11,750 –> 00:16:14,789
it’s also, whether you like. Him or not is not my point. I could care
260
00:16:14,789 –> 00:16:17,190
less whether you like him or not. That’s not, that’s not what I’m getting at
261
00:16:17,190 –> 00:16:20,190
here. Yeah, that’s not what we’re getting at here. What we’re talking about also. And
262
00:16:20,190 –> 00:16:24,030
this gets to, to what I just read there. So the
263
00:16:24,030 –> 00:16:26,990
ways in which most European
264
00:16:27,890 –> 00:16:31,570
governments were set up even after World War I,
265
00:16:31,570 –> 00:16:34,610
everything cracked apart after World War I, but
266
00:16:35,490 –> 00:16:38,930
certain ways still struggled on. Even 80 years
267
00:16:39,250 –> 00:16:42,770
later in our time, there’s still evidence of this.
268
00:16:43,010 –> 00:16:46,770
So Europe and England and specifically
269
00:16:47,010 –> 00:16:48,610
European countries like France,
270
00:16:51,250 –> 00:16:54,910
Russia, not so
271
00:16:54,910 –> 00:16:58,430
much Spain, although you could throw a spade in there. Italy for sure.
272
00:16:58,510 –> 00:17:02,270
And of course, Germany come out of a
273
00:17:02,350 –> 00:17:05,230
concept or have a, have an inbuilt concept of
274
00:17:05,470 –> 00:17:08,910
aristocratic rule that we don’t have.
275
00:17:10,350 –> 00:17:14,190
We explicitly rejected that. And so
276
00:17:14,350 –> 00:17:18,190
in the United States, our founding explicitly rejects aristocratic rule.
277
00:17:18,830 –> 00:17:22,450
This is where the, and these
278
00:17:22,450 –> 00:17:25,690
people, maybe they had a point, maybe they didn’t, but that the protesters this summer
279
00:17:25,690 –> 00:17:29,250
were talking about no kings or whatever. Like. Well,
280
00:17:29,570 –> 00:17:33,330
I mean, if you look at the Constitution and if you look at the three,
281
00:17:34,610 –> 00:17:38,210
the three branches of government, like they’re
282
00:17:38,210 –> 00:17:42,010
functioning exactly constitutionally as they should be. You just don’t like
283
00:17:42,010 –> 00:17:45,770
the decisions that they’re making. Which is, which is, which is why you
284
00:17:45,770 –> 00:17:49,490
get to vote. That’s why you, that’s why, that’s
285
00:17:49,490 –> 00:17:53,200
why, like we were talking about with, with Charlie Kirk’s assassination
286
00:17:53,200 –> 00:17:56,680
and around that this is why freedom of speech matters. Because which,
287
00:17:56,920 –> 00:18:00,320
which by the way, we also don’t have a tradition of in a European context.
288
00:18:00,320 –> 00:18:03,400
And so how decisions get made in an
289
00:18:03,400 –> 00:18:07,240
aristocratic, with an aristocratic mindset, in an aristocratic
290
00:18:07,240 –> 00:18:11,080
manner is fundamentally different than how decisions get made in a more. And this is
291
00:18:11,080 –> 00:18:14,720
point that Liddell Hart’s making in a more democratic mindset. So when he
292
00:18:14,720 –> 00:18:18,440
talks about a small cadre of people making decisions and then basically stage
293
00:18:18,440 –> 00:18:22,020
managing them for 20 or 30 other people, that comes. And
294
00:18:22,020 –> 00:18:25,300
Liddell Hart was English. Comes out of a specific aristocratic
295
00:18:25,300 –> 00:18:28,780
mindset. Yeah, yeah. That we don’t have.
296
00:18:29,340 –> 00:18:32,860
So, so as I started this conversation where I said
297
00:18:33,100 –> 00:18:36,860
I wish Lidell had access to some of the research that, that
298
00:18:36,860 –> 00:18:40,580
has, that psych psychology has come leaps and bounds over the last, you know,
299
00:18:40,580 –> 00:18:44,140
80 years since, since World War II. So, so there’s, there’s that,
300
00:18:44,220 –> 00:18:47,980
there’s that, that, you know, that hindsight bias that we have
301
00:18:48,320 –> 00:18:52,160
because, because history is written by the victors most times.
302
00:18:52,560 –> 00:18:56,280
And again, we’re getting better at that, but not. Still not great. But then
303
00:18:56,280 –> 00:18:59,960
there you have the other. I, I was once told that the difference
304
00:18:59,960 –> 00:19:03,760
between right and wrong is the majority. And that
305
00:19:03,760 –> 00:19:07,440
really, that really hit me hard too because now you’re saying to me
306
00:19:07,440 –> 00:19:11,040
that I could know that one plus one, we go back to Orwell, right?
307
00:19:11,120 –> 00:19:14,080
I could know that one plus one or two plus two equals four. But you’re
308
00:19:14,080 –> 00:19:16,960
going to say that the majority of people say it’s five. So now it’s just
309
00:19:16,960 –> 00:19:20,180
right. Five is right from now on. But that in.
310
00:19:20,740 –> 00:19:24,180
When it comes to something so linear as
311
00:19:24,180 –> 00:19:27,980
math, maybe you can make arguments against it. But when it comes
312
00:19:27,980 –> 00:19:31,220
to something that’s either opinion or, or
313
00:19:32,500 –> 00:19:36,300
consensus based or like, there’s a lot, there are a
314
00:19:36,300 –> 00:19:40,060
lot of situations where the right thing to do and this
315
00:19:40,060 –> 00:19:43,300
is where we get the whole Democratic vote, right? So you’re going to vote 100
316
00:19:43,300 –> 00:19:47,090
people vote or you know, 200 people vote and, or
317
00:19:47,090 –> 00:19:50,130
100 people vote and 51 of them say this. So we’re just going to do
318
00:19:50,130 –> 00:19:53,810
that. And 49 of them can know damn well that it’s the wrong thing to
319
00:19:53,810 –> 00:19:57,650
do. But the right. The difference between right and wrong is the
320
00:19:57,650 –> 00:20:01,490
majority right. Like, so there’s also some of that that happens throughout the
321
00:20:01,490 –> 00:20:04,810
course of history. And then there’s the final one that
322
00:20:05,290 –> 00:20:07,130
I think of, quite honestly is
323
00:20:09,050 –> 00:20:12,880
there’s a disassociation of time that happens. Right. The further
324
00:20:12,880 –> 00:20:16,160
away from something we get, we become more arrogant that
325
00:20:16,800 –> 00:20:19,800
we can see it, we know it’s happening. We’re not going to make the same
326
00:20:19,800 –> 00:20:23,520
mistakes because we know they’re there, but yet we do because we have a bias
327
00:20:23,920 –> 00:20:27,680
of disassociation of time. And I’ll give you an example of this one.
328
00:20:27,920 –> 00:20:31,720
Now, for those of you who can’t see me on the video here, I’m
329
00:20:31,720 –> 00:20:35,480
not a woman, so I’m not speaking as a woman. But childbirth, to me is
330
00:20:35,480 –> 00:20:39,320
a very good example of this on an individual basis. You’re a parent,
331
00:20:39,320 –> 00:20:42,200
I’m a parent. I don’t know if you spent time in the delivery room with
332
00:20:42,200 –> 00:20:45,640
your wife, but I did. Yeah. I watched what she went through
333
00:20:45,800 –> 00:20:49,440
and I went. Why would anybody do this more than once? Like,
334
00:20:49,440 –> 00:20:52,960
why? Like, honestly, like, the amount of the, the. The. The mental
335
00:20:52,960 –> 00:20:56,520
anguish, the physical anguish, the, the pain they go through. Like,
336
00:20:57,160 –> 00:21:00,880
it, it. Childbirth, to me, is one of the most fascinating things
337
00:21:00,880 –> 00:21:04,440
on the. In the entire natural world. Because
338
00:21:05,530 –> 00:21:08,810
women decide to do this again. Like, Right. I know, I know.
339
00:21:09,210 –> 00:21:12,410
Let me just say this for, for the record. I know,
340
00:21:12,650 –> 00:21:16,330
guys, if we went through that once, we’d be like, hell, no, we’re
341
00:21:16,330 –> 00:21:19,970
done. Nope, because we wouldn’t do that
342
00:21:19,970 –> 00:21:22,810
again. Like, it’s like we don’t have the same mindset. But. But to get back
343
00:21:22,810 –> 00:21:25,810
to Liddell and this. The reason I say it this way is because there’s a
344
00:21:25,810 –> 00:21:29,530
disassociation with time. So when women have. If you ever notice, like,
345
00:21:30,090 –> 00:21:33,630
two years later, they didn’t remember the pain the same
346
00:21:33,630 –> 00:21:36,830
way that you, observer, do. Right?
347
00:21:37,150 –> 00:21:40,790
Right. So, like. Right. By the way, guys, I’m totally kidding. Because we do
348
00:21:40,790 –> 00:21:44,550
stupid. We still do stupid stuff all the time. And we continue. You
349
00:21:44,550 –> 00:21:47,349
fall off a ladder, you still climb the ladder. I mean, you know, I’m just
350
00:21:47,349 –> 00:21:50,710
saying, like, as an observer, as an observer watching
351
00:21:50,710 –> 00:21:54,510
childbirth, and you think to yourself, why would anybody ever do this again? But
352
00:21:54,590 –> 00:21:57,870
as a woman watches her child grow up and she gets
353
00:21:57,870 –> 00:22:01,670
disassociated from the time of it, she decides to
354
00:22:01,670 –> 00:22:05,470
have another child. And I think that is Another symptom of
355
00:22:05,470 –> 00:22:09,190
what we’re talking about here. The longer we go from this, from the.
356
00:22:09,190 –> 00:22:12,710
Again, take this, our current political landscape, to
357
00:22:12,710 –> 00:22:16,550
Hitler. You can make all the associations you want. We feel
358
00:22:16,550 –> 00:22:19,310
like it’s not going to happen again because of the checks and balances that we
359
00:22:19,310 –> 00:22:22,990
have in place. But, but if you’re just a simple observer looking
360
00:22:22,990 –> 00:22:26,590
in and you’re looking at this going, holy crap, it’s happening again
361
00:22:26,590 –> 00:22:30,380
because these people have had so much time in between that they didn’t realize
362
00:22:30,380 –> 00:22:33,580
it. And by the way, go backwards in time. And you can say the same
363
00:22:33,580 –> 00:22:37,300
thing about people like Napoleon, about Hannibal, about, like, just
364
00:22:37,300 –> 00:22:40,900
keep going. Like you. There’s. There’s plenty of instances where that
365
00:22:41,140 –> 00:22:44,980
singular person is. Is that dynamic shift in the
366
00:22:44,980 –> 00:22:48,780
power of. And the balance of power. So it’s
367
00:22:48,780 –> 00:22:52,620
happened several times. I, Again, I don’t think. I think
368
00:22:52,620 –> 00:22:55,380
we have. I think we figured out some checks and balances. But I could see
369
00:22:55,380 –> 00:22:59,030
this happening again. Look at the current situation in Russia.
370
00:22:59,030 –> 00:23:02,750
Putin. Oh, yeah, right. Like, he, he’s proving that this could
371
00:23:02,750 –> 00:23:06,350
potentially happen again in Russia for sure. Like that. Like, I don’t know if the
372
00:23:06,350 –> 00:23:10,150
Russian government has checks and balances to, to make sure that he doesn’t do that,
373
00:23:10,150 –> 00:23:13,630
but I don’t think they do. They don’t. He’s been. He’s been running things
374
00:23:13,790 –> 00:23:17,470
pretty well. Well, well, you know, pretty consistently, I would say. Well, pretty
375
00:23:17,470 –> 00:23:20,910
consistently. So I’m not claiming to know their political. But I’m just saying, like,
376
00:23:21,150 –> 00:23:24,390
he’s working. I think it’s fairly consistently for the last 30 years there. As an
377
00:23:24,390 –> 00:23:28,060
observer of history, I see this happening again.
378
00:23:28,060 –> 00:23:31,060
That’s, that’s, that’s the point. But it’s the observer. Yeah, but I think. I think.
379
00:23:31,060 –> 00:23:34,460
So between those three things, I think if Liddell had access
380
00:23:34,620 –> 00:23:38,300
to really deep research in those psychological profiles,
381
00:23:38,380 –> 00:23:41,980
I think his book would have been even more impactful. I, I’m not
382
00:23:41,980 –> 00:23:44,700
suggesting it’s not impactful. I. And I, I think it’s really.
383
00:23:45,740 –> 00:23:49,380
I think it’s a really good book. But. And of course, it answers a
384
00:23:49,380 –> 00:23:52,960
question. I think it answers the question pretty well for its time. Yeah.
385
00:23:52,960 –> 00:23:56,760
But think about a guy like Liddell having access to the current psychology
386
00:23:56,840 –> 00:24:00,000
research that we have and how much more impactful he could have been with that
387
00:24:00,000 –> 00:24:02,440
book. So a couple things there.
388
00:24:05,080 –> 00:24:08,600
So we use
389
00:24:08,760 –> 00:24:12,560
history to. And Lidell Hart talks about
390
00:24:12,560 –> 00:24:15,960
this in his book, too. A variation of this. And again, to your point, pre.
391
00:24:16,280 –> 00:24:19,960
Not pre. But the depth of psychological research. When he wrote this, and he wrote
392
00:24:19,960 –> 00:24:23,520
it in, I believe it was the. Yeah. Published in
393
00:24:23,520 –> 00:24:27,180
1944. Ye. Yeah, exactly. So, you
394
00:24:27,180 –> 00:24:30,460
know, the, the degree to which, to your point, the degree to which psychological research
395
00:24:30,460 –> 00:24:33,940
has come along since 1944 is. Is leaps and
396
00:24:33,940 –> 00:24:36,500
bounds ahead of what he. He had in his time.
397
00:24:38,260 –> 00:24:41,900
But even then he understood something about human nature, which gets
398
00:24:41,900 –> 00:24:45,420
to a couple of different things that we’ve talked about on this show. So we
399
00:24:45,420 –> 00:24:49,220
talked about it in our extra episode where we. Where we discussed the
400
00:24:49,220 –> 00:24:52,020
movie Oppenheimer, which interestingly enough, I watched again last night,
401
00:24:53,820 –> 00:24:57,660
kind of just weirdly lined up with this. And then.
402
00:24:58,300 –> 00:25:02,140
And then we also talked about this in the Orwell episode, not only
403
00:25:02,220 –> 00:25:05,580
on his. In his essay on literature in the English language,
404
00:25:06,140 –> 00:25:09,740
but also in his books, you know, animal farm in 1984.
405
00:25:09,740 –> 00:25:13,580
Right. There’s a through line here. Right.
406
00:25:14,060 –> 00:25:17,660
And even books that we’ve covered, a couple books we cover
407
00:25:18,460 –> 00:25:22,230
from Theodore Roosevelt. We covered the
408
00:25:22,230 –> 00:25:24,750
book that he wrote way back when he was a
409
00:25:26,430 –> 00:25:29,950
representative, I think, or a senator in New York State.
410
00:25:30,110 –> 00:25:33,070
He wrote a book about power politics. We talked about that with Libby Younger.
411
00:25:34,910 –> 00:25:37,990
And the through line that Tom is getting to, and I do agree with it,
412
00:25:37,990 –> 00:25:41,830
is this. We have to figure out, and
413
00:25:41,830 –> 00:25:45,430
this is part of the radical experiment of America, but we have to
414
00:25:45,430 –> 00:25:48,800
figure out how to negotiate
415
00:25:48,800 –> 00:25:51,880
reality with each other. So to get back to two plus two equals five
416
00:25:54,360 –> 00:25:58,040
and this, this, by the way, can start happening in the last 10 years here,
417
00:25:58,040 –> 00:26:00,840
because I’m going to go ahead and step on this third rail, because why not?
418
00:26:07,000 –> 00:26:10,360
If you want to say that two plus two equals five in a dynamic
419
00:26:12,210 –> 00:26:15,970
environment where you’re not abrogating my speech, you’re just providing me
420
00:26:15,970 –> 00:26:19,730
consequences for that speech, which is a whole other kind of discussion.
421
00:26:20,770 –> 00:26:23,930
If you want to go ahead and say two plus two equals five, knock yourself
422
00:26:23,930 –> 00:26:26,930
right the hell out. Sure, go ahead.
423
00:26:28,210 –> 00:26:31,570
But I’m going to. I’m going to channel Ben Shapiro here
424
00:26:31,970 –> 00:26:35,010
and I’m going to say facts don’t care about your feelings.
425
00:26:36,690 –> 00:26:40,530
So, like, you could say two plus two equals five all day, but
426
00:26:40,530 –> 00:26:44,330
I have two things. Then I put together two more
427
00:26:44,330 –> 00:26:47,850
things and invariably I’m going to have four things.
428
00:26:48,170 –> 00:26:52,010
Sorry. Like, this is just. This is just reality. This is the
429
00:26:52,010 –> 00:26:55,809
ceiling of, like, your logic. Right. You could feel any way you
430
00:26:55,809 –> 00:26:58,570
want about it. You could feel that it’s. And here we go. I’m going to
431
00:26:58,570 –> 00:27:02,170
step on the. Step on the rail. You could feel that it’s white supremacist.
432
00:27:02,410 –> 00:27:06,180
You could feel that it’s a sign of the patriarchy. You can feel
433
00:27:06,180 –> 00:27:09,900
that it’s oppressive. You can feel that it’s a sign of
434
00:27:09,900 –> 00:27:13,620
systemic oppression. You could feel that it’s a sign of
435
00:27:13,620 –> 00:27:16,420
white fragility. You could feel all of these things.
436
00:27:19,540 –> 00:27:23,340
And I hold up two fingers, and then I
437
00:27:23,340 –> 00:27:26,740
hold up two more fingers and I still only have four
438
00:27:26,740 –> 00:27:30,460
fingers. That’s
439
00:27:30,460 –> 00:27:33,460
it. That’s. That’s it. That’s it. That’s the whole thing. There’s no, there’s no more
440
00:27:33,460 –> 00:27:36,850
argument that I need to. Right. And so this, this has started happening, this
441
00:27:36,850 –> 00:27:40,250
renegotiation of reality that you’re talking about, which is
442
00:27:40,490 –> 00:27:44,130
fine with history. I don’t have a problem with renegotiating
443
00:27:44,130 –> 00:27:47,530
reality with history. And, and Liddell Hart talks a little bit about this in his
444
00:27:47,530 –> 00:27:51,290
book. You know, when myths become stronger than the history, the myth becomes
445
00:27:51,290 –> 00:27:55,050
the history. He, he mentions this, right. And it’s fine. I
446
00:27:55,050 –> 00:27:58,810
have no problem taking a scientific approach. He does. He has a
447
00:27:58,810 –> 00:28:01,770
problem with taking a scientific approach to history. He talks about it in the first
448
00:28:01,770 –> 00:28:05,050
part of his book because he thinks that it drains and denud
449
00:28:06,630 –> 00:28:09,790
of all of its emotive value. And he doesn’t think that you should be able
450
00:28:09,790 –> 00:28:13,550
to do that. He thinks you should balance the emotive value with the scientific
451
00:28:13,550 –> 00:28:17,030
pursuit of the truth. I. Okay, I can see his
452
00:28:17,030 –> 00:28:20,830
argument, But I also think that human beings are going
453
00:28:20,830 –> 00:28:23,470
to look for that emotive value and they’re going to look forward in myths. You
454
00:28:23,470 –> 00:28:25,710
can’t take that out of the human because again, that’s one of those, like two
455
00:28:25,710 –> 00:28:29,270
plus two equals four things. It’s one of the things that’s built into human understanding
456
00:28:29,270 –> 00:28:32,970
of reality. But if we want to renegotiate history, sure,
457
00:28:32,970 –> 00:28:36,250
let’s renegotiate history. Let’s go ahead and dig into the
458
00:28:36,250 –> 00:28:39,010
Tuskegee Project. Let’s go ahead and dig into
459
00:28:40,370 –> 00:28:43,970
Black Wall street in Oklahoma. Let’s go ahead and dig
460
00:28:43,970 –> 00:28:47,730
into the decimation of the Native American
461
00:28:47,730 –> 00:28:51,410
tribes in, in, in North America
462
00:28:51,570 –> 00:28:55,250
and in Mexico and in South America. Let’s go ahead and dig into that
463
00:28:55,810 –> 00:28:58,370
and understand that when you do that,
464
00:29:00,140 –> 00:29:03,780
and this is something that we also forget with history, I think, and
465
00:29:03,780 –> 00:29:07,540
it’s notorious in warfare. Understand that when you do
466
00:29:07,540 –> 00:29:11,100
that, the enemy you are fighting or you are opposing
467
00:29:11,820 –> 00:29:14,380
also gets a vote in a free speech society.
468
00:29:16,300 –> 00:29:19,900
So when you start pulling apart people’s myths, they’re going
469
00:29:19,900 –> 00:29:23,540
to have a reaction to that and a response to
470
00:29:23,540 –> 00:29:27,270
that. Now, we would like to keep that response and reaction in the space of
471
00:29:27,270 –> 00:29:30,670
speech. So there’s going to be an argument, there’s going to be a
472
00:29:30,670 –> 00:29:33,110
verbal conflict, there’s going to be a fight.
473
00:29:34,710 –> 00:29:37,950
And I think in the structure that we have in America, that’s fine. We should
474
00:29:37,950 –> 00:29:40,950
be doing that. Now you have to talk about the observer from the outside. If
475
00:29:40,950 –> 00:29:43,750
I’m an observer from the outside looking at this, all this looks like is just
476
00:29:43,750 –> 00:29:47,510
massive chaos, just endless, never ending massive chaos
477
00:29:47,830 –> 00:29:51,390
about things that should just be decided by an aristocratic
478
00:29:51,390 –> 00:29:55,210
cadre of individuals. Everybody gets told what the thing is
479
00:29:55,370 –> 00:29:57,970
and then you just move on. By the way, this is what they do in
480
00:29:57,970 –> 00:30:01,290
communist China. They have myths based on
481
00:30:01,290 –> 00:30:04,970
Confucianism and the Chinese communist government just says, we’ve decided
482
00:30:04,970 –> 00:30:06,970
this, you’re welcome.
483
00:30:09,370 –> 00:30:12,970
Everybody sort of goes, yeah, that’s fine. By the way, in those kinds of systems,
484
00:30:13,450 –> 00:30:17,170
to Orwell’s point, if you’re in a to totalizing
485
00:30:17,170 –> 00:30:20,890
system like that, where, where dissension from something is
486
00:30:20,890 –> 00:30:24,540
not allowed, this is the point from 1984. Yeah, the two
487
00:30:24,540 –> 00:30:28,180
plus two does equal five, sure, in
488
00:30:28,180 –> 00:30:32,020
that system. But guess what? That
489
00:30:32,020 –> 00:30:35,540
system has to negotiate with the reality of other systems in a global environment.
490
00:30:35,940 –> 00:30:39,380
Which is part of the reason why the Internet’s like wrecking havoc
491
00:30:39,380 –> 00:30:43,220
everywhere right now and has for the last 30, almost 40 years
492
00:30:43,220 –> 00:30:47,060
and actually more than 40 years now and will continue to wreak havoc, by the
493
00:30:47,060 –> 00:30:49,820
way, for the next hundred years because we’re not going to be able to get
494
00:30:49,820 –> 00:30:53,370
over this. And then the other thing that you’re talking about in there, so there’s
495
00:30:53,370 –> 00:30:56,690
negotiating reality, then there’s the other thing you’re talking about. There is democracy.
496
00:30:56,930 –> 00:31:00,450
And fundamentally, as an aristocrat,
497
00:31:01,250 –> 00:31:04,970
B. H Liddell Hart was opposed to democracy and he’s joined
498
00:31:04,970 –> 00:31:08,650
in good as a fellow traveler, or he walks as a fellow
499
00:31:08,650 –> 00:31:12,290
traveler alongside Thomas Jefferson, who was opposed to democracy,
500
00:31:12,770 –> 00:31:16,570
and George Washington, who was opposed to democracy, and Ben Franklin who was
501
00:31:16,570 –> 00:31:19,970
opposed to democracy, and John Adams who was opposed to democracy.
502
00:31:20,290 –> 00:31:24,030
These guys wanted a. And they built a republic
503
00:31:24,830 –> 00:31:28,590
because democracies are two wolves and a
504
00:31:28,590 –> 00:31:31,390
sheep voting on who gets to be dinner.
505
00:31:32,590 –> 00:31:36,430
That’s a democracy. Congratulations. 66% of the people,
506
00:31:36,590 –> 00:31:40,030
66% of the entities in that group just voted on dinner
507
00:31:40,750 –> 00:31:44,150
and your vote didn’t count. That’s democracy. Democracy is
508
00:31:44,150 –> 00:31:47,990
inevitably, inevitably degrades to mob rule. We saw this, we see this
509
00:31:47,990 –> 00:31:51,600
throughout history, literally. So you build a republic,
510
00:31:52,000 –> 00:31:55,680
and in a republic, the two wolves,
511
00:31:56,880 –> 00:32:00,560
the two wolves have to send one wolf as a representative and the sheep
512
00:32:00,560 –> 00:32:03,520
gets to send two sheeps as a representative. And now we’re going to have a
513
00:32:03,520 –> 00:32:06,320
Fight. And that’s how we keep balance in the system.
514
00:32:07,200 –> 00:32:10,880
And so we’ve got all these dynamics which impact how we view
515
00:32:10,880 –> 00:32:14,600
history in America. But to your point, the through line is
516
00:32:14,600 –> 00:32:18,290
about negotiating reality and who gets to negotiate that and who gets to
517
00:32:18,290 –> 00:32:21,730
say what the other thing too. And through your little
518
00:32:21,730 –> 00:32:25,250
Spiel. Spiel there. I wanted to, I just wanted to jump in and say one
519
00:32:25,250 –> 00:32:28,850
thing before, but I obviously wanted to let you finish. But, but when
520
00:32:28,850 –> 00:32:32,489
you, when you’re referencing the two plus two, and I still have four
521
00:32:32,490 –> 00:32:36,090
and you know, fact. Listen, facts are facts. Right? Like, to your point,
522
00:32:36,410 –> 00:32:39,130
the problem with history is
523
00:32:40,090 –> 00:32:43,650
it’s almost never. Yes, the facts, ma’. Am. Right.
524
00:32:43,650 –> 00:32:47,410
Like, there has to. There’s always human emotion in it. There’s always
525
00:32:47,490 –> 00:32:51,250
like these degrees of champions and like,
526
00:32:51,250 –> 00:32:54,610
like there’s always these. There’s heroism and like we’re
527
00:32:54,610 –> 00:32:58,450
elevating things to, to make it feel more real or to more. Have
528
00:32:58,450 –> 00:33:02,170
more impact. And it’s never like, if you were to ever
529
00:33:02,170 –> 00:33:05,970
read a history book that was literally just factual information, nobody would read the
530
00:33:05,970 –> 00:33:09,330
whole. Nobody would read it. No, like, no, it’s like, wait,
531
00:33:09,570 –> 00:33:13,330
okay, so World War II started in this date, ended in this date. The
532
00:33:13,330 –> 00:33:17,010
battle of this date was this date, this day, the battle. Like, like. But
533
00:33:17,650 –> 00:33:21,370
because we tie everything into some sort of
534
00:33:21,370 –> 00:33:25,210
emotion, we have to feel an emotion for anything that we’re
535
00:33:25,210 –> 00:33:28,850
involved in, whether it’s reading history, creating history,
536
00:33:29,250 –> 00:33:32,924
looking at the future, etc. Etc. So to your point about the 2
537
00:33:32,976 –> 00:33:36,540
+2 is always going to equal 4. Yes, because there’s no
538
00:33:36,540 –> 00:33:40,380
emotion in that that you cannot, you can’t invoke emotion into
539
00:33:40,460 –> 00:33:41,980
math. Right.
540
00:33:44,380 –> 00:33:48,220
Like, I remember, I read recently, I read
541
00:33:48,220 –> 00:33:51,900
recently, somebody, somebody was arguing some political.
542
00:33:52,060 –> 00:33:55,780
Blah, blah, blah, whatever. And the person, literally, the, the argument, what
543
00:33:55,780 –> 00:33:58,620
they were talking about was like, listen, you cannot
544
00:33:58,780 –> 00:34:02,590
distort math. And they literally said
545
00:34:02,590 –> 00:34:06,310
exactly what you said. 2/2 is always going to equal 4. So if you’re looking
546
00:34:06,310 –> 00:34:09,510
at statistical data that is verified
547
00:34:09,990 –> 00:34:12,870
statistical data, I’m not talking about stuff that we just make up.
548
00:34:13,750 –> 00:34:16,990
First time I ever heard this line, it made me laugh so hard. It’s like,
549
00:34:16,990 –> 00:34:20,310
hey, Tom, did you know that 86 of all statistics are made up? And I
550
00:34:20,310 –> 00:34:23,270
went, there you go, what? Like, I had to stop. And I was like,
551
00:34:23,990 –> 00:34:26,550
that’s made up. Exactly. Right. So, like,
552
00:34:28,630 –> 00:34:32,370
so but if you. But if it’s verified statistical data and
553
00:34:32,370 –> 00:34:35,890
you can say that X number of people voted this way,
554
00:34:35,890 –> 00:34:39,370
X number of people voted that way. So 28 of people
555
00:34:39,370 –> 00:34:43,050
voted this Way based on the information they were given that is statistical.
556
00:34:43,050 –> 00:34:46,690
You cannot change that. That. It just is what it is,
557
00:34:46,770 –> 00:34:50,010
right? They were trying to make that argument and the guy was still fighting, like,
558
00:34:50,010 –> 00:34:53,810
pushing back, going, no, you can, you can, you can manipulate statistics.
559
00:34:54,130 –> 00:34:57,719
Well, yes, but then they’ve not. Then they’re not validated. They’re not
560
00:34:57,719 –> 00:35:01,359
verifiable statistics at that point. They’re 86% of
561
00:35:01,359 –> 00:35:04,879
statistics are made up. Like, you have to be able to validate it. If you
562
00:35:04,879 –> 00:35:08,479
can’t validate it, then it’s not, it’s not reasonable to see. So, so
563
00:35:08,479 –> 00:35:10,759
my, my point to all that is that,
564
00:35:11,879 –> 00:35:15,319
yes, mathematical information, things that you can
565
00:35:15,319 –> 00:35:19,039
literally see, two plus two equals four. I, if anybody
566
00:35:19,039 –> 00:35:22,479
tries to argue that, I just think they’re being silly, and all of us are
567
00:35:22,479 –> 00:35:26,110
going to think they’re being silly. But when they start debating on the
568
00:35:26,110 –> 00:35:29,870
merits of historical information that is written by people, that are
569
00:35:29,870 –> 00:35:32,910
written by the winners, so to speak, that are written by the
570
00:35:33,310 –> 00:35:37,030
majority, all of that stuff when
571
00:35:37,030 –> 00:35:40,790
you read that, you feel something when you read that. That’s
572
00:35:40,790 –> 00:35:44,230
another bias that we probably don’t talk about is the emotional
573
00:35:44,230 –> 00:35:47,710
bias that you get when you read some of those things. So therefore,
574
00:35:48,110 –> 00:35:51,830
if you see that the United States and its allies are going to
575
00:35:51,830 –> 00:35:55,390
win World War II based on what you’re reading, and you say, okay, so
576
00:35:55,630 –> 00:35:59,350
why, why do we have the. Why are we trying to prevent World War
577
00:35:59,350 –> 00:36:02,550
iii? We know the new the. The allies are going to win. That’s what happens,
578
00:36:02,550 –> 00:36:05,990
right? So we look throughout history, but, like, can we stop for a second? And
579
00:36:05,990 –> 00:36:08,790
let’s, like, not lose the millions of lives that we’re going to lose because of
580
00:36:08,790 –> 00:36:12,550
it. Like, is there a way to circumvent? Right. Just go from point A to
581
00:36:12,550 –> 00:36:15,470
point C instead of, like, can we skip point B,
582
00:36:16,750 –> 00:36:20,570
skip point D. Yeah, don’t learn. That’s what we don’t learn. Right. Well,
583
00:36:20,570 –> 00:36:23,970
and this is, this is. I mean, again, I watched Oppenheimer last night, right?
584
00:36:24,290 –> 00:36:28,010
And you know, it’s interesting. Like, there’s a movie coming out, and we always
585
00:36:28,010 –> 00:36:31,410
end up talking about movies. This is the moment to do this. Right? Now there’s
586
00:36:31,410 –> 00:36:35,090
a movie coming out from Catherine Bigelow, who directed A Hurt Locker
587
00:36:35,249 –> 00:36:37,810
and a couple other movies on Netflix
588
00:36:38,770 –> 00:36:42,490
about, like, Five Minutes to Nuclear Impact or
589
00:36:42,490 –> 00:36:45,570
something like that. Anyway, it reminds. It’s on. It’s going on Netflix. It’s coming out
590
00:36:45,570 –> 00:36:47,930
in December. I think it’s going to be in the theaters for, like, a brief
591
00:36:47,930 –> 00:36:50,250
minute, and then it’s going to. It’s called House of Fire, I think, or House
592
00:36:50,250 –> 00:36:54,030
of Dynamite or I can’t remember. Anyway, it doesn’t matter what the name
593
00:36:54,030 –> 00:36:57,550
of the movie is. It’s the latest entry into a
594
00:36:57,550 –> 00:37:01,070
genre of films that started, gosh,
595
00:37:01,150 –> 00:37:04,910
back in the 1950s with the movie directed by Stanley Kubrick,
596
00:37:04,910 –> 00:37:08,590
Dr. Strangelove, or how I Learned to Stop Worrying and Love the Bomb,
597
00:37:09,390 –> 00:37:13,230
where artists who are
598
00:37:13,230 –> 00:37:16,430
attempting to, to your point about emotion, who are attempting to
599
00:37:16,510 –> 00:37:19,480
emote into history, because that’s what artists do do
600
00:37:21,640 –> 00:37:25,080
and are attempting to get the viewer or the observer to engage
601
00:37:25,080 –> 00:37:28,040
emotionally with something, an idea that they have.
602
00:37:29,640 –> 00:37:33,280
There’s a long genre of films that began there and
603
00:37:33,280 –> 00:37:37,080
continues on through the Katherine Bigalow film, of which Oppenheimer is one of these films
604
00:37:39,000 –> 00:37:42,360
where to your point about World War 3,
605
00:37:42,840 –> 00:37:46,590
it’s that constant drum beat of warning that nuclear
606
00:37:46,590 –> 00:37:49,790
thermonuclear destruction is the worst thing that can possibly ever happen.
607
00:37:50,190 –> 00:37:53,390
It will destroy all of humanity, all civilization, everywhere,
608
00:37:53,950 –> 00:37:57,310
period, full stop. And it is, it is a
609
00:37:57,470 –> 00:38:00,750
drumbeat that runs through history.
610
00:38:01,710 –> 00:38:04,750
And to your point about forgetting, and this is sort of the last thing I
611
00:38:04,750 –> 00:38:06,710
want to say, and then we can go back to the book. But your point
612
00:38:06,710 –> 00:38:10,350
about forgetting, we’ve forgotten how bad World War
613
00:38:10,350 –> 00:38:13,230
II really was because we are distanced from it.
614
00:38:14,160 –> 00:38:18,000
And my concern also is that we have forgotten how
615
00:38:18,000 –> 00:38:20,800
bad a nuclear bomb can be.
616
00:38:22,240 –> 00:38:26,080
Thus you get crazy talk like in the last
617
00:38:26,080 –> 00:38:29,760
administration from certain people about
618
00:38:30,000 –> 00:38:33,840
arming, arming
619
00:38:33,840 –> 00:38:37,520
bases in Europe with nuclear tipped
620
00:38:37,920 –> 00:38:41,640
whatever, and if Russia does this thing
621
00:38:41,640 –> 00:38:45,080
or that thing in the Ukraine, we’re going to use those
622
00:38:45,080 –> 00:38:48,880
nukes. People in the last administration running around saying, saying
623
00:38:49,120 –> 00:38:52,080
nonsense like this. And of course the Russians are responding with,
624
00:38:52,960 –> 00:38:56,800
listen, we lost 60 some odd million people in
625
00:38:56,800 –> 00:39:00,480
what we call the Great Patriotic War and we didn’t miss
626
00:39:00,480 –> 00:39:03,200
a beat. You really want to go ahead, you really want to go ahead and
627
00:39:03,200 –> 00:39:06,880
pull that smoke wagon, you go right on ahead. We got nukes
628
00:39:06,880 –> 00:39:10,680
too. Yeah, exactly. And this is, this is the hubris to your point. This
629
00:39:10,680 –> 00:39:14,240
is reflect, even, just even allowing that to be said, number one,
630
00:39:14,240 –> 00:39:18,080
moves the Overton window on talking about nuclear war. But it also,
631
00:39:19,280 –> 00:39:22,720
it also reflects a level of hubris and arrogance
632
00:39:22,960 –> 00:39:26,360
that can only come from being disassociated in time and
633
00:39:26,360 –> 00:39:30,080
forgetting the lessons of history. Right. Yeah. And by the
634
00:39:30,080 –> 00:39:33,390
way, but just to circle this back into the whole purpose of this
635
00:39:33,550 –> 00:39:36,430
podcast in general, this
636
00:39:37,070 –> 00:39:40,830
nothing that we’re talking about excludes businesses, by the way. No,
637
00:39:41,150 –> 00:39:44,270
we, we’ve seen the same. Think again,
638
00:39:44,910 –> 00:39:48,710
current Environment not excluded. If you think
639
00:39:48,710 –> 00:39:52,110
about, like, I was actually, I literally just read something
640
00:39:52,270 –> 00:39:55,790
earlier, earlier today about this, which I found fascinating because
641
00:39:56,190 –> 00:39:59,790
those of us in the sales and marketing world have been talking recently about
642
00:40:00,390 –> 00:40:04,030
AI kind of plateauing a little bit. Like there’s, there’s been, like
643
00:40:04,030 –> 00:40:07,630
there was this major rush and there was all this talk about
644
00:40:07,630 –> 00:40:10,950
AI replacing people and taking jobs and all this other stuff,
645
00:40:11,270 –> 00:40:15,070
and yet I just saw a statistic, statistic. I just saw
646
00:40:15,070 –> 00:40:18,510
a statistic this morning that said of the
647
00:40:18,510 –> 00:40:22,030
Fortune 500 companies, the average company is only
648
00:40:22,030 –> 00:40:25,030
deploying AI at about a 2% efficiency rate.
649
00:40:26,000 –> 00:40:29,360
How many people you think they’re replacing at IBM because of this?
650
00:40:29,680 –> 00:40:33,240
How many people do you think at Amazon do you think they’re replacing because of
651
00:40:33,240 –> 00:40:36,520
it? It’s just not happening the way we expected it. Now here’s. Let me back
652
00:40:36,520 –> 00:40:39,840
up to my point a second ago. We saw this once before,
653
00:40:40,160 –> 00:40:43,840
folks, the dot com in the late 90s where
654
00:40:44,400 –> 00:40:48,080
investors were just throwing money upon money upon money at
655
00:40:48,480 –> 00:40:51,680
anything that had dot com at the end of it. And some of them lost
656
00:40:51,680 –> 00:40:54,830
their shirts. Some of them did okay. Some of them did all right. Yeah. The
657
00:40:54,830 –> 00:40:58,630
dot com boom didn’t. It didn’t. Just
658
00:40:58,630 –> 00:41:01,750
because there were a few winners did not mean that a,
659
00:41:02,550 –> 00:41:06,190
an intense amount of businesses failed. Yeah. Oh,
660
00:41:06,190 –> 00:41:09,910
yeah. So. So what did that teach us? Apparently
661
00:41:09,910 –> 00:41:13,630
nothing. Nothing. Because, because, because investors are
662
00:41:13,630 –> 00:41:17,030
starting to throw so much money at AI right now, I guarantee you
663
00:41:18,000 –> 00:41:20,880
a majority of them are going to lose their shirts and there are going to
664
00:41:20,880 –> 00:41:24,640
be a few, a select few that come out on top. And for some reason,
665
00:41:24,720 –> 00:41:28,360
we’re going to remember history as the dot com boom and the AI boom as
666
00:41:28,360 –> 00:41:32,120
being successful. How that happens to me is
667
00:41:32,120 –> 00:41:35,920
beyond. Like, again, business does the same stuff.
668
00:41:36,080 –> 00:41:39,800
Like, I feel, I, I’ve been telling people lately, I feel like
669
00:41:39,800 –> 00:41:43,280
I’m a veteran who’s been through like the fourth World War. Like, I feel like
670
00:41:43,280 –> 00:41:45,400
I’ve been, you know, I feel like I’ve been through four. I was going through
671
00:41:45,400 –> 00:41:48,400
the search revolution. Well, no, the first one was the Internet revolution to your point,
672
00:41:48,400 –> 00:41:51,500
with the dot com bubble. Oh, go back, go back one step further. The email.
673
00:41:51,580 –> 00:41:54,700
When email. Oh, God. Oh, yeah, email. Oh my God.
674
00:41:55,260 –> 00:41:58,020
Email’s gonna change the world. And we’re never gonna, we’re not gonna need a postal
675
00:41:58,020 –> 00:42:01,740
service anymore, right, Exactly. We have Amazon that delivers packages literally
676
00:42:01,740 –> 00:42:05,580
daily, like a day to people’s. Anyway, go ahead. Sorry. Yeah,
677
00:42:05,580 –> 00:42:08,860
yeah, sure. Amazon knows exactly where I live
678
00:42:08,940 –> 00:42:12,740
anyway. They can find me. They’ll have no Problem
679
00:42:12,740 –> 00:42:16,100
finding me, by the way. So does ups. Amazon knows where I live,
680
00:42:16,100 –> 00:42:19,780
UPS knows where I live, the postal service, everybody knows where I live now. Like,
681
00:42:19,780 –> 00:42:23,580
I can’t even get away from it at this point anyway. Good. But yeah, you
682
00:42:23,580 –> 00:42:26,540
got email. Take all that away. Email, Internet.
683
00:42:27,100 –> 00:42:30,780
Oh, God. Social media was going to revolutionize how we were going. Remember
684
00:42:30,780 –> 00:42:34,460
that. Like, we were all just going to live in these, like, these like Facebook
685
00:42:34,540 –> 00:42:37,980
pods and the Instagram Pod and the LinkedIn Pod, and we were never going to.
686
00:42:38,220 –> 00:42:41,900
And then, and then VR ar, which was a brief.
687
00:42:42,140 –> 00:42:45,930
Before that, it was, it was a 3D printing. Oh, yeah. Oh, I haven’t even
688
00:42:45,930 –> 00:42:49,570
gotten to that one yet. 3D printing. Crypto. Oh, my God.
689
00:42:49,570 –> 00:42:53,410
Yeah, crypto. And, and now, now here
690
00:42:53,410 –> 00:42:56,490
we are with AI. And I got to admit
691
00:42:57,290 –> 00:43:00,850
to your point, Tom, I get a little
692
00:43:00,850 –> 00:43:04,570
grizzled and gray when I, when I hear
693
00:43:04,570 –> 00:43:08,250
about it because I’m like. On the one hand, it’s very exciting
694
00:43:08,250 –> 00:43:11,850
because it’s a, it’s, it is a gold rush, exciting kind of, kind of thing.
695
00:43:11,850 –> 00:43:15,310
You can, you could fall into the emotion of it. Right. Because it gets to
696
00:43:15,310 –> 00:43:18,910
be emotive. It’s the bright shiny object thing. It’s the bright shiny object thing.
697
00:43:18,910 –> 00:43:22,110
Exactly. You just talked about. That’s exactly what they were. Now
698
00:43:22,670 –> 00:43:26,190
are some of them. Yeah, I’m not suggesting that they quote, unquote, failed.
699
00:43:26,350 –> 00:43:30,110
We won’t know that for 15 years, though. We won’t know for 15 years. Business
700
00:43:30,110 –> 00:43:33,830
lessons. And then, and then being more calculated about how we project, how
701
00:43:33,830 –> 00:43:37,510
we go forward, business. We learned nothing from all of those things you just said.
702
00:43:37,510 –> 00:43:41,070
The email, the, the dot com boom, the search engine, the,
703
00:43:41,230 –> 00:43:44,830
the, the, the 3D printing, the crypto, the
704
00:43:44,990 –> 00:43:48,750
social media. At all of those. There were
705
00:43:49,310 –> 00:43:52,910
thousands of companies involved in those things, most of which failed.
706
00:43:53,150 –> 00:43:56,989
Right. Like, but there will.
707
00:43:56,990 –> 00:44:00,350
And there will be an AI apocalypse. There will. Yeah, there will be an LLM
708
00:44:00,350 –> 00:44:04,070
apocalypse everywhere. Like, I was reading something of the other day in one of the
709
00:44:04,070 –> 00:44:07,830
startup. One of the startup newsletters that I read for the other project that you
710
00:44:07,830 –> 00:44:09,790
and I are on. And,
711
00:44:11,550 –> 00:44:15,190
and something like all of
712
00:44:15,190 –> 00:44:17,910
the. Oh yeah, I know what it was. All of the investor money, or the
713
00:44:17,910 –> 00:44:21,510
vast majority of investor money in Silicon Valley. If you don’t have something
714
00:44:21,510 –> 00:44:25,150
AI, you can’t get, you can’t even get into the door now.
715
00:44:25,470 –> 00:44:28,830
And I think about the event that we ran this week
716
00:44:29,070 –> 00:44:32,670
with the folks that we ran that with on that other
717
00:44:32,670 –> 00:44:36,080
project. And, and I mean,
718
00:44:36,800 –> 00:44:39,680
there’s way more things happening in the world. I Think we only had like what,
719
00:44:39,680 –> 00:44:43,400
two, maybe three out of the, out of the companies, out of the ones that
720
00:44:43,400 –> 00:44:47,160
we looked at that were. And I’m being on purposely oblique
721
00:44:47,160 –> 00:44:50,400
about this folks, but like three companies that we looked at
722
00:44:50,720 –> 00:44:54,240
that even had an AI play. The vast majority of everybody else
723
00:44:54,560 –> 00:44:58,160
is still trying to do a business the way you do a
724
00:44:58,240 –> 00:45:01,860
business. Now is there going to be an AI play built into that?
725
00:45:01,860 –> 00:45:05,460
Yeah, maybe. Probably because you got to get investors attention. But
726
00:45:05,540 –> 00:45:08,780
that’s insane to me because there’s just so many other businesses that are, that could
727
00:45:08,780 –> 00:45:12,460
operate in the world without, without an LLM. Yeah. And there’s so many other
728
00:45:12,460 –> 00:45:16,220
problems that we solve that LLMs can’t solve. So anyway, yeah, no,
729
00:45:16,220 –> 00:45:19,940
we haven’t learned. And there’ll be another bubble in 10 years. There’ll be another
730
00:45:19,940 –> 00:45:23,460
bubble, I guarantee. And it might be, honestly, it might be the humanoid robot bubble
731
00:45:23,460 –> 00:45:27,020
that I think might be the hardware bubble, that might be the hardware version, that
732
00:45:27,020 –> 00:45:30,660
might be the 3D printing version. The next bubble you.
733
00:45:31,040 –> 00:45:34,840
And so Elon, you know why, Jason? Because the more things change, the more.
734
00:45:34,840 –> 00:45:38,560
Things and the more they stick. There we go. There it
735
00:45:38,560 –> 00:45:42,320
is. There it is. Back to the book. Back
736
00:45:42,320 –> 00:45:45,920
to why don’t we learn from History by B.H. liddell Hart.
737
00:45:46,320 –> 00:45:50,040
This is an open source book, by the way. You can get it online for
738
00:45:50,040 –> 00:45:53,840
free. So. But the version that I’m reading has a
739
00:45:53,840 –> 00:45:57,520
yellow cover and it was edited and with an introduction by Gills
740
00:45:57,520 –> 00:46:01,070
Lauren. But you can grab this book anywhere
741
00:46:01,310 –> 00:46:04,430
online. This is, this is definitely an open source book and I would encourage you
742
00:46:04,510 –> 00:46:07,870
if you are in business or you’re in leadership
743
00:46:08,110 –> 00:46:09,790
or you are in tech,
744
00:46:11,790 –> 00:46:15,310
especially if you’re part of one of those LLM
745
00:46:15,310 –> 00:46:18,750
driven AI startups. I strongly recommend
746
00:46:18,990 –> 00:46:22,270
reading this book. All right, back to the book. So
747
00:46:22,670 –> 00:46:26,510
let’s look at the importance of keeping promises and the importance of care about making
748
00:46:26,510 –> 00:46:30,080
promises. And, and I want to talk about, with Tom, about something that he
749
00:46:30,080 –> 00:46:33,280
mentioned that ties into how we teach history.
750
00:46:34,880 –> 00:46:38,240
Back to the book on the importance of keeping promises.
751
00:46:38,480 –> 00:46:41,440
Civilization is built on the practice of keeping promises.
752
00:46:42,080 –> 00:46:45,840
It may not sound a high attainment, but if trust in its observance
753
00:46:45,840 –> 00:46:49,360
should be shaken, the whole structure cracks and sinks.
754
00:46:49,920 –> 00:46:52,960
Any constructive effort in all human relations, personal,
755
00:46:53,570 –> 00:46:57,370
political and commercial depend on being able
756
00:46:57,370 –> 00:47:01,010
to depend on promises. By the way,
757
00:47:01,010 –> 00:47:04,570
pause for just a minute. That’s genius. I’ve never heard
758
00:47:04,570 –> 00:47:08,010
that. I’ve never heard an argument for high trust, a high trust
759
00:47:08,010 –> 00:47:11,850
society laid out as succinctly as is laid out in those
760
00:47:11,850 –> 00:47:15,170
three Sentences. That’s brilliant. That’s brilliant writing.
761
00:47:15,810 –> 00:47:19,210
Back to the book. This truth has a reflection on the question of
762
00:47:19,210 –> 00:47:22,890
collective security among nations and on the lessons of history in regards to that
763
00:47:22,890 –> 00:47:26,640
subject. In the years before the war, the charge was constantly brought. And
764
00:47:26,640 –> 00:47:30,440
by the way, the war he’s talking about, just pause again, is World War II.
765
00:47:30,440 –> 00:47:33,440
But sometimes he’s also talking about World War I. So just you have to think
766
00:47:33,440 –> 00:47:36,880
about those both in concert with each other. All right. In the years before the
767
00:47:36,880 –> 00:47:40,360
war, the charge was constantly brought that its supporters were courting the risk of war
768
00:47:41,400 –> 00:47:45,160
by their exaggerated respect for covenants. Although
769
00:47:45,160 –> 00:47:48,680
they may have been fools in disregarding the conditions necessary for the
770
00:47:48,680 –> 00:47:52,430
effective fulfillment of pledges, they at least show themselves men
771
00:47:52,430 –> 00:47:56,150
of honor. I have that double underlined, by the way, and in the long
772
00:47:56,150 –> 00:47:59,910
view of more fundamental common sense than those who
773
00:47:59,910 –> 00:48:03,750
argued that we should give aggressors a free hand so long as they left us
774
00:48:03,750 –> 00:48:07,590
alone. History has shown repeatedly
775
00:48:07,750 –> 00:48:11,510
that the hope of buying safety in this way is the greatest of
776
00:48:11,510 –> 00:48:15,350
delusions. The importance of care about
777
00:48:15,350 –> 00:48:19,170
making promises. It is immoral to make promises
778
00:48:19,170 –> 00:48:22,610
that one cannot in practice fulfill in the sense that the
779
00:48:22,610 –> 00:48:26,130
recipient expects on the ground in
780
00:48:26,130 –> 00:48:29,850
1939. This is ahead of World War II. I question the
781
00:48:29,850 –> 00:48:33,570
underlying morality of the Polish guarantee as well as its practicality. If
782
00:48:33,570 –> 00:48:37,250
the Poles had realized the military inability of Britain and France to save them from
783
00:48:37,250 –> 00:48:40,690
defeat and of what such a defeat would mean to them individually and
784
00:48:40,690 –> 00:48:44,320
collectively, it is unlikely that they would have shown such a stubborn opposition to
785
00:48:44,320 –> 00:48:48,120
Germany’s originally modest demands for Danzig and a
786
00:48:48,120 –> 00:48:51,680
passage through the Corridor, since it was obvious to me that they were bound to
787
00:48:51,680 –> 00:48:55,320
lose those points and even much more in the event of a conflict.
788
00:48:55,480 –> 00:48:59,320
It seemed to me wrong on our part to make promises that we
789
00:48:59,320 –> 00:49:03,040
that were bound to encourage false hopes. It
790
00:49:03,040 –> 00:49:05,920
also seemed to me that any such promises were the most certain way to produce
791
00:49:05,920 –> 00:49:09,690
war. Because the inevitable provocativeness of guaranteeing at such
792
00:49:09,690 –> 00:49:13,530
a moment of tension an area which we had hitherto treated as outside our
793
00:49:13,530 –> 00:49:17,210
sphere of interest, because of the manifest temptation which the
794
00:49:17,210 –> 00:49:20,850
guarantee offered to a military minded people like the Germans, to show how
795
00:49:20,850 –> 00:49:24,490
fatuously impractical our guarantee was, and because of its
796
00:49:24,490 –> 00:49:27,850
natural effect on stiffening the attitude of a people, the
797
00:49:27,850 –> 00:49:31,450
Poles who had always shown themselves exceptionally intractable in
798
00:49:31,450 –> 00:49:34,450
negotiating a reasonable settlement of any issue
799
00:49:38,220 –> 00:49:42,020
and historian could not help seeing certain parallels between the long standing aspect of
800
00:49:42,020 –> 00:49:45,780
the Polish German situation and that between Britain and the Boer Republics 40
801
00:49:45,780 –> 00:49:49,460
years earlier, and remembering the effects on us of the attempts of the other European
802
00:49:49,460 –> 00:49:52,860
powers to induce or coerce us into negotiating a settlement with the Boers.
803
00:49:53,340 –> 00:49:57,020
If our own reaction then had been so violent, it could hardly be expected that
804
00:49:57,020 –> 00:50:00,460
the reaction of a nation filled with an even more bellicose spirit would be less
805
00:50:00,460 –> 00:50:04,200
violent, especially as the attempt to compel negotiation was backed by
806
00:50:04,200 –> 00:50:08,000
an actual promise of making war if Poland felt moved to
807
00:50:08,000 –> 00:50:11,040
resist the German conditions.
808
00:50:15,120 –> 00:50:18,600
That is a brilliant piece of analysis. That’s why I’m reading this. That is a
809
00:50:18,600 –> 00:50:22,080
brilliant piece of analysis of the psychology
810
00:50:22,160 –> 00:50:26,000
of nation states, which we don’t often talk about. Like, we try
811
00:50:26,000 –> 00:50:29,790
to pretend that nations are these, and
812
00:50:29,790 –> 00:50:33,390
maybe we do it less so now than we have in the past in history.
813
00:50:33,390 –> 00:50:37,070
We try to pretend that nation states are somehow this amorphous collection of
814
00:50:37,070 –> 00:50:40,790
ideas. But nation states actually do have their own psychology and
815
00:50:40,790 –> 00:50:44,510
their own character. And history reveals that. And
816
00:50:44,510 –> 00:50:47,190
Liddell Hart there is brilliant
817
00:50:48,150 –> 00:50:51,910
in basically saying, if you’re going to make a promise,
818
00:50:51,910 –> 00:50:55,620
keep it, but understand the character as a nation
819
00:50:55,620 –> 00:50:58,980
state of the nation state
820
00:50:59,220 –> 00:51:02,540
that you are making the promise to understand their
821
00:51:02,540 –> 00:51:06,020
character, study them, examine them,
822
00:51:06,660 –> 00:51:10,100
don’t just. And then this gets to our current geopolitical
823
00:51:10,580 –> 00:51:14,420
climate. Don’t just, like in the case of NATO, sign a piece
824
00:51:14,420 –> 00:51:18,180
of paper 80 years ago and then just sort of pretend like everything’s
825
00:51:18,180 –> 00:51:22,030
the same as it was 80 years ago. And nation
826
00:51:22,030 –> 00:51:25,150
states, characters change just like people.
827
00:51:25,550 –> 00:51:28,710
France now is
828
00:51:28,710 –> 00:51:32,390
significantly different than they were at
829
00:51:32,390 –> 00:51:35,230
the back end of World War II after being
830
00:51:35,470 –> 00:51:38,670
humiliated by the Germans. They’re significantly different.
831
00:51:39,950 –> 00:51:42,990
They can pay for the protection of their own
832
00:51:43,150 –> 00:51:46,110
continent, by the way, this is all that Trump is saying, by the way. He’s
833
00:51:46,110 –> 00:51:49,890
been saying this since his first administration. Maybe, maybe you
834
00:51:49,890 –> 00:51:53,210
could pay for the protection of your own continent,
835
00:51:53,370 –> 00:51:57,010
because maybe the character of France as a nation state
836
00:51:57,010 –> 00:52:00,690
has changed. Maybe the character of Germany as a nation state has changed. Maybe the
837
00:52:00,690 –> 00:52:04,370
character of Sweden and Finland and at all has
838
00:52:04,370 –> 00:52:07,530
changed since World War II.
839
00:52:08,090 –> 00:52:11,850
Maybe we don’t just have to keep honoring these guarantees
840
00:52:11,850 –> 00:52:15,500
in perpetuity. Now, the other
841
00:52:15,500 –> 00:52:19,140
idea in there, which I find to be interesting, is this idea of being a
842
00:52:19,140 –> 00:52:22,900
person of honor. And this gets to diplomacy, right? And
843
00:52:22,900 –> 00:52:26,580
so if civilization is built on promises, and
844
00:52:27,140 –> 00:52:30,820
one of the big lessons that Liddell Hart learned from World War I
845
00:52:31,300 –> 00:52:32,260
was how,
846
00:52:35,060 –> 00:52:38,860
how fatuous, to a certain degree, diplomacy
847
00:52:38,860 –> 00:52:42,640
really was in the run up to that war. And so
848
00:52:42,640 –> 00:52:46,400
that mistake where, interestingly enough, has been corrected, we now have more
849
00:52:46,400 –> 00:52:50,160
diplomatic venues to get more people to talk as leaders of
850
00:52:50,160 –> 00:52:53,520
nation states to talk than ever before in the History of the world. It’s kind
851
00:52:53,520 –> 00:52:56,760
of insane that we have a un. That’s actually kind of nuts in the history
852
00:52:56,760 –> 00:53:00,600
of the world, and we don’t appreciate how. Bananas in pajamas, that is.
853
00:53:00,600 –> 00:53:04,440
We just don’t. We just don’t. And the first place that we don’t appreciate
854
00:53:04,440 –> 00:53:08,190
that, I think, is in how we teach history to
855
00:53:08,190 –> 00:53:11,190
the next generation. So Tom hit on this,
856
00:53:12,630 –> 00:53:16,390
and this is an interesting point. The biggest challenge in teaching
857
00:53:16,390 –> 00:53:20,110
history is getting people to care
858
00:53:20,110 –> 00:53:23,030
about it who were born after all of that was done.
859
00:53:24,950 –> 00:53:28,550
And so that’s the question to Tom. How do we get people to care?
860
00:53:29,270 –> 00:53:32,870
How do we teach history to people who were born after
861
00:53:32,870 –> 00:53:36,530
all of that was over? Well, and before
862
00:53:36,530 –> 00:53:40,250
you. Before you get to that challenge, I think what happens even
863
00:53:40,250 –> 00:53:44,010
before that is even if you. Let’s say. Let’s say you don’t get
864
00:53:44,010 –> 00:53:47,290
them excited about it, but you at least get them to read it, there’s another
865
00:53:47,290 –> 00:53:51,090
bias that happens that psychology proves that there’s a. An
866
00:53:52,290 –> 00:53:56,130
in. What’s it. What the hell would they. I forget how they worded it. I
867
00:53:56,130 –> 00:53:59,730
don’t remember the title they gave it, but it’s like a. It’s like an obvious.
868
00:54:00,530 –> 00:54:04,050
An obvious ton ability factor. My point is,
869
00:54:04,450 –> 00:54:07,370
so when you. When you read. When you read the history of World War II
870
00:54:07,370 –> 00:54:11,090
and you see, oh, like, oh, you watched this guy come into power and
871
00:54:11,090 –> 00:54:14,770
nobody really wanted. And then it’s almost like you see the inevitability
872
00:54:15,170 –> 00:54:19,010
without somebody actually teaching it to you. Well, we’re not speaking
873
00:54:19,010 –> 00:54:22,170
German right now, so obviously we won, right? Like, so then, like, there’s like an
874
00:54:22,170 –> 00:54:26,010
obvious factor that happens in reading history that. That
875
00:54:26,010 –> 00:54:29,690
gives you a bias that you already know the outcome. So
876
00:54:29,920 –> 00:54:33,760
another reason why today that. That whole history repeats itself kind of
877
00:54:33,760 –> 00:54:37,400
thing is because we. We have this expectation of
878
00:54:37,400 –> 00:54:40,760
inevitability, right? That because history showed us this
879
00:54:40,760 –> 00:54:44,520
inevitability, now we’re going to expect this inevitability. And whether it happens or
880
00:54:44,520 –> 00:54:47,520
not, which, by the way, it usually does.
881
00:54:48,160 –> 00:54:51,360
That’s why we keep saying the more things stay the same,
882
00:54:51,760 –> 00:54:55,520
but. But we keep making the same mistakes over and over again
883
00:54:55,600 –> 00:54:58,560
because we take the inevitability factor into it
884
00:54:59,130 –> 00:55:02,570
subconsciously. So to your point about getting the next
885
00:55:02,570 –> 00:55:06,250
generation to learn to care
886
00:55:06,250 –> 00:55:09,610
about it, that’s another reason why you and I get
887
00:55:09,690 –> 00:55:13,370
involved so heavily in conversations about film. It’s a
888
00:55:13,370 –> 00:55:17,090
media in which we can get the next generation to understand and learn from some
889
00:55:17,090 –> 00:55:20,930
of those historical events. You produce Band
890
00:55:20,930 –> 00:55:24,330
of Brothers or Saving Private Ryan or.
891
00:55:24,720 –> 00:55:27,920
And you Get a the next generation to watch that movie. And they go, oh
892
00:55:27,920 –> 00:55:31,080
my God. And they think it’s just cinematography. And then they realize it’s a real
893
00:55:31,080 –> 00:55:34,880
thing, it actually happened. They’re like, oh my God, I should go
894
00:55:34,880 –> 00:55:38,680
learn about World War. Like we have mechanisms that we can use
895
00:55:38,680 –> 00:55:41,920
and pull the lever on to get the next generation to care.
896
00:55:43,200 –> 00:55:47,000
Well, we did. I’m not sure they care anymore. They don’t watch it. They don’t
897
00:55:47,000 –> 00:55:50,160
watch all that many more. Maybe we should put them in TikTok videos. Maybe that’s
898
00:55:50,160 –> 00:55:53,830
what it is. Just short 30 second clips of what happened with
899
00:55:53,830 –> 00:55:57,390
World War II. And maybe then we’ll get some the next generation to care about
900
00:55:57,390 –> 00:55:59,710
history the. Way that we do. But
901
00:56:00,910 –> 00:56:04,630
kidding aside though, but that, that’s. We, we’ve got to
902
00:56:04,630 –> 00:56:08,390
stop. We’ve got to get it out of the. We, we have to. We. If
903
00:56:08,390 –> 00:56:11,790
we can change the way the mechanisms in which we teach,
904
00:56:12,270 –> 00:56:16,110
we will be able to touch the hearts of the next generation. Because
905
00:56:16,770 –> 00:56:20,130
when you and I were growing up, it was books, it was
906
00:56:20,450 –> 00:56:24,050
our imagination that reading these books and, and how
907
00:56:24,050 –> 00:56:27,850
they impacted us. This generation doesn’t really care
908
00:56:27,850 –> 00:56:31,570
so much about books, but they care about media. And so like some of these,
909
00:56:31,570 –> 00:56:35,090
like video media and all that. So if we can use that,
910
00:56:36,210 –> 00:56:39,170
all we need to do is get their attention. Once we get their attention and
911
00:56:39,170 –> 00:56:42,520
they care about it, then they’ll go back and, and figure out the rest. Right.
912
00:56:42,520 –> 00:56:46,120
So there’s a lot of talk about teachers in this country, particularly the five years
913
00:56:46,120 –> 00:56:49,680
after Covid. Right. Because you know, we all went on lockdown and then,
914
00:56:50,320 –> 00:56:53,760
you know, everybody who’s a parent who had their kid in the public school system
915
00:56:54,480 –> 00:56:57,919
kind of looked over the shoulder. This is kind of what happened. Actually, not kind
916
00:56:57,919 –> 00:57:01,120
of. This is what happened. Everybody who had a kid in public school
917
00:57:01,680 –> 00:57:05,040
all of a sudden had their kids at home learning off of a laptop
918
00:57:05,360 –> 00:57:09,170
and looking over the shoulder and actually for the
919
00:57:09,170 –> 00:57:10,170
first time in
920
00:57:13,290 –> 00:57:16,890
a long time in America, actually seeing for
921
00:57:17,610 –> 00:57:21,450
four, six, eight hours a day what a teacher is actually teaching
922
00:57:21,450 –> 00:57:25,050
their kids. This is why things have started to crack apart with the K through
923
00:57:25,050 –> 00:57:28,850
12 system, which by the way, the unions were
924
00:57:28,850 –> 00:57:32,610
the ones that insisted on lockdowns and worked
925
00:57:32,610 –> 00:57:36,170
in concert with the government and insisted on and still insist on lock
926
00:57:36,170 –> 00:57:39,820
on, on. What are you teaching students
927
00:57:39,820 –> 00:57:43,660
from, from home in many areas, particularly urban areas of
928
00:57:43,660 –> 00:57:47,340
our country. Even though we know three point about statistical
929
00:57:47,340 –> 00:57:50,220
data, we have good research now that
930
00:57:51,340 –> 00:57:54,460
kids lose a step when they are taught virtually
931
00:57:55,820 –> 00:57:59,340
particularly if they are being switched from being taught in person to
932
00:58:00,060 –> 00:58:03,840
being taught virtually, we know this. This is a fact. Okay?
933
00:58:04,240 –> 00:58:08,080
It’s like that two plus two thing. It just is. This is
934
00:58:08,080 –> 00:58:11,840
the thing. Okay? We could argue about why. We could argue
935
00:58:11,840 –> 00:58:14,840
about what the inputs are. All that, that’s fine. But you can’t argue with the
936
00:58:14,840 –> 00:58:18,680
fact, okay? We talk a lot about
937
00:58:18,680 –> 00:58:21,880
teachers in this country and how much teachers get paid and we lament and we
938
00:58:21,880 –> 00:58:24,400
wring our hands and all of that. I don’t want to get into any of
939
00:58:24,400 –> 00:58:27,120
that. Instead, I want to talk about something that’s a little bit more
940
00:58:28,320 –> 00:58:31,960
egregious, I think, which is the
941
00:58:31,960 –> 00:58:35,600
fact that. And this goes to your point about students and TikTok,
942
00:58:36,320 –> 00:58:40,120
I would agree. However, I
943
00:58:40,120 –> 00:58:43,760
think students. I agree. And I think that students
944
00:58:43,760 –> 00:58:47,520
care about ideas the same way students always
945
00:58:47,600 –> 00:58:51,280
have. And what we lack are
946
00:58:51,280 –> 00:58:54,560
teachers willing to engage
947
00:58:54,640 –> 00:58:56,560
passionately with ideas
948
00:58:58,980 –> 00:59:02,780
about the subject matter they are hired to teach. And
949
00:59:02,780 –> 00:59:06,420
this is hugely important with history, right? So, for instance,
950
00:59:07,380 –> 00:59:11,220
if I am going to teach. No. If
951
00:59:11,220 –> 00:59:15,020
I’m going to going to attend a class on an area that
952
00:59:15,020 –> 00:59:18,780
you know a lot about in history, Native American history, it
953
00:59:18,780 –> 00:59:21,620
doesn’t matter whether I agree with your conclusions or not about that history.
954
00:59:22,420 –> 00:59:26,140
None of that matters. None of that matters. I’m
955
00:59:26,140 –> 00:59:29,700
going there to tie into your passion about
956
00:59:29,700 –> 00:59:33,300
that because your passion is going to
957
00:59:33,300 –> 00:59:37,060
either make me care more or it’s going to make
958
00:59:37,060 –> 00:59:40,580
me disagree. But either way, I will give you attention
959
00:59:40,820 –> 00:59:44,300
because of the passion that I see coming from you. This is the whole plot
960
00:59:44,300 –> 00:59:48,020
of the movie Dead Poets Society. How do you make poetry and Shakespeare
961
00:59:48,020 –> 00:59:51,460
interesting? Right? Well, you make poetry and Shakespeare interesting by having a
962
00:59:51,460 –> 00:59:55,290
dynamic person teaching poetry and Shakespeare. That’s how you make it interesting.
963
00:59:56,160 –> 00:59:58,800
Same thing with history. And yet
964
01:00:00,240 –> 01:00:04,000
when you hear teachers talk about teaching history, it’s
965
01:00:04,000 –> 01:00:07,360
either one of two poles. You either have a person who’s got a personal,
966
01:00:07,920 –> 01:00:11,360
as a teacher, got a personal bugaboo of a thing that they’re drilling in on
967
01:00:11,680 –> 01:00:15,240
that’s interesting to them, but it’s not interesting maybe to 99 of the people that
968
01:00:15,240 –> 01:00:19,080
they’re teaching. You see this in colleges a lot. Or you get
969
01:00:19,080 –> 01:00:21,440
a person who is in a K through 12 space where
970
01:00:23,190 –> 01:00:26,790
they would love to be passionate about history, but they’re fighting uphill
971
01:00:26,790 –> 01:00:30,590
against a lack of comprehension, a lack of preparation, kids
972
01:00:30,590 –> 01:00:34,230
falling behind the policies of the system of the blah, blah, blah, blah,
973
01:00:34,230 –> 01:00:37,829
blah. And they would love to be passionate, but they’re not going to try to
974
01:00:37,829 –> 01:00:41,190
break the system because they’re getting paid $30,000 a year. They just want their pension
975
01:00:41,190 –> 01:00:43,430
at the end of it. They’re just not going to try to break the system.
976
01:00:43,510 –> 01:00:46,830
And by the way, they know that the teacher. Because I know teachers in K
977
01:00:46,830 –> 01:00:50,110
through 12 system over the course of my life. They know that the teacher down
978
01:00:50,110 –> 01:00:53,340
the hall who’s supposed to be teaching English comprehension or
979
01:00:53,580 –> 01:00:57,220
spelling or whatever, is it doing their job. They know
980
01:00:57,220 –> 01:01:00,900
this. They know, they know. And so by the time that
981
01:01:00,900 –> 01:01:03,900
kid shows up in sixth period for history,
982
01:01:05,260 –> 01:01:08,860
they’re fighting uphill against the previous, you know, the other previous five
983
01:01:08,860 –> 01:01:12,340
periods of nonsense. And so they’re just trying to get through the period and get
984
01:01:12,340 –> 01:01:16,020
the kid out. And those are the two poles
985
01:01:16,020 –> 01:01:17,660
the teachers are fighting a pill against.
986
01:01:19,790 –> 01:01:23,390
And history is so critical for our time. Now, I would argue that it’s
987
01:01:23,390 –> 01:01:26,910
probably as critical or more critical even than English than
988
01:01:26,910 –> 01:01:29,630
English, right? Even being able to speak and write well.
989
01:01:31,550 –> 01:01:34,269
Because history is the place where, for
990
01:01:35,790 –> 01:01:39,310
it’s the platform where our political battles are being fought.
991
01:01:39,870 –> 01:01:43,710
It’s the place where ideologies are being
992
01:01:43,710 –> 01:01:46,920
made and are being rendered. It’s even a place where, oddly enough,
993
01:01:47,000 –> 01:01:50,360
identities are being formed of all kinds.
994
01:01:51,160 –> 01:01:55,000
And teachers have a huge, huge responsibility, particularly
995
01:01:55,000 –> 01:01:57,960
in the K12 system. Have a huge responsibility. And I don’t know how you fix
996
01:01:57,960 –> 01:01:58,840
the two poles problem.
997
01:02:02,520 –> 01:02:05,000
Well, and to your point, I think,
998
01:02:07,960 –> 01:02:11,690
I think history is too big of a subject for
999
01:02:11,690 –> 01:02:15,210
that to be the, the thing that they use to try to break the polls.
1000
01:02:15,210 –> 01:02:17,890
Right? So. Because yeah, like
1001
01:02:19,170 –> 01:02:22,610
if you just think about. So I remember, I remember going through The K through
1002
01:02:22,610 –> 01:02:25,650
12 system, whatever in public schools. And I remember,
1003
01:02:26,690 –> 01:02:30,050
I remember like in. So we had to take four years when I got to
1004
01:02:30,050 –> 01:02:33,570
high school. There were four years of high school, right? 9, 10, 11, 12 Y.
1005
01:02:33,570 –> 01:02:37,370
And the, and they, they. This is not a, a
1006
01:02:37,370 –> 01:02:40,960
college environment where you get to select your, you know, your
1007
01:02:40,960 –> 01:02:44,080
course, your courses, right? So they. You dictated
1008
01:02:44,960 –> 01:02:48,800
year one. Year one was world history, year two was US
1009
01:02:48,800 –> 01:02:51,360
history, year three was.
1010
01:02:54,320 –> 01:02:57,960
US. History post World War I, I think it was. So it was
1011
01:02:57,960 –> 01:03:01,440
US history one which was like from the beginning of the country,
1012
01:03:02,400 –> 01:03:05,720
you know, 1500s, uh, up until, you know,
1013
01:03:05,720 –> 01:03:09,160
1918, World War I. And then it was
1014
01:03:09,560 –> 01:03:12,760
history from World War u. S history. Two
1015
01:03:13,240 –> 01:03:16,880
was World War I to present day. And then the, the
1016
01:03:16,880 –> 01:03:20,720
four. The fourth year was, was some sort of like there
1017
01:03:20,720 –> 01:03:24,280
was some nonsense. I say nonsense, but it was because it was like, it was
1018
01:03:24,280 –> 01:03:28,040
like theoretical history. Like it was like a future thing. Like we were, we were
1019
01:03:28,040 –> 01:03:31,830
trying to predict like his history predictions or something. Some craziness like that.
1020
01:03:31,830 –> 01:03:34,630
Which by the way, not a single one of us will write about anything. But
1021
01:03:34,630 –> 01:03:38,310
whatever. But because we’re not driving
1022
01:03:38,310 –> 01:03:42,110
flying cars right now. Yeah. We literally just talked
1023
01:03:42,110 –> 01:03:45,590
about this before we hit the record button. We’re not flying cars. We’re not teleporting
1024
01:03:45,590 –> 01:03:49,430
anywhere. We don’t have the Galactic Empire, the Galactic Federation
1025
01:03:49,430 –> 01:03:53,070
of Planets. We don’t have any of that yet. And of my graduating class in,
1026
01:03:53,310 –> 01:03:56,910
in whatever night, early night, whatever, the 19, whatever,
1027
01:03:56,910 –> 01:04:00,610
whatever. We all thought that was happening because we watched
1028
01:04:00,610 –> 01:04:04,290
the Jetsons growing up. Right. Whatever. Anyway, but, but
1029
01:04:04,290 –> 01:04:08,010
anyway, to your, to your point, the. When they, when they
1030
01:04:08,010 –> 01:04:11,810
said to us, like, you know, freshman year is
1031
01:04:11,810 –> 01:04:15,370
going to be world history. And we were like, okay, cool. Whose world?
1032
01:04:15,690 –> 01:04:18,890
What world? Like, because. And they, they went through these,
1033
01:04:19,370 –> 01:04:23,010
they went through these timelines so fast. Yeah. You
1034
01:04:23,010 –> 01:04:26,480
couldn’t get passionate about any of them even if you wanted to. Right. Like
1035
01:04:26,640 –> 01:04:30,320
I. Most of what I learned about in world history my freshman year, I had
1036
01:04:30,320 –> 01:04:33,160
to go reback and I had to go back and revisit things that I thought
1037
01:04:33,160 –> 01:04:36,280
were even a little bit interesting to see if I was curious about him. The
1038
01:04:36,280 –> 01:04:40,040
main dynasty in China, the Samurai. All the like,
1039
01:04:40,040 –> 01:04:43,120
we, and we talked about some of these things on your podcast, but I had
1040
01:04:43,120 –> 01:04:46,880
to go backwards and after the fact to see, like, was that
1041
01:04:46,880 –> 01:04:50,560
worth my time and effort to dig a little deeper because they had.
1042
01:04:50,640 –> 01:04:54,310
It’s so surface level and to your point, if I’m a history
1043
01:04:54,310 –> 01:04:57,630
teacher, for example. To your point. Exactly.
1044
01:04:58,510 –> 01:05:02,310
And I would make the argument that being a teacher in this country
1045
01:05:02,310 –> 01:05:06,070
is the easiest job on the planet. It is the easiest job on the planet
1046
01:05:06,070 –> 01:05:09,110
because number one, you’re not doing it at all unless you
1047
01:05:09,110 –> 01:05:12,750
absolutely love it and you really want to be there
1048
01:05:12,830 –> 01:05:16,550
because as you know and me being on this. But I love history and there’s
1049
01:05:16,550 –> 01:05:20,280
certain parts of history that I love way more than others. I could never
1050
01:05:20,280 –> 01:05:23,720
be a history teacher because I would want to teach just that. And they won’t
1051
01:05:23,720 –> 01:05:27,200
let me. Right. Right. Because I have to follow some syllabus and.
1052
01:05:27,200 –> 01:05:30,800
Etc. So I have to love teaching more than
1053
01:05:30,800 –> 01:05:33,960
anything else in my being in order to be a history teacher.
1054
01:05:34,200 –> 01:05:38,040
Yep. Which I don’t. Which is why I’m not a history teacher. So
1055
01:05:38,040 –> 01:05:41,080
if I, if I were, that’s why I say, and by the way, any teacher
1056
01:05:41,080 –> 01:05:43,400
out there, you don’t have to come find me to kill me. I’m not suggesting
1057
01:05:43,400 –> 01:05:47,010
you have an easy job. That is not what I said. I didn’t say you’re
1058
01:05:47,330 –> 01:05:50,250
the function of your job is easy. I’m just saying it’s the easiest job in
1059
01:05:50,250 –> 01:05:53,730
the planet because you’re not doing it unless you love it. And they always
1060
01:05:53,810 –> 01:05:56,450
say if you love what you do, you never work a day in your life.
1061
01:05:56,450 –> 01:06:00,290
So there you go, Ex. So you have the easiest job
1062
01:06:00,290 –> 01:06:03,050
if you, you’re doing it because you love it. And if you love it, then
1063
01:06:03,050 –> 01:06:06,650
it’s not really work. And if you, if it’s not really work, then stop complaining
1064
01:06:06,650 –> 01:06:10,010
about it anyway. But no, I’m kidding. I’m totally, totally
1065
01:06:10,010 –> 01:06:13,730
kidding. No, no, it’s fine. I, I already nuked. I already nuked teachers already.
1066
01:06:13,730 –> 01:06:17,390
I nuked the unions. I’m in far more trouble than you are. Oh,
1067
01:06:17,390 –> 01:06:19,870
yeah, because you went after the union. I’m just going after the individual. Oh, yeah,
1068
01:06:20,030 –> 01:06:23,790
yeah, yeah, yeah, right. More than me. They’re gonna. Yeah, Randy Baumgarten’s gonna
1069
01:06:23,790 –> 01:06:27,590
find me in about 10 seconds. Anyway, sorry, go ahead, keep going. Yeah, no, but,
1070
01:06:27,590 –> 01:06:30,750
but anyway. But the point, like, to your point, it’s, it’s part of the problem
1071
01:06:30,750 –> 01:06:34,190
that we have with, with, especially with subject matter like this book,
1072
01:06:34,590 –> 01:06:37,150
is that the subject matter is so
1073
01:06:38,190 –> 01:06:41,270
intense, so wide, so varied, so
1074
01:06:41,270 –> 01:06:45,040
expansive, that there’s no way that a single individual is
1075
01:06:45,040 –> 01:06:48,160
going to be able to teach all of it. And if we focus on one
1076
01:06:48,160 –> 01:06:52,000
thing or another, it’s, it’s not going to be. It’s, it’s so isolating
1077
01:06:52,000 –> 01:06:55,360
that there. You’re not understanding all of the other factors involved. For
1078
01:06:55,360 –> 01:06:59,200
example, and to your, to your point about, like, if you really
1079
01:06:59,200 –> 01:07:02,280
go deep, if you could literally be a teacher about
1080
01:07:02,760 –> 01:07:06,600
the preamble to World War II, never mind the actual war, but
1081
01:07:06,600 –> 01:07:10,200
the, the events that led up to World War II, you could teach an entire
1082
01:07:10,280 –> 01:07:13,040
semester college syllabus on just that
1083
01:07:14,080 –> 01:07:17,840
if somebody was willing to take it. But they don’t
1084
01:07:17,840 –> 01:07:21,600
find that valuable enough to take that course, like that one course about
1085
01:07:21,920 –> 01:07:25,600
the, you know, the, the. The era between World War I and World
1086
01:07:25,600 –> 01:07:29,040
War II that basically caused World War II to happen in the first place.
1087
01:07:29,360 –> 01:07:33,120
All the geopolitical landscaping that happened, all the political power shifting that
1088
01:07:33,120 –> 01:07:36,800
happened, all that stuff in one instance. And by the
1089
01:07:36,800 –> 01:07:40,410
way, not just the time frame, forget about just the time frame,
1090
01:07:40,490 –> 01:07:44,250
but what was happening in Germany was different than what was happening
1091
01:07:44,250 –> 01:07:47,730
in the rest of Europe that was different. That was happening in China. Did China
1092
01:07:47,730 –> 01:07:50,490
have anything to do with this? Did the United States being so,
1093
01:07:51,690 –> 01:07:55,490
so arrogant and egotistical, thinking that they were too strong to even
1094
01:07:55,490 –> 01:07:59,210
bother with it in the first place. What was that geopolitical landscape?
1095
01:07:59,210 –> 01:08:02,970
How was that economy working for you, by the way? Black. Black Friday was in
1096
01:08:02,970 –> 01:08:06,680
there. In. Involved in that as well. Like there. There. There’s so many different
1097
01:08:06,680 –> 01:08:10,480
components to what just brought us to war. One that. Sorry.
1098
01:08:10,480 –> 01:08:14,040
To. To World War II, that you could not be an expert at all of
1099
01:08:14,040 –> 01:08:17,840
it. You just can’t. So. Right. That’s another reason why we don’t learn from the
1100
01:08:17,840 –> 01:08:21,480
mistakes of history. It’s too vast and we just don’t
1101
01:08:21,480 –> 01:08:25,200
get deeply involved in enough to, like, actually pluck out
1102
01:08:25,200 –> 01:08:28,920
the real lessons. So this goes. But this goes
1103
01:08:28,920 –> 01:08:31,840
to. Okay, so this goes to a basic problem with. With the. With the structure
1104
01:08:31,840 –> 01:08:34,650
of the public school system, which we never actually talked about that on this podcast.
1105
01:08:34,800 –> 01:08:38,640
It’s been a long time coming. The structure of K through 12
1106
01:08:38,720 –> 01:08:42,160
schooling is built, of course, on the
1107
01:08:42,160 –> 01:08:45,840
Henry Ford industrialization model,
1108
01:08:46,240 –> 01:08:50,080
which, by the way, takes its. Talk about history, takes
1109
01:08:50,080 –> 01:08:53,920
its cue, or took its cue from a Prussian model,
1110
01:08:53,920 –> 01:08:57,760
a German Prussian model of education, which was focused on
1111
01:08:58,160 –> 01:09:01,760
getting people just enough information to become
1112
01:09:02,000 –> 01:09:05,600
rote soldiers. Right. To be able
1113
01:09:05,600 –> 01:09:09,120
to. Because this is what. This is what not only Kaiser Wilhelm.
1114
01:09:09,680 –> 01:09:13,440
Well, every German leader from Kaiser Wilhelm all the way to Adolf
1115
01:09:13,440 –> 01:09:16,960
Hitler wanted. Right. Was
1116
01:09:17,040 –> 01:09:20,840
compliant troops that would do what they
1117
01:09:20,840 –> 01:09:24,560
were told. And Henry Ford and
1118
01:09:24,560 –> 01:09:28,280
John D. Rockefeller. And John. John Dewey. Rockefeller, sorry,
1119
01:09:28,280 –> 01:09:31,390
Dewey and Ford and all those other folks got together
1120
01:09:32,110 –> 01:09:34,870
and a bunch of other folks got together and they said, we will build an
1121
01:09:34,870 –> 01:09:37,470
American public education system to take these people from the farm
1122
01:09:38,350 –> 01:09:40,030
who were used to
1123
01:09:41,950 –> 01:09:45,750
walking behind a plow and doing whatever it is that they wanted. Right? And we
1124
01:09:45,750 –> 01:09:49,390
have to turn them into industrial cogs in a room
1125
01:09:49,630 –> 01:09:53,350
so that they don’t get up and leave to turn a person into an
1126
01:09:53,350 –> 01:09:57,140
industrial cog, starting at the age of 4 or
1127
01:09:57,140 –> 01:10:00,940
5 or 6 until they are 17
1128
01:10:00,940 –> 01:10:04,620
or 18. You don’t need to teach
1129
01:10:04,620 –> 01:10:08,380
that person history. So, of course history is taught. As to your
1130
01:10:08,380 –> 01:10:12,100
point, this dizzying array of names
1131
01:10:12,100 –> 01:10:15,020
and dates and nonsense that you just blast through
1132
01:10:15,660 –> 01:10:19,500
because you’re not actually teaching people ideas or how to think, because
1133
01:10:19,500 –> 01:10:22,620
God forbid you do that. You don’t want them to think. You want them to
1134
01:10:22,620 –> 01:10:25,880
shut up and go to the factory. And this is my
1135
01:10:25,960 –> 01:10:29,640
opposition, by the way, one of my core oppositions to the entire public
1136
01:10:29,640 –> 01:10:33,160
school system in America. We need to reform the
1137
01:10:33,160 –> 01:10:36,920
whole thing. I don’t know if that’s possible.
1138
01:10:37,080 –> 01:10:39,959
My wife is way more radical on this than I am, which is why we
1139
01:10:39,959 –> 01:10:43,600
homeschool our kids. Yes, I did just say. But we homeschool our kids. My wife
1140
01:10:43,600 –> 01:10:47,280
is way more radical on this than I am. I think the
1141
01:10:47,280 –> 01:10:50,920
system probably has to be, not probably has to be reformed
1142
01:10:52,040 –> 01:10:55,400
because we no longer live in a world where
1143
01:10:55,400 –> 01:10:58,520
rote factory work,
1144
01:10:59,000 –> 01:11:02,320
even if we bring all the factories back to America, it still won’t be rope
1145
01:11:02,320 –> 01:11:05,400
factory work. It will not be the rope factory work of
1146
01:11:06,120 –> 01:11:09,400
what Henry Ford was wanting his workers to be between
1147
01:11:09,960 –> 01:11:13,720
1910 and 1930.
1148
01:11:14,360 –> 01:11:17,850
It’s not going to be that. That worked really well in the 20th century.
1149
01:11:18,010 –> 01:11:20,650
Mass industrialization worked really well. I’m not knocking it.
1150
01:11:21,690 –> 01:11:23,930
And we don’t live in that world anymore. We don’t live in a world of
1151
01:11:23,930 –> 01:11:27,530
mass industrialization. We need people to think in terms of ideas,
1152
01:11:28,650 –> 01:11:32,250
not in terms of rote responses to the test.
1153
01:11:35,210 –> 01:11:38,890
And so history, which, to your point about
1154
01:11:39,050 –> 01:11:41,450
being an expert in what happened in the world between,
1155
01:11:42,780 –> 01:11:46,340
let’s say, let’s conserve, let’s take, let’s pick a 25 year period between like, you
1156
01:11:46,340 –> 01:11:50,140
know, 1915 and, and you know, our 15 year period,
1157
01:11:50,140 –> 01:11:52,860
1950. Oh, no, no, let’s put 25, 1915 to
1158
01:11:52,860 –> 01:11:56,060
1939. Being an expert just on that
1159
01:11:56,380 –> 01:11:59,980
requires you to be as, as someone once said,
1160
01:11:59,980 –> 01:12:02,980
Richard Dawkins, I think when he was talking to Jordan Peterson, used this phrase, which
1161
01:12:02,980 –> 01:12:06,340
I really like, be drunk on ideas. You have to be drunk on the
1162
01:12:06,340 –> 01:12:10,110
ideas of that period. Because to your point,
1163
01:12:10,110 –> 01:12:13,870
there’s a lot of ideas of that period, economic, social, cultural, and on
1164
01:12:13,870 –> 01:12:17,630
and on and on. I don’t know that you have anybody in
1165
01:12:17,630 –> 01:12:21,070
K through 12 who
1166
01:12:21,070 –> 01:12:24,470
approaches because they’re still doing the rope thing.
1167
01:12:24,710 –> 01:12:28,510
Right, the rope preparation thing. And are there some systems that are changed, some
1168
01:12:28,510 –> 01:12:31,590
parts of the school system, some parts of the country that are changing? Yes. I’m
1169
01:12:31,590 –> 01:12:33,470
sure that if you’re in the sound of my voice, you’re going to think of
1170
01:12:33,470 –> 01:12:37,280
Your K through 12 public education building or your
1171
01:12:37,280 –> 01:12:40,600
student and you’re going to say, well, not my kid. That’s not what’s happening. Blah,
1172
01:12:40,600 –> 01:12:43,880
blah, blah, blah, blah. It’s great. Okay, sure. No one raindrop blames itself for the
1173
01:12:43,880 –> 01:12:47,000
flood. Okay, cool. Yeah, you’re special. Got it. All right.
1174
01:12:48,120 –> 01:12:51,800
But you see the outcomes, by the way, when those
1175
01:12:51,800 –> 01:12:54,800
kids who have gone through the K through 12 system and some of them have
1176
01:12:54,800 –> 01:12:58,520
matriculated to college now come out and
1177
01:12:58,600 –> 01:13:01,720
not only do they not know history, but they know tick tock really well.
1178
01:13:02,930 –> 01:13:06,770
They don’t know how to write a sentence. And now I, as an employer,
1179
01:13:06,770 –> 01:13:10,450
as a leader, have to lead these people around who, who have
1180
01:13:10,450 –> 01:13:14,170
never been given the who’s. Who’s whose thirst has
1181
01:13:14,170 –> 01:13:17,570
never been activated for even the history
1182
01:13:17,809 –> 01:13:21,450
of the business that they are in because their thirst was never
1183
01:13:21,450 –> 01:13:25,250
activated properly for the history of the world and
1184
01:13:25,250 –> 01:13:28,900
country that they live in. That’s a
1185
01:13:28,900 –> 01:13:32,540
tragedy. I said all of that. To say that that is an
1186
01:13:32,860 –> 01:13:35,500
absolute tragedy on how we teach history.
1187
01:13:36,620 –> 01:13:40,420
And as an amateur historian and a person who gets interested
1188
01:13:40,420 –> 01:13:44,140
in all of this stuff, it’s an absolute bugaboo for me. Like,
1189
01:13:45,020 –> 01:13:48,380
we do. We homeschool our kids and so, like, when my wife has a difficult
1190
01:13:48,380 –> 01:13:52,180
question in history that she does not know how to answer or how to propose
1191
01:13:52,180 –> 01:13:55,370
or difficult idea that pops up in history, go ask.
1192
01:13:56,090 –> 01:13:59,730
She told me to go ask. Go ask your father. Go ask him. He’ll tell
1193
01:13:59,730 –> 01:14:02,690
you. And I’ll talk to you for four hours about it. Because I just, I
1194
01:14:02,690 –> 01:14:05,930
just know. I just know the stuff. Like, sure, ask me about the collapse of
1195
01:14:05,930 –> 01:14:09,370
the rise and fall the Ottoman Empire. Like, why do I know about that?
1196
01:14:10,890 –> 01:14:13,530
Why do I need to know about that? Like the sultan here,
1197
01:14:14,490 –> 01:14:16,970
like, why do I need to know about that? Or why do I need to
1198
01:14:16,970 –> 01:14:20,290
know about the monetary system of, you know, the Greek
1199
01:14:20,290 –> 01:14:23,990
islands as described by Heraclitus?
1200
01:14:24,470 –> 01:14:27,710
Why do I need. Why, why do I know this? Because I’m passionate about the
1201
01:14:27,710 –> 01:14:31,270
ideas behind those kinds of things. And there’s always a core idea.
1202
01:14:31,350 –> 01:14:35,110
And of course the idea is of humanity advancing and building civilization
1203
01:14:35,670 –> 01:14:38,550
built on to Lidell Hart’s idea, built on promises.
1204
01:14:39,430 –> 01:14:42,550
But at the end of the day, I’m drunk on ideas. And you need a
1205
01:14:42,550 –> 01:14:46,390
history teacher to be drunk on ideas, particularly at the K12 level, because
1206
01:14:46,390 –> 01:14:50,070
my, the kids will pick up on that passion and then they’ll lock it.
1207
01:14:53,110 –> 01:14:55,270
And by the way, you never want to ask a kid to predict the future.
1208
01:14:56,790 –> 01:15:00,590
Like, that’s ridiculous. That’s. That’s some nonsense. Did you,
1209
01:15:00,590 –> 01:15:03,069
when you were in high school, did you go to, like, not go to. But
1210
01:15:03,069 –> 01:15:05,310
were you like, in that? Did you. You don’t have to say the year you
1211
01:15:05,310 –> 01:15:09,030
graduated, but like, in the four years of high school, looking back,
1212
01:15:09,110 –> 01:15:12,910
was that part of like the years in this country where we were doing experimental
1213
01:15:12,910 –> 01:15:16,270
learning? Oh, God, yes. Yeah, yeah, yeah. Okay. All right, so you, you.
1214
01:15:16,510 –> 01:15:19,390
I’m sorry, I’m sorry.
1215
01:15:20,030 –> 01:15:23,710
Yeah, it was just. And again, it was like. But
1216
01:15:24,430 –> 01:15:27,910
there was. They were trying to teach us like,
1217
01:15:27,910 –> 01:15:31,750
predictive analytics. I. They shouldn’t have called it a history class, put it that
1218
01:15:31,750 –> 01:15:35,150
way. But, but it fell under, under the history department
1219
01:15:35,790 –> 01:15:39,360
because it was supposed to be historical data that was supposed to be predictive analytics.
1220
01:15:41,030 –> 01:15:43,990
I always thought that that class should have been under the math department, but that’s
1221
01:15:45,110 –> 01:15:48,950
whatever. So, but anyway, either way, nothing,
1222
01:15:48,950 –> 01:15:51,990
nothing that we had. Not a single one of us got a damn thing right.
1223
01:15:52,870 –> 01:15:56,630
But, but okay, so. Well, one last question.
1224
01:15:56,710 –> 01:15:59,590
So you, you brought up something that’s very interesting. So
1225
01:16:00,390 –> 01:16:04,070
I’ll ask, I’ll frame my observation the four of a question without any
1226
01:16:04,070 –> 01:16:07,760
run up. How’s your factual history? Why
1227
01:16:07,760 –> 01:16:08,880
is that so popular?
1228
01:16:12,080 –> 01:16:13,680
Well, I, I, I think,
1229
01:16:18,320 –> 01:16:22,160
I, I think it gives you retrospect and, and I think that
1230
01:16:22,160 –> 01:16:25,760
the, we talked a lot about history invoking emotions here.
1231
01:16:26,240 –> 01:16:29,960
I think sometimes I, I think sometimes
1232
01:16:29,960 –> 01:16:33,670
that counterfactual information you get a different sense,
1233
01:16:33,670 –> 01:16:37,430
you get a different emotion coming from it and then it’s up to you to
1234
01:16:37,430 –> 01:16:40,270
kind of balance that that, that too. Again,
1235
01:16:41,150 –> 01:16:43,950
this is, this is a very, this to me is a very clear
1236
01:16:45,070 –> 01:16:48,510
example of. There’s always three sides to every story.
1237
01:16:48,670 –> 01:16:51,630
His, hers and the truth. That’s why to me,
1238
01:16:51,870 –> 01:16:55,510
counterfactual information that can be very, very important when you
1239
01:16:55,510 –> 01:16:59,340
start seeing. Again, I, I’m not talking about
1240
01:16:59,340 –> 01:17:02,980
statistical, verifiable statistical data. I’m talking about
1241
01:17:03,700 –> 01:17:07,540
like, I’ll just take our, an example
1242
01:17:07,540 –> 01:17:10,980
from, from my, my own, my own
1243
01:17:11,140 –> 01:17:14,580
repertoire. Here, right here in Massachusetts, out in western part of
1244
01:17:14,580 –> 01:17:18,100
Massachusetts, there’s a small town that originally was a native community.
1245
01:17:18,820 –> 01:17:21,620
And the, the,
1246
01:17:23,710 –> 01:17:27,310
the facts of the case, the statistical data behind it
1247
01:17:27,310 –> 01:17:31,150
was a hundred armed men walked
1248
01:17:31,150 –> 01:17:34,590
into that village and killed 400 unarmed people.
1249
01:17:34,990 –> 01:17:38,750
Period. Okay. Okay. The factual versus
1250
01:17:38,750 –> 01:17:41,950
counterfactual details to it were
1251
01:17:42,830 –> 01:17:46,470
that the armed men were colonialists. The
1252
01:17:46,470 –> 01:17:50,160
native community was comprised at the time of the
1253
01:17:50,160 –> 01:17:53,840
massacre or at the time of the, the incursion. The, the time of the
1254
01:17:53,840 –> 01:17:56,120
event was mostly women and children.
1255
01:17:57,640 –> 01:18:01,480
So you had a hundred armed men going in and killing 400
1256
01:18:01,720 –> 01:18:05,120
unarmed women and children. Okay, the
1257
01:18:05,120 –> 01:18:08,960
counter. So that’s, that’s all factual information that’s supposed to invoke
1258
01:18:08,960 –> 01:18:12,680
a certain kind of feeling to you. And I’m, as I’m even saying it, I
1259
01:18:12,680 –> 01:18:16,090
get riled up a little bit inside. But so then the alternate, the
1260
01:18:16,090 –> 01:18:19,850
counterfactual information of that was the armed men had
1261
01:18:19,850 –> 01:18:23,690
intel that told them that warriors were going to be there and
1262
01:18:23,690 –> 01:18:27,210
that they should shoot on site. And it didn’t matter whether they were
1263
01:18:27,370 –> 01:18:30,810
just shoot the people and we’ll, it’s like basically drop the bodies and we’ll figure
1264
01:18:30,810 –> 01:18:34,410
it out later. Most of those armed men,
1265
01:18:35,210 –> 01:18:39,050
years later had tremendous
1266
01:18:39,050 –> 01:18:42,810
mental breakdowns. Over it. These guys did not live
1267
01:18:42,810 –> 01:18:46,530
normal, healthy lives after the fact. So they did. They were
1268
01:18:46,530 –> 01:18:49,850
basically just taking. Let me, Let me ask you if you heard this story before.
1269
01:18:50,730 –> 01:18:54,170
We killed innocent people, but we were just taking orders. We’re a military
1270
01:18:54,890 –> 01:18:58,450
faction that we’re just, we’re just doing our job. We’re just taking orders. So you
1271
01:18:58,450 –> 01:19:01,370
shouldn’t vilify us over this. Right.
1272
01:19:02,330 –> 01:19:06,170
So again, when you talk about counterfactual information or,
1273
01:19:06,170 –> 01:19:09,670
or. And again, it’s not because the.
1274
01:19:09,750 –> 01:19:13,550
Again, facts are facts, you can’t really change them, but sometimes when
1275
01:19:13,550 –> 01:19:17,110
you hear only one side of the facts and you don’t hear both
1276
01:19:17,190 –> 01:19:20,950
sides of, again, factual information, albeit. And again,
1277
01:19:21,750 –> 01:19:24,990
the point is now you have his, hers in the truth. Right? So now you
1278
01:19:24,990 –> 01:19:28,590
have the, the. The. We also don’t
1279
01:19:28,590 –> 01:19:32,110
know, the warriors that were expected to be there, were they
1280
01:19:32,110 –> 01:19:35,720
deemed, you know, criminals
1281
01:19:35,720 –> 01:19:39,480
or some sort of. What is the
1282
01:19:39,480 –> 01:19:43,280
word that I’m thinking of here? Somebody that you’re
1283
01:19:43,280 –> 01:19:46,960
targeting very specifically in warfare. I can’t remember there’s a particular name for it.
1284
01:19:47,280 –> 01:19:51,080
Oh, insurgents. Yeah. Yeah. So. So though. And
1285
01:19:51,080 –> 01:19:54,000
they were known. They. We don’t know. There were certain facts that we don’t know.
1286
01:19:54,000 –> 01:19:57,440
Now after the fact, after the deed was done, after, you know, and this came
1287
01:19:57,440 –> 01:20:00,970
out probably 100 years later,
1288
01:20:01,450 –> 01:20:05,050
that those particular warriors were in fact involved
1289
01:20:05,050 –> 01:20:08,890
in raids on, you know, on the colonial. And this, this was
1290
01:20:09,290 –> 01:20:12,970
pre. Before the time of the state of Massachusetts even existed.
1291
01:20:13,370 –> 01:20:17,090
So this is like in the 1600s. And. But so that
1292
01:20:17,090 –> 01:20:20,610
you find out after the fact that those warriors were indeed, in fact part
1293
01:20:20,610 –> 01:20:24,330
of. Of raids that were on, you know, colonial
1294
01:20:24,330 –> 01:20:28,050
towns and they, they did do some stuff. They. That could have warranted them
1295
01:20:28,530 –> 01:20:32,330
being singled out and, and going after. So
1296
01:20:32,330 –> 01:20:36,010
does that justify the hundred armed men going in there and killing 400
1297
01:20:36,010 –> 01:20:39,570
unarmed and innocent women and children? Probably not that, but
1298
01:20:39,730 –> 01:20:41,810
let me say. Let me rephrase that. No.
1299
01:20:44,610 –> 01:20:48,050
Does that negate them from. From responsibility of quote,
1300
01:20:48,050 –> 01:20:51,770
unquote, just taking orders? No. But if you
1301
01:20:51,770 –> 01:20:55,170
were in that moment and you were one of those soldiers, would you have done
1302
01:20:55,170 –> 01:20:57,800
something different? My guess is no.
1303
01:20:58,760 –> 01:21:02,280
Oh, does something different. No. Everybody’s a hero after the fact
1304
01:21:03,800 –> 01:21:07,600
or. Or a villain after the fact. Everybody’s a hero or a villain. Right. But
1305
01:21:07,600 –> 01:21:11,320
in that moment, can you truly crucify that per those. Those
1306
01:21:11,320 –> 01:21:15,120
hundred guys? Some could say yes to a
1307
01:21:15,120 –> 01:21:18,760
degree. Because again, again, walking into a village of
1308
01:21:19,000 –> 01:21:22,560
400 innocent women and children, by the way, I keep saying the word innocent because
1309
01:21:22,560 –> 01:21:26,290
that’s the way it’s Depicted, you can simply say 400 women
1310
01:21:26,290 –> 01:21:29,810
and children. Right. Yeah. You don’t. Yeah, you don’t need another adjective on there.
1311
01:21:30,770 –> 01:21:34,330
But. But again, but the word innocent is. Is purposeful.
1312
01:21:34,330 –> 01:21:38,050
It’s intended to invoke emotion to you. Right, Right.
1313
01:21:38,050 –> 01:21:41,650
So that. That’s. Again, that’s kind of what I’m. What I’m getting at here. And
1314
01:21:41,650 –> 01:21:44,850
when you’re describing the 100 men, it’s 100 armed
1315
01:21:45,090 –> 01:21:48,450
military men. Like, again, you’re supposed to get a feeling
1316
01:21:48,710 –> 01:21:52,070
of what you’re. What you’re seeing here. And you’re supposed to develop this
1317
01:21:52,230 –> 01:21:55,430
vision in your brain as to what was happening at the moment. Now, there are
1318
01:21:55,430 –> 01:21:59,230
people that will argue that at some point One of those
1319
01:21:59,230 –> 01:22:03,070
100 should have realized that they weren’t armed insurgents, that they were actually
1320
01:22:03,070 –> 01:22:06,710
women and children. Stopped the massacre. Maybe you killed
1321
01:22:06,710 –> 01:22:10,150
100 of them, but you’re not going to kill all 400.
1322
01:22:10,790 –> 01:22:14,510
Right. But that’s. That’s the argument that’s made in most of these cases
1323
01:22:14,510 –> 01:22:18,360
where you’re no longer just following orders. You are now just a
1324
01:22:18,360 –> 01:22:21,760
murderer because you’ve. You’ve walked into this village, you started shooting.
1325
01:22:21,920 –> 01:22:25,280
Nobody stopped to actually think or do or whatever. But again,
1326
01:22:25,840 –> 01:22:29,680
that’s the whole. Like, to your point about. Or the question about car. Counterfactual
1327
01:22:29,680 –> 01:22:33,520
information is. To me, it’s always that.
1328
01:22:33,840 –> 01:22:37,600
That version of his, hers and the truth. Always. There’s always some semblance
1329
01:22:37,600 –> 01:22:41,360
of. You’re gonna get a story from the victor, a story from the loser.
1330
01:22:41,360 –> 01:22:45,130
And the. The actual story is probably somewhere in the middle. Somewhere in the
1331
01:22:45,130 –> 01:22:48,330
middle. Yeah. Yeah, yeah, yeah. Well. And
1332
01:22:52,650 –> 01:22:56,250
Liddell Hart has an answer for this. Back to the book. There’s a thought about
1333
01:22:56,250 –> 01:23:00,010
this. Back to the book. Back to. Why don’t we learn from history?
1334
01:23:00,970 –> 01:23:04,330
So this is in the War and Peace section. The
1335
01:23:04,330 –> 01:23:06,650
dilemma of the Intellectual.
1336
01:23:10,900 –> 01:23:13,060
We did not plan this, folks, so this is good.
1337
01:23:15,940 –> 01:23:19,420
Neither intellectuals nor their critics appear to recognize the
1338
01:23:19,420 –> 01:23:23,060
inherent dilemma of the thinking man and its inevitability.
1339
01:23:24,020 –> 01:23:27,100
The dilemma should be faced, for it is a natural part of the growth of
1340
01:23:27,100 –> 01:23:30,860
any human mind. An intellectual ought to realize the extent to
1341
01:23:30,860 –> 01:23:34,580
which the world is shaped by human emotions, emotions uncontrolled
1342
01:23:34,580 –> 01:23:37,900
by reason. His thinking must have been shallow and his
1343
01:23:37,900 –> 01:23:41,620
observations narrow. If he fails to realize that having once
1344
01:23:41,620 –> 01:23:45,460
learned to think and to use reason as a guide, however, he cannot possibly
1345
01:23:45,460 –> 01:23:49,139
float with the current of popular emotion and fluctuate with its violent
1346
01:23:49,139 –> 01:23:52,980
changes unless he himself ceases to think or is deliberately false in
1347
01:23:52,980 –> 01:23:56,780
his own thought. And in the Latter case, it
1348
01:23:56,780 –> 01:24:00,520
is likely that he will commit intellectual suicide gradually, quote,
1349
01:24:00,520 –> 01:24:04,040
unquote, by the death of a thousand cuts. A deeper
1350
01:24:04,040 –> 01:24:07,840
diagnosis of the malady from which left wing intellectuals have suffered in the
1351
01:24:07,840 –> 01:24:11,520
past might suggest that their troubles have come not from following reason too far,
1352
01:24:11,920 –> 01:24:14,960
but from not following it far enough to realize the general power of
1353
01:24:14,960 –> 01:24:18,760
unreason. That’s an interesting point. Many
1354
01:24:18,760 –> 01:24:22,600
of them also seem to have suffered from failing to apply reason internally as
1355
01:24:22,600 –> 01:24:26,290
well as externally, through not using it with the control of their own
1356
01:24:26,290 –> 01:24:30,090
emotions. In that way, they unwittingly helped to get this country into the mess
1357
01:24:30,090 –> 01:24:33,770
of the last war and then found themselves in an intellectual mess as a
1358
01:24:33,770 –> 01:24:37,450
result. In one of the more penetrating criticisms
1359
01:24:37,450 –> 01:24:39,930
written on this subject, George Orwell
1360
01:24:41,690 –> 01:24:45,330
exposed a profound truth in saying that, quote, the energy that
1361
01:24:45,330 –> 01:24:49,170
actually shapes the world springs from emotions. He referred to the deep
1362
01:24:49,170 –> 01:24:53,010
seated and dynamic power of racial pride, leader worship, religious belief, love
1363
01:24:53,010 –> 01:24:56,580
of war. There are powerful emotions beyond these, however,
1364
01:24:56,740 –> 01:25:00,540
the energy of the intellectual himself springs from emotion, the love of
1365
01:25:00,540 –> 01:25:04,260
truth, the desire for wider knowledge and understanding. That emotion
1366
01:25:04,260 –> 01:25:06,980
has done quite a lot to shape the world, as a study of world history
1367
01:25:07,220 –> 01:25:10,900
amply shows. In the thinking man, that
1368
01:25:10,900 –> 01:25:14,540
source of energy dries up only when he ceases to believe
1369
01:25:14,540 –> 01:25:18,100
in the guiding power of thought and allows himself to become merely a
1370
01:25:18,100 –> 01:25:21,390
vehicle for the prevailing popular emotions of
1371
01:25:21,790 –> 01:25:25,590
the moment. I’m going to skip
1372
01:25:25,590 –> 01:25:28,990
down because he’s going to talk about Bertrand Russell, he says, and I’m going to
1373
01:25:28,990 –> 01:25:32,270
read this paragraph. History bears witness to the vital part that
1374
01:25:32,270 –> 01:25:36,110
prophets, quote, unquote, have played in human progress, which is evidence of
1375
01:25:36,110 –> 01:25:39,950
the ultimate practical value of expressing unreservedly the truth as
1376
01:25:39,950 –> 01:25:43,790
one sees it. Yet it also becomes clear that the acceptance and spreading
1377
01:25:43,790 –> 01:25:46,840
of their vision has always depended on another class of men, leaders
1378
01:25:47,560 –> 01:25:51,040
who had to be philosophical strategists striking a
1379
01:25:51,040 –> 01:25:54,280
compromise between truth. I think you’ve said this actually,
1380
01:25:54,760 –> 01:25:58,440
Tom, striking a compromise between truth and
1381
01:25:58,600 –> 01:26:02,199
men’s receptivity to it. Their effect
1382
01:26:02,199 –> 01:26:05,840
is often depended as much on their own limitations in perceiving the
1383
01:26:05,840 –> 01:26:09,160
truth as on their practical wisdom in proclaiming it.
1384
01:26:09,960 –> 01:26:13,800
The prophet must be stoned. That is their lot and the test of their self
1385
01:26:13,800 –> 01:26:17,440
fulfillment. A leader who is stoned, however, may merely prove that he has failed in
1386
01:26:17,440 –> 01:26:21,080
his function through a deficiency of wisdom or through confusion, confusing his
1387
01:26:21,080 –> 01:26:24,880
function with that of a prophet. Time alone can tell whether the effect of
1388
01:26:24,880 –> 01:26:28,680
such a sacrifice redeems the apparent failure as a leader that does honor to
1389
01:26:28,680 –> 01:26:32,480
him as a man. And then, finally, I’m going to Skip
1390
01:26:32,480 –> 01:26:36,080
a couple paragraphs to go down. Opposition to the truth is inevitable,
1391
01:26:36,160 –> 01:26:39,400
especially if it takes the form of a new idea. But the degree of resistance
1392
01:26:39,400 –> 01:26:42,440
can be diminished by giving thought not only to the aim but to the method
1393
01:26:42,440 –> 01:26:45,470
of approach. This is an important tactic.
1394
01:26:46,190 –> 01:26:48,830
Avoid a frontal attack on a long established position.
1395
01:26:49,630 –> 01:26:53,470
Instead, seek to turn it by a flank movement so that a more
1396
01:26:53,470 –> 01:26:56,430
penetrable side is exposed to the thrust of truth.
1397
01:26:57,390 –> 01:27:01,070
But any such indirect approach. But in any such indirect
1398
01:27:01,070 –> 01:27:04,830
approach, take care not to diverge from the truth. For nothing is
1399
01:27:04,830 –> 01:27:08,430
more fatal to its real advancement than to lapse
1400
01:27:09,060 –> 01:27:10,980
into untruth.
1401
01:27:16,420 –> 01:27:19,860
I wish this guy was still alive. I’d love to have him on the show.
1402
01:27:26,180 –> 01:27:29,700
The dilemma of the intellectual. Or,
1403
01:27:29,860 –> 01:27:33,140
you know, I mean, I. There’s a clip floating around from one of our episodes
1404
01:27:33,140 –> 01:27:36,950
of our show. We were talking about Sam Altman. If we’re so smart,
1405
01:27:36,950 –> 01:27:40,590
why aren’t we rich? Tom, you know. Yeah, I watch that a lot,
1406
01:27:41,630 –> 01:27:42,430
that clip.
1407
01:27:50,350 –> 01:27:54,190
I mean, the big
1408
01:27:54,190 –> 01:27:57,790
separation in life is between the thinker and the doer, right? And,
1409
01:27:59,230 –> 01:28:02,910
and the intellect is hobbled by his mind and the working of his faculties.
1410
01:28:03,380 –> 01:28:06,580
He’s hobbled into not doing or not acting.
1411
01:28:08,980 –> 01:28:12,420
However, the theory of how people operate can only take you so far.
1412
01:28:12,820 –> 01:28:16,620
You actually have to go into practice. You actually have to. You actually
1413
01:28:16,620 –> 01:28:20,460
have to actually interact with people, right? And the dilemma of the intellectual,
1414
01:28:20,460 –> 01:28:24,020
the, the practical dilemma that Liddell Hart is talking about
1415
01:28:24,020 –> 01:28:27,780
here relates exactly to what we’ve been talking about here on,
1416
01:28:27,860 –> 01:28:30,520
on the show. When we over
1417
01:28:30,520 –> 01:28:34,240
intellectualize history and we lack
1418
01:28:34,240 –> 01:28:37,600
the emotive where we. Or we deny the emotive power of it, or even worse.
1419
01:28:37,600 –> 01:28:40,320
And I see a lot of this happening particularly in the last, I would say
1420
01:28:40,320 –> 01:28:43,720
10 to 15 years in our country when we are. When we allow
1421
01:28:43,800 –> 01:28:47,440
history to fall into political or ideological
1422
01:28:47,440 –> 01:28:51,200
capture. Now we’re tying emotions
1423
01:28:51,200 –> 01:28:54,920
to ideology to wind people up because we don’t know
1424
01:28:55,770 –> 01:28:59,530
another way to get them to act like.
1425
01:28:59,530 –> 01:29:03,170
One of the knocks in our time is how few people, particularly
1426
01:29:03,170 –> 01:29:06,010
young people, protest
1427
01:29:06,410 –> 01:29:09,850
injustices like previous generations
1428
01:29:10,170 –> 01:29:13,930
did in the 60s and 70s. And there’s a lot of
1429
01:29:13,930 –> 01:29:17,770
different factors that go into why young people don’t protest injustices at
1430
01:29:17,770 –> 01:29:21,500
as high a rate as was
1431
01:29:21,500 –> 01:29:25,300
done by previous generations. There’s a lot of different factors that go into this.
1432
01:29:26,260 –> 01:29:30,020
But the biggest thing that I see is the protests from young people
1433
01:29:30,420 –> 01:29:34,100
that are about injustices that may or may not be
1434
01:29:34,100 –> 01:29:37,340
happening are probably some of the most ill
1435
01:29:37,340 –> 01:29:39,060
informed protests
1436
01:29:41,460 –> 01:29:44,260
in recent history because
1437
01:29:46,590 –> 01:29:50,350
they have not been given the intellectual foundation
1438
01:29:51,790 –> 01:29:54,990
They’ve just the intellectual foundation to think through their positions.
1439
01:29:55,870 –> 01:29:58,310
So I’ll give you a case in point. I can pro, this is a big
1440
01:29:58,310 –> 01:30:01,990
example and I can point to this when I see a protest on
1441
01:30:01,990 –> 01:30:05,830
an Ivy League college campus. Columbia is
1442
01:30:05,830 –> 01:30:07,630
too easy to pick on, so I’ll pick on Stanford.
1443
01:30:11,790 –> 01:30:15,620
And, and, and just for the record. Stanford is not an Ivy League school,
1444
01:30:15,620 –> 01:30:19,380
but go ahead. Well, they, they think they are, but anyway, it’s fine. I
1445
01:30:19,380 –> 01:30:23,220
know, I know they’re not, but they think they are anyway, so. Yeah,
1446
01:30:25,140 –> 01:30:28,460
and, and, and students unroll a banner in
1447
01:30:28,460 –> 01:30:32,260
protest and the banner on
1448
01:30:33,060 –> 01:30:35,700
says Gays for G.
1449
01:30:39,950 –> 01:30:43,710
There’s a fundamental, there’s several fundamental. And I’m
1450
01:30:43,710 –> 01:30:47,310
not the only person to point this out. This is why it’s so easy. There’s
1451
01:30:47,310 –> 01:30:51,030
several fundamental steps in
1452
01:30:51,030 –> 01:30:54,749
the ladder to that thought that you just
1453
01:30:54,749 –> 01:30:58,510
haven’t been educated on gay people
1454
01:30:58,510 –> 01:31:01,550
in Gaza? Well, there are none
1455
01:31:03,390 –> 01:31:07,060
because in that part of the world, Islam has a
1456
01:31:07,380 –> 01:31:10,660
tradition as a religion
1457
01:31:12,420 –> 01:31:16,020
that is not as, shall we say, tolerant as
1458
01:31:16,020 –> 01:31:19,220
Western Christianity has been. And Western
1459
01:31:19,220 –> 01:31:21,460
secularism has been around
1460
01:31:23,300 –> 01:31:26,820
the public open practice out of the closet of
1461
01:31:27,140 –> 01:31:30,700
homosexuality. And so if you
1462
01:31:30,700 –> 01:31:34,510
are unrolling a banner that proclaims that you are
1463
01:31:34,510 –> 01:31:38,230
gay and you, you are protesting against a
1464
01:31:38,230 –> 01:31:41,990
perceived injustice happening during a war in Gaza,
1465
01:31:42,470 –> 01:31:45,830
so thus you are pro the people in Gaza, you have to understand
1466
01:31:46,310 –> 01:31:50,150
that the history of the people that you claim to be in, in solidarity
1467
01:31:50,150 –> 01:31:53,990
with, those people don’t want to be in solidarity
1468
01:31:53,990 –> 01:31:57,070
with you. And by the way, this is not something that they’ve recently like come
1469
01:31:57,070 –> 01:32:00,590
to a conclusion about. They don’t want to be in
1470
01:32:00,590 –> 01:32:02,270
solidarity with you going back to like 600,
1471
01:32:04,510 –> 01:32:08,230
the 6th century. And by the way, they’ve been very clear about
1472
01:32:08,230 –> 01:32:10,110
this. They’re not hiding it.
1473
01:32:12,030 –> 01:32:15,470
So that reveals an intellectual poverty
1474
01:32:16,110 –> 01:32:19,870
that is incredibly troubling. That could be an intellectual poverty that
1475
01:32:19,870 –> 01:32:23,610
again could be solved through an actual understanding
1476
01:32:23,680 –> 01:32:27,360
understanding of the facts of history. So would you, would you
1477
01:32:27,360 –> 01:32:30,960
been, would you have been more understanding of the
1478
01:32:30,960 –> 01:32:34,520
rolling out of the banner if it said gays for Gaza and then in
1479
01:32:34,520 –> 01:32:36,960
parentheses it said we understand the irony?
1480
01:32:38,880 –> 01:32:41,960
Yes, I would have been. Yes, that, that would actually, that would actually have worked
1481
01:32:41,960 –> 01:32:45,240
for me. Yes, actually, that, that would, that would have worked for me. Absolutely. And
1482
01:32:45,240 –> 01:32:47,440
I picked that one because it’s, it’s a hot bit one, but you can see
1483
01:32:47,440 –> 01:32:50,130
it everywhere on social media. I picked that. So I picked that one. One. It’s
1484
01:32:50,130 –> 01:32:53,930
also very blatant. Like, I knew exactly where you were going with
1485
01:32:53,930 –> 01:32:57,610
this as soon as you said, the. What the banner said. Yeah, it’s very blatant.
1486
01:32:57,610 –> 01:33:00,450
I do, I do agree with you in, in a sense, like, so
1487
01:33:01,730 –> 01:33:05,570
there’s. Just a lack of, a lack of historical understanding behind that. Right, right.
1488
01:33:05,970 –> 01:33:09,730
And, and to my point, it’s not whether you are gay or
1489
01:33:09,730 –> 01:33:13,330
not. Shouldn’t have, like, who cares? I, I don’t. It’s not
1490
01:33:13,890 –> 01:33:16,530
if they’re gay or not. But if you’re going to write something like that,
1491
01:33:17,680 –> 01:33:21,480
to your point, or to my point a second ago, be clever
1492
01:33:21,480 –> 01:33:25,240
about it and make sure you let the world know you understand the irony
1493
01:33:25,240 –> 01:33:29,040
behind it. Like, we, we know that the people in Gaza don’t give a
1494
01:33:29,040 –> 01:33:32,880
crap about us, but we, we care about them. We care about the
1495
01:33:32,880 –> 01:33:36,680
right, we care about the X, Y, whatever, whatever. If you feel
1496
01:33:36,680 –> 01:33:40,400
it’s an atrocity or if you feel like it’s a humanitarian thing,
1497
01:33:40,640 –> 01:33:44,470
whatever, whatever you’re. But, but to your point, and I, I don’t disagree
1498
01:33:44,470 –> 01:33:47,990
with you, by the way, that there is a fundamental disconnect
1499
01:33:48,230 –> 01:33:51,430
with some of the, the ways in which people are
1500
01:33:52,070 –> 01:33:55,670
project that they think versus what they actually think or what
1501
01:33:55,670 –> 01:33:59,430
or, or why they think that way. There’s, there’s disconnect there. And I, I think,
1502
01:33:59,510 –> 01:34:02,390
yeah, when I saw, I, I saw something similar
1503
01:34:03,270 –> 01:34:06,590
several times. And it’s not just Stanford that has done it. There’s been. The
1504
01:34:06,590 –> 01:34:10,160
LGBTQ community has done several things like that.
1505
01:34:10,640 –> 01:34:14,400
And, and Right. I just go, but do you,
1506
01:34:15,040 –> 01:34:18,720
like, do you understand who you’re supporting? And by the way,
1507
01:34:18,800 –> 01:34:22,000
I’m not suggesting that we shouldn’t stop the, the,
1508
01:34:22,640 –> 01:34:26,280
this craziness that’s happening in the Middle East. I. No, no, no. I am for
1509
01:34:26,280 –> 01:34:30,080
stopping killing people. Believe me. I don’t care what country you’re in.
1510
01:34:30,080 –> 01:34:33,400
I had. Stop killing people. Like, why are we doing this? This doesn’t make any
1511
01:34:33,400 –> 01:34:36,940
sense. You, you referenced a few minutes ago or earlier in this
1512
01:34:36,940 –> 01:34:40,740
podcast, this, this podcast episode that we today have more
1513
01:34:40,820 –> 01:34:44,580
avenues for diplomacy than we ever have yet. Ever.
1514
01:34:44,900 –> 01:34:48,100
Yet we still have. These things happen. Like,
1515
01:34:48,660 –> 01:34:52,380
absolutely. Because. Because human nature will. Human nature will. Will out.
1516
01:34:52,380 –> 01:34:55,980
Even Ladell Hart would say this human nature will out. Like, it just, it just
1517
01:34:55,980 –> 01:34:59,780
will. People will go to war. I’m reading John Keegan’s, which we’re
1518
01:34:59,780 –> 01:35:03,220
going to cover that on the podcast too. Coming up here in an episode, John
1519
01:35:03,220 –> 01:35:06,230
Keegan’s History of the First World War War. And it is
1520
01:35:07,510 –> 01:35:11,270
the thing, the reasons he talks about for going to war are
1521
01:35:11,270 –> 01:35:15,030
the same reasons that people go to war now. It has not changed.
1522
01:35:15,110 –> 01:35:18,790
National pride. A leader’s personal hurt feelings
1523
01:35:18,950 –> 01:35:22,750
and arrogance and ego. Right. They think they can get away with it or
1524
01:35:22,750 –> 01:35:26,590
they have hurt pride as a nation or as a leader. Arrogance,
1525
01:35:26,590 –> 01:35:30,310
ego, pride, power. Right, exactly.
1526
01:35:30,310 –> 01:35:33,950
And, and so they’re just going to. There’s going to do it. So I, I
1527
01:35:33,950 –> 01:35:37,470
would. And yes, if there were a, there were a banner that were rolled out
1528
01:35:37,470 –> 01:35:41,310
underneath there, absolutely. I would, I would be
1529
01:35:41,310 –> 01:35:44,470
like, oh, okay, so you, you know what the game is. Okay, cool. I can
1530
01:35:44,470 –> 01:35:47,670
leave that. But, but because there’s not a banner there
1531
01:35:48,150 –> 01:35:51,910
now, I have to treat you or I have to treat that message. I have
1532
01:35:51,910 –> 01:35:55,190
to deal with that message. I have to examine that message critically
1533
01:35:55,750 –> 01:35:59,600
in the realm of ideas and in the realm of ideas,
1534
01:36:00,080 –> 01:36:01,120
that message.
1535
01:36:04,240 –> 01:36:07,120
It may take a brain surgeon. Normally you say it doesn’t take a brain surgeon.
1536
01:36:07,440 –> 01:36:10,960
No, no. It may take a brain surgeon to, to
1537
01:36:11,280 –> 01:36:15,120
parse critically what’s happening inside of that
1538
01:36:15,120 –> 01:36:18,800
message and to realize, again, the poverty of idea that’s at the
1539
01:36:18,800 –> 01:36:21,200
bottom of it. And by the way, there’s a lot of these messages that are
1540
01:36:21,200 –> 01:36:24,680
floating around now, like, okay, you want to be in solidarity with the Palestinians for
1541
01:36:24,680 –> 01:36:28,480
whatever identity group you come from in America. Cool beings, that’s fine. You have the
1542
01:36:28,480 –> 01:36:32,320
freedom to do that. You live in America. You should be thanking whatever deity
1543
01:36:32,320 –> 01:36:35,920
you pray to that you live in America. Cool. Fine. Understand,
1544
01:36:36,480 –> 01:36:39,280
though, that your particular affinity group,
1545
01:36:41,840 –> 01:36:44,640
the history of your particular affinity group in this country
1546
01:36:45,920 –> 01:36:49,520
is a history that is, that is uniquely situated
1547
01:36:50,160 –> 01:36:52,900
in this historical context of this, this country.
1548
01:36:54,100 –> 01:36:56,980
So now I’m going to pick on the Columbia students. Now I’m going to go
1549
01:36:56,980 –> 01:37:00,660
for them. So when you’re talking about
1550
01:37:05,540 –> 01:37:08,100
bringing the Intifada to Columbia University,
1551
01:37:09,460 –> 01:37:13,260
you don’t know what you’re talking about. You
1552
01:37:13,260 –> 01:37:16,900
just don’t know what you’re talking about. You have no idea. You, you, you’ve now,
1553
01:37:17,300 –> 01:37:21,020
you’ve now taken a historical event that you may feel emotion
1554
01:37:21,020 –> 01:37:24,780
about or a series of historical events that you may feel emotion about
1555
01:37:24,780 –> 01:37:27,220
because of the way they were taught to you by a person who’s an expert
1556
01:37:27,220 –> 01:37:30,940
on those historical events. You’ve taken that emotion, you’ve
1557
01:37:30,940 –> 01:37:34,100
channeled it into political ideology, and you’ve come up with a
1558
01:37:34,100 –> 01:37:36,820
sloganeering idea that sounds good on paper.
1559
01:37:37,700 –> 01:37:41,140
Right. We’re going to internationalize or globalize the Intifada. Right.
1560
01:37:42,260 –> 01:37:45,910
And you don’t understand what
1561
01:37:45,910 –> 01:37:48,750
you’re asking for because someone like me
1562
01:37:49,630 –> 01:37:52,750
looks at that message and goes, okay.
1563
01:37:53,390 –> 01:37:56,830
And by the way, there’s no irony underneath that you’re deadly Sincere.
1564
01:37:57,070 –> 01:38:00,790
Okay, that’s cool. I’m deadly sincere about not having the
1565
01:38:00,790 –> 01:38:04,470
Intifada here because I know exactly what that word means. And the
1566
01:38:04,470 –> 01:38:08,030
history that that word represents inside of that historical
1567
01:38:08,030 –> 01:38:11,070
context over there. That has, by the way,
1568
01:38:11,950 –> 01:38:15,350
some of it does have something to do with us as America, but it has
1569
01:38:15,350 –> 01:38:19,190
a vast majority, vast majority of that has more
1570
01:38:19,190 –> 01:38:22,750
to do with how the European colonial powers dealt with the
1571
01:38:22,750 –> 01:38:26,270
breakup of the Ottoman Empire and dealt with Turkey
1572
01:38:26,270 –> 01:38:29,830
going all the way back to the rise of the Ottoman empire in like
1573
01:38:29,830 –> 01:38:31,310
1086 or something.
1574
01:38:33,870 –> 01:38:37,110
That’s history. That’s what I’m talking about. That poverty of
1575
01:38:37,110 –> 01:38:40,830
understanding, that inability to link ideas together and have second thoughts.
1576
01:38:41,250 –> 01:38:45,090
That’s the part that’s most troubling to me around why we don’t learn
1577
01:38:45,090 –> 01:38:47,970
from history. That’s the most troubling thing to me. And of course I’m going to
1578
01:38:47,970 –> 01:38:51,050
have people who are going to email me and say, well, hey, son, yes, we
1579
01:38:51,050 –> 01:38:54,690
absolutely do understand that. It goes to 1086. And you have to understand that colonialism,
1580
01:38:54,770 –> 01:38:58,250
daba dabba, dabba, systemic oppression. And I’m going to say, well, that’s one
1581
01:38:58,250 –> 01:39:01,770
interpretation. That’s just like your interpretation,
1582
01:39:01,770 –> 01:39:03,890
man. I have a different interpretation.
1583
01:39:05,970 –> 01:39:09,290
I want to see that you have an actual depth to your
1584
01:39:09,290 –> 01:39:13,090
interpretation. And I don’t think you do, because I don’t think you studied
1585
01:39:13,090 –> 01:39:13,370
history.
1586
01:39:17,210 –> 01:39:21,010
Yeah. Oh, I’m going all in on this one. I’m going
1587
01:39:21,010 –> 01:39:24,690
all in on this episode. I got nothing on this one. I’m going all in
1588
01:39:24,690 –> 01:39:28,290
on this one. I am, because I’m passionate about history and I’m
1589
01:39:28,290 –> 01:39:31,770
passionate about people actually learning these things and actually caring
1590
01:39:32,170 –> 01:39:35,610
in sincere ways about what the words are that they say
1591
01:39:35,970 –> 01:39:39,570
and understanding the historical context that, that they exist and that they exist
1592
01:39:39,570 –> 01:39:41,570
in. And also understanding that
1593
01:39:43,730 –> 01:39:47,530
the history of this country is not the history of everywhere else. And so
1594
01:39:47,530 –> 01:39:51,370
the kinds of quirks and things that you get to get away with here, if
1595
01:39:51,370 –> 01:39:54,890
I put you anywhere else, I mean, we’re talking about the Middle east right now,
1596
01:39:54,890 –> 01:39:58,130
but let’s talk about any, any country on the African continent.
1597
01:39:58,610 –> 01:40:00,530
If I put you there, you’re going to have, you’re going to have, you’re going
1598
01:40:00,530 –> 01:40:04,180
to have a problem because it’s a different thing there. And
1599
01:40:04,180 –> 01:40:07,860
so outsourcing that or
1600
01:40:07,860 –> 01:40:11,460
insourcing those ideas to here doesn’t. Doesn’t work. And
1601
01:40:11,460 –> 01:40:15,260
then the other, the other way doesn’t work either. Outsourcing those ideas to other
1602
01:40:15,260 –> 01:40:19,020
places. Like the conversations that we have in America about social justice
1603
01:40:19,100 –> 01:40:22,940
based on our particular historical understanding of what that means don’t work
1604
01:40:22,940 –> 01:40:26,740
in France and they don’t work in England because they come
1605
01:40:26,740 –> 01:40:30,300
from a different lineage and a different heritage and a different way of thinking about
1606
01:40:30,300 –> 01:40:34,100
both society and justice which gets
1607
01:40:34,100 –> 01:40:36,940
back to historical consciousness anyway.
1608
01:40:37,900 –> 01:40:41,700
And, and you know, so, so listen, this is, this is summed up very
1609
01:40:41,700 –> 01:40:44,900
easily, right? This is subbed up very easily. Yeah, go ahead. We just did two
1610
01:40:44,900 –> 01:40:48,540
hours. We did. And, and the idea, the concept
1611
01:40:48,540 –> 01:40:52,340
here is very simply we’re never going to learn
1612
01:40:52,340 –> 01:40:55,500
from history. It’s never going to happen because
1613
01:40:55,980 –> 01:40:59,830
of the, the, the emotions behind pride, power,
1614
01:41:00,070 –> 01:41:03,670
ego. That, that’s part of, it’s,
1615
01:41:03,670 –> 01:41:06,750
it’s human nature. It’s who we are as a people, as, as a, as a,
1616
01:41:06,750 –> 01:41:10,590
as an entity, as a being. We, we, we get our, we get
1617
01:41:10,590 –> 01:41:14,350
our pride hurt, we want to fight, we get, we, we, we,
1618
01:41:14,350 –> 01:41:17,950
we see something that we don’t have that somebody else has, we want it, we’re
1619
01:41:17,950 –> 01:41:21,270
gonna. Fight. Like it, it,
1620
01:41:21,670 –> 01:41:23,510
we’re never. Now that being said,
1621
01:41:25,520 –> 01:41:29,280
if we ever get to the point where, you know, there’s
1622
01:41:29,280 –> 01:41:32,080
been lots of talks about how, you know, over
1623
01:41:32,480 –> 01:41:36,120
evolution where like you know, we’re losing our pinky, we lost our
1624
01:41:36,120 –> 01:41:39,440
appendix because certain things happen and there’s some sci fi
1625
01:41:39,679 –> 01:41:43,240
theory out there that, that says that as we, as our
1626
01:41:43,240 –> 01:41:47,080
brains get bigger and we are urged to actually that that
1627
01:41:47,080 –> 01:41:50,650
fight or flight mechanism will go away and
1628
01:41:50,810 –> 01:41:54,650
the urge to fight for power will go away because we’ll have the intellect
1629
01:41:54,970 –> 01:41:57,850
to not have to do that anymore. Until that happens,
1630
01:41:59,130 –> 01:42:02,170
we’re never going to learn from history because we are going to fight over things
1631
01:42:02,170 –> 01:42:05,810
that are prideful for us. We’re going to fight over things that we want power
1632
01:42:05,810 –> 01:42:09,650
over. We’re going to fight over things that you know, our leadership
1633
01:42:09,650 –> 01:42:13,290
has a ego, ego trip over
1634
01:42:14,090 –> 01:42:17,480
because of their own, their own self interest or wor. Like
1635
01:42:17,640 –> 01:42:21,400
that’s going to continue to happen. So and, and
1636
01:42:21,400 –> 01:42:24,840
I don’t think any, there, there was a reference to,
1637
01:42:25,400 –> 01:42:29,160
there was a reference to the, the, the, the, the, the
1638
01:42:29,160 –> 01:42:32,600
military strategy room that goes a little beyond
1639
01:42:32,840 –> 01:42:36,040
the Joint Chiefs of Staffs in, in World War II in thinking about
1640
01:42:37,320 –> 01:42:41,000
Japan would never in a million years bomb Pearl harbor because they just know how
1641
01:42:41,330 –> 01:42:43,970
big and strong the United States is. And there was like a,
1642
01:42:45,010 –> 01:42:48,850
a believability bias that happened in there where even if one person said
1643
01:42:48,930 –> 01:42:52,730
but wait, the data shows this and they
1644
01:42:52,730 –> 01:42:56,370
go yeah, I don’t care what the data says, Japan will never do it. Right?
1645
01:42:57,169 –> 01:42:59,930
Japan’s never going to do that. We’re too big, we’re too big, we’re too big,
1646
01:42:59,930 –> 01:43:03,610
etc. Etc. We’re too strong, we’re too powerful, etc. Whatever. And
1647
01:43:03,610 –> 01:43:07,110
there’s a, there’s a report even of saying that when Japan’s,
1648
01:43:08,220 –> 01:43:12,060
when Japan’s planes were in, in route,
1649
01:43:12,700 –> 01:43:16,420
a radar radar technician went
1650
01:43:16,420 –> 01:43:20,140
to his superior officer and said hey, there’s a blip on this map. It looks
1651
01:43:20,140 –> 01:43:23,980
a little funny. And the radar got the, the, the superior officer said,
1652
01:43:23,980 –> 01:43:26,700
oh well, we’re, I, I think we’re expecting our
1653
01:43:26,700 –> 01:43:29,500
B17s coming back. Don’t worry about it.
1654
01:43:31,020 –> 01:43:34,750
An hour later, Pearl harbor doesn’t exist or you know, the rest is history.
1655
01:43:34,750 –> 01:43:38,590
So yeah, so there there. So even when there is a
1656
01:43:38,590 –> 01:43:40,670
singular voice of reason
1657
01:43:42,350 –> 01:43:46,150
because of like that consensus bias that happened
1658
01:43:46,150 –> 01:43:49,990
in that we still don’t listen to it. So even if there is an
1659
01:43:49,990 –> 01:43:53,790
intellect that comes above and beyond or goes beyond or our thinking and
1660
01:43:54,350 –> 01:43:57,390
again all the things we talked about today, whether it’s military,
1661
01:43:57,390 –> 01:44:01,160
geopolitical, whether it’s business related because you don’t think that the,
1662
01:44:01,240 –> 01:44:04,880
that somebody out there is going to an like the, the
1663
01:44:04,880 –> 01:44:08,160
VC world going hey can we put a little bit of a break on this,
1664
01:44:08,160 –> 01:44:11,680
on spending on these AI companies? Because there’s going to be a bubble somewhere. And
1665
01:44:11,680 –> 01:44:14,600
then the, the consensus bias still happens and goes
1666
01:44:15,080 –> 01:44:17,880
nah, you don’t know. What you’re talking about. Nah, you don’t know what you’re talking
1667
01:44:17,880 –> 01:44:19,400
about. You don’t know what you’re talking about.
1668
01:44:22,040 –> 01:44:25,850
So foreign. We’re
1669
01:44:25,850 –> 01:44:29,570
not solving this problem today, Hasan. No, we’re not solving this problem today.
1670
01:44:29,570 –> 01:44:33,090
But maybe, maybe, maybe that we can. Identify it
1671
01:44:33,170 –> 01:44:36,890
is probably tells me that there is much smarter people than us that
1672
01:44:36,890 –> 01:44:39,650
have already identified it and they couldn’t do anything about it.
1673
01:44:41,890 –> 01:44:45,690
Well so at least, at least what we can say is this. I think we
1674
01:44:45,690 –> 01:44:49,490
can say this. I think we could sum up the our two hour long conversation
1675
01:44:49,650 –> 01:44:53,290
in this, this, this two hour long episode around why We Don’t Learn
1676
01:44:53,290 –> 01:44:55,810
from History by BH Lart with this
1677
01:44:57,570 –> 01:45:01,290
I am a proponent and I think probably you are Tom, as
1678
01:45:01,290 –> 01:45:04,930
you Tom, are as well. I am a proponent of a. More,
1679
01:45:06,050 –> 01:45:09,650
some would say tragic but maybe that’s not it. A more
1680
01:45:10,210 –> 01:45:13,810
small C. Small C conservative view of history
1681
01:45:14,530 –> 01:45:18,330
rather than a large capital P progressive view of history
1682
01:45:18,730 –> 01:45:22,410
which is the kind of view of history that quite frankly
1683
01:45:22,730 –> 01:45:26,370
folks in the west have had since the Enlightenment going back
1684
01:45:26,370 –> 01:45:30,130
to the 17th century. This idea that everything will
1685
01:45:30,130 –> 01:45:33,970
just somehow consistently get better and that
1686
01:45:33,970 –> 01:45:36,250
history is merely the Tale of
1687
01:45:37,610 –> 01:45:41,460
endless improvement going forward. I mean, you see this with the techno future optimistic
1688
01:45:41,690 –> 01:45:45,370
right now, like Mark Andreessen and others that are. To your point, they’re sitting around
1689
01:45:45,370 –> 01:45:49,210
in all those rooms dumping all this money into AI because like, the future
1690
01:45:49,210 –> 01:45:52,490
will just be better. Maybe it will, maybe it won’t.
1691
01:45:53,370 –> 01:45:56,570
The tragic view of human nature. They also serve another
1692
01:45:57,450 –> 01:46:01,210
very vital problem with the. The whole theory of history teaching us
1693
01:46:01,290 –> 01:46:04,490
is because they feel if they throw enough money at it, it will eventually get
1694
01:46:04,490 –> 01:46:08,340
better regardless. Right? You’re right. Exactly. No matter how tragic human nature is,
1695
01:46:08,490 –> 01:46:11,930
is right. Right. And. And the thing is you can’t.
1696
01:46:13,850 –> 01:46:17,530
In taking a small c. Conservative view of. Of. Of human nature.
1697
01:46:17,530 –> 01:46:21,370
Not of human nature, sorry, of history, which I, I believe
1698
01:46:21,370 –> 01:46:25,130
I’m a proponent of. You.
1699
01:46:25,370 –> 01:46:29,210
You understand that no matter how many trillions of dollars you throw at human
1700
01:46:29,210 –> 01:46:32,410
nature. And yes, I did use trillions.
1701
01:46:34,180 –> 01:46:37,660
It doesn’t matter. Human nature is impervious to your
1702
01:46:37,660 –> 01:46:41,180
trillions. And human nature is a whack a
1703
01:46:41,180 –> 01:46:44,700
mole. It pops up in all kinds of odd
1704
01:46:44,700 –> 01:46:48,380
places and it. It pushes
1705
01:46:48,380 –> 01:46:52,220
and expands boundaries, usually against your
1706
01:46:52,220 –> 01:46:55,380
will and the will of your dollars. And by the way, the will of your
1707
01:46:55,380 –> 01:46:59,220
power. You know, you could say that
1708
01:46:59,870 –> 01:47:03,670
the trillions of dollars will do it, or rules and regulations will
1709
01:47:03,670 –> 01:47:07,110
do it, or laws will do it, or punishments and
1710
01:47:07,110 –> 01:47:10,030
consequences will do it. And the fact of the matter is,
1711
01:47:11,310 –> 01:47:14,990
a small c conservative reading of history
1712
01:47:16,350 –> 01:47:18,510
says that you’re actually incorrect.
1713
01:47:19,950 –> 01:47:23,790
You’re wrong. And you’re wrong with
1714
01:47:24,510 –> 01:47:27,720
a. You’re wrong. And you’re. And you’re one more.
1715
01:47:29,240 –> 01:47:32,920
You’re one more wealthy person or wealthy system
1716
01:47:33,960 –> 01:47:35,560
or wealthy set of institutions
1717
01:47:37,720 –> 01:47:41,000
that will wind up on the ash sheep of history
1718
01:47:41,800 –> 01:47:43,720
being read about by a bunch of people
1719
01:47:45,720 –> 01:47:49,320
four or five hundred years from now going, why those people didn’t get it.
1720
01:47:50,200 –> 01:47:50,680
Yeah.
1721
01:47:54,290 –> 01:47:58,130
And so with that, there’s no upper.
1722
01:47:58,290 –> 01:48:01,650
Upper on this episode of the podcast. There’s a way to end with this. I
1723
01:48:01,650 –> 01:48:04,210
think we’re done. I think we’re just. We’re just done. Wait, hold on. I think
1724
01:48:04,210 –> 01:48:04,770
we’re done here
1725
01:48:10,930 –> 01:48:14,210
with that. We’re. We’re out.











