Interview with Brian Morgan, Founder of Think Deeply, Write Clearly
—
00:00 Introduction to Brian Morgan, Founder of Think Deeply, Write Clearly.
08:44 The Kardashians vs. Meaningful Influence.
13:09 Craving Authentic, Thoughtful Content.
20:02 “Focus on Audience, Not Virality.”
26:41 Identification Is Not Power.
30:47 Everyday Unspoken Choices.
37:24 Education System-Conspiracy Theory Debate.
40:08 Privileged Upbringing’s Educational Impact.
44:21 “Globalism Education Gap.”
50:25 Human Success: Cognition-Enhanced Instincts.
58:50 Personal Rationality and Emotion.
01:00:09 Self as Reality’s Interpreter.
01:06:42 Novels, Noise, and Cultural Wisdom.
01:15:28 Human Intuition vs. AI Analysis.
01:21:01 “Deep Writing Program Intro Offer.”
01:21:54 Connect with Brian Morgan
—
Opening and closing themes composed by Brian Sanyshyn of Brian Sanyshyn Music.
—
- Pick up your copy of 12 Rules for Leaders: The Foundation of Intentional Leadership NOW on AMAZON!
- Check out the 2022 Leadership Lessons From the Great Books podcast reading list!
—
★ Support this podcast on Patreon ★
- Subscribe to the Leadership Lessons From The Great Books Podcast: https://bit.ly/LLFTGBSubscribe
- Check out HSCT Publishing at: https://www.hsctpublishing.com/.
- Check out LeadingKeys at: https://www.leadingkeys.com/
- Check out Leadership ToolBox at: https://leadershiptoolbox.us/
- Contact HSCT for more information at 1-833-216-8296 to schedule a full DEMO of LeadingKeys with one of our team members.
—
- Leadership ToolBox website: https://leadershiptoolbox.us/.
- Leadership ToolBox LinkedIn: https://www.linkedin.com/company/ldrshptlbx/.
- Leadership ToolBox YouTube: https://www.youtube.com/@leadershiptoolbox/videos
- Leadership ToolBox Twitter: https://twitter.com/ldrshptlbx.
- Leadership ToolBox IG: https://www.instagram.com/leadershiptoolboxus/.
- Leadership ToolBox FB: https://www.facebook.com/LdrshpTl
00:00:17,255 –> 00:00:21,015
Hello. My name is Jesan Sorrells, and this is the Leadership
6
00:00:21,015 –> 00:00:24,074
Lessons from the Great Books podcast, bonus.
7
00:00:25,255 –> 00:00:28,840
There’s no book reading on these bonus
8
00:00:28,980 –> 00:00:32,340
episodes, or at least there’s no usually no book reading, although we’ve been breaking that
9
00:00:32,340 –> 00:00:35,800
rule lately. These are typically interviews,
10
00:00:35,860 –> 00:00:39,320
rants, raves, insights, and other audio musings and conversations
11
00:00:39,620 –> 00:00:43,195
about leadership. Because listening to me and an
12
00:00:43,195 –> 00:00:47,035
interesting guest talk about leadership for at least a couple of hours, is
13
00:00:47,035 –> 00:00:50,575
still better than reading and trying to understand yet another
14
00:00:51,035 –> 00:00:54,810
business book, even that business book that’s been sitting on your
15
00:00:54,810 –> 00:00:57,770
shelf for the last six months that you got for Christmas, even though that was,
16
00:00:57,770 –> 00:01:01,130
like, two months ago. Our guest
17
00:01:01,130 –> 00:01:04,810
today, Brian Morgan, is a founder who has
18
00:01:04,810 –> 00:01:08,555
a system. Quoting directly from his website,
19
00:01:08,775 –> 00:01:12,375
Brian Morgan built the think deeply, write clearly system to address the
20
00:01:12,375 –> 00:01:15,895
gap between the requirements of the college education system even at
21
00:01:15,895 –> 00:01:19,355
excellent schools and the depth of thinking and writing needed in business.
22
00:01:19,840 –> 00:01:23,360
The system grew out of his 15 experience as managing editor at one of New
23
00:01:23,360 –> 00:01:27,040
York City’s premier environmental planning and engineering firms and also from his
24
00:01:27,040 –> 00:01:30,580
teaching work at the New Jersey Institute of Technology among other
25
00:01:30,800 –> 00:01:33,300
New York City area schools.
26
00:01:36,445 –> 00:01:40,204
But having a website that is touting a system is
27
00:01:40,204 –> 00:01:43,805
not the reason we are talking with him today on the show. It’s a
28
00:01:43,805 –> 00:01:47,485
nice little extra thing, but it’s not the real reason we’re talking
29
00:01:47,485 –> 00:01:50,520
with him on the show. That would violate one of our core principles.
30
00:01:51,860 –> 00:01:55,460
We’re talking with Brian on the show today because he thinks deeply about
31
00:01:55,460 –> 00:01:59,060
writing, education, reading technology, and that the
32
00:01:59,219 –> 00:02:02,920
and how all those areas intersect with, well, leadership.
33
00:02:03,485 –> 00:02:06,545
And he writes clearly about all of those areas.
34
00:02:07,485 –> 00:02:10,764
And in a world of algorithmic and I’m gonna use a term here. So if
35
00:02:10,764 –> 00:02:14,525
you’re got kids in the car, mute me here. But, in a
36
00:02:14,525 –> 00:02:18,260
world of algorithmic, quote, unquote, instidification, to borrow a term
37
00:02:18,260 –> 00:02:21,860
from Cory Doctorow, which describes the gradual deterioration of
38
00:02:21,860 –> 00:02:25,540
online platforms, but increasingly can be applied to the deterioration of
39
00:02:25,540 –> 00:02:28,980
communication in general, understanding in particular, and
40
00:02:28,980 –> 00:02:32,595
critical thinking most narrowly, Morgan’s
41
00:02:32,595 –> 00:02:35,815
approach to deepening human thinking through writing
42
00:02:37,075 –> 00:02:40,755
might just be the revolution we need right
43
00:02:40,755 –> 00:02:44,470
now. And I haven’t even gotten into the
44
00:02:44,470 –> 00:02:48,310
impact that the large language models are going to have on
45
00:02:48,310 –> 00:02:51,990
human cognition. We’re already starting to see the
46
00:02:51,990 –> 00:02:55,690
signs, and we may talk about that today. But Brian
47
00:02:55,885 –> 00:02:58,845
has thought about all of this and more, and we’re gonna talk with him, like
48
00:02:58,845 –> 00:03:02,685
I said, about all of it here on the podcast today. So welcome to the
49
00:03:02,685 –> 00:03:06,465
podcast, Brian. How are you doing? Good. Thank you for the opportunity,
50
00:03:06,525 –> 00:03:10,070
my friend. Appreciate appreciate your time today as well. Absolutely.
51
00:03:10,290 –> 00:03:13,750
So tell all our listeners who you are and what do you do
52
00:03:14,130 –> 00:03:17,970
and all that. Well, I mean, I think you’ve done a nice you’ve
53
00:03:17,970 –> 00:03:21,730
done a nice introduction. If if people hang out on
54
00:03:21,730 –> 00:03:25,125
LinkedIn, they probably, know me for the
55
00:03:25,125 –> 00:03:27,465
business that I run, which is,
56
00:03:29,525 –> 00:03:32,345
both a a a corporate training and also
57
00:03:33,925 –> 00:03:36,505
marketing and sales company.
58
00:03:37,520 –> 00:03:40,900
But those two bend together in one specific
59
00:03:40,960 –> 00:03:44,480
space, which is which is really, I I would say, I
60
00:03:44,480 –> 00:03:47,840
help people put language to their
61
00:03:47,840 –> 00:03:50,740
thinking. And that begs
62
00:03:51,605 –> 00:03:55,285
the question, of course, because that that’s not particularly new
63
00:03:55,285 –> 00:03:58,965
or revolutionary in and of itself. That it begs the
64
00:03:58,965 –> 00:04:02,405
question really, what is the quality of the thought that is being
65
00:04:02,405 –> 00:04:06,110
expressed? And and that maybe that maybe
66
00:04:06,110 –> 00:04:09,329
shouldn’t be as quite as revolutionary as it is.
67
00:04:09,790 –> 00:04:13,629
Right? Maybe we should think about the quality of the
68
00:04:13,629 –> 00:04:17,324
thoughts that we want to express, and and that but that really
69
00:04:17,324 –> 00:04:20,845
goes into both directions. Right? Like, if you work for an
70
00:04:20,845 –> 00:04:24,685
institution where where people make decisions based you work for
71
00:04:24,685 –> 00:04:28,445
the United States Federal Reserve or you you work for a consulting company so that
72
00:04:28,445 –> 00:04:32,220
people are gonna make decisions based on your language, then
73
00:04:32,220 –> 00:04:35,900
then they need to know the quality of the thinking behind that language. They
74
00:04:35,900 –> 00:04:39,740
can make accurate decisions. And it’s sort of the same
75
00:04:39,740 –> 00:04:43,580
thing, not not sort of, in the marketing and sales space. If we were
76
00:04:43,580 –> 00:04:47,294
to indicate to somebody that this is our view of the world and
77
00:04:47,294 –> 00:04:50,835
and that because of this view in the world, the services that we provide
78
00:04:51,375 –> 00:04:55,155
are accurate reflections of what we see the world to be, then
79
00:04:55,375 –> 00:04:58,860
then we want the quality of that thinking to be to be good. And,
80
00:04:59,020 –> 00:05:02,319
unfortunately, we live in a in a world both on
81
00:05:02,780 –> 00:05:06,080
the institutional, let’s just say corporate side, as well as the
82
00:05:06,460 –> 00:05:09,039
entrepreneurial sales, marketing,
83
00:05:10,219 –> 00:05:13,875
podcast e side, where we care a lot
84
00:05:13,875 –> 00:05:17,555
about getting behind the microphone and we care maybe
85
00:05:17,555 –> 00:05:21,314
not as much as we should about what is the quality of
86
00:05:21,314 –> 00:05:24,694
the thought that is reflected in the speech that we use
87
00:05:25,230 –> 00:05:28,210
or we say behind that microphone. And I think that’s probably
88
00:05:28,830 –> 00:05:32,370
that’s that’s where those two things meet together, if that makes sense to you.
89
00:05:33,070 –> 00:05:36,670
Yeah. That makes sense. Matter of fact, that makes a lot of
90
00:05:36,670 –> 00:05:40,435
sense, And it particularly aligns with the work
91
00:05:40,435 –> 00:05:43,875
that I’m trying to do on this podcast by reading novels and
92
00:05:43,875 –> 00:05:47,474
essays and nonfiction and fiction and trying to pull
93
00:05:47,474 –> 00:05:51,190
leadership lessons, from spaces that
94
00:05:51,190 –> 00:05:53,930
have not been watered down by,
95
00:05:54,790 –> 00:05:57,450
business book jargon, or by
96
00:05:58,470 –> 00:06:02,230
logos or by a lack of quality, and I did say logos, yes, or
97
00:06:02,230 –> 00:06:04,330
by a lack of quality of thought.
98
00:06:05,805 –> 00:06:08,865
So let’s start off with that sticky area
99
00:06:09,405 –> 00:06:12,525
because listeners are going to are going to hear this and they’re gonna go quality
100
00:06:12,525 –> 00:06:15,965
of thought. All my thoughts are quality. All of my thoughts
101
00:06:15,965 –> 00:06:19,645
deserve to be spat out on the Internet or on social
102
00:06:19,645 –> 00:06:23,400
media or all of my thoughts deserve to be tweeted. I mean, we have an
103
00:06:23,400 –> 00:06:26,920
entire platform, Twitter, that is
104
00:06:26,920 –> 00:06:29,480
devoted to the
105
00:06:30,400 –> 00:06:34,120
it is devoted to shortening the distance between my
106
00:06:34,120 –> 00:06:37,335
thought and expression. And I am algorithmically
107
00:06:37,794 –> 00:06:41,395
rewarded for the hot take, and I am not
108
00:06:41,395 –> 00:06:45,235
algorithmically rewarded for the slow burn. So how do
109
00:06:45,235 –> 00:06:48,694
we I guess maybe the core question here is this.
110
00:06:49,699 –> 00:06:53,540
What does quality of thought mean in a time such
111
00:06:53,540 –> 00:06:57,300
as this? Yeah. Well okay. I I
112
00:06:57,300 –> 00:07:00,360
love this question because because it implies so many,
113
00:07:01,155 –> 00:07:04,995
I think, stunningly important things that we don’t we just don’t get enough
114
00:07:04,995 –> 00:07:08,355
of a chance to discuss in the world. And and
115
00:07:08,355 –> 00:07:12,135
so the first thing is everyone
116
00:07:12,195 –> 00:07:15,889
has a right to their own thinking. Right? No. That
117
00:07:16,050 –> 00:07:19,270
and and we should. This is this is not a thing that I would
118
00:07:19,810 –> 00:07:22,689
even if I don’t agree with someone’s thinking or I think it’s shallow or I
119
00:07:22,689 –> 00:07:26,150
think it’s stupid or whatever, I would I would still say, please continue
120
00:07:26,210 –> 00:07:29,695
thinking. Like, you should do that, and the world should do
121
00:07:29,695 –> 00:07:33,155
that. And and, like, don’t stop now just because
122
00:07:33,535 –> 00:07:37,375
just because some idiot on on your lovely podcast, you
123
00:07:37,375 –> 00:07:41,220
know, is is is gonna indicate that that the quality of thinking out
124
00:07:41,220 –> 00:07:44,440
there on social media platforms or whatever is not particularly useful.
125
00:07:44,900 –> 00:07:48,740
But but it begs the next purpose of the question, which is why
126
00:07:48,740 –> 00:07:51,720
would why would we want to share that thinking?
127
00:07:52,324 –> 00:07:56,165
Right? That that could be the fact that I have the thought and the fact
128
00:07:56,165 –> 00:08:00,005
that I have the right to share it doesn’t necessarily answer the
129
00:08:00,005 –> 00:08:03,764
question, what is the purpose of sharing it? And and this
130
00:08:03,764 –> 00:08:07,440
is, to me, where we get screwed up. And
131
00:08:07,440 –> 00:08:11,040
and so and so if I had to put my my initial thoughts together on
132
00:08:11,040 –> 00:08:14,660
that, it would go something like this. Most people
133
00:08:14,880 –> 00:08:18,640
want to share their thoughts because
134
00:08:18,640 –> 00:08:22,415
some very unthoughtful marketing adviser or
135
00:08:22,415 –> 00:08:26,095
something told them that if they were top of mind and they had a
136
00:08:26,095 –> 00:08:29,775
personal brand, that other people would see them
137
00:08:29,775 –> 00:08:33,520
as credible and amazing, and they would make money, and they’d
138
00:08:33,520 –> 00:08:36,880
get better jobs, and they’d be promoted. And it’s all bullshit, of course. Like, no.
139
00:08:36,880 –> 00:08:40,640
None none of that actually happens, but unless your name’s Kardashian. And and
140
00:08:40,640 –> 00:08:44,320
and but it but it but it but but the Kardashians are sort of interesting
141
00:08:44,320 –> 00:08:48,080
here. Right? Because to me, this is sort of this is sort of the
142
00:08:48,080 –> 00:08:50,415
point that that the Kardashians
143
00:08:51,755 –> 00:08:55,274
do that, and I know who they are, but I
144
00:08:55,274 –> 00:08:58,955
wouldn’t hire them to do anything. Right? That that and
145
00:08:58,955 –> 00:09:02,555
so and so the only thing the Kardashians actually have
146
00:09:02,555 –> 00:09:06,190
accomplished is is that people know who they are and
147
00:09:06,410 –> 00:09:10,190
advertisers are halfway decent at making money from them.
148
00:09:10,330 –> 00:09:14,090
But certainly, no one benefits from, from
149
00:09:14,090 –> 00:09:17,415
from the wisdom that they share or don’t share. We don’t we don’t see it.
150
00:09:17,574 –> 00:09:21,415
And and so if your if your business or
151
00:09:21,415 –> 00:09:25,175
if your approach to your business or your job or whatever is based on how
152
00:09:25,175 –> 00:09:28,774
well you make sense of the world, then it begs the question, is
153
00:09:28,774 –> 00:09:32,375
a constant litany of all of the thoughts that you
154
00:09:32,375 –> 00:09:36,029
have the best way to display the credibility that you bring to your
155
00:09:36,029 –> 00:09:39,709
thinking? And and the answer is almost certainly not. And and
156
00:09:39,709 –> 00:09:43,470
so and so the minute the minute we we have this conversation, we end
157
00:09:43,470 –> 00:09:46,690
up saying, well, if the purpose of my writing
158
00:09:47,355 –> 00:09:50,795
is now no longer if is now no
159
00:09:50,795 –> 00:09:54,495
longer to build my personal brand, which is
160
00:09:55,035 –> 00:09:58,795
the dumb way of looking at it, but it is to help other
161
00:09:58,795 –> 00:10:02,590
people, meaning meaning my my job now is is to put
162
00:10:02,590 –> 00:10:06,350
language into the world that helps other people live better lives. And
163
00:10:06,350 –> 00:10:10,190
from that, I also make money, have have a better brand, be
164
00:10:10,190 –> 00:10:13,895
seen as an expert in a space. Then then it begs the
165
00:10:13,895 –> 00:10:17,335
question, what is the quality of my thinking that is
166
00:10:17,335 –> 00:10:21,015
necessary to actually help somebody? At which point you realize,
167
00:10:21,015 –> 00:10:24,240
well, it’s very rare that we live in that
168
00:10:24,640 –> 00:10:28,399
language space inside our own heads. It’s very easy to
169
00:10:28,399 –> 00:10:32,160
say, I don’t like what’s happening with the government. It’s very hard to
170
00:10:32,160 –> 00:10:36,000
say, I’ve been thinking about why I don’t
171
00:10:36,000 –> 00:10:39,725
like about what’s happening with the government and what
172
00:10:39,725 –> 00:10:43,085
I’d prefer to see different and
173
00:10:43,085 –> 00:10:46,845
why. And and the outcomes that I think the government wants, which is
174
00:10:46,845 –> 00:10:50,465
different than the outcomes that I might want, and this is the
175
00:10:50,709 –> 00:10:54,230
quality of thinking that I’m bringing to this discussion. And so you can build a
176
00:10:54,230 –> 00:10:57,910
personal brand with your complaints, but you won’t build any
177
00:10:57,910 –> 00:11:01,690
trust for your thinking. But the minute we get to to discussing
178
00:11:01,750 –> 00:11:05,144
not what I think, but why and how I think
179
00:11:05,144 –> 00:11:08,745
it, you can gain a lot of intimate trust with our
180
00:11:08,745 –> 00:11:12,345
thinking. And that and if language is tied immediately to that depth of
181
00:11:12,345 –> 00:11:15,960
thought, then then all of a sudden the world moves in your direction pretty quickly.
182
00:11:15,960 –> 00:11:19,800
But I think we begin to see that without that reflection of why
183
00:11:19,800 –> 00:11:23,560
do I think that and and and what is a process that I’ve that I’ve
184
00:11:23,560 –> 00:11:26,940
gone through to to to think why that’s important to other people,
185
00:11:27,240 –> 00:11:30,785
then what we have is a whole bunch of noise on social media
186
00:11:30,785 –> 00:11:34,465
among other places. And it’s not that people don’t have the right to that noise.
187
00:11:34,465 –> 00:11:37,905
They have the right. It’s just not particularly useful to them or to
188
00:11:37,905 –> 00:11:41,285
anybody else. But does that make sense, or how do you hear that?
189
00:11:42,149 –> 00:11:45,610
No. That that that makes that makes a lot of sense.
190
00:11:46,310 –> 00:11:49,829
If we begin with what is the
191
00:11:49,829 –> 00:11:53,670
purpose of sharing my thought. Right? What is
192
00:11:53,670 –> 00:11:57,029
the thing that I want people to do? Which, by the way, good
193
00:11:57,029 –> 00:12:00,765
marketing. And I’m a fan of the writer Seth Godin. He’s also a
194
00:12:00,765 –> 00:12:04,465
marketer. He does write deceptively
195
00:12:04,765 –> 00:12:08,525
simple sentences that have deep thoughts in them. I don’t know that I
196
00:12:08,525 –> 00:12:12,045
agree with all of those sentences, but the thought is
197
00:12:12,045 –> 00:12:15,740
definitely there. Mhmm. And you can tell in the nature of his
198
00:12:15,740 –> 00:12:19,340
writing versus the writing of a person
199
00:12:19,340 –> 00:12:23,180
like I’m gonna throw him under the bus because what
200
00:12:23,180 –> 00:12:27,014
the heck? Why not? He’s not listening to the show. Gary Vaynerchuk. Right?
201
00:12:27,014 –> 00:12:30,555
Like, that guy is monetizing everything
202
00:12:30,855 –> 00:12:34,615
to the nth degree. He would even monetize his facial
203
00:12:34,615 –> 00:12:38,295
hair if he could get away with it. Right? And he’s sort of an
204
00:12:38,295 –> 00:12:41,960
exaggeration of sort of the worst examples. You mentioned the Kardashians. Sort of the
205
00:12:41,960 –> 00:12:45,800
worst examples of of of fame culture, or the outcomes
206
00:12:45,800 –> 00:12:49,260
of fame culture, in a a fragmented media environment
207
00:12:49,320 –> 00:12:52,840
where it seems as though shouting your your
208
00:12:52,840 –> 00:12:56,675
purposeless thoughts louder in order to get
209
00:12:56,675 –> 00:12:59,894
attention in the public square seems to be the mode for most marketers.
210
00:13:03,154 –> 00:13:06,615
And there is a growing
211
00:13:06,755 –> 00:13:10,540
category of people because I do believe there’s a tension here. There’s a growing category
212
00:13:10,540 –> 00:13:13,820
of people, and this is why I have you on the show. I put myself
213
00:13:13,820 –> 00:13:16,560
in that category. There’s a growing category of people who
214
00:13:17,980 –> 00:13:21,579
are tired of
215
00:13:21,579 –> 00:13:24,725
garbage thinking and garbage writing.
216
00:13:25,505 –> 00:13:29,345
Now they don’t know where to go because the systems haven’t been built out
217
00:13:29,345 –> 00:13:31,985
for them over the last twenty years. Maybe they’ll start being built out over the
218
00:13:31,985 –> 00:13:33,045
next twenty years.
219
00:13:35,529 –> 00:13:39,290
And so because they have nowhere to go, they’re listening to podcasts
220
00:13:39,290 –> 00:13:42,810
or they’re writing on Substack or they’re, you know, they’re they’re they’re in those
221
00:13:42,810 –> 00:13:46,490
spaces where long form, I hate this term, but long
222
00:13:46,490 –> 00:13:49,950
form content is is the thing. Right?
223
00:13:50,795 –> 00:13:53,355
And then the other thought that I have these were all half formed thoughts, but
224
00:13:53,355 –> 00:13:55,055
probably this is maybe the the most
225
00:13:57,115 –> 00:13:58,894
controversial thought that I have.
226
00:14:01,035 –> 00:14:03,295
This is the most controversial thought that I have.
227
00:14:04,990 –> 00:14:08,829
What you’re talking about is gatekeeping myself, and why would I wanna do that?
228
00:14:08,829 –> 00:14:12,450
I’m being rewarded for not gatekeeping myself. Yeah.
229
00:14:12,510 –> 00:14:16,350
And the gatekeepers who used to stop
230
00:14:16,350 –> 00:14:19,785
me, and I’ll use a perfect example for this.
231
00:14:20,005 –> 00:14:23,365
When there used to be newspapers and people used to write letters to the
232
00:14:23,365 –> 00:14:27,145
editor, there was always a crank file. Mhmm.
233
00:14:27,525 –> 00:14:31,145
Mhmm. And and the editor
234
00:14:31,740 –> 00:14:34,160
of the newspaper, managing or otherwise,
235
00:14:35,180 –> 00:14:38,960
acted as a stop on that person’s crank thinking.
236
00:14:39,100 –> 00:14:42,620
So you mentioned the government. I have a problem with the government, and it’s the
237
00:14:42,620 –> 00:14:45,520
alien’s fault. Mhmm. And
238
00:14:46,245 –> 00:14:49,465
the newspaper editor looked at that thought,
239
00:14:50,085 –> 00:14:53,785
which might have been fully thought out on mimeograph paper,
240
00:14:55,525 –> 00:14:59,365
and and said we’re not publishing that. Mhmm. We’re acting as a
241
00:14:59,365 –> 00:15:03,010
backstop on that. It seems as though, from
242
00:15:03,010 –> 00:15:06,850
my perspective, when you’re asking people to gatekeep themselves and yet they
243
00:15:06,850 –> 00:15:10,550
are being rewarded for not doing that and there’s no external gatekeepers
244
00:15:10,610 –> 00:15:14,050
on them, it seems as though you’ve selected amount to Everest of a
245
00:15:14,050 –> 00:15:16,704
problem to solve. Or am I looking at it incorrectly?
246
00:15:18,685 –> 00:15:22,445
I think you’re looking at it the way the platforms would
247
00:15:22,445 –> 00:15:25,884
love for you to look at it, but it wouldn’t be the way I would
248
00:15:25,884 –> 00:15:29,105
look at it Okay. If I was a person who wanted
249
00:15:29,805 –> 00:15:33,620
to, for instance, display the credibility of
250
00:15:33,620 –> 00:15:36,760
of my thinking for my business or something.
251
00:15:37,540 –> 00:15:41,240
And and so, I’ll give you a good example of this.
252
00:15:43,140 –> 00:15:46,895
I was in, yeah, I’m
253
00:15:46,895 –> 00:15:49,635
pretty sure this is accurate. I think I was in Germany
254
00:15:50,655 –> 00:15:53,215
when, a
255
00:15:54,255 –> 00:15:57,855
yeah. Yeah. I was in Italy. I was in Italy. Same trip. Different different time
256
00:15:57,855 –> 00:15:59,540
period. When
257
00:16:01,440 –> 00:16:05,279
Bridgewater, the largest
258
00:16:05,279 –> 00:16:08,639
hedge fund in the world, put out, an essay
259
00:16:08,639 –> 00:16:12,464
called, we’re all mercantilists now, which
260
00:16:12,464 –> 00:16:14,144
we’re recording this on
261
00:16:14,144 –> 00:16:17,925
02/24/2025, and that was
262
00:16:19,024 –> 00:16:22,644
probably December 15. Pretty good job, Bridgewater.
263
00:16:23,505 –> 00:16:26,885
Right? Pretty good job. So
264
00:16:27,105 –> 00:16:30,680
so so you would think that that
265
00:16:30,680 –> 00:16:33,420
essay written by Ray Dalio’s replacement,
266
00:16:35,400 –> 00:16:39,020
I can’t imagine that that thing did not go viral.
267
00:16:39,480 –> 00:16:43,214
Right? They paid. They’ve sponsored it. I’ve seen
268
00:16:43,214 –> 00:16:46,894
that ad. They’ve paid to put that in
269
00:16:46,894 –> 00:16:50,495
front of me. Facebook didn’t give it to them that they’ve
270
00:16:50,495 –> 00:16:52,754
paid to put that in front of me.
271
00:16:54,540 –> 00:16:57,899
We’re talking about it right now. How
272
00:16:57,899 –> 00:17:01,519
many tens of
273
00:17:01,660 –> 00:17:05,420
thousands of pieces of content have crossed
274
00:17:05,420 –> 00:17:09,155
both your life and mine since
275
00:17:09,295 –> 00:17:12,275
12/15/2024,
276
00:17:13,055 –> 00:17:16,494
and we’re talking about that content. And it’s
277
00:17:16,494 –> 00:17:19,935
like, that wasn’t up to the algorithm. That was somebody
278
00:17:19,935 –> 00:17:23,179
somebody wrote something amazing and useful
279
00:17:23,559 –> 00:17:27,399
and helpful, and then they paid to put
280
00:17:27,399 –> 00:17:31,240
it in front of them. And we’re talking about it now because
281
00:17:31,240 –> 00:17:35,080
it was useful and helpful and brilliant. And and I imagine
282
00:17:35,080 –> 00:17:38,865
they’re gonna keep doing that, and their hedge fund will continue to grow
283
00:17:38,865 –> 00:17:42,625
and grow and grow. And they seem to be some of the wisest
284
00:17:42,625 –> 00:17:46,225
people in the space. And so and
285
00:17:46,225 –> 00:17:49,365
so and so if if you’re the platform,
286
00:17:50,110 –> 00:17:53,630
you say, hey. What’s the stuff that I’ve gotta
287
00:17:53,630 –> 00:17:57,389
do for the free stuff? Like,
288
00:17:57,389 –> 00:18:00,990
the free what’s the free stuff that I have to
289
00:18:00,990 –> 00:18:04,735
do to keep the people who are gonna see a
290
00:18:04,735 –> 00:18:08,415
lot of stuff on the platform. And then it’s
291
00:18:08,415 –> 00:18:12,195
gonna be angry stuff, shallow stuff,
292
00:18:12,335 –> 00:18:16,015
stupid stuff. Right? It’s gonna be all of that. And
293
00:18:16,015 –> 00:18:19,669
and so the platform will look at it and say,
294
00:18:19,889 –> 00:18:23,269
I want the shallowest, angriest, most unthoughtful
295
00:18:23,490 –> 00:18:27,190
stuff all of the time because it keeps other shallow,
296
00:18:27,250 –> 00:18:30,690
angry, unthoughtful people on the platform, and I can sell, you
297
00:18:30,690 –> 00:18:34,225
know, shoes to them or something. Right? Okay. And then
298
00:18:34,225 –> 00:18:37,825
there’s the layer. It’s like, you and I are on there. We’re looking at pictures
299
00:18:37,825 –> 00:18:41,425
of our friends’ kids or something. Right? We’re we’re we’re looking at each other’s
300
00:18:41,425 –> 00:18:45,185
stuff probably. And and we go, this is a great
301
00:18:45,185 –> 00:18:48,570
essay. And the algorithm didn’t give it to
302
00:18:48,570 –> 00:18:52,270
us. The the ad space gave it to us.
303
00:18:52,650 –> 00:18:56,490
And so and so we get that out of that experience, and they know
304
00:18:56,490 –> 00:19:00,170
how to target it to us. And so the question
305
00:19:00,170 –> 00:19:03,865
becomes, who are you aiming at, and then how do you
306
00:19:03,865 –> 00:19:07,485
get that material in front of the other in in front of that person?
307
00:19:07,785 –> 00:19:11,565
And the difficulty we have in the marketing space is that everybody
308
00:19:11,705 –> 00:19:15,005
says, in order to market, you have to understand the algorithm.
309
00:19:15,385 –> 00:19:19,030
Bullshit. In order to market, you have to understand
310
00:19:19,090 –> 00:19:22,610
human comprehension. And so if we were to
311
00:19:22,610 –> 00:19:26,370
say somebody has to comprehend this about something that really
312
00:19:26,370 –> 00:19:29,995
matters to them, then the only question is how do
313
00:19:29,995 –> 00:19:33,835
we get that piece of information in front of that person. And
314
00:19:33,835 –> 00:19:37,434
it doesn’t actually matter if there’s only, let’s just
315
00:19:37,434 –> 00:19:41,195
say, 10,000 people an hour who who who are
316
00:19:41,195 –> 00:19:44,940
that person versus the millions of people on there. All you have
317
00:19:44,940 –> 00:19:48,700
to do is get that in front of the right 10,000. And so on
318
00:19:48,700 –> 00:19:52,460
LinkedIn, there are ways to do that. On Facebook, there are ways to do that.
319
00:19:52,460 –> 00:19:56,299
We do a lot of it in relationship building on LinkedIn around
320
00:19:56,299 –> 00:20:00,105
people that we really that’s how you and I met, that on on people
321
00:20:00,105 –> 00:20:03,785
that we really think are interesting and thoughtful people. And so and
322
00:20:03,785 –> 00:20:07,225
so and so I think if we start looking at it like, who is it
323
00:20:07,225 –> 00:20:10,680
that we need to speak to, and who is it that
324
00:20:10,840 –> 00:20:14,220
that this piece of content is going to be beneficial for,
325
00:20:14,760 –> 00:20:18,520
and how do I get this in front of them? That’s
326
00:20:18,520 –> 00:20:22,060
an equation that makes sense to us, and it removes the question,
327
00:20:22,440 –> 00:20:26,245
what does the algorithm sponsor make easy,
328
00:20:26,245 –> 00:20:29,925
go viral? That all all of that doesn’t matter. That’s their problem, not
329
00:20:29,925 –> 00:20:33,685
mine. My question is, how do I use that service to get
330
00:20:33,685 –> 00:20:37,445
the right information in front of you? And and if as long
331
00:20:37,445 –> 00:20:39,765
as I’m in control of that, I could give a I could give a shit
332
00:20:39,765 –> 00:20:43,580
what they do. Right? Let let let let them make pictures of banana
333
00:20:43,580 –> 00:20:47,260
goes go go viral. I don’t care. Right? I I just want
334
00:20:47,260 –> 00:20:50,860
my stuff in front of you so that we can have this relationship. And I
335
00:20:50,860 –> 00:20:54,684
think if we started looking at it that way, we’d we’d stop looking at it
336
00:20:54,684 –> 00:20:58,524
like, oh, I understand if I put this and that in there, I’m gonna
337
00:20:58,524 –> 00:21:01,965
go viral. And I’m like, why the fuck would you wanna go viral? Say
338
00:21:01,965 –> 00:21:05,804
thoughtful things and get it in front of thoughtful people, and your world gets
339
00:21:05,804 –> 00:21:09,450
really simple and and much more lucrative really quickly. So does that make sense,
340
00:21:09,450 –> 00:21:12,169
or how do you hear that? Yeah. No. That that makes that makes a lot
341
00:21:12,169 –> 00:21:16,010
of sense. And I hear the
342
00:21:16,010 –> 00:21:19,450
core challenge in there of
343
00:21:19,450 –> 00:21:23,125
understanding human comprehension. So let’s let’s
344
00:21:23,125 –> 00:21:26,505
wander down that road a little bit. How do you understand how do we,
345
00:21:26,805 –> 00:21:30,645
how do I right? I I’ve written I’ll use myself as an example. So just
346
00:21:30,805 –> 00:21:34,245
yeah. I’ve written three books. I’m getting ready to write a fourth one. I do
347
00:21:34,245 –> 00:21:37,960
this podcast. I do training and development, kind of
348
00:21:37,960 –> 00:21:41,720
the same that you do. I work with clients. I’m consulting and and coaching in
349
00:21:41,720 –> 00:21:45,560
the leadership and in the organizational behavior space. You
350
00:21:45,560 –> 00:21:49,384
know, I’m I’m trying to give people and I
351
00:21:49,384 –> 00:21:53,004
try to push clients towards meat, not milk.
352
00:21:53,705 –> 00:21:57,065
You know, one of the greatest sort of compliments I’ve ever gotten from a
353
00:21:57,065 –> 00:22:00,445
client is that, you know, Hasan, you offer pragmatic
354
00:22:01,065 –> 00:22:03,940
solutions. Mhmm. Because things have to work.
355
00:22:04,880 –> 00:22:08,580
Because that’s really what people care about. People care about things working. Right? So
356
00:22:08,800 –> 00:22:12,580
this this essay that you were referring to, we’re all mercantilists now.
357
00:22:13,200 –> 00:22:16,995
If that’s going to make me invest better as a member of that hedge
358
00:22:16,995 –> 00:22:20,275
fund or as a part of that hedge fund or as a person who’s giving
359
00:22:20,275 –> 00:22:23,495
advice from that hedge fund to the hedge fund I’m running,
360
00:22:23,795 –> 00:22:26,855
great. It’s been pragmatic. It’s it’s worked. Right?
361
00:22:29,260 –> 00:22:33,020
But the the rise of pragmatism, which I think, by the way, is is the
362
00:22:33,020 –> 00:22:36,860
only escape hatch you have from the algorithm, or at least it’s the escape
363
00:22:36,860 –> 00:22:40,640
hatch I found. The rise of pragmatism as a countervailing
364
00:22:40,940 –> 00:22:42,960
force or a counterbalancing force
365
00:22:45,135 –> 00:22:48,835
does require not only an understanding, I think, of human comprehension,
366
00:22:49,294 –> 00:22:52,735
but an understanding of human attitude and behavior. And
367
00:22:52,735 –> 00:22:55,635
so how does how does how does comprehension
368
00:22:56,490 –> 00:23:00,170
and behavior link together if I’m if I’m
369
00:23:00,170 –> 00:23:02,809
writing my fourth book? Right? Which I am, by the way. I’m writing my fourth
370
00:23:02,809 –> 00:23:06,650
book. So if you were advising me, in writing my fourth book,
371
00:23:06,650 –> 00:23:10,330
which is not a business book, it’s a cultural commentary book, little bit of a
372
00:23:10,330 –> 00:23:13,635
polemic small book, you know, only a 50 pages.
373
00:23:14,175 –> 00:23:17,535
It’s a book I feel compelled to write. That’s why I’m writing it. I spent
374
00:23:17,535 –> 00:23:21,375
two years working on the ideas in it. Partially, the podcast
375
00:23:21,375 –> 00:23:25,135
has influenced it. Other things are the conversations I have with people. And
376
00:23:25,135 –> 00:23:28,840
by the way, I write because I want to inflict my ideas on other people
377
00:23:30,419 –> 00:23:34,039
because I think they’re worth inflicting on other people. And I think that my
378
00:23:34,260 –> 00:23:38,039
page should have them in a book form because I’m I’m obsessed with books.
379
00:23:38,100 –> 00:23:41,635
Right? I’m drunk on ideas as Richard Dawkins would
380
00:23:41,635 –> 00:23:45,315
would say. Right? So how do I, as a
381
00:23:45,315 –> 00:23:48,595
person, writing a book, putting an idea out in the world that I thought deeply
382
00:23:48,595 –> 00:23:52,355
about, how do I understand the link between comprehension and behavior in order
383
00:23:52,355 –> 00:23:55,230
to get somebody, not not necessarily to pick up the book, but just to read
384
00:23:55,230 –> 00:23:58,610
my deep thoughts. Mhmm. So,
385
00:24:00,110 –> 00:24:03,730
I just wanna make sure I understand the question. Sure. I understand the comprehension
386
00:24:03,790 –> 00:24:07,090
part. Yeah. And I think you’re you’re asking,
387
00:24:07,550 –> 00:24:10,945
what what if what if people don’t
388
00:24:11,165 –> 00:24:14,865
have the right attitudes or behaviors to be open
389
00:24:14,925 –> 00:24:18,525
for the type of writing that you’re discussing? And and,
390
00:24:18,525 –> 00:24:22,205
therefore, how do we how do we access Yeah. And I think that that’s a
391
00:24:22,205 –> 00:24:25,620
huge problem for a lot of a lot of folks because in a
392
00:24:25,620 –> 00:24:29,080
fragmented communication milieu,
393
00:24:31,860 –> 00:24:35,540
where well, when we just saw this, you know, we’re in
394
00:24:35,540 –> 00:24:38,600
2025. Right? We just saw this with the last election. Right?
395
00:24:39,325 –> 00:24:43,085
There are many, many people who don’t know people who voted for the opposite
396
00:24:43,085 –> 00:24:46,925
candidate on their social media plat platforms. Right? Because the platform does the thing
397
00:24:46,925 –> 00:24:50,605
that the platform does, which I loved your description of that. But in real
398
00:24:50,605 –> 00:24:54,410
life, they don’t know. Forget the platforms. In real life, which is
399
00:24:54,410 –> 00:24:57,850
another area we can talk about, they don’t know anybody because we’re self
400
00:24:57,850 –> 00:25:00,910
selecting. Right? Mhmm. COVID really sort of,
401
00:25:02,090 –> 00:25:05,850
accelerated this process, you know, as people literally physically got up and moved around the
402
00:25:05,850 –> 00:25:09,434
country. Right? Because they could. Right? It was a it was a
403
00:25:09,434 –> 00:25:12,715
unique opportunity to be able to do that. I’m gonna I know because I was
404
00:25:12,715 –> 00:25:16,475
one of those people. People self selected into into or
405
00:25:16,475 –> 00:25:20,155
out of groups. Right? Behavior groups that they wanted to be a part of. But
406
00:25:20,155 –> 00:25:24,000
when you do that, the group of people you’re targeting with your ideas, if you’re
407
00:25:24,000 –> 00:25:27,380
a business, does not expand. It becomes smaller.
408
00:25:28,080 –> 00:25:31,780
So how do we link, like I said, how do we link behavior and comprehension
409
00:25:32,080 –> 00:25:35,700
together? How do we how do we do that? Mhmm.
410
00:25:35,835 –> 00:25:39,115
I think I think the if the the the best answer I could give you
411
00:25:39,115 –> 00:25:42,715
is we accept that we can’t, but but we
412
00:25:42,715 –> 00:25:45,934
can invite people to our own comprehension.
413
00:25:46,794 –> 00:25:50,335
And so, if you don’t mind a slightly longer
414
00:25:50,395 –> 00:25:54,010
answer to the two question, this is a very common
415
00:25:54,010 –> 00:25:57,610
thing that I do in my workshops. But Mhmm. You’ll
416
00:25:57,610 –> 00:26:01,370
see, this coffee cup here. If anybody’s on YouTube, you can
417
00:26:01,370 –> 00:26:05,210
see this. And it says, this is not a coffee cup on it. And and
418
00:26:05,210 –> 00:26:08,845
what that means is is that we don’t think of
419
00:26:08,845 –> 00:26:12,065
it, but but we we learned language
420
00:26:12,845 –> 00:26:16,605
as an act of manipulation. So a a a
421
00:26:16,605 –> 00:26:20,110
teacher held up a pen in a fourth grade
422
00:26:20,110 –> 00:26:23,710
classroom and said, this is a pen. And what is this? And the class
423
00:26:23,710 –> 00:26:26,910
went, it’s a pen. What is it? It’s a pen. What is it? It’s a
424
00:26:26,910 –> 00:26:30,590
pen. And went, okay. Great. You can identify in this in the world. You
425
00:26:30,590 –> 00:26:34,025
have power. And to be very clear, in in
426
00:26:34,245 –> 00:26:38,085
love and adoration to all fourth grade teachers, they needed to do that
427
00:26:38,085 –> 00:26:41,845
because the kids have to function in the world. That’s your
428
00:26:41,845 –> 00:26:45,685
vegetables. You have to eat them. Right? Like like, kids
429
00:26:45,685 –> 00:26:48,990
have to function. But we never actually fix
430
00:26:49,470 –> 00:26:52,830
that underlying assumption, which is which is that
431
00:26:52,830 –> 00:26:56,510
identification is power, and I can’t think of anything less
432
00:26:56,510 –> 00:27:00,030
actually accurate than that. So so if we take the
433
00:27:00,030 –> 00:27:03,550
coffee cup that that we we don’t we don’t
434
00:27:03,550 –> 00:27:07,395
say, you know, if if I
435
00:27:07,395 –> 00:27:11,155
tell you that, that that this is a coffee cup, that
436
00:27:11,155 –> 00:27:14,915
this is an act of manipulation. Right? If you’re at my house and
437
00:27:14,915 –> 00:27:18,740
I say, is it alright with you if I put my your coffee in
438
00:27:18,740 –> 00:27:22,340
this coffee cup? We don’t say that’s an act of manipulation. We we
439
00:27:22,340 –> 00:27:26,179
don’t we don’t say, I’m commanding you to see this as
440
00:27:26,179 –> 00:27:29,780
a coffee cup because I have learned it to be a coffee cup, and I
441
00:27:29,780 –> 00:27:33,475
demand that you see this as a coffee cup because it’s a coffee cup
442
00:27:33,475 –> 00:27:36,695
and you probably do. But there’s a reason
443
00:27:37,075 –> 00:27:40,675
that this is effective to hold coffee. It’s ceramic. It’s got a
444
00:27:40,675 –> 00:27:43,415
handle. It’s got a decent size to it. The ceramic
445
00:27:44,149 –> 00:27:47,990
structure is different than metal or glass, which would burn my hand if I put
446
00:27:47,990 –> 00:27:51,750
something warm in it. And so I could say, while you’re at
447
00:27:51,750 –> 00:27:55,510
my house, do you mind if I put this in this cup that
448
00:27:55,510 –> 00:27:58,950
is ceramic and it’s got a handle on it and it’s got looser molecules in
449
00:27:58,950 –> 00:28:02,525
it than, say, metal or glass, it won’t burn your hand. And for the duration
450
00:28:02,525 –> 00:28:06,285
of this conversation, is it alright with you? If if I if
451
00:28:06,285 –> 00:28:09,245
I call this a coffee cup and put it in there and you say, that
452
00:28:09,245 –> 00:28:12,785
was completely useless, I call that a coffee cup too. That’s ridiculous.
453
00:28:13,165 –> 00:28:16,899
Of course, you can do that. But what I’ve really done is invited you into
454
00:28:16,899 –> 00:28:20,600
my frame of the world. I haven’t commanded you to take my own.
455
00:28:20,899 –> 00:28:24,500
And that’s a very different thing than this is a coffee
456
00:28:24,500 –> 00:28:27,860
cup. Right? Like, that’s a very different thing. This is my
457
00:28:27,860 –> 00:28:31,635
understanding of the structure of this thing, and I think it’s gonna be useful
458
00:28:31,635 –> 00:28:34,515
for you. And do you mind if I put your coffee in it? Is a
459
00:28:34,515 –> 00:28:38,195
very different thing than this is a coffee cup. And no one
460
00:28:38,195 –> 00:28:41,794
says this matters. No one says we need to do this. This is just
461
00:28:41,794 –> 00:28:45,570
semantic philosophical bullshit Until someone says democrats are
462
00:28:45,570 –> 00:28:49,410
assholes, and you go, where the hell did you get that from? That’s
463
00:28:49,410 –> 00:28:53,250
really mean. That’s, like, you’re saying, jerk. Like, where did that come from? And
464
00:28:53,250 –> 00:28:56,930
it’s like, wait a minute. If I can command you to see
465
00:28:56,930 –> 00:29:00,345
reality as I do with this, then I’ve
466
00:29:00,345 –> 00:29:04,105
learned that my view of reality is real, and it’s not.
467
00:29:04,105 –> 00:29:07,705
It’s a lie. It’s bullshit. And so what I have to
468
00:29:07,705 –> 00:29:11,465
do is say, my understanding is
469
00:29:11,465 –> 00:29:15,240
that this functions in a way that is useful to
470
00:29:15,240 –> 00:29:19,080
hold coffee. And if it’s alright with you, I’d like to put
471
00:29:19,080 –> 00:29:22,680
your coffee in it because I think that’s what’s best for you. And you’ll say,
472
00:29:22,680 –> 00:29:25,820
I agree with that. And I’ll say, I have concerns
473
00:29:26,445 –> 00:29:30,125
about how the Democratic Party is messaging. And I think some of that
474
00:29:30,125 –> 00:29:33,725
messaging and and so what I’m doing is sharing my frame for
475
00:29:33,725 –> 00:29:37,485
information, and then you volunteer whether or
476
00:29:37,485 –> 00:29:40,945
not that frame for information is an accurate place to make the decision.
477
00:29:41,164 –> 00:29:44,710
Yes. Put coffee there. Yes. I think that’s where
478
00:29:44,710 –> 00:29:48,250
Democratic party party messaging needs to go, etcetera, etcetera, etcetera.
479
00:29:48,550 –> 00:29:52,150
So all I can do is is command my
480
00:29:52,150 –> 00:29:55,930
own frame of understanding, and I can invite you
481
00:29:56,575 –> 00:30:00,415
to my frame of understanding, and you have every right to say, that guy’s
482
00:30:00,415 –> 00:30:04,175
an asshole. I’m not gonna take that frame. You have every right, and I can’t
483
00:30:04,175 –> 00:30:07,955
and I can’t demand that you do. I can only say,
484
00:30:08,495 –> 00:30:11,920
these are the things that I’m noticing in the world. This is this is
485
00:30:11,920 –> 00:30:15,680
why I think things are working this way, and these are
486
00:30:15,680 –> 00:30:19,360
my suggestions because of this understanding that that I
487
00:30:19,360 –> 00:30:22,720
think the world will be better if if if because of these
488
00:30:22,720 –> 00:30:26,495
understandings. And so how do you hear that? And then you get a
489
00:30:26,495 –> 00:30:30,195
choice, and and I have to give you that choice.
490
00:30:30,335 –> 00:30:34,174
I cannot command it of you. I have to give you that
491
00:30:34,174 –> 00:30:38,010
choice. And so are certain people going
492
00:30:38,010 –> 00:30:41,470
to be more open to your frame than other people? Yes.
493
00:30:42,730 –> 00:30:46,490
And the people who aren’t, fuck them. It’s not it’s not for them, and
494
00:30:46,490 –> 00:30:50,330
that’s okay. Right. Move on. Right? And and and so and so
495
00:30:50,330 –> 00:30:54,085
you’re not gonna and and and that’s a powerful thing. Right? Because I because I
496
00:30:54,085 –> 00:30:57,845
think I think a lot of times people go, like, well, you
497
00:30:57,845 –> 00:31:01,605
know, my my my neighbor down the street would would never would would never
498
00:31:02,245 –> 00:31:05,924
like, that person, I’ve really just gotta say, this is the world and you better
499
00:31:05,924 –> 00:31:09,740
take it. And it’s like, how many people are you not speaking
500
00:31:09,799 –> 00:31:13,399
to in order to speak that way to the most unthoughtful person on your
501
00:31:13,399 –> 00:31:17,240
street? And and and how many people how many people
502
00:31:17,240 –> 00:31:21,065
are you are you losing and and not inviting into
503
00:31:21,065 –> 00:31:24,905
the thoughtful observations of the world that that you have? And
504
00:31:24,905 –> 00:31:27,545
so I think we go through this. I think we go through it every day.
505
00:31:27,545 –> 00:31:31,065
We just don’t discuss it. And not for nothing. You and I both have an
506
00:31:31,065 –> 00:31:34,659
academic background. Schools are the basis of
507
00:31:34,659 –> 00:31:38,179
this, that we we are terrible. Like,
508
00:31:38,179 –> 00:31:41,880
this conversation does not happen in freshman
509
00:31:41,940 –> 00:31:45,240
English, and it should. It’s it’s the core
510
00:31:45,765 –> 00:31:49,605
of what of what freshman English should be. But but it
511
00:31:49,605 –> 00:31:52,485
doesn’t happen in freshman English, and they say, well, you have an opinion. Write your
512
00:31:52,485 –> 00:31:55,845
opinion. Find 15 sources that roughly put it in
513
00:31:55,845 –> 00:31:59,605
MLA, and, we’ll give you an a. And and then and then
514
00:31:59,605 –> 00:32:03,120
we never fix this misunderstanding that I can
515
00:32:03,120 –> 00:32:06,880
command someone to see reality versus I can invite
516
00:32:06,880 –> 00:32:10,320
someone to see my own, and that’s a huge change in the world. But does
517
00:32:10,320 –> 00:32:13,360
that make sense, or how do you hear that? Oh, yeah. I mean, that makes
518
00:32:13,600 –> 00:32:16,875
it makes sense. I don’t
519
00:32:17,335 –> 00:32:20,294
necessarily agree with all parts of it, but I understand how you got to the
520
00:32:20,294 –> 00:32:23,414
end of the road with the with the cul de sac. Absolutely makes sense how
521
00:32:23,414 –> 00:32:26,775
you got there. And I would
522
00:32:26,775 –> 00:32:30,580
say that you are talking about well, first
523
00:32:30,580 –> 00:32:33,780
off, you’re right. It’s not a coffee cup. It is a collection of atoms that
524
00:32:33,780 –> 00:32:37,380
just holds another collection of atoms. Oh, really? Just as Now we’re
525
00:32:37,380 –> 00:32:40,840
closer. Well, you know yeah. Well, I was so in
526
00:32:41,140 –> 00:32:44,835
in first year art history class, we look at, you know,
527
00:32:44,835 –> 00:32:48,675
the Magritte painting that your cup comes from. You know? We look at the treachery
528
00:32:48,675 –> 00:32:51,555
of images, and we seek to understand. And I know this because I was a
529
00:32:51,555 –> 00:32:55,235
bachelor of fine arts major as an undergraduate in college. And
530
00:32:55,235 –> 00:32:58,720
so we understand how yes. We understand
531
00:32:58,860 –> 00:33:02,000
how we understand how and why
532
00:33:02,620 –> 00:33:06,380
even philosophers like Plato had a problem with the artists. Mhmm.
533
00:33:06,380 –> 00:33:10,205
Because that manipulation in a
534
00:33:10,205 –> 00:33:13,965
sophistic way can be used, yes, for understanding and for breaking frames
535
00:33:13,965 –> 00:33:17,725
and for joining people together, but it can also be used
536
00:33:17,725 –> 00:33:21,345
for creating frames and and blocking people off and creating,
537
00:33:22,039 –> 00:33:25,500
creating, what do you call it? Fake what I call fake conflict, pretend
538
00:33:25,559 –> 00:33:29,320
conflict around things that don’t really matter. This is what this is all what Plato
539
00:33:29,320 –> 00:33:31,880
was yelling about with the sophists. This is why he was yelling at those guys
540
00:33:31,880 –> 00:33:35,720
all the time. And sophistry has been raised to a high art and then given
541
00:33:35,720 –> 00:33:38,914
a platform and an algorithm, these days.
542
00:33:40,414 –> 00:33:41,875
The other thing that I think
543
00:33:44,095 –> 00:33:47,475
is that if we’re inviting people into our thinking,
544
00:33:47,775 –> 00:33:51,580
right, We are
545
00:33:51,580 –> 00:33:55,179
social animals. Right? So, you know, we wanna
546
00:33:55,179 –> 00:33:58,940
invite as many people into our tribe as we possibly can. We know from
547
00:33:58,940 –> 00:34:02,620
Dunbar’s number that once we get to about a hundred, we’re basically done. We can’t
548
00:34:02,620 –> 00:34:06,175
keep track of that many. We those that’s the max limit on relationships.
549
00:34:06,475 –> 00:34:10,255
Right? And you even see this on those algorithmic
550
00:34:10,315 –> 00:34:13,995
platforms. You know? I’m only interacting with five or six people because that’s all that’s
551
00:34:13,995 –> 00:34:16,599
all all the things that I can, like, handle and I can’t. You know? And
552
00:34:16,599 –> 00:34:19,960
if somebody pops out with something or whatever and by the way, just use my
553
00:34:19,960 –> 00:34:23,339
own example, post election,
554
00:34:24,280 –> 00:34:28,040
in The United States, one of the things
555
00:34:28,040 –> 00:34:30,764
that I’ve done is I’ve taken to snoozing people,
556
00:34:31,484 –> 00:34:34,525
because just you you gotta be snoozed for thirty days. Like, you need to go
557
00:34:34,525 –> 00:34:38,204
to sleep. Like, my wife’s just like, just get off Facebook period. Like, no. Just
558
00:34:38,204 –> 00:34:40,944
just just you go to sleep.
559
00:34:41,964 –> 00:34:45,679
And and, you know, I snooze and then I delete.
560
00:34:45,739 –> 00:34:49,260
Right? Because I wanna give people an opportunity to still, you to your point, think
561
00:34:49,260 –> 00:34:51,679
their thoughts and and bring those thoughts into my frame,
562
00:34:53,659 –> 00:34:56,924
because two things can be true at the same time. I don’t wanna be assaulted
563
00:34:56,984 –> 00:35:00,424
inside of my frame by your thinking. I don’t wanna be and I think you’re
564
00:35:00,424 –> 00:35:03,385
getting to this as well. This is when I’m hearing the core idea. I don’t
565
00:35:03,385 –> 00:35:06,905
wanna be assaulted into compliance. Right. I don’t wanna be
566
00:35:06,905 –> 00:35:10,345
forced to comply with your thinking. I want to be invited for
567
00:35:10,345 –> 00:35:14,080
sure. But if you invite me and then I’ve
568
00:35:14,080 –> 00:35:17,520
taken the invitation and I’ve said, no. I don’t want it. I want to
569
00:35:17,520 –> 00:35:20,100
leave. I should be allowed to leave.
570
00:35:21,440 –> 00:35:22,000
This isn’t
571
00:35:25,455 –> 00:35:27,395
Facebook post is not a suicide pact.
572
00:35:29,455 –> 00:35:33,155
A marketing post on LinkedIn is not a suicide pact with a brand.
573
00:35:33,695 –> 00:35:37,150
Like, I don’t have to ride or die with you. You
574
00:35:37,150 –> 00:35:40,510
invited me in. I looked around. I saw what was going
575
00:35:40,510 –> 00:35:44,050
on. It’s not for me.
576
00:35:44,430 –> 00:35:48,190
Mhmm. I think people, because they
577
00:35:48,190 –> 00:35:52,035
are seeking for connection, you use the term connection several times in relationship because
578
00:35:52,035 –> 00:35:55,635
that’s the the larger thing that we’re going to. I think people are seeking the
579
00:35:55,635 –> 00:35:59,175
connection and relationship that comes from purposeful communication,
580
00:35:59,474 –> 00:36:03,315
but they don’t know how to ask for it. They don’t know how to ask
581
00:36:03,315 –> 00:36:06,250
for that purposeful connection. And I don’t know if that starts in the family. You
582
00:36:06,250 –> 00:36:08,809
you talk about being in the fourth grade, you know, holding up a pen. I
583
00:36:08,809 –> 00:36:12,010
think it starts way earlier than that. I think it starts when you’re two one
584
00:36:12,010 –> 00:36:15,609
and two years old in your house. It’s way earlier than that. I think
585
00:36:15,609 –> 00:36:19,375
the the educational system, and both my wife and I are
586
00:36:19,375 –> 00:36:23,214
educators. The educational system comes along way after a lot of that’s already
587
00:36:23,214 –> 00:36:26,734
hardwired in and then just doubles down and reinforces, you
588
00:36:26,734 –> 00:36:30,515
know, all the way through twelfth grade or, you know, if you’re
589
00:36:31,214 –> 00:36:35,030
so blessed, college. You know? And
590
00:36:35,030 –> 00:36:38,710
it is all about compliance. We will get you
591
00:36:38,710 –> 00:36:42,470
to comply. The question, I guess,
592
00:36:42,470 –> 00:36:46,090
is who does that work for, which gets us into some very Marxist territory.
593
00:36:47,645 –> 00:36:49,965
You know, does it work for the capitalist? Does it work for the people in
594
00:36:49,965 –> 00:36:53,565
power? Who has the power? And I don’t wanna go down that. I don’t wanna
595
00:36:53,565 –> 00:36:55,805
go down that road. That’s a that’s a different kind of road than what I
596
00:36:55,805 –> 00:36:59,325
wanna go down. I want to focus on the
597
00:36:59,325 –> 00:37:03,000
writing piece of this because I think you’ve hit on something,
598
00:37:03,859 –> 00:37:07,619
and the decline in writing among the k through
599
00:37:07,619 –> 00:37:11,380
12 cohort is something I think we have to we
600
00:37:11,380 –> 00:37:13,640
have to talk about. And so
601
00:37:15,945 –> 00:37:19,485
who do who does it benefit if kids can’t write
602
00:37:20,025 –> 00:37:23,085
and if kids can’t comprehend? What kind of adults do they become?
603
00:37:23,465 –> 00:37:27,225
Mhmm. Yeah. This is this is interesting, and and
604
00:37:27,225 –> 00:37:30,750
we may or may not have have the same point of view on this. I
605
00:37:30,750 –> 00:37:33,250
I know that that there’s there’s
606
00:37:34,589 –> 00:37:38,110
a, maybe a theory out there
607
00:37:38,110 –> 00:37:41,730
that that the education system
608
00:37:41,870 –> 00:37:45,575
has conspired, to make
609
00:37:45,575 –> 00:37:48,935
kids ineffectual and and
610
00:37:48,935 –> 00:37:52,455
soldier on for the for the powers that
611
00:37:52,455 –> 00:37:56,075
be. And I could see that argument. Like like, certainly,
612
00:37:56,135 –> 00:37:59,500
there’s a there’s a there’s a frame of
613
00:37:59,500 –> 00:38:03,180
understanding there that that is credible enough to
614
00:38:03,180 –> 00:38:06,940
consider at any rate, that that that it’s not it’s not it’s
615
00:38:06,940 –> 00:38:10,539
not outright dismissible. But I I’ve done enough work
616
00:38:10,539 –> 00:38:13,735
in government and and other places. I love I think I heard this on a
617
00:38:13,735 –> 00:38:17,015
Sunday show at one point. This is, you know, the thing about conspiracy theories is
618
00:38:17,015 –> 00:38:20,855
is that you’re making the assumption that the government is, wise
619
00:38:20,855 –> 00:38:24,640
enough, smart enough, and, committed enough to
620
00:38:24,640 –> 00:38:28,480
actually pull through on any of these things. It’s not any of them. And
621
00:38:28,480 –> 00:38:31,940
I know that that is more roughly
622
00:38:32,480 –> 00:38:36,100
my experience of the and so but
623
00:38:36,160 –> 00:38:39,515
but and so and so and so I don’t think it’s the answer you might
624
00:38:39,515 –> 00:38:42,395
be leaning for, but I’ll answer your question. Go ahead and give no. Give me
625
00:38:42,395 –> 00:38:45,355
give me the give me the answer that is the answer. I’m I’ll work with
626
00:38:45,355 –> 00:38:48,795
my own leaning for that. I don’t know. But but but if someone were asking
627
00:38:48,795 –> 00:38:52,460
me that question, I would say the person that it benefits is the
628
00:38:52,460 –> 00:38:55,900
school teacher and the professor and the
629
00:38:55,900 –> 00:38:59,740
person who does not have to take on the obligation of
630
00:38:59,740 –> 00:39:03,535
what it takes to be successful in life. Oh. And
631
00:39:03,755 –> 00:39:07,515
and so and so by by being a person this
632
00:39:07,515 –> 00:39:10,975
is this is absolutely true. Like, I’ll I’ll just share this with you. Last week,
633
00:39:11,595 –> 00:39:15,355
I shared with my students, freshmen. I do not teach at
634
00:39:15,355 –> 00:39:19,150
a predominantly white institution. I did a lot of first generation
635
00:39:20,330 –> 00:39:23,770
college kids, lots of and so and
636
00:39:23,770 –> 00:39:27,050
so we and and so I went through, and I was
637
00:39:27,530 –> 00:39:30,605
and it’s like, I don’t care
638
00:39:31,225 –> 00:39:35,065
if you get wealthy in this country. Like like
639
00:39:35,065 –> 00:39:38,905
like, that’s up to you. I very much care that you
640
00:39:38,905 –> 00:39:42,680
know how. Like, I want you to know how to do
641
00:39:42,680 –> 00:39:46,359
it. And and it’s important to me that you understand the difference
642
00:39:46,359 –> 00:39:50,119
between working for someone and having your own business and understand that is a
643
00:39:50,119 –> 00:39:53,720
choice. And it’s very it’s under like, let’s talk
644
00:39:53,720 –> 00:39:57,475
about investing money, and and and that is a choice. Like,
645
00:39:57,475 –> 00:40:01,235
there are all of these choices that are available to you, and
646
00:40:01,235 –> 00:40:05,075
it’s and it’s like, well, why are we having that discussion in an English
647
00:40:05,075 –> 00:40:08,595
in an English classroom? Because it isn’t happening anywhere else. It isn’t happening anywhere else.
648
00:40:08,595 –> 00:40:12,230
Yeah. And and so and and so and so the
649
00:40:12,230 –> 00:40:15,990
people who have it, and I don’t think this is a conspiracy, I think this
650
00:40:15,990 –> 00:40:19,430
is just life, are the peoples whose parents had
651
00:40:19,430 –> 00:40:23,100
it. And and their parents had it, and their parents and so and and so
652
00:40:23,100 –> 00:40:26,485
and so someone’s childhood so someone’s
653
00:40:26,485 –> 00:40:30,025
educational outcome is essentially predetermined
654
00:40:30,325 –> 00:40:34,105
by the family that they’re brought into and the quality of conversations
655
00:40:34,245 –> 00:40:37,925
at their dinner table, and school tries not to get in the way and to
656
00:40:37,925 –> 00:40:41,670
help the rest of the people more or less the best
657
00:40:41,670 –> 00:40:45,110
they can. And what that does is alleviate the
658
00:40:45,110 –> 00:40:48,790
responsibility of the teachers of actually understanding the
659
00:40:48,790 –> 00:40:52,470
world very deeply and being able to explain the world in a very
660
00:40:52,470 –> 00:40:56,279
deep way as a matter of character, as a matter as a matter of finance,
661
00:40:56,279 –> 00:40:59,906
as a matter of economy, as a matter of geopolitics, as a matter
662
00:40:59,906 –> 00:41:03,532
of everything. It’s it it alleviates that responsibility. And so I think
663
00:41:03,532 –> 00:41:07,158
the person who benefits from kids not knowing that is the person who
664
00:41:07,158 –> 00:41:10,589
doesn’t have to take on the responsibility of, I have to now go
665
00:41:10,589 –> 00:41:14,190
investigate the world really, really well. And
666
00:41:14,190 –> 00:41:17,869
and listen, let’s face it. If you make $65,000 a
667
00:41:17,869 –> 00:41:21,470
year, maybe we’re not paying or finding the right people to do
668
00:41:21,470 –> 00:41:25,170
that. Like like, if someone were to say, Brian, fix the world in
669
00:41:25,434 –> 00:41:29,115
in a generation or less, I’d say everybody who might
670
00:41:29,115 –> 00:41:32,635
go into law Mhmm. Pay them enough to go into
671
00:41:32,635 –> 00:41:36,335
teaching. Bring bring all of the smartest educated
672
00:41:36,395 –> 00:41:39,595
people in the world, pay them all a hundred and $50 a year to go
673
00:41:39,595 –> 00:41:43,110
teach and and and get that thinking, that
674
00:41:43,330 –> 00:41:45,990
wide, broad, thoughtful, amazing geopolitical,
675
00:41:46,690 –> 00:41:50,370
economic, etcetera, thinking into the classroom and do it from
676
00:41:50,370 –> 00:41:54,210
k to k to the time they graduate, but but we don’t have those
677
00:41:54,210 –> 00:41:57,925
teachers there. And so so to me, the system is built not by
678
00:41:57,925 –> 00:42:01,605
conspiracy, but just by default to to to make
679
00:42:01,605 –> 00:42:05,365
it easy to pass kids through. And the net effect of
680
00:42:05,365 –> 00:42:08,325
that is they’re not they’re they’re not good in the world, but the only person
681
00:42:08,325 –> 00:42:11,980
who really benefits there are the professors. I wouldn’t say it’s necessarily the
682
00:42:11,980 –> 00:42:15,339
rich people or whatever. I think it’s probably the professors that get more benefit than
683
00:42:15,339 –> 00:42:19,180
that. But how do you hear that? So it’s interesting that you bring this up
684
00:42:19,180 –> 00:42:22,785
because I I don’t, again, I don’t fully agree with you,
685
00:42:22,944 –> 00:42:26,625
and these two things can be true at once. And I have seen in
686
00:42:26,625 –> 00:42:30,145
my experience when I was working as an adjunct, at a
687
00:42:30,145 –> 00:42:33,184
business school and making significantly less than
688
00:42:33,184 –> 00:42:36,865
$65,000 a year. Let’s be real. I understand. Okay. I would have made more
689
00:42:36,865 –> 00:42:40,600
babysitting. And
690
00:42:40,600 –> 00:42:43,660
and some days, that’s what I feel like they expected me to do.
691
00:42:45,800 –> 00:42:49,560
Because of the nature of how I’m wired, and, yes, this does go to
692
00:42:49,560 –> 00:42:53,305
upbringing and all of that, I categorically refuse to
693
00:42:53,305 –> 00:42:57,145
play that play that game. Right? And, intentionally, this is a word I use with
694
00:42:57,145 –> 00:43:00,505
leadership, and this is a word I use in organizational behavior. We have to lead
695
00:43:00,505 –> 00:43:02,985
with our brains on. We have to if we’re talking about teaching, we have to
696
00:43:02,985 –> 00:43:06,569
teach with our brains on. We have to write with our brains on. Right?
697
00:43:07,109 –> 00:43:10,890
Intentionality for me is huge. Right? Are you doing things on purpose,
698
00:43:11,270 –> 00:43:14,730
or are you just reactively responding by accident?
699
00:43:14,950 –> 00:43:18,630
Okay. When I
700
00:43:18,630 –> 00:43:22,315
was, you know, that adjunct, I would always do a
701
00:43:22,315 –> 00:43:25,535
lecture in my business class, and it would come usually
702
00:43:26,315 –> 00:43:30,154
spring semester. Actually, probably right about now. And it was a
703
00:43:30,154 –> 00:43:33,615
lecture about globalism because very few
704
00:43:34,395 –> 00:43:37,970
students in business schools who are going to go work
705
00:43:38,130 –> 00:43:41,890
60% of them are gonna go work for some multinational corporation
706
00:43:41,890 –> 00:43:44,450
that is not fully to care about them and is gonna burn them out in
707
00:43:44,450 –> 00:43:48,130
four years. Mhmm. And then they’re gonna be clamoring around
708
00:43:48,130 –> 00:43:49,995
trying to find a smaller place or whatever
709
00:43:52,635 –> 00:43:56,395
They don’t understand why it’s cheaper for a
710
00:43:56,395 –> 00:43:59,595
hedge fund, going back to a hedge fund for just a minute, to send them
711
00:43:59,595 –> 00:44:03,099
to Malaysia to live out of a laptop than it is for head and and
712
00:44:03,099 –> 00:44:06,480
look at an Excel spreadsheet and fire a bunch of people that they never met
713
00:44:06,700 –> 00:44:09,839
than it is for a hedge fund to keep them at home in a neighborhood
714
00:44:10,220 –> 00:44:13,740
actually engaging with people that they may be firing at a local plant. It’s
715
00:44:13,740 –> 00:44:16,480
cheaper to send them to Malaysia because of globalism.
716
00:44:18,125 –> 00:44:21,964
But business school students do not understand this. They don’t to
717
00:44:21,964 –> 00:44:25,505
your point about it not being explained, at no point in high school,
718
00:44:25,805 –> 00:44:29,505
and I thought I taught probably a thousand students
719
00:44:29,645 –> 00:44:33,290
in the course of five years, right, that I was an adjunct. I can’t
720
00:44:33,290 –> 00:44:36,970
remember one student coming up to me and saying, oh, yeah. This was all explored
721
00:44:36,970 –> 00:44:40,750
in, like, high school. The vast majority of folks
722
00:44:41,290 –> 00:44:44,810
came up to me and said, I never actually heard that explained. And by the
723
00:44:44,810 –> 00:44:48,515
way, I started globalism off with Bretton Woods and what happened after World War
724
00:44:48,515 –> 00:44:52,035
two, and then just a cascade of, you know, down in Nixon and everything else.
725
00:44:52,035 –> 00:44:55,795
Right? Okay. And I draw the line for them. And I say, if you
726
00:44:55,795 –> 00:44:59,395
want to make this decision, this is the system you’re engaging in. I don’t care
727
00:44:59,395 –> 00:45:03,170
if you engage in the system. I am agnostic on your life decisions.
728
00:45:03,550 –> 00:45:07,390
That’s right. And and I I don’t care. But I don’t
729
00:45:07,390 –> 00:45:10,990
want you to be able to say that no one ever told you Yes. That
730
00:45:10,990 –> 00:45:14,430
this was the thing that was going to happen. And so Yes. I have seen
731
00:45:14,430 –> 00:45:18,125
what you’re talking about when as an instructor, as a
732
00:45:18,125 –> 00:45:21,725
teacher, I chose to, regardless of what I was getting
733
00:45:21,725 –> 00:45:25,485
paid, go to the system with a different
734
00:45:25,485 –> 00:45:27,985
idea. That was an active choice.
735
00:45:29,280 –> 00:45:32,880
And because I’m psychologically wired to be high in
736
00:45:32,880 –> 00:45:36,640
personal agency and I’m I have a a high
737
00:45:36,640 –> 00:45:39,780
internal locus of control rather than an external one,
738
00:45:41,355 –> 00:45:45,195
I’m not really too concerned about whether or not the system likes me. That doesn’t
739
00:45:45,195 –> 00:45:48,875
really Right. Like, concern me. Right? Right. What concerns me
740
00:45:48,875 –> 00:45:52,395
is, are the people who are going into any system, are they
741
00:45:52,395 –> 00:45:56,190
adequately prepared to operate and know what the rules are because no one
742
00:45:56,190 –> 00:45:58,990
is explaining it to them as a failure of leadership, which is the point of
743
00:45:58,990 –> 00:46:02,670
actually this podcast as well. K. And the failures of leadership are all over the
744
00:46:02,670 –> 00:46:06,030
place. You know? And so I think those
745
00:46:06,030 –> 00:46:09,790
thoughts at the same time, I also think of this as where I maybe disagree
746
00:46:09,790 –> 00:46:13,265
with you a little bit. I don’t think it’s a conspiracy so much as it
747
00:46:13,265 –> 00:46:16,785
is the inertia of things moving
748
00:46:16,785 –> 00:46:20,164
beneficially forward. Right? And by the way,
749
00:46:20,224 –> 00:46:23,605
benefiting to your point, maybe teachers or or
750
00:46:23,664 –> 00:46:26,880
principals or k through 12 administrators. Sure. Okay.
751
00:46:28,480 –> 00:46:32,000
I always ask the question, at what point does the benefit run out? Which I
752
00:46:32,000 –> 00:46:34,319
think the benefit is starting to run out now. My father always used to tell
753
00:46:34,319 –> 00:46:37,519
me you’re gonna pay the piper one way or another, and the the bill always
754
00:46:37,519 –> 00:46:41,115
comes due. I agree. You know? I agree. And so I think we’re paying the
755
00:46:41,115 –> 00:46:44,315
piper now. Yeah. And I think we’re going to be paying the piper in the
756
00:46:44,315 –> 00:46:47,995
future, particularly as we outsource more and more of
757
00:46:47,995 –> 00:46:51,295
our cognition to these large language models and these
758
00:46:52,059 –> 00:46:55,420
more of the algorithmic in publication, to
759
00:46:55,420 –> 00:46:59,099
paraphrase from Cory Doctor, Doctorow, an AI
760
00:46:59,099 –> 00:47:02,940
slop that’s just gonna be laying around the Internet. Yep. And it’ll
761
00:47:02,940 –> 00:47:05,895
be our own fault. We will have done it to ourselves, but, of course, we
762
00:47:05,895 –> 00:47:09,275
will search for a leader who we can blame or who will save us,
763
00:47:09,495 –> 00:47:13,095
and we will never have realized that
764
00:47:13,095 –> 00:47:16,775
that saving piece was in our own hands the whole
765
00:47:16,775 –> 00:47:20,609
time. So I
766
00:47:20,609 –> 00:47:22,530
have a bunch of a bunch of different thoughts in my head. I’m gonna have
767
00:47:22,530 –> 00:47:24,530
to go through this a little bit. I’m gonna have to cascade this and think
768
00:47:24,530 –> 00:47:26,150
about this a little bit because
769
00:47:28,849 –> 00:47:32,530
it’s not necessarily agree on that, by the way. I I don’t I I I’ll
770
00:47:32,530 –> 00:47:35,535
I’ll take I’ll take inertia as as,
771
00:47:37,835 –> 00:47:40,895
as as the as the process of
772
00:47:41,835 –> 00:47:44,735
of false or or or umbrellaed,
773
00:47:45,620 –> 00:47:49,300
or umbrellaed conspiracy. I I would take that word. I think I think that’s an
774
00:47:49,300 –> 00:47:52,660
accurate ish. Well and I’m not willing to go full Marxist. I think
775
00:47:52,660 –> 00:47:54,040
Marxists don’t.
776
00:47:57,620 –> 00:48:01,205
The the Marxist left and the anarchist right both share something in common. They’re both
777
00:48:01,205 –> 00:48:05,045
looking for boogeyman under the bed. Yes. When in reality More than
778
00:48:05,045 –> 00:48:06,025
that. But yes.
779
00:48:08,725 –> 00:48:12,485
When when in reality, the boogeyman is themselves the whole time. And they’re both, by
780
00:48:12,485 –> 00:48:15,380
the way, underneath the same bed. They’re both hiding out in the same bed looking
781
00:48:15,380 –> 00:48:18,360
for each other. Isn’t that funny? So It it’s oh, it’s hilarious.
782
00:48:19,460 –> 00:48:23,140
Yeah. Yeah. Okay. Let’s turn the corner a little bit because
783
00:48:23,140 –> 00:48:26,980
we’ve talked about frames. We’ve talked about comprehension and a little bit about
784
00:48:26,980 –> 00:48:30,655
commission and the purpose of sharing thoughts, the education system, free
785
00:48:30,655 –> 00:48:34,255
stuff versus paid stuff, getting our language, making sure
786
00:48:34,255 –> 00:48:37,855
the the quality of our language is is high when we are
787
00:48:37,855 –> 00:48:41,214
expressing it, and that we are careful thinkers
788
00:48:41,214 –> 00:48:42,595
thinkers and speakers.
789
00:48:45,590 –> 00:48:49,110
For leaders, for people who have been
790
00:48:49,110 –> 00:48:52,550
positionally placed in charge and by the way, I’m not thinking
791
00:48:52,550 –> 00:48:56,390
about a leader of a major corporation. So I don’t this
792
00:48:56,390 –> 00:48:59,990
is not where I’m framing this this question. I’m thinking of a leader in
793
00:48:59,990 –> 00:49:03,545
a small company, employees maybe 500
794
00:49:03,545 –> 00:49:07,165
people, maybe. Maybe his dad,
795
00:49:07,785 –> 00:49:11,465
or his granddad founded that company. Mhmm. And
796
00:49:11,465 –> 00:49:14,380
he grew up in it, and and he just always assumed that he was gonna
797
00:49:14,380 –> 00:49:18,060
be the leader, and he got into the leadership position. And now we live
798
00:49:18,060 –> 00:49:20,480
in times like these where,
799
00:49:22,460 –> 00:49:26,234
he may not prioritize writing clearly. He
800
00:49:26,234 –> 00:49:29,454
may not prioritize writing at all. He may outsource it to somebody else.
801
00:49:29,835 –> 00:49:33,674
Mhmm. What do you say? What advice do you have? What thought name
802
00:49:33,674 –> 00:49:37,375
and advice. What information, that’s a better word, do you have
803
00:49:37,434 –> 00:49:41,250
for that individual, around agency, even
804
00:49:41,250 –> 00:49:44,930
around his own thoughts and putting them out there into, into the
805
00:49:44,930 –> 00:49:48,770
world? Yeah. So so
806
00:49:48,770 –> 00:49:51,705
so this is interesting. My
807
00:49:52,825 –> 00:49:56,585
my I’ll I’ll answer it in two ways just just because one’s gonna make
808
00:49:56,585 –> 00:50:00,345
me laugh. That person, I never try
809
00:50:00,345 –> 00:50:04,125
to convince them of anything. Right? If somebody says, I’m gonna
810
00:50:04,740 –> 00:50:08,500
let AI do all my writing, and, I don’t need writing,
811
00:50:08,500 –> 00:50:11,540
and I’ve been writing since the fourth grade. I know what I do. I say,
812
00:50:11,540 –> 00:50:15,300
I wish you luck. Right. Right. Like, I actually don’t try to
813
00:50:15,300 –> 00:50:19,065
convince that person of anything. But but but I think what you’re hinting at
814
00:50:19,224 –> 00:50:22,665
at is is is what is it that that writing is
815
00:50:22,665 –> 00:50:26,265
inferring about leadership Mhmm. Or quality of
816
00:50:26,265 –> 00:50:29,704
thinking or whatever that we don’t say out loud. And and and and so taking
817
00:50:29,704 –> 00:50:33,500
that frame, here’s here’s the here here’s
818
00:50:33,500 –> 00:50:36,620
my understanding of it. The the first thing is we
819
00:50:37,180 –> 00:50:40,620
David Eagleman writes about this in his neuroscience book, which I which I absolutely love.
820
00:50:40,620 –> 00:50:43,660
I don’t know if you’ve you’ve discussed his books here on your podcast or not,
821
00:50:43,660 –> 00:50:47,494
but but, he’s got a couple which are which are great, but incognito is
822
00:50:47,875 –> 00:50:51,234
the kind of the one that that gets the most press, and it’s worth
823
00:50:51,234 –> 00:50:55,075
it. He talks about the human being is not
824
00:50:55,075 –> 00:50:58,915
a successful animal because we are more
825
00:50:58,915 –> 00:51:02,730
cognitive than other than other animals because we
826
00:51:02,730 –> 00:51:06,490
think better. That that’s a mistake. That the human
827
00:51:06,490 –> 00:51:10,250
being is not a successful animal because
828
00:51:10,250 –> 00:51:14,090
we have fewer instincts and more cognition, which is the
829
00:51:14,090 –> 00:51:17,435
story we tell ourselves. The human being is a successful
830
00:51:17,495 –> 00:51:20,855
animal because we have more instincts and
831
00:51:20,855 –> 00:51:24,155
better instincts honed by our cognition,
832
00:51:25,015 –> 00:51:28,375
which is which is a fundamentally different
833
00:51:28,375 –> 00:51:31,980
thing. And so and so now we’ll start with that when it comes to
834
00:51:31,980 –> 00:51:34,720
writing. So for instance, somebody says,
835
00:51:36,140 –> 00:51:39,900
we we do this on on Thursdays. If any one of your people wanna join
836
00:51:39,900 –> 00:51:43,705
us at at some point, they’re welcome to. Where we look at an essay
837
00:51:43,705 –> 00:51:47,385
from The Wall Street Journal or The New York Times or something. And we simply
838
00:51:47,385 –> 00:51:51,145
say, do we trust this author as being credible? Would we make
839
00:51:51,145 –> 00:51:54,985
decisions based on the information that’s and the answer is almost always
840
00:51:54,985 –> 00:51:58,700
no in part because I try to find the worst essay that day. But Of
841
00:51:58,700 –> 00:52:02,060
course. But Put my thumb on the scale a little bit
842
00:52:02,060 –> 00:52:05,600
there. Exactly. But but but
843
00:52:06,300 –> 00:52:09,580
but but it when we listen to, like, when we listen to those conversations on
844
00:52:09,580 –> 00:52:13,365
Thursday, what generally speaking is somebody raise some somebody says
845
00:52:13,365 –> 00:52:16,585
something in the first ten words.
846
00:52:17,045 –> 00:52:20,165
And I’m gonna pair it. I’m gonna I’m I’m just gonna make one up. But
847
00:52:20,165 –> 00:52:23,785
but to give people an an example, it might say something like
848
00:52:23,925 –> 00:52:27,680
the Trump administration is obviously incorrect
849
00:52:27,900 –> 00:52:31,200
on policy x. Right? And and so
850
00:52:31,580 –> 00:52:35,340
and and so even there are people in that room who
851
00:52:35,340 –> 00:52:38,925
would politically agree with that statement, but they
852
00:52:38,925 –> 00:52:42,225
would still say that undercuts the writer’s credibility.
853
00:52:42,765 –> 00:52:46,445
And that’s a feeling. It’s a feeling. We
854
00:52:46,445 –> 00:52:50,205
go, oh, why like, why am I having that feeling toward that
855
00:52:50,205 –> 00:52:53,789
frame of information? Why am I having that feeling? And
856
00:52:53,789 –> 00:52:57,569
so and so what I would say to the person who’s interested
857
00:52:57,630 –> 00:53:01,390
to how writing what language is gonna do is say, I’ve had an
858
00:53:01,390 –> 00:53:05,234
instinct, and the instinct is this is wrong. Mhmm. And now
859
00:53:05,234 –> 00:53:09,075
I wanna go through the process of saying, why am
860
00:53:09,075 –> 00:53:12,535
I having that instinct? Does does does that reflect
861
00:53:13,474 –> 00:53:17,160
the presentation of somebody’s information? Mhmm.
862
00:53:17,160 –> 00:53:20,780
Does that reflect my own biases? Does that does that reflect
863
00:53:21,160 –> 00:53:24,920
something that triggered me from when I was a kid? And and so and so
864
00:53:24,920 –> 00:53:28,520
that’s coming up. Like, I have to now ask the question and be
865
00:53:28,520 –> 00:53:32,305
willing to answer the question, why am I having this instinct, this
866
00:53:32,305 –> 00:53:36,145
reaction, this feeling? And the minute I have that and so
867
00:53:36,145 –> 00:53:39,825
take your your average piece of, let’s just
868
00:53:39,825 –> 00:53:43,650
say, email around, I don’t know. You we
869
00:53:43,650 –> 00:53:46,849
gotta we’re we’re we’re gonna come back to the office. We’re no longer gonna work
870
00:53:46,849 –> 00:53:50,530
from home. Mhmm. And so and so a leader
871
00:53:50,530 –> 00:53:54,369
now is forced to present that information. And this well,
872
00:53:54,369 –> 00:53:58,185
the easiest thing is just write an email that said Mhmm. Come back to the
873
00:53:58,185 –> 00:54:01,305
office. We’re no longer gonna work from home, and then people are gonna get really
874
00:54:01,305 –> 00:54:05,065
angry. If you don’t like it, quit. Right? That’s Yep. So so
875
00:54:05,065 –> 00:54:08,825
so that’s one way of handling it. Right? And and people
876
00:54:08,825 –> 00:54:12,540
go, man, the way this is presented has really concerned me.
877
00:54:12,540 –> 00:54:16,300
That says something. Right? And so and so then we say, well, why are people
878
00:54:16,300 –> 00:54:20,060
having that instinct? Can we anticipate that instinct? And
879
00:54:20,060 –> 00:54:23,520
can we say, I suspect that
880
00:54:24,055 –> 00:54:27,095
that there are gonna be people who have difficulty with this, so I wanna be
881
00:54:27,095 –> 00:54:30,775
very transparent about why we’re doing what we’re doing. I wanna show you exactly our
882
00:54:30,775 –> 00:54:34,235
observations of the world and how we’re making sense of those observations
883
00:54:34,615 –> 00:54:38,369
so that you can understand the decisions we’ve come in we’ve come to. And those
884
00:54:38,369 –> 00:54:42,069
observations are, and those decisions are, and the reasons are.
885
00:54:42,289 –> 00:54:45,890
And people go, oh, okay. Now I see how you make
886
00:54:45,890 –> 00:54:49,190
that decision, not just the decision that you’ve made.
887
00:54:49,825 –> 00:54:53,505
Take the essay in the Wall Street Journal. Trump administration is
888
00:54:53,505 –> 00:54:57,185
obviously wrong. It’s not necessarily incorrect. It’s just
889
00:54:57,185 –> 00:55:00,945
inferred. Right? And as opposed to explained or explored, it’s
890
00:55:00,945 –> 00:55:04,085
an abstraction that infers concrete information
891
00:55:04,650 –> 00:55:08,170
versus details concrete information. And so if we
892
00:55:08,170 –> 00:55:11,530
own the difference between what concrete
893
00:55:11,530 –> 00:55:15,230
information are people going to agree on, this is a coffee cup,
894
00:55:15,530 –> 00:55:19,335
versus what concrete information do we need to state the
895
00:55:19,335 –> 00:55:22,635
inferences so they can agree on it or at least understand
896
00:55:22,775 –> 00:55:26,535
it. That’s the benefit of writing well. Does that make sense? This is
897
00:55:26,535 –> 00:55:30,135
the continuing battle of the enlightenment. Right? I mean, this is the battle going all
898
00:55:30,135 –> 00:55:33,910
the way back to the seventeenth century in the West. This is why our
899
00:55:33,910 –> 00:55:37,270
greatest fights, I’ve come to this conclusion in the last couple of
900
00:55:37,270 –> 00:55:40,950
years, are over who owns the
901
00:55:40,950 –> 00:55:44,710
dictionary and what words get to be in. That’s where
902
00:55:44,710 –> 00:55:48,434
our greatest fights are. And it’s not really
903
00:55:48,434 –> 00:55:50,295
about politics. It’s about,
904
00:55:52,355 –> 00:55:56,115
the struggle in Western culture, and it is most notable in
905
00:55:56,115 –> 00:55:59,654
Western culture. The struggle in Western culture to ascend
906
00:55:59,875 –> 00:56:03,690
to the heights of reason without feelings. This is what all
907
00:56:03,690 –> 00:56:05,870
the technologists promise us. Right?
908
00:56:07,690 –> 00:56:10,830
And, you know, look, I so I’m also an amateur historian
909
00:56:11,450 –> 00:56:14,650
because I think history matters a whole lot in these kinds of con a whole
910
00:56:14,650 –> 00:56:18,125
lot in these kinds of conversations, I think, actually, history probably
911
00:56:18,185 –> 00:56:21,565
matters more than which generational cohort you happen to be in,
912
00:56:22,345 –> 00:56:25,965
because the historical events that are surrounding you
913
00:56:26,665 –> 00:56:30,090
mold your thinking even if you are not aware of them,
914
00:56:31,190 –> 00:56:34,470
because they molded your parents’ thinking. And then your parents behaved a certain way, and
915
00:56:34,470 –> 00:56:37,290
there we go. That’s the the the falling domino. Right?
916
00:56:37,830 –> 00:56:41,655
So I think the height
917
00:56:41,655 –> 00:56:44,315
of enlightenment reason was the atomic bomb,
918
00:56:45,015 –> 00:56:48,775
Hiroshima and Nagasaki. That was the height of enlightenment reasoning. And I think
919
00:56:48,775 –> 00:56:52,535
we’ve been pulling back in the West in horror from that over the
920
00:56:52,535 –> 00:56:56,240
last eighty years. And what you’re talking about is a triumph of
921
00:56:56,320 –> 00:56:59,200
at least what I’m hearing. And maybe I’m I’m incorrect. Correct me if I’m wrong.
922
00:56:59,200 –> 00:57:01,920
And I have not read Ekeland’s book. I wrote it down. I’ll go ahead and
923
00:57:01,920 –> 00:57:05,140
take a look at that. I read some neuroscience stuff.
924
00:57:05,760 –> 00:57:09,585
Okay. Sounds good. Yeah. Probably. Because There’s some other
925
00:57:09,585 –> 00:57:13,105
things that happened to me in my life, relatives and my family and and whatnot.
926
00:57:13,105 –> 00:57:16,705
I had to figure out what’s going on with them. But, but what I’m
927
00:57:16,705 –> 00:57:20,325
seeing over the last eighty years is that the triumph
928
00:57:20,465 –> 00:57:23,280
of or the, I mean, not the triumph of. I think of it like Star
929
00:57:23,280 –> 00:57:26,640
Wars. Right? The Empire Strikes Back. It’s it’s feeling strike
930
00:57:26,640 –> 00:57:30,160
back. Right? And and if I’m
931
00:57:30,160 –> 00:57:33,380
incorrect in thinking about this or analyzing this this way, let me know.
932
00:57:33,760 –> 00:57:37,335
I I I think I think the continuing
933
00:57:37,395 –> 00:57:41,015
struggle will be the tension between feelings and reason, but this is the enlightenment
934
00:57:41,155 –> 00:57:44,995
struggle. And the lie is that we
935
00:57:44,995 –> 00:57:47,475
can write our way out of it or we can reason our way out of
936
00:57:47,475 –> 00:57:49,415
it because writing feels,
937
00:57:51,150 –> 00:57:53,890
well, for lack of a better term, reasonable.
938
00:57:57,069 –> 00:58:00,910
But I even just said it there. It feels
939
00:58:00,910 –> 00:58:04,735
reasonable. There’s no rationality or logic to that. Right? And
940
00:58:04,735 –> 00:58:08,575
so this this this tension, I don’t think is going
941
00:58:08,575 –> 00:58:11,155
to be is gonna be going anywhere anytime soon.
942
00:58:12,415 –> 00:58:16,175
And our technology, of course, serves to wind up that tension to a higher and
943
00:58:16,175 –> 00:58:19,760
higher level because it it benefits people, right, and benefits
944
00:58:19,760 –> 00:58:23,280
advertisers and whatever. Yeah. I
945
00:58:23,600 –> 00:58:27,120
oh, I’m sorry. Go ahead. No. Go ahead. You’re
946
00:58:27,840 –> 00:58:31,005
I I know at one point you wanted to bring up LLMs and and and
947
00:58:31,005 –> 00:58:34,605
this this this Yeah. I’m I’m wandering in that direction. Right? Yeah. This kind of
948
00:58:34,605 –> 00:58:38,204
this kind of loose there. But but,
949
00:58:38,605 –> 00:58:40,785
I I do not see
950
00:58:42,365 –> 00:58:45,530
feelings and reason as,
951
00:58:47,590 –> 00:58:51,190
in a war. Mhmm. Okay. I
952
00:58:51,590 –> 00:58:54,890
so so if if this makes sense to you,
953
00:58:55,750 –> 00:58:59,425
there is no entity
954
00:58:59,485 –> 00:59:00,145
out there
955
00:59:03,805 –> 00:59:07,485
rationalizing for you and me. You
956
00:59:07,485 –> 00:59:11,085
rationalize for you and I rationalize for me. And
957
00:59:11,085 –> 00:59:14,820
and so and so and
958
00:59:14,820 –> 00:59:18,280
so I am going to be a
959
00:59:20,260 –> 00:59:23,860
experiential mix of the things that I
960
00:59:23,860 –> 00:59:27,460
have observed, how they have affected me,
961
00:59:29,255 –> 00:59:32,715
and and given me instincts for reaction, let’s call that emotion,
962
00:59:33,495 –> 00:59:37,015
and my ability to reflect on
963
00:59:37,015 –> 00:59:40,155
that and try to make the most sense of it as possible
964
00:59:40,850 –> 00:59:44,530
before I create any behavior. So
965
00:59:44,530 –> 00:59:47,990
so that that’s of one piece to me. That’s all
966
00:59:48,210 –> 00:59:51,510
one thing that I have experiences,
967
00:59:52,450 –> 00:59:55,890
and then they are mine to emote about, and they are mine to
968
00:59:55,890 –> 00:59:59,384
reflect on. On. And so I get to question why am I having that
969
00:59:59,384 –> 01:00:03,164
emotional response or why am I not having that emotional response.
970
01:00:03,305 –> 01:00:06,664
But but the mechanism of of of
971
01:00:06,664 –> 01:00:10,260
rationality is is me. And
972
01:00:10,260 –> 01:00:13,960
and so so so so that so then we sort of go,
973
01:00:14,100 –> 01:00:17,540
well, does the world function where we have,
974
01:00:17,540 –> 01:00:21,000
whatever, 8,000,000,000 individual
975
01:00:21,380 –> 01:00:24,075
mechanisms of of comprehension
976
01:00:24,935 –> 01:00:27,515
and and not a rational reality,
977
01:00:28,615 –> 01:00:32,055
yes. Right? Like like, that’s
978
01:00:32,215 –> 01:00:35,780
like like, that that or at least that is our experience
979
01:00:36,320 –> 01:00:39,780
of the world. I would concede that there is a reality,
980
01:00:40,320 –> 01:00:44,080
but but I wouldn’t concede that I know it. That but but
981
01:00:44,080 –> 01:00:47,920
I can I can have my own reflections on it
982
01:00:47,920 –> 01:00:51,045
and then make my own assumptions
983
01:00:51,425 –> 01:00:55,185
of, and and be curious about it and try to and
984
01:00:55,185 –> 01:00:58,945
try to understand that reality as much as possible and then align my
985
01:00:58,945 –> 01:01:02,485
life to the way where I think that that that I can
986
01:01:02,790 –> 01:01:06,550
have the most productive and meaningful life inside of that reality? But
987
01:01:06,550 –> 01:01:10,390
the sense maker there is me. It’s not my church. It’s
988
01:01:10,390 –> 01:01:14,230
not my university. It’s not the sense
989
01:01:14,230 –> 01:01:17,690
maker there. It’s me. And and so to me, they’re not really
990
01:01:18,275 –> 01:01:22,035
divorced from each other. They’re of the same piece because they’re internal to
991
01:01:22,035 –> 01:01:25,335
us, and they inform each other. But we may see that differently.
992
01:01:25,955 –> 01:01:29,795
Yeah. We do. I’m gonna sidestep that because that’s that we would unwind into,
993
01:01:29,795 –> 01:01:33,030
like, a four hour conversation. I don’t think we have that kind of time. And
994
01:01:33,030 –> 01:01:36,410
I would I would love to challenge the the
995
01:01:38,870 –> 01:01:41,850
lack of a god inside of the machine. Let’s let’s frame it that way.
996
01:01:42,790 –> 01:01:46,425
Idea that’s inherent in that, but not right now. Maybe we’ll have you on.
997
01:01:46,724 –> 01:01:49,325
Yeah. I mean, that that would be another section. That’d be that’s a different kind
998
01:01:49,325 –> 01:01:52,964
of conversation. Yeah. But the but I and I and I will say
999
01:01:52,964 –> 01:01:56,665
this. I
1000
01:01:56,724 –> 01:02:00,190
I am of the thought that we can actually know
1001
01:02:00,190 –> 01:02:03,790
reality, but, here’s the but, it’s hard and it requires
1002
01:02:03,790 –> 01:02:06,130
effort from us Yeah. Which
1003
01:02:07,230 –> 01:02:10,905
we filter that effort through our
1004
01:02:10,905 –> 01:02:14,205
experiences and through our feelings and through our reason.
1005
01:02:15,785 –> 01:02:19,325
But the effort is the thing that matters.
1006
01:02:19,785 –> 01:02:23,545
This is why, the term Israel means we
1007
01:02:23,545 –> 01:02:26,870
who wrestle with or struggle with god. Right?
1008
01:02:27,970 –> 01:02:29,590
And I think that that is an appropriate
1009
01:02:32,130 –> 01:02:35,490
motto for our time. I tend to I tend
1010
01:02:35,490 –> 01:02:39,030
to I’m just I’ll just sort of partially lay my cards out.
1011
01:02:39,170 –> 01:02:42,954
I tend to not think that Nietzsche was that brilliant. He got a
1012
01:02:42,954 –> 01:02:46,714
couple of things correct, but he was
1013
01:02:46,714 –> 01:02:50,255
just calling the end of the Kantian enlightenment project,
1014
01:02:51,275 –> 01:02:54,875
and saying that it had reached its logical conclusion. But a lot of
1015
01:02:54,875 –> 01:02:58,420
people, a lot of philosophers in particular and also
1016
01:02:58,420 –> 01:03:01,860
writers, have leveraged his thoughts, I
1017
01:03:01,860 –> 01:03:05,540
think, incorrectly, throughout the twentieth century and and and
1018
01:03:05,540 –> 01:03:09,300
caused a lot of damage, actually. Mhmm. So and and I think that
1019
01:03:09,300 –> 01:03:12,955
there’s some talk about the neuroscience. I think there’s some neuroscience
1020
01:03:12,955 –> 01:03:15,215
and some research that shows that,
1021
01:03:16,635 –> 01:03:20,315
yeah, maybe we might we might have missed the mark a little bit on that.
1022
01:03:20,315 –> 01:03:24,130
So Mhmm. But, again, that’s that’s way beyond that’s way beyond where we
1023
01:03:24,130 –> 01:03:27,650
are. Listen. I don’t think that’s that that’s controversial. I don’t think you should be
1024
01:03:27,650 –> 01:03:31,250
worried. I I’m like, hey. Say it. Like like, I
1025
01:03:31,410 –> 01:03:34,290
you know? Like No. No. It’s not the controversy. Oh, I’m not worried about the
1026
01:03:34,290 –> 01:03:38,125
controversy piece. It’s the it’s I wanna be cognizant of your time. It’s the only
1027
01:03:38,125 –> 01:03:41,645
thing that we yeah. Let let us all live in a world where a hundred
1028
01:03:41,645 –> 01:03:45,484
years from now, people are looking back on these talks, and they go, you know,
1029
01:03:45,484 –> 01:03:48,365
they got some things right about the world, but not everything. I’m not gonna be
1030
01:03:48,365 –> 01:03:52,010
a person. I’ll sign up for that, man. There you go. That’s right.
1031
01:03:55,030 –> 01:03:58,630
Okay. So we’ve talked about well, okay. So this
1032
01:03:58,630 –> 01:04:02,470
leads into one of the things that sort of I’m obsessed with on this show.
1033
01:04:02,470 –> 01:04:06,090
Okay? I’m obsessed with with the transference of wisdom.
1034
01:04:06,525 –> 01:04:10,285
How do we get wisdom from one generation to another? The
1035
01:04:10,285 –> 01:04:14,125
best vehicles we’ve had for that have been stories. Stories, the
1036
01:04:14,125 –> 01:04:17,645
oral narrative. There’s an essayist named Walter
1037
01:04:17,645 –> 01:04:21,040
Benjamin, who wrote an essay called The Storyteller.
1038
01:04:21,900 –> 01:04:25,580
It was on the writings his critique of the writings of, Nikolai
1039
01:04:25,580 –> 01:04:29,420
Leskoff, back in the nineteen thirties. And we actually covered that on the podcast.
1040
01:04:29,660 –> 01:04:32,880
You should go listen to that episode. And his
1041
01:04:33,260 –> 01:04:36,925
critique, right, of the technology of the novel
1042
01:04:37,385 –> 01:04:40,365
was that it killed the ability to transfer wisdom.
1043
01:04:40,905 –> 01:04:44,665
Instead, it took wisdom that was in an oral narrative and turned it
1044
01:04:44,665 –> 01:04:48,470
into mere information. Mhmm. Okay.
1045
01:04:48,470 –> 01:04:51,830
And and he was approaching it from a Marxian dialectic as well. So there’s there’s
1046
01:04:51,830 –> 01:04:55,609
some other things underneath there. But also you’re going? Yeah.
1047
01:04:55,990 –> 01:04:59,670
Yeah. But also a Judaic mystic frame and a German German
1048
01:04:59,670 –> 01:05:03,425
Prussian sort of framing as he was writing in the nineteen thirties, you
1049
01:05:03,425 –> 01:05:07,265
know, in Germany, and and trying to figure out what was
1050
01:05:07,265 –> 01:05:11,025
going on in the Intergerum, you know, in that in that country. Right?
1051
01:05:11,025 –> 01:05:14,805
Why wasn’t why wasn’t the wisdom of avoiding
1052
01:05:14,865 –> 01:05:18,599
authoritarianism filtering down into people? Why were they going in the
1053
01:05:18,599 –> 01:05:20,380
particular direction that they were going?
1054
01:05:22,520 –> 01:05:25,420
And I think that’s a relevant question for our time as well.
1055
01:05:26,359 –> 01:05:29,880
So how does writing help us
1056
01:05:29,880 –> 01:05:33,575
transfer wisdom? Does it, or do we need the oral narrative? Is it better to
1057
01:05:33,575 –> 01:05:35,115
just do that through conversation?
1058
01:05:37,575 –> 01:05:41,195
I think I think writing is, generally
1059
01:05:41,415 –> 01:05:44,555
speaking, more effective than oral,
1060
01:05:46,859 –> 01:05:50,540
But but the type of oral tradition you’re referring to is
1061
01:05:50,540 –> 01:05:53,599
is a a type of oral tradition
1062
01:05:54,460 –> 01:05:58,140
that functions as a piece of writing. Okay. And and
1063
01:05:58,140 –> 01:06:01,605
so and and so to me. And and
1064
01:06:01,605 –> 01:06:05,365
so if we think of writing, this is and this is
1065
01:06:05,365 –> 01:06:08,825
not all writing to our to our Mhmm. Social media conversation earlier.
1066
01:06:08,965 –> 01:06:12,805
Yeah. But if we think of writing as the
1067
01:06:12,805 –> 01:06:16,510
written expression of a thought that has
1068
01:06:16,510 –> 01:06:20,049
been, reflected upon enough
1069
01:06:20,589 –> 01:06:23,410
to be worthy of someone else’s time,
1070
01:06:24,990 –> 01:06:27,730
then writing is certainly a very useful
1071
01:06:28,765 –> 01:06:32,525
mechanism of sharing wisdom if if that’s
1072
01:06:32,525 –> 01:06:36,285
if that’s the definition of it. And I think, you know, that’s
1073
01:06:36,285 –> 01:06:39,725
really what what traditional oral history
1074
01:06:39,725 –> 01:06:43,540
is is is is is that. Right? It is a
1075
01:06:43,940 –> 01:06:46,920
it it is the process of language applied to a reflection
1076
01:06:47,619 –> 01:06:51,380
that that, informs the listener of
1077
01:06:51,380 –> 01:06:55,155
the world. And so, I think Benjamin
1078
01:06:55,155 –> 01:06:58,915
is is onto something about do all novels do that?
1079
01:06:58,915 –> 01:07:02,535
No. Do even most novels do that? Probably not.
1080
01:07:02,675 –> 01:07:05,495
The and then if you wanna get controversial,
1081
01:07:07,500 –> 01:07:08,960
did novels in
1082
01:07:10,220 –> 01:07:13,900
1995 do
1083
01:07:13,900 –> 01:07:17,040
that better than novels
1084
01:07:17,260 –> 01:07:19,840
in 2015? Yes.
1085
01:07:21,165 –> 01:07:24,845
That that that I I would say the same difficulties we
1086
01:07:24,845 –> 01:07:28,605
have on social media, name your publishing house. They’ve
1087
01:07:28,605 –> 01:07:32,285
had those difficulties too that how, you
1088
01:07:32,285 –> 01:07:36,080
know, the the the the loudest people, not necessarily the most
1089
01:07:36,080 –> 01:07:39,920
thoughtful, not necessarily the most reflective, the ones who make the
1090
01:07:39,920 –> 01:07:43,680
most noise out there with the biggest platforms are the ones who are getting the
1091
01:07:43,680 –> 01:07:46,740
book deals. And and it’s like, well, what is that doing
1092
01:07:47,305 –> 01:07:51,120
to the, wisdom of the of the, of
1093
01:07:51,120 –> 01:07:54,825
of the culture? You know, it’s not particularly adding to it. So so
1094
01:07:55,065 –> 01:07:58,825
but but but do I think that writing as as the
1095
01:07:59,450 –> 01:08:02,650
if we look at it as in in the same way we would in a
1096
01:08:02,650 –> 01:08:06,410
in a strong oral tradition, is is the is the
1097
01:08:06,410 –> 01:08:10,170
verbalization or the written verbalization of of
1098
01:08:10,170 –> 01:08:13,930
a reflection that is worthy of consideration of
1099
01:08:13,930 –> 01:08:17,645
someone else? And does writing function that way? And
1100
01:08:17,645 –> 01:08:21,024
can it function that way and create wisdom
1101
01:08:21,805 –> 01:08:25,425
for other people to grow and make decisions and add their own
1102
01:08:25,645 –> 01:08:29,325
understandings of it to that? I do think writing is highly effective and probably the
1103
01:08:29,325 –> 01:08:33,090
most effective tool we have for that still. Yeah. I mean, I
1104
01:08:33,090 –> 01:08:35,750
agree with Benjamin about the novel,
1105
01:08:37,010 –> 01:08:39,750
disintermediating, which is a word he did not know.
1106
01:08:40,850 –> 01:08:44,364
The and the printing press, actually, is where he really goes back to it, disintermediating
1107
01:08:44,904 –> 01:08:48,524
the oral narrative. And yet, there are books
1108
01:08:49,545 –> 01:08:53,005
that seem to resist the disintermediation
1109
01:08:53,385 –> 01:08:55,920
of the printing press, or they went along with it.
1110
01:08:57,240 –> 01:09:01,040
Stories that were then translated and became
1111
01:09:01,040 –> 01:09:04,640
parts of or transliterated, not translated, transliterated into
1112
01:09:04,640 –> 01:09:08,080
other forms in novels, movies,
1113
01:09:08,080 –> 01:09:11,779
film, of course, in the West.
1114
01:09:12,535 –> 01:09:15,734
And, of course, in these books, I’m I’m in front of, like, in this thought.
1115
01:09:15,734 –> 01:09:19,175
Those books also seem to defy the algorithm. I
1116
01:09:19,175 –> 01:09:22,935
mean, if I am and the example that
1117
01:09:22,935 –> 01:09:26,635
I’ll use is Homer. Like, Christopher Nolan, who just directed Oppenheimer,
1118
01:09:26,854 –> 01:09:30,570
is directing The Odyssey. Mhmm.
1119
01:09:31,790 –> 01:09:35,470
Make of that whatever you will. Okay? And I’m I’m
1120
01:09:35,470 –> 01:09:39,229
gonna be here. Lots of cool things happen. I’m a I’m a huge
1121
01:09:39,229 –> 01:09:42,510
fan of Christopher Nolan as a director. I’ve I’ve I’ve I’ve
1122
01:09:42,990 –> 01:09:45,285
here’s here’s what I was and Nolan, I trust, and I just leave it at
1123
01:09:45,285 –> 01:09:48,404
that. There you go. Okay. You know? He’s he’s made a few duds. Don’t get
1124
01:09:48,404 –> 01:09:51,604
me wrong. Interstellar was not great. Chris, we should have a conversation about that. That
1125
01:09:51,604 –> 01:09:55,145
movie was trash, and tenant tenant was self referential
1126
01:09:55,205 –> 01:09:58,700
garbage. Stop it, sir. But Yes. The vast majority of the rest of it has
1127
01:09:58,700 –> 01:10:02,540
been has been has been excellent. I guess. A plus
1128
01:10:02,540 –> 01:10:06,160
stuff. But this is a person who, again, understands how storytelling applies
1129
01:10:06,220 –> 01:10:10,045
to that medium, how ancient stories, again, Homer,
1130
01:10:10,045 –> 01:10:13,485
apply to that medium, how they, again, they defy the algorithm. And I think our
1131
01:10:13,485 –> 01:10:17,105
most ancient stories that come out of an oral tradition, like the Bible,
1132
01:10:17,645 –> 01:10:21,060
like Greek mythology, are gonna just continue on
1133
01:10:21,460 –> 01:10:25,240
regardless of what the technology is that seeks to disintermediate them.
1134
01:10:26,740 –> 01:10:30,420
And that gets us to our last go around here. It gets us to the
1135
01:10:30,420 –> 01:10:30,920
LLMs.
1136
01:10:33,780 –> 01:10:34,840
Mhmm. So
1137
01:10:38,845 –> 01:10:42,285
as a person who writes, I’m not worried. Weirdly
1138
01:10:42,285 –> 01:10:45,725
enough, I’m not worried about large language models. I’m really
1139
01:10:45,725 –> 01:10:49,505
not. A, because
1140
01:10:51,570 –> 01:10:54,950
I personally, as an individual, can
1141
01:10:55,170 –> 01:10:58,930
outthink them no matter what they spew out. Right? I can find
1142
01:10:58,930 –> 01:11:02,370
the gaps and all of that. Number two, I don’t
1143
01:11:02,370 –> 01:11:06,210
anthropomorphize them. I don’t call them intelligence because they’re
1144
01:11:06,210 –> 01:11:08,925
not, and I refuse to play that,
1145
01:11:10,025 –> 01:11:13,865
word game with them. But then I also and this
1146
01:11:13,865 –> 01:11:17,545
is the third thing. Just like any technology, I am
1147
01:11:17,545 –> 01:11:20,045
expecting it to expose human failures,
1148
01:11:21,460 –> 01:11:24,599
but also to create human successes. Right?
1149
01:11:25,699 –> 01:11:29,300
And so I don’t buy into the hype of LLMs. I do see their
1150
01:11:29,300 –> 01:11:32,739
usefulness in certain situations or for certain
1151
01:11:32,739 –> 01:11:36,155
projects. But I think the challenge
1152
01:11:36,535 –> 01:11:40,295
that they provide is one of, and it’s kind of
1153
01:11:40,295 –> 01:11:43,335
one we’re we’ve kind of been lazy at, at least in America over the last
1154
01:11:43,335 –> 01:11:45,835
twenty years, curation and aggregation.
1155
01:11:47,150 –> 01:11:50,750
And the people who figure out how to use these models and
1156
01:11:50,750 –> 01:11:54,350
then curated aggregate the best of these models are
1157
01:11:54,350 –> 01:11:57,950
going to be fine. Other people are just
1158
01:11:57,950 –> 01:12:01,594
gonna continue to use Microsoft Copilot to write a crappy email that they don’t wanna
1159
01:12:01,594 –> 01:12:04,475
send so they could twirl around at their desk and eat a Snickers bar. And
1160
01:12:04,475 –> 01:12:08,015
that’s fine. That’s that’s fine. I mean, I guess.
1161
01:12:09,035 –> 01:12:12,795
Thoughts on LLMs? Thoughts on anything. I believe
1162
01:12:12,795 –> 01:12:15,910
that was a commercial, by the way. I was trying to say I believe that
1163
01:12:15,910 –> 01:12:17,990
was a commercial during the Super Bowl that I might have missed or might have
1164
01:12:17,990 –> 01:12:21,750
heard about later on. Thoughts on thoughts
1165
01:12:21,750 –> 01:12:25,590
on on the the the hype around LLMs versus
1166
01:12:25,590 –> 01:12:29,265
the reality of of human cognition? I I
1167
01:12:29,265 –> 01:12:33,045
think I couldn’t agree with you more. I I I think it’s it’s
1168
01:12:33,264 –> 01:12:36,885
a but, like, so so let’s let’s give LLMs
1169
01:12:36,945 –> 01:12:40,545
their their their due Their due. Yeah. At
1170
01:12:40,545 –> 01:12:42,485
first. And and so,
1171
01:12:46,420 –> 01:12:50,179
are there tens of
1172
01:12:50,179 –> 01:12:53,239
thousands of photos of rare cancers
1173
01:12:53,780 –> 01:12:56,415
on the Internet? Probably. Yeah.
1174
01:12:58,235 –> 01:13:01,295
Could someone, in the foreseeable future or now,
1175
01:13:02,235 –> 01:13:05,215
take a photo of a spot on their arm,
1176
01:13:05,995 –> 01:13:09,215
which the doctor said it’s probably nothing,
1177
01:13:10,600 –> 01:13:14,040
And the artificial intelligence engine could say there’s
1178
01:13:14,040 –> 01:13:17,880
a eighty two percent chance that it’s one of these
1179
01:13:17,880 –> 01:13:21,020
rare cancers. Possible.
1180
01:13:21,560 –> 01:13:25,265
And it might take the doctor six weeks to
1181
01:13:25,265 –> 01:13:28,705
test and whatever. And now that
1182
01:13:28,705 –> 01:13:32,385
process is accelerated by this
1183
01:13:32,385 –> 01:13:36,225
person going and saying, I’d like to be tested for these cancers, and
1184
01:13:36,225 –> 01:13:39,900
this is why. And I guess this part of the mole
1185
01:13:39,900 –> 01:13:43,740
looking this way is roughly approximating this rare
1186
01:13:43,740 –> 01:13:47,340
cancer, and I’d like to look for it. And now you’ve you’ve accelerated the
1187
01:13:47,340 –> 01:13:50,940
process six weeks. So Mhmm. So do I think that that is a
1188
01:13:50,940 –> 01:13:53,614
foreseeable and a useful and an amazing
1189
01:13:54,635 –> 01:13:58,235
achievement? I do, and we should use it, and God
1190
01:13:58,235 –> 01:14:01,835
bless humanity. And then the
1191
01:14:01,835 –> 01:14:05,215
question becomes, is it actually
1192
01:14:05,355 –> 01:14:09,080
intelligent? Because that isn’t actually intelligence. That
1193
01:14:09,080 –> 01:14:12,179
is that is something closer to
1194
01:14:12,480 –> 01:14:15,780
a massively high functioning
1195
01:14:15,920 –> 01:14:19,679
database. Mhmm. And and but it doesn’t
1196
01:14:19,679 –> 01:14:23,514
actually require new information. And
1197
01:14:23,514 –> 01:14:26,414
the minute the minute we go, well,
1198
01:14:27,434 –> 01:14:30,875
what’s the what what’s the what what is
1199
01:14:30,875 –> 01:14:34,474
the next level of understanding that
1200
01:14:34,474 –> 01:14:37,790
humans do not have about something? Mhmm.
1201
01:14:38,010 –> 01:14:41,449
Can we ask the the computers to do
1202
01:14:41,449 –> 01:14:45,290
it? There’s two difficulties I have with that just as a
1203
01:14:45,290 –> 01:14:49,070
matter of structure. One is, to our point earlier,
1204
01:14:49,849 –> 01:14:53,415
we falsely think humans are are
1205
01:14:53,415 –> 01:14:56,395
intelligent because we have
1206
01:14:57,095 –> 01:15:00,635
data. It’s not true. Humans
1207
01:15:00,775 –> 01:15:03,435
are intelligent because we have instincts
1208
01:15:04,375 –> 01:15:08,170
based on reflections of data. And so
1209
01:15:08,170 –> 01:15:11,070
and so that begs the question,
1210
01:15:11,930 –> 01:15:15,310
what instincts does this have? None. Mhmm.
1211
01:15:15,930 –> 01:15:19,605
And what reflective ability does it have?
1212
01:15:19,765 –> 01:15:23,605
None. So it’s highly limited in its
1213
01:15:23,605 –> 01:15:27,364
ability to create actual intelligence, and
1214
01:15:27,364 –> 01:15:30,965
you and I are not. Right? It’s actually a fairly it’s a
1215
01:15:30,965 –> 01:15:34,740
fairly simple process, right, that that, you know,
1216
01:15:34,740 –> 01:15:38,580
if my wife were to come in here right now and start yelling
1217
01:15:38,580 –> 01:15:42,340
at me, we we would we would say, oh, jeez. Brian looks upset. And
1218
01:15:42,340 –> 01:15:46,180
then and then we’d be able to determine within a couple
1219
01:15:46,180 –> 01:15:49,364
of minutes, probably, why she’s upset.
1220
01:15:49,905 –> 01:15:53,665
Yeah. And and it would be really hard for an LLM
1221
01:15:53,665 –> 01:15:57,505
to do that. Right? Like like, you could feed it
1222
01:15:57,505 –> 01:16:01,344
my whole life, and it and it probably couldn’t do that. But you and
1223
01:16:01,344 –> 01:16:05,070
I could do it in about forty seconds. And so and
1224
01:16:05,070 –> 01:16:08,750
so and so and so it’s very like, it’s it and and so and so
1225
01:16:08,750 –> 01:16:12,510
we make the assumption that data is equal
1226
01:16:12,510 –> 01:16:16,350
to assessment of data and reflection on data, and they’re
1227
01:16:16,350 –> 01:16:19,765
not the same thing. And and and it’s not codable.
1228
01:16:20,065 –> 01:16:23,765
And and and it then begs the question, does
1229
01:16:24,225 –> 01:16:28,005
it that it who who is the thinker? Not
1230
01:16:28,225 –> 01:16:31,825
not what is the not what is the data to be
1231
01:16:31,825 –> 01:16:35,180
thought about, but who is the thinker.
1232
01:16:35,880 –> 01:16:39,660
And and and do these things actually have enough personality
1233
01:16:39,720 –> 01:16:43,420
and, therefore, the instincts, etcetera, to be thinking?
1234
01:16:43,880 –> 01:16:47,455
And my sense is we’re very far from from from
1235
01:16:47,455 –> 01:16:51,055
that right right now, and I don’t know that we will ever get
1236
01:16:51,055 –> 01:16:53,875
there. And I’ll and I’ll share with you
1237
01:16:54,575 –> 01:16:58,015
this. I I was gonna ask this question. This I was at the Wall Street
1238
01:16:58,015 –> 01:17:01,659
Journal Future of Everything conference, and I went to the the guy who
1239
01:17:01,659 –> 01:17:05,500
runs DeepMind, went to his thing. Yeah. And so I’m sure this is
1240
01:17:05,500 –> 01:17:08,219
gonna end up on the Internet, and go ahead and feel free to clip this
1241
01:17:08,219 –> 01:17:11,920
and make me look like an asshole. But I was gonna ask him this question.
1242
01:17:12,825 –> 01:17:16,505
And number one, he didn’t take any questions. The guy who ran DeepMind was
1243
01:17:16,505 –> 01:17:20,265
was was number one, he didn’t ask any questions. Oh, no. It was it was
1244
01:17:20,265 –> 01:17:23,545
Google’s sorry. I should say this right because because the DeepMind guy was there as
1245
01:17:23,545 –> 01:17:27,260
well. It was Google’s, like, like, head
1246
01:17:27,260 –> 01:17:31,020
of, like, moonshot projects or something. Oh, yeah. Yeah. Yeah. Yeah. Yeah. Mhmm.
1247
01:17:31,020 –> 01:17:34,320
And and and I was gonna ask this question, but
1248
01:17:34,620 –> 01:17:38,460
his presentation was
1249
01:17:38,460 –> 01:17:42,005
so pedestrian. It was so simple. It
1250
01:17:42,625 –> 01:17:46,305
was so, hey. Human beings have been replaced by technology for
1251
01:17:46,305 –> 01:17:49,905
forever, and I know you’re upset about it. But, like, it was so I was
1252
01:17:49,905 –> 01:17:53,125
like, I can’t ask. I’m not sure he could answer it.
1253
01:17:53,344 –> 01:17:56,350
Like like like, if if this is the guy
1254
01:17:57,050 –> 01:18:00,030
doing moonshots, we’re nowhere like,
1255
01:18:01,130 –> 01:18:04,890
they’re driving. They’re not this is not even the right highway. They’re they’re they’re going
1256
01:18:04,890 –> 01:18:08,555
the wrong direction. Like like, this person can’t
1257
01:18:08,555 –> 01:18:12,235
comprehend that question. Right. That right? And and
1258
01:18:12,235 –> 01:18:15,915
so and and so, like, I was like, this is so frustrating. Like like because
1259
01:18:15,915 –> 01:18:19,435
because these are interesting things if they actually bring it up, but but that
1260
01:18:19,435 –> 01:18:23,110
was awful. Right? It was like talk about the
1261
01:18:23,110 –> 01:18:26,869
the the quality of somebody’s, reflections on on the
1262
01:18:26,869 –> 01:18:29,929
experiences they’ve had. I’m like, if that’s the quality of
1263
01:18:30,389 –> 01:18:33,849
reflection based on the experiences of the people who run
1264
01:18:34,070 –> 01:18:37,844
Google moonshots, sell your stock. Right? Like
1265
01:18:37,844 –> 01:18:41,364
like, that that ain’t gonna go well for people. And so
1266
01:18:41,525 –> 01:18:45,364
and and so my sense is that that we have this promise of
1267
01:18:45,364 –> 01:18:49,159
the thing. We think data and processing power is the way to
1268
01:18:49,159 –> 01:18:52,520
get through the promise of this thing, and we’re missing the
1269
01:18:52,520 –> 01:18:56,060
question, who is the thinker and how is the thinker
1270
01:18:56,119 –> 01:18:59,639
creating the instincts of the instincts of of
1271
01:18:59,639 –> 01:19:03,045
creation? And I don’t think we’re anywhere near there, and I don’t think Yeah. I
1272
01:19:03,045 –> 01:19:06,885
don’t think we’re it’s it’s just not a threat to writers. No. No. I
1273
01:19:06,885 –> 01:19:08,105
I agree. I think
1274
01:19:12,645 –> 01:19:16,405
human beings can do everything an LLM can’t do. Right? Which is a lot of
1275
01:19:16,405 –> 01:19:20,140
things. And these two
1276
01:19:20,140 –> 01:19:23,680
things can be true at the same time. LLMs can do a lot of things
1277
01:19:24,220 –> 01:19:28,060
that human beings don’t want to do when they are
1278
01:19:28,060 –> 01:19:31,840
employed to do those things that human beings don’t wanna do.
1279
01:19:32,515 –> 01:19:36,355
And the sad tragedy is the things that human beings don’t
1280
01:19:36,355 –> 01:19:39,955
wanna do. Say, for instance, I’ve got
1281
01:19:39,955 –> 01:19:43,715
to I do my laundry because I live in a house with other
1282
01:19:43,715 –> 01:19:46,140
people, so I get to do my laundry once a month.
1283
01:19:47,660 –> 01:19:50,380
That’s the only time that I can get in. Yes. I do have enough clean
1284
01:19:50,380 –> 01:19:51,760
clothes. Thank you for asking.
1285
01:19:54,220 –> 01:19:57,900
I make sure I do everything once a month, and then I just dominate. And
1286
01:19:57,900 –> 01:20:01,574
then I’m done, and I irritate everybody, and it’s fine. I don’t
1287
01:20:01,715 –> 01:20:05,415
want an LLM to send my email
1288
01:20:05,795 –> 01:20:09,235
to somebody. That’s not a problem. I want the
1289
01:20:09,235 –> 01:20:12,934
LLM to do my laundry. Yes. Yes.
1290
01:20:12,994 –> 01:20:16,600
To paraphrase Peter Thiel, you know, I don’t
1291
01:20:16,600 –> 01:20:20,360
wanna I don’t wanna be promised moonshots and
1292
01:20:20,360 –> 01:20:23,740
get emails. And don’t don’t don’t overpromise
1293
01:20:24,680 –> 01:20:28,360
and then and then specifically don’t don’t
1294
01:20:28,360 –> 01:20:31,580
under deliver. Yeah. Yeah.
1295
01:20:31,975 –> 01:20:34,935
Alright, Brian. I think we’ve reached the end of our time together. This has been
1296
01:20:34,935 –> 01:20:38,775
a fascinating conversation. We’ve opened up doors in the floor, in
1297
01:20:38,775 –> 01:20:41,655
the floor of my head. Hopefully, I’ve opened up some doors in the floor of
1298
01:20:41,655 –> 01:20:45,335
your head. Appreciate it. Hopefully, this has been a this has been a an an
1299
01:20:45,335 –> 01:20:49,140
enlightening and engaging conversation for our our listeners as well, something to think
1300
01:20:49,140 –> 01:20:52,660
about. We haven’t really come to any conclusions, and I think that’s good, because these
1301
01:20:52,660 –> 01:20:56,500
are all still open questions. What would you like to
1302
01:20:56,500 –> 01:20:59,800
promote today, if anything? I’ll give you the last word here.
1303
01:21:01,405 –> 01:21:05,105
Well, first of all, if anybody’s interested in learning more about us,
1304
01:21:05,565 –> 01:21:09,265
think deeply, write clearly Com. There’s a little
1305
01:21:09,485 –> 01:21:13,005
button on there for a fifteen minute call if anybody’s interested
1306
01:21:13,005 –> 01:21:16,790
in in in chatting. In a in
1307
01:21:16,790 –> 01:21:20,550
a very nonspecific way, the things that a lot of
1308
01:21:20,550 –> 01:21:24,390
people find interesting to to start with my company if if this
1309
01:21:24,390 –> 01:21:28,230
conversation is is is of interest to you and how to
1310
01:21:28,230 –> 01:21:30,875
write from more deeply and observed,
1311
01:21:32,375 –> 01:21:35,835
way in the world is is of interest to you. We have a
1312
01:21:36,934 –> 01:21:40,695
a program that’s $99 per quarter, and it’s about ten
1313
01:21:40,695 –> 01:21:44,235
minutes a week. People tend to love that, and I’d be happy to give anybody,
1314
01:21:44,590 –> 01:21:47,470
you know, a couple of months free into that and see if they like it.
1315
01:21:47,470 –> 01:21:51,070
So just email me or or, you know, hook up on that
1316
01:21:51,070 –> 01:21:54,670
somehow on that site, and I’d be happy to chat with you. Great. We will
1317
01:21:54,670 –> 01:21:57,970
have links to Brian Morgan’s site
1318
01:21:58,270 –> 01:22:01,915
at think deeply, write clearly. I would encourage
1319
01:22:01,915 –> 01:22:05,675
you to check that out and to click on all those links and get in
1320
01:22:05,675 –> 01:22:09,515
contact with Brian Morgan and, of course, follow him around in all the places on
1321
01:22:09,515 –> 01:22:13,135
social media where you may be able to follow him, follow him
1322
01:22:13,250 –> 01:22:16,790
around and, and make sure to connect with him widely
1323
01:22:17,330 –> 01:22:21,170
and clearly. Alright. I’d like to thank Brian
1324
01:22:21,170 –> 01:22:24,790
Morgan for coming on leadership lessons from the Great Books podcast today. And
1325
01:22:24,929 –> 01:22:28,290
with that, well, we’re out. Thank you,
1326
01:22:28,290 –> 01:22:28,790
Fred.