Do Androids Dream of Electric Sheep? by Philip K. Dick w/ John Hill and Jesan Sorrells
—
00:00 Welcome and Introduction – Do Androids Dream of Electric Sheep? by Philip K. Dick.
00:10 The Promise of 100,000 Humanoid Robots by 2030.
05:59 New Podcast Format Introduction.
14:47 When Does Jesan Stop Reading a Book?
19:20 Infinite Games, Culture, and Perspective.
22:38 Hindsight and Decision-Making Insights from Playing Poker.
27:11 Sociocultural Cycle Dynamics Explained.
32:11 The Tech Bros. Learned All the Wrong Lessons from Philip K. Dick.
38:50 Rethinking Salesperson Stereotypes.
45:54 Having a Human-Centric Sales Strategy.
52:17 Government vs. Private Industry Venality.
52:57 Conformity vs. Cultural Individuality.
01:02:33 Reality Mirrors Fiction in Pandemics.
01:07:53 Embracing Human Connection.
01:12:20 Embracing AI Gatekeepers for Success in the Future.
01:14:42 Sales Authenticity vs. Biases.
01:19:51 “Revisiting AI and Societal Costs”
01:27:01 The Coming Anthropomorphizing of Robots.
01:34:53 The Summer of Love in 2068.
01:38:48 Value of Genuine Human Connections.
01:42:07 Cyclical Spiritual Awakenings in History.
01:48:06 Staying on the Leadership Path with Do Androids Dream of Electric Sheep?
—
Opening and closing themes composed by Brian Sanyshyn of Brian Sanyshyn Music.
—
- Pick up your copy of 12 Rules for Leaders: The Foundation of Intentional Leadership NOW on AMAZON!
- Check out the 2022 Leadership Lessons From the Great Books podcast reading list!
—
★ Support this podcast on Patreon ★
- Subscribe to the Leadership Lessons From The Great Books Podcast: https://bit.ly/LLFTGBSubscribe
- Check out HSCT Publishing at: https://www.hsctpublishing.com/.
- Check out LeadingKeys at: https://www.leadingkeys.com/
- Check out Leadership ToolBox at: https://leadershiptoolbox.us/
- Contact HSCT for more information at 1-833-216-8296 to schedule a full DEMO of LeadingKeys with one of our team members.
—
- Leadership ToolBox website: https://leadershiptoolbox.us/.
- Leadership ToolBox LinkedIn: https://www.linkedin.com/company/ldrshptlbx/.
- Leadership ToolBox YouTube: https://www.youtube.com/@leadershiptoolbox/videos
- Leadership ToolBox Twitter: https://twitter.com/ldrshptlbx.
- Leadership ToolBox IG: https://www.instagram.com/leadershiptoolboxus/.
- Leadership ToolBox FB: https://www.facebook.com/
00:00:02,080 –> 00:00:04,560
All right, all right, all right, all right, all right. Let’s get the show on
2
00:00:04,560 –> 00:00:07,920
the road. All right. Leadership Lessons from the Great Books
3
00:00:07,920 –> 00:00:09,640
podcast episode number
4
00:00:09,640 –> 00:00:13,360
149Do Androids Dream
5
00:00:13,600 –> 00:00:17,200
of Electric Sheep by fellow K. Dick with John
6
00:00:17,360 –> 00:00:21,000
Hill aka Small Mountain in
7
00:00:21,000 –> 00:00:23,840
3, 2, 1.
8
00:00:26,570 –> 00:00:30,010
Hello, my name is Jesan Sorrells, and this is
9
00:00:30,170 –> 00:00:33,890
Leadership Lessons from the Great Books podcast, episode
10
00:00:33,890 –> 00:00:36,250
number 159.
11
00:00:38,250 –> 00:00:42,010
We will open our show today with a
12
00:00:42,010 –> 00:00:45,290
short yet iconic speech that resonates
13
00:00:45,770 –> 00:00:49,090
both as a naive warning and as a prescient
14
00:00:49,090 –> 00:00:52,830
prediction of humanity’s future. And I’m thinking that
15
00:00:52,830 –> 00:00:56,230
future is just about 10 minutes from now.
16
00:00:57,670 –> 00:01:01,350
And I quote, I’ve seen things you people
17
00:01:01,350 –> 00:01:04,870
wouldn’t believe. Attack ships on fire off the
18
00:01:04,870 –> 00:01:08,550
shoulder of Orion. I watch sea beams glitter
19
00:01:08,550 –> 00:01:12,150
in the dark near the Tannhauser Gate. All
20
00:01:12,150 –> 00:01:15,910
those moments will be lost in time like tears in
21
00:01:15,910 –> 00:01:18,960
the rain. Time to die.
22
00:01:20,000 –> 00:01:21,120
Close quote.
23
00:01:23,520 –> 00:01:27,360
Combine that from Roy Batty with this
24
00:01:27,360 –> 00:01:31,200
little tidbit I recently pulled from a email
25
00:01:31,280 –> 00:01:35,040
newsletter list that I’m a part of that tracks trends.
26
00:01:36,320 –> 00:01:40,080
It was published in January of 2025 from a
27
00:01:40,080 –> 00:01:43,750
story on Forbes.com so take from this what you
28
00:01:43,750 –> 00:01:47,390
will, and I quote the CEO of one of the leading
29
00:01:47,390 –> 00:01:50,990
manufacturers of humanoid robots says it has signed a
30
00:01:50,990 –> 00:01:54,710
second commercial customer that is, quote, one of the
31
00:01:54,710 –> 00:01:57,830
biggest US Companies. Close quote Figure
32
00:01:57,910 –> 00:02:01,630
CEO Brett Adcock also said that he sees the potential
33
00:02:01,630 –> 00:02:05,390
to ship 100,000 humanoid robots over
34
00:02:05,390 –> 00:02:09,110
the next four years and said that figure is focused on two
35
00:02:09,110 –> 00:02:12,510
markets, commercial and home. And I quote
36
00:02:12,510 –> 00:02:16,230
directly from Forbes quoting Figure CEO
37
00:02:16,230 –> 00:02:20,070
Brett Adcock, our newest customer is one of the biggest US companies, Adcock says
38
00:02:20,070 –> 00:02:23,790
in an update on LinkedIn it gives us potential to ship at high volumes,
39
00:02:23,790 –> 00:02:27,310
which will drive cost reduction and AI data collection between
40
00:02:27,310 –> 00:02:30,750
both customers. We believe there is a path to 100,000 robots
41
00:02:30,910 –> 00:02:34,590
over the next four years. Those
42
00:02:34,590 –> 00:02:38,050
as of January 2025, close quote
43
00:02:39,970 –> 00:02:43,810
100,000 robots, humanoid robots no less, in
44
00:02:43,810 –> 00:02:47,490
commercial and home applications by 2029 or maybe 2030
45
00:02:47,490 –> 00:02:51,330
at the latest. Right? Oh, and by the way, these humanoid
46
00:02:51,330 –> 00:02:55,130
robots will be equipped with baseline large language model programming, in case any
47
00:02:55,130 –> 00:02:58,930
of you are wondering out there. Thus taking the next leap forward in
48
00:02:59,410 –> 00:03:03,210
seeking to solve the embodiment problem inherent in AI and
49
00:03:03,210 –> 00:03:06,960
robotics understanding. Since Alan Turing
50
00:03:07,040 –> 00:03:09,640
came up with the Turing Test back in the
51
00:03:09,640 –> 00:03:11,120
1940s.
52
00:03:14,240 –> 00:03:17,960
Our. Author today would only be surprised that it
53
00:03:17,960 –> 00:03:21,480
took all of us until about 2025 to arrive at the
54
00:03:21,480 –> 00:03:25,240
doorstep of our brave new world. With, of course, echoes of
55
00:03:25,240 –> 00:03:28,760
our cowardly old world. And that we got here without a major
56
00:03:28,760 –> 00:03:32,460
nuclear exchange and without colonies
57
00:03:32,460 –> 00:03:36,180
on the moon, without vacations to Mars or
58
00:03:36,340 –> 00:03:40,180
even interstellar exploration. No, no. We got here with
59
00:03:40,180 –> 00:03:43,820
144 characters and people endlessly hot, taking each
60
00:03:43,820 –> 00:03:45,940
other about things that don’t matter.
61
00:03:47,540 –> 00:03:50,580
We’ve seen things our author today would agree
62
00:03:51,380 –> 00:03:52,580
that he wouldn’t believe.
63
00:03:55,140 –> 00:03:58,940
Today on this episode of the podcast, we will recommending that leader. Recommending
64
00:03:58,940 –> 00:04:02,610
what leaders can do and what leaders can think and how
65
00:04:02,610 –> 00:04:06,370
leaders should behave in times of cynically delivered technological
66
00:04:06,370 –> 00:04:10,210
fantasies, fatalistic opportunism and endless
67
00:04:10,210 –> 00:04:13,730
denuding cultural slop by exploring
68
00:04:13,730 –> 00:04:17,210
insights provided for us from the book Do
69
00:04:17,210 –> 00:04:21,010
Androids Dream of Electric Sheep? By
70
00:04:21,010 –> 00:04:24,250
Philip K. Dick. Leaders.
71
00:04:25,290 –> 00:04:28,090
We’re soon going to find out the answer
72
00:04:30,520 –> 00:04:31,640
to the title question
73
00:04:35,000 –> 00:04:38,800
and we are going to be joined on the show today by our co
74
00:04:38,800 –> 00:04:42,520
host rejoining us from episode number 122. By the
75
00:04:42,520 –> 00:04:46,160
way, I’ve looked through it appears that he has been on at least seven
76
00:04:46,160 –> 00:04:49,560
episodes with us. So he’s officially sort of getting ready to arc into that
77
00:04:49,640 –> 00:04:53,320
regular guest co host slot, working his way
78
00:04:53,320 –> 00:04:56,710
up past some other competitors like Libby Unger and Richard
79
00:04:56,710 –> 00:05:00,470
Messing. So that’s good. He’s got a goal
80
00:05:00,470 –> 00:05:03,990
now. And in episode number
81
00:05:03,990 –> 00:05:07,390
122, we discuss Ray Bradbury’s Fahrenheit 451.
82
00:05:08,750 –> 00:05:11,710
Our guest host is with us today from an
83
00:05:11,710 –> 00:05:14,670
undisclosed location deep,
84
00:05:15,550 –> 00:05:17,710
deep in the heart of the American West.
85
00:05:19,950 –> 00:05:23,790
John Hill, AKA Small Mountain. How you doing,
86
00:05:23,790 –> 00:05:26,350
John? I am
87
00:05:27,790 –> 00:05:31,550
loving my life right now. I’m on vacation, family road trip. We’re in
88
00:05:31,550 –> 00:05:35,310
New Mexico. So that’s as, that’s as disclosed as we’re going to get here.
89
00:05:35,310 –> 00:05:38,750
Right. Because I’m in the land of Enchantment. So come find me if you want
90
00:05:38,750 –> 00:05:41,790
to come hang out and. Yeah, I
91
00:05:42,910 –> 00:05:46,350
can, I, can I talk about my, my unique perspective on this or do you
92
00:05:46,350 –> 00:05:49,990
want to talk a. Little more about the book? No, go ahead, jump in. One
93
00:05:49,990 –> 00:05:53,830
thing I will say is. We. Are,
94
00:05:53,830 –> 00:05:57,110
because this book is still under vicious copyright, we are not going to be reading
95
00:05:57,110 –> 00:06:00,830
directly from the book today. We will. Oh, okay. If you, if
96
00:06:00,830 –> 00:06:04,590
you, if you, if. We will be reading clips, pieces, little paragraphs here and there
97
00:06:04,590 –> 00:06:08,230
to make larger points, but we will be referencing and framing pieces
98
00:06:08,230 –> 00:06:11,510
of the book for folks just like we did on our introductory episode, episode number
99
00:06:11,510 –> 00:06:14,790
158. I recommend you go back and listen to that before you listen to this,
100
00:06:16,310 –> 00:06:19,230
which frames some of the themes, the larger themes that we’re going to be talking
101
00:06:19,230 –> 00:06:22,890
about today in our new, our new format where I do an
102
00:06:22,890 –> 00:06:26,050
intro episode by myself, and then I bring on a guest, and we sort of
103
00:06:26,050 –> 00:06:29,810
break this down a little bit larger level. Cool. So we’re doing that. This
104
00:06:29,810 –> 00:06:32,770
is part of the new format. Oh, yeah, that’s right. You haven’t, you haven’t been
105
00:06:32,770 –> 00:06:35,410
around since the new format. Yeah, this is, this is the new, this is the
106
00:06:35,410 –> 00:06:39,010
new thing, John. Okay. Doing, doing a little bit of, kind of a little bit
107
00:06:39,010 –> 00:06:42,610
different thing. That way we could do a deeper dive into, into books. Plus, it
108
00:06:42,610 –> 00:06:46,260
also lightens the, the reading load on, on
109
00:06:46,260 –> 00:06:50,060
me. Yeah. Yeah. I mean, I. Whenever we first
110
00:06:50,060 –> 00:06:53,420
started talking about shows and episodes and everything, and you were showing me your
111
00:06:53,420 –> 00:06:57,060
schedule, I had a moment of like,
112
00:06:58,260 –> 00:07:02,059
how, how, like, I read a lot, you know, and you go ask my
113
00:07:02,059 –> 00:07:05,900
friends who don’t read, and, like, in their perspective, I must be
114
00:07:05,900 –> 00:07:08,580
just, like, locked in a tomb 24 hours a day. Right? But then I see
115
00:07:08,580 –> 00:07:12,300
your schedule of just, like, the stuff that. And I was like, oh, there’s no
116
00:07:12,300 –> 00:07:14,500
way he keeps this. And we’ve known each other for a couple of years now,
117
00:07:14,500 –> 00:07:18,250
and you, like, this is the first year that you’ve had to, like,
118
00:07:18,250 –> 00:07:22,090
move anything, at least in, in my, in, you know, in
119
00:07:22,090 –> 00:07:25,690
relation to me, right? Well, I, I, I moved. So
120
00:07:25,850 –> 00:07:29,690
I will say this. I went from running eight
121
00:07:29,690 –> 00:07:32,730
books a month to running four books a month now.
122
00:07:33,850 –> 00:07:37,650
So I’m currently reading, just so that you know that y’ all know, because
123
00:07:37,650 –> 00:07:40,730
these are all upcoming on the podcast. I’m reading Empire of the Summer Moon,
124
00:07:41,770 –> 00:07:45,290
which is getting turned into a show written by Taylor Sheridan.
125
00:07:45,660 –> 00:07:47,700
So I’m kind of on a little bit of a little bit of a kick
126
00:07:47,700 –> 00:07:51,420
on that. Then I’m also reading B.H. liddell Hart’s
127
00:07:51,500 –> 00:07:54,460
why Don’t We Learn From History? That’s going to be coming up in October.
128
00:07:55,900 –> 00:07:59,660
I’m about 75% through. We talked about this just before we hit record, about
129
00:07:59,660 –> 00:08:03,260
75% through Martian Chronicles. And
130
00:08:03,340 –> 00:08:06,700
I started. Interesting enough, I started Aldous Huxley’s Brave New World, which we’re going to
131
00:08:06,700 –> 00:08:10,500
read in June. And I got to a point in
132
00:08:10,500 –> 00:08:13,850
Huxley where I had to get off the train. And I won’t talk about why
133
00:08:13,850 –> 00:08:16,610
on this episode, but I had to get off the train with Huxley.
134
00:08:17,490 –> 00:08:21,250
I was just. I was done. I was like, I mean, I see where you’re
135
00:08:21,250 –> 00:08:24,290
going, Aldous, but we’re. We’re finished here. We’re good.
136
00:08:25,410 –> 00:08:29,250
We’re good. So I’m curious, right? As a guy who has
137
00:08:29,250 –> 00:08:32,850
this show, right, and promotes this idea that, like, you don’t need to
138
00:08:32,850 –> 00:08:36,690
reinvent the wheel. Someone has gone through something similar, right? And you. And you spend
139
00:08:36,690 –> 00:08:39,610
a lot of time looking back at this. And I’m also very aware that, like,
140
00:08:39,610 –> 00:08:43,010
we’re not on topic at all, but I’m in vacation mode, so you just have
141
00:08:43,010 –> 00:08:46,720
to, like, wrangle me. So. So how do you deal with
142
00:08:46,720 –> 00:08:50,440
a book like that?
143
00:08:50,520 –> 00:08:54,240
Right? Like, you. You see, like, is it. When you can see
144
00:08:54,240 –> 00:08:57,640
where it’s going, you’re like, okay, like, I’m not going with you.
145
00:08:57,960 –> 00:09:01,760
Is it like. Like, how. When does Hay San decide that a book is
146
00:09:01,760 –> 00:09:05,440
not worth it? How does Hasan quit a book? Like, how. How do you go
147
00:09:05,440 –> 00:09:09,160
through that process? I’m very fascinated with how people grade reading,
148
00:09:09,160 –> 00:09:12,960
think about reading, quitting points, when to put down a book, and stuff like that.
149
00:09:12,960 –> 00:09:16,070
Because life is just. I want to read everything. Life is just too short, though,
150
00:09:16,070 –> 00:09:19,510
you know? Yeah. Yeah. So for me,
151
00:09:20,150 –> 00:09:23,910
and this is just my. My perspective, your mileage will vary,
152
00:09:23,910 –> 00:09:27,630
right? You know, you’re asking me the question. I host the show. So here
153
00:09:27,630 –> 00:09:28,070
we go.
154
00:09:33,590 –> 00:09:37,430
If I reach a point in a book where it.
155
00:09:37,830 –> 00:09:41,120
The writer is postulating an idea
156
00:09:42,320 –> 00:09:45,760
where I can see the logical outcropping of that idea,
157
00:09:46,400 –> 00:09:50,000
and they’re not. They’re not. They’re telegraphing sort of where they’re going to be
158
00:09:50,000 –> 00:09:53,720
going. And I don’t. I’m not
159
00:09:53,720 –> 00:09:57,440
in agreement with that idea, or I find that idea. And really,
160
00:09:57,440 –> 00:10:00,560
agreement’s really a tenuous one because I can read a lot of things I do.
161
00:10:00,560 –> 00:10:03,360
I read a lot of things I don’t agree with. So let’s not really put
162
00:10:03,360 –> 00:10:07,040
too much stock into that. It’s the idea that I
163
00:10:07,040 –> 00:10:10,720
find morally repugnant that violates my. And we talked a little bit
164
00:10:10,720 –> 00:10:14,240
about morals and ethics before in Fahrenheit 451. And,
165
00:10:15,120 –> 00:10:18,560
you know, my morality is. Is. Is. Is influenced.
166
00:10:20,160 –> 00:10:23,760
Good, bad, ugly, or different? It’s. It’s influenced by how I
167
00:10:23,760 –> 00:10:27,480
perceive, not how I perceive. No, it is influenced
168
00:10:27,480 –> 00:10:31,320
by how I practice my Christianity. Okay. It’s influenced by how I practice my religion.
169
00:10:31,320 –> 00:10:35,170
It just is. And I would be lying if I said it. It wasn’t.
170
00:10:35,170 –> 00:10:38,410
I think most people do lie. You are being influenced by something that’s a higher
171
00:10:38,890 –> 00:10:42,370
thing. You just don’t know how to identify it. Right. And you don’t really know
172
00:10:42,370 –> 00:10:46,170
how to talk about it. So if I’m in a book and this is
173
00:10:46,170 –> 00:10:49,370
just me again, not everybody else, just me. If I’m in a book
174
00:10:49,930 –> 00:10:53,650
and. And I’m going along, I’m Bebopping along and I find
175
00:10:53,650 –> 00:10:57,130
something to be morally repugnant in there. And I can see that the author is
176
00:10:57,130 –> 00:10:59,610
sort of using this to make a larger point.
177
00:11:01,130 –> 00:11:04,090
I will say to myself and to the author,
178
00:11:04,970 –> 00:11:08,810
you know, I don’t have to go down that road with you. I’m
179
00:11:08,810 –> 00:11:11,650
not required to now. I will tell you, it’s taken me a long time to
180
00:11:11,650 –> 00:11:14,650
get to that point, you know, So I read Brave New World
181
00:11:15,370 –> 00:11:19,050
when I was probably 15 or 16 maybe,
182
00:11:19,290 –> 00:11:23,090
was the first time I interacted with that book. And I
183
00:11:23,090 –> 00:11:26,600
was in a totally different stage of life. I had, you know, didn’t have kids.
184
00:11:27,320 –> 00:11:31,080
I didn’t have responsibilities. I read
185
00:11:31,080 –> 00:11:34,880
it in. In parallel to or right around the same time that I
186
00:11:34,880 –> 00:11:38,360
also read. I was reading the Stand
187
00:11:38,840 –> 00:11:42,520
by Stephen King. Oh, wow. So there’s kind of a lot of things going on
188
00:11:42,520 –> 00:11:46,240
here together. I have not read either one of those
189
00:11:46,240 –> 00:11:50,080
books, but Melissa’s a huge Stephen
190
00:11:50,080 –> 00:11:53,460
King fan, and she’s like, hey, anytime you’re ready. And she’ll just kind of point
191
00:11:53,460 –> 00:11:56,980
to it. And I’m like, I know it’s there, but, like, I’m not ready for
192
00:11:56,980 –> 00:12:00,660
it, you know, so. So some of the worst excesses of COVID you will
193
00:12:00,660 –> 00:12:04,300
find in the Stand. Oh, that’s interesting. Okay.
194
00:12:05,020 –> 00:12:08,700
It’s. It’s what, logically? No, it’s one of those
195
00:12:08,700 –> 00:12:12,220
books that the first time I read it, it. And I was already,
196
00:12:12,460 –> 00:12:15,940
as you can imagine, I was already walking down the road to being jaded at
197
00:12:15,940 –> 00:12:19,700
15. And that even led. I
198
00:12:19,700 –> 00:12:23,420
mean, that book even, you know, had the bottom dropping out of me,
199
00:12:23,500 –> 00:12:27,100
you know, because he
200
00:12:27,100 –> 00:12:30,780
nailed human patterns of behavior
201
00:12:31,020 –> 00:12:34,220
so accurately in that. And so
202
00:12:35,340 –> 00:12:39,140
particularly when all the rules are off, right? When there’s no more boundaries, no
203
00:12:39,140 –> 00:12:42,620
more restrictions, what are people actually going to do? How. How.
204
00:12:42,780 –> 00:12:46,620
What’s the pattern, the typical pattern of human behavior? Then let’s just logically
205
00:12:46,700 –> 00:12:49,920
push that forward. Huxley, on the other hand,
206
00:12:53,840 –> 00:12:57,600
Brave New World focuses around hedonic pleasure and
207
00:12:57,600 –> 00:13:01,440
around what hedonism looks like. And how do
208
00:13:01,440 –> 00:13:05,000
we. How is that
209
00:13:05,000 –> 00:13:08,760
used for utopian ends by people who want to be in
210
00:13:08,760 –> 00:13:12,440
control and be in power? And you can see a lot of that in our
211
00:13:12,440 –> 00:13:15,720
own culture right now. Okay. And
212
00:13:15,960 –> 00:13:19,800
so where I have wound up, you know, 30 years
213
00:13:19,800 –> 00:13:23,600
later with kids and with a perspective
214
00:13:23,600 –> 00:13:27,400
on the hedonic culture, that’s. It’s probably counterculture
215
00:13:28,360 –> 00:13:31,960
and sort of where I go with all of that. I. I
216
00:13:32,040 –> 00:13:35,840
can’t follow a quote unquote free thinker like Huxley. He would say,
217
00:13:35,840 –> 00:13:39,320
I’m I’m captured by morality. Nietzsche would say a slave
218
00:13:39,320 –> 00:13:42,830
morality. I would disagree with Nietzsche too. I don’t think he really understood what he’s
219
00:13:42,830 –> 00:13:46,030
talking about there, but that’s what he would say. And that’s, and that’s sort of
220
00:13:46,030 –> 00:13:49,230
where I go with Huxley. And so I’m having this philosophical battle with the author.
221
00:13:49,230 –> 00:13:51,670
And I do this, by the way, in every book that I read. Am I
222
00:13:51,670 –> 00:13:53,870
having this philosophical battle with this author? So
223
00:13:55,630 –> 00:13:59,150
I mentioned Empire of the Summer Moon, and that’s another book there too, where I’m
224
00:13:59,150 –> 00:14:02,950
sort of having this philosophical battle with the author. Yeah. When you first
225
00:14:02,950 –> 00:14:05,750
started to talk about your way of thinking around this, I was kind of confused
226
00:14:05,750 –> 00:14:09,550
for a moment, right? And then I was like, okay, but to put this
227
00:14:09,550 –> 00:14:12,550
into a business book context, right, because everybody’s own business books and you do as
228
00:14:12,550 –> 00:14:16,270
well, right. So if I’m hearing you correctly, if, if,
229
00:14:16,270 –> 00:14:19,790
you know, even if it’s like a very highly touted business book, right.
230
00:14:20,030 –> 00:14:22,830
If they’re making the example use case off of,
231
00:14:25,470 –> 00:14:29,310
you know, Hugh Hefner or Larry Flint or,
232
00:14:29,310 –> 00:14:33,030
you know, or, or anything in that realm, like, like, if that is
233
00:14:33,030 –> 00:14:36,470
the use case that’s forming your thinking. We did. Like, like, like that’s just where
234
00:14:36,470 –> 00:14:39,790
you called it is kind of what I’m hearing. Okay. Yeah, that’s what I’m going
235
00:14:39,790 –> 00:14:42,430
to call the game. And that’s for me, like, I’m not saying that that has
236
00:14:42,430 –> 00:14:45,230
to be for everybody. And you know what? Quite frankly, maybe the author gets to
237
00:14:45,230 –> 00:14:48,790
a different spot on the other end of it. And that’s kind of my, that’s
238
00:14:48,790 –> 00:14:52,270
kind of my, like, next question. Right? Because, like, I, you know,
239
00:14:52,270 –> 00:14:56,030
I’m like you in my, in my own way of, like,
240
00:14:56,030 –> 00:14:59,830
I. Sometimes it’s just too real. Like, sometimes it’s just I don’t, you
241
00:14:59,830 –> 00:15:02,430
know, I don’t need to subject myself to this, you know, and stuff like that.
242
00:15:02,840 –> 00:15:05,520
But then on the other side of it, I am concerned about building an echo
243
00:15:05,520 –> 00:15:09,280
chamber, right? Of just having, you know, people around me believe what I believe
244
00:15:09,280 –> 00:15:12,040
and the books on my shelves that I really like because they just, you know,
245
00:15:12,040 –> 00:15:15,760
validate all my thinking that’s already there in place and everything, you know, and it’s
246
00:15:15,760 –> 00:15:19,080
weird because I try to, I try to have
247
00:15:19,320 –> 00:15:23,040
some balance to this stuff, right? Like, I’ll, I’ll try to go finish
248
00:15:23,040 –> 00:15:26,600
a book that, like, I’m, like, struggling with or, you know, I’ll, I’ll go dive
249
00:15:26,600 –> 00:15:29,720
into a conversation with someone who I know is polar opposite of me just because,
250
00:15:29,720 –> 00:15:33,290
like, I can respect them enough that I can, you know, hear them out
251
00:15:33,290 –> 00:15:36,650
and leave it, you know. But you know,
252
00:15:36,970 –> 00:15:40,170
to me, I think that that’s, I think that that is also dangerous, right? Of,
253
00:15:40,170 –> 00:15:44,010
you know, you know, there’s line and stoicism of like
254
00:15:44,010 –> 00:15:46,410
the worst place you can be before you know who you are is in a
255
00:15:46,410 –> 00:15:50,210
crowd, right? And then, and then the worst place to be after you know
256
00:15:50,210 –> 00:15:52,330
who you are is also in a crowd. It’s also in a crowd.
257
00:15:54,650 –> 00:15:58,370
So like, I don’t know, I like it’s Seneca or like Epictetus. I’m not sure
258
00:15:58,370 –> 00:16:00,370
which one it is. But like, I just think about that all the time because,
259
00:16:00,370 –> 00:16:02,810
like, man, I was hanging out in these crowds and I’m drinking from that Kool
260
00:16:02,810 –> 00:16:05,700
Aid and stuff like this, and now it’s just like, I know to avoid those,
261
00:16:05,700 –> 00:16:09,060
those circles that like just don’t serve me and everything else like that.
262
00:16:10,420 –> 00:16:13,500
But I also try to live by the idea that I should be challenged. My
263
00:16:13,500 –> 00:16:17,260
con, my, my blind spots, my conventions, my thinking and stuff like that.
264
00:16:17,260 –> 00:16:21,100
Because I also think that it’s easy to get complacent. So
265
00:16:21,100 –> 00:16:24,740
I think that there’s multiple different ways to do that. And I think
266
00:16:24,980 –> 00:16:28,500
books, obviously, conversations with people who, to your point,
267
00:16:28,580 –> 00:16:32,340
are of opposing viewpoints or perspectives. I
268
00:16:32,580 –> 00:16:36,020
honestly, I don’t, I never,
269
00:16:36,740 –> 00:16:40,380
and I don’t want people listening to this to think that I’m a censorious person,
270
00:16:40,380 –> 00:16:44,220
because I’m not. I want everything published. Publish it all. That’s why
271
00:16:44,220 –> 00:16:47,620
we have free speech. Absolutely should all. The government should take no position.
272
00:16:48,820 –> 00:16:52,660
Don’t get me started on government schools and government school run
273
00:16:52,660 –> 00:16:55,780
libraries. Don’t get me started on that nonsense. Just don’t.
274
00:16:57,700 –> 00:17:01,490
Government should take no position. Library should actually be funded by the public,
275
00:17:01,730 –> 00:17:05,450
or by the way, public by private individuals. Put whatever you want to put in
276
00:17:05,450 –> 00:17:08,610
there and let’s move on. The
277
00:17:08,610 –> 00:17:12,290
responsibility for curating and
278
00:17:12,370 –> 00:17:15,330
quite frankly censoring ideas
279
00:17:16,370 –> 00:17:19,690
comes from not an overall
280
00:17:19,690 –> 00:17:23,410
institution. But quite frankly, and this is why you have free speech from the
281
00:17:23,410 –> 00:17:26,980
culture. Yeah, the culture will figure it
282
00:17:26,980 –> 00:17:30,780
out. And what we see if we
283
00:17:30,780 –> 00:17:34,580
have robust free speech and robust book publishing,
284
00:17:34,660 –> 00:17:37,860
which is also why I’m opposed to some of the things going on with
285
00:17:37,860 –> 00:17:41,700
LLMs, I see a narrowing of
286
00:17:42,020 –> 00:17:45,740
view and a narrowing of thought that is supposed to be the
287
00:17:45,740 –> 00:17:48,820
majority. So I’ll go back to Brave new world.
288
00:17:49,220 –> 00:17:52,690
Huxley’s perspective on hedonistic
289
00:17:52,690 –> 00:17:56,330
pleasure has become the dominant perspective. And I say this with
290
00:17:56,330 –> 00:17:59,490
absolutely no. What do you call it?
291
00:18:00,210 –> 00:18:03,290
I’m not. I’m not. I’m not taking a position on this one way or another,
292
00:18:03,290 –> 00:18:05,650
whether this is a good or bad. I’m just saying this is a fact. I
293
00:18:05,650 –> 00:18:09,010
think we can look around and see it. We also
294
00:18:09,170 –> 00:18:12,210
have enough examples
295
00:18:12,930 –> 00:18:16,570
of people of all different age brackets and
296
00:18:16,570 –> 00:18:19,950
ranges literally from cradle to grave being
297
00:18:19,950 –> 00:18:23,670
influenced around what their hedonic pleasures should be.
298
00:18:23,990 –> 00:18:26,150
And primarily, this is being driven by the cell phones.
299
00:18:27,750 –> 00:18:29,990
Okay. Yeah, okay.
300
00:18:30,870 –> 00:18:32,790
Huxley won the argument.
301
00:18:34,550 –> 00:18:38,350
Guys like me lost. We lost. It’s fine. We
302
00:18:38,350 –> 00:18:41,590
lost. And by the way. By the way, I’m not saying by the way that
303
00:18:41,590 –> 00:18:45,350
the argument is. It’s still ongoing. I’m merely saying that we lost the argument.
304
00:18:45,350 –> 00:18:48,700
Now, does that mean that we shouldn’t stop having it? No. Does that mean the
305
00:18:48,700 –> 00:18:52,460
discussion shouldn’t still be going on? No, it will be, because people can’t shut up
306
00:18:52,460 –> 00:18:56,300
about it. But. But we lost. And that’s. And
307
00:18:56,300 –> 00:18:59,860
by the way, that’s the thing that happens in culture. Culture decides who wins and
308
00:18:59,860 –> 00:19:03,700
loses, right? And that’s very tough for us to deal with
309
00:19:03,700 –> 00:19:07,260
either side of the argument, right? And then we want. And then we want to.
310
00:19:07,260 –> 00:19:09,500
And then we want to dump it off into politics and make it a political
311
00:19:09,500 –> 00:19:12,900
thing, when in reality it’s not. It’s a cultural thing. It is a cultural thing.
312
00:19:13,150 –> 00:19:16,870
Um, this is fascinating because I just read this other book, right? The. The.
313
00:19:16,870 –> 00:19:20,470
The Infinite Game by Simon Sinek, and it. Oh, yeah, really?
314
00:19:20,470 –> 00:19:23,990
Really? This is my second book by him that I’ve dove into. The first one
315
00:19:23,990 –> 00:19:27,430
was Start with why, and a friend of recommended this one, and, man, it was
316
00:19:27,430 –> 00:19:30,750
so good, because I think it brings a lot of
317
00:19:31,790 –> 00:19:35,070
interesting perspective to the conversation you’re having, right? Like
318
00:19:36,030 –> 00:19:39,870
that, like, culture is an ongoing game. It’s not like a. Like a.
319
00:19:39,870 –> 00:19:42,870
You know, and if you’re trying to win at it, right, you’re taking it out
320
00:19:42,870 –> 00:19:46,690
of this finite, kind of dense, beautiful tapestry and stuff like that, to use
321
00:19:46,690 –> 00:19:50,010
all these very weird, crunchy labels, because I’m in New Mexico and I’m allowed come
322
00:19:50,010 –> 00:19:52,890
after me if you don’t like it, right? Versus
323
00:19:53,530 –> 00:19:57,290
this, like, okay, we lost the argument. Okay, well, like, hey,
324
00:19:57,290 –> 00:20:00,890
like, you know, but there’s also these. These other trends and
325
00:20:01,050 –> 00:20:04,410
encounters to this whole thing, right? Because, like, you know, I think
326
00:20:05,050 –> 00:20:08,730
that maybe some of this freewheeling, you know, that we’re dealing
327
00:20:08,730 –> 00:20:12,410
with now is. Is kind of this, like, right, You.
328
00:20:12,410 –> 00:20:16,170
You put. You run it through the machine, right? And the tightening machine happens, right?
329
00:20:16,170 –> 00:20:19,610
And. Right? It’s. It’s the. It’s the same reason why, like every
330
00:20:19,610 –> 00:20:23,290
preacher’s kid has like an amazing college experience, you know, kind of thing.
331
00:20:23,290 –> 00:20:23,610
Right.
332
00:20:27,050 –> 00:20:30,570
Well, to your point, and this is one of the reasons why I talk a
333
00:20:30,570 –> 00:20:34,290
little bit about the end of the fourth turning on this show. So I do
334
00:20:34,290 –> 00:20:38,040
think. I do believe more and more in historical
335
00:20:38,040 –> 00:20:41,280
cycles. Right, yeah, agreed, same.
336
00:20:41,920 –> 00:20:45,520
And so I do fundamentally believe that in 2025,
337
00:20:45,920 –> 00:20:49,760
when we’re recording this together, that you and
338
00:20:49,760 –> 00:20:53,480
I are at the end of a particular 25 year cycle.
339
00:20:53,480 –> 00:20:57,160
I do believe it is wrapping up. Oh, okay, interesting. And I
340
00:20:57,160 –> 00:21:00,800
think the next cycle is going to.
341
00:21:01,440 –> 00:21:04,880
The players will have similar names or similar
342
00:21:05,040 –> 00:21:08,560
functions. I should maybe say that similar mindsets, but. But different labels,
343
00:21:08,940 –> 00:21:12,740
those mindsets. Yeah. Well, no, no, I think even the mindsets are
344
00:21:12,740 –> 00:21:15,900
shifting around. I’m seeing signs.
345
00:21:16,380 –> 00:21:20,060
Mindsets. Mindsets are shifting around because these things
346
00:21:20,060 –> 00:21:23,900
start. And I actually talked about this, interestingly enough with, with Tom libby
347
00:21:23,900 –> 00:21:27,580
on episode 157 about Sitting Bull as well, brought up this point.
348
00:21:28,300 –> 00:21:32,020
But I think mindsets start at the center. I used to think they started
349
00:21:32,020 –> 00:21:35,180
at the edges in the places that we typically associate with culture,
350
00:21:35,600 –> 00:21:39,280
cultural dominance in this country. So la, New York,
351
00:21:39,440 –> 00:21:42,960
dc. Right. I think.
352
00:21:43,440 –> 00:21:47,200
And we don’t understand a vibe. Vibe shift. We don’t
353
00:21:47,200 –> 00:21:50,240
understand a thing. A thing that’s happened until.
354
00:21:51,680 –> 00:21:55,360
Right. I hate. Hate that term. But we. We don’t understand a thing that’s
355
00:21:55,360 –> 00:21:59,200
happened until we’re outside of it. And so it’s very hard to see
356
00:21:59,200 –> 00:22:02,360
it when we’re in it. Well, it’s just too easy to be like, this is
357
00:22:02,360 –> 00:22:05,560
the right way, because these are the decisions that I’m making and I’ve not died
358
00:22:05,560 –> 00:22:09,380
yet. Right. So it’s like weird. It’s this weird survivorship bias
359
00:22:09,380 –> 00:22:13,180
thing. Oh, man. Beautiful book on this whole idea by.
360
00:22:13,340 –> 00:22:17,180
Is it to Live? Oh, yeah. To live, yeah. Oh, God, Phenomenal book.
361
00:22:17,180 –> 00:22:20,980
Right. And like, as a poker player and like, I spent a whole
362
00:22:20,980 –> 00:22:24,260
lot of time having to get really comfortable with the idea of, like, just because
363
00:22:24,260 –> 00:22:27,700
I made the decision doesn’t make it right. And just make it right. Just because
364
00:22:27,700 –> 00:22:30,620
I won the hand doesn’t mean that that’s the way that I should play this
365
00:22:30,620 –> 00:22:33,060
hand again next time and next time and next time and next time and next
366
00:22:33,060 –> 00:22:35,940
time. Right, well. And you’re not. And you’re not. Also not. You’re also trying to
367
00:22:35,940 –> 00:22:39,780
not be blinded by a hindsight heuristic. Well, well. But the first
368
00:22:39,780 –> 00:22:43,180
thing that you have to acknowledge is that, like, hindsight is not the 2020 thing.
369
00:22:43,180 –> 00:22:46,380
That everyone is just going around talking about, like, hey, you know how perfect that
370
00:22:46,380 –> 00:22:50,180
is? Like, let’s acknowledge the fact that, like, we got lucky, right?
371
00:22:50,180 –> 00:22:53,620
In most situations, you know, like most people, like when,
372
00:22:53,780 –> 00:22:57,100
when I talk about, like, poker with people that have never played poker before, and
373
00:22:57,100 –> 00:22:59,660
they’re like. And I’m like, oh, you’re just thinking that this is like, well, you
374
00:22:59,660 –> 00:23:03,020
either one of your law won or lost. And that’s the only decision making, right?
375
00:23:03,020 –> 00:23:06,720
Any hand of poker that plays to the, on average 18
376
00:23:06,800 –> 00:23:10,240
different decision points in it, and every one of those decision points leads to a
377
00:23:10,240 –> 00:23:13,680
different reality at the end of the day. And sales conversations are remarkably similar.
378
00:23:13,920 –> 00:23:17,200
Let me just pitch you. Let me just see if you actually need any of
379
00:23:17,200 –> 00:23:20,800
these things that I do. But first, in your reality will change completely, right?
380
00:23:21,280 –> 00:23:24,440
You have people and they’re pushing their ideas and their agendas and their. But it’s
381
00:23:24,440 –> 00:23:28,280
the same kind of, like, thing. But you have to think about probabilistic thinking. And
382
00:23:28,280 –> 00:23:31,960
then when I do this, what could happen? What is most likely
383
00:23:31,960 –> 00:23:35,520
to happen and then what never happens because we actually go out and test it,
384
00:23:35,520 –> 00:23:38,720
right? And people don’t do this in sales, but I wish that they did because
385
00:23:38,720 –> 00:23:42,320
people would like salespeople a whole lot more. Right. If we actually, like, tested
386
00:23:42,560 –> 00:23:46,280
in, like, well, tested for benchmark, as opposed to just thinking that benchmarks were just
387
00:23:46,280 –> 00:23:49,840
pushy. But everybody’s sample size is too small. Oh,
388
00:23:50,160 –> 00:23:54,000
yeah. Oh, you know, we can’t talk about. Hold on. We can talk
389
00:23:54,000 –> 00:23:57,520
about that. Like, we need 17 books
390
00:23:57,840 –> 00:24:01,560
on just this topic. And you would still have every one of your
391
00:24:01,560 –> 00:24:05,400
listeners be like, well, hey, son, I made three cold calls. Cold
392
00:24:05,400 –> 00:24:08,000
calling is dead because I didn’t talk to anyone to book an appointment to close
393
00:24:08,000 –> 00:24:11,120
a deal. Oh, did you make them all in the last 10 minutes? Like, I,
394
00:24:11,120 –> 00:24:14,960
I mean, come. I mean, like, sample size and time duration are these two things
395
00:24:14,960 –> 00:24:18,000
that people are not thinking about when they’re thinking about any of these. Any of
396
00:24:18,000 –> 00:24:21,480
these things. And it’s. Right. And so the question you have to ask is at
397
00:24:21,480 –> 00:24:25,280
a cultural level. So we talk about that in business. Yeah. At a
398
00:24:25,280 –> 00:24:28,320
cultural level. Let’s scale this up to culture, right? So
399
00:24:29,420 –> 00:24:33,220
just in the United States, just let’s keep it. The sample size to
400
00:24:33,220 –> 00:24:36,500
the, to the third of the continent that we’re spread across. Let’s just keep it
401
00:24:36,500 –> 00:24:37,900
to that for just a minute. Okay.
402
00:24:39,820 –> 00:24:43,300
350. Approaching
403
00:24:43,300 –> 00:24:46,540
350 million people on this continent. Right.
404
00:24:46,620 –> 00:24:49,340
Okay. Okay.
405
00:24:50,940 –> 00:24:54,660
And it terrifies me, like, like in those moments, you know,
406
00:24:54,660 –> 00:24:58,450
like, It’s. It’s been so interesting to come up here because, like, I
407
00:24:58,450 –> 00:25:01,250
told some friends that I was coming to New Mexico on vacation. Like, oh, man,
408
00:25:01,250 –> 00:25:04,010
where are you going? I was like, well, we’re going. And they were like, why?
409
00:25:04,330 –> 00:25:07,810
And I was like, because I just want to go explore. I don’t know. I
410
00:25:07,810 –> 00:25:11,570
like the state. I think it’s cool. I like mountains, you know, and I like
411
00:25:11,570 –> 00:25:15,050
not having to go all the way to Colorado. So what? And everyone was like,
412
00:25:15,050 –> 00:25:18,610
man, you should just go to Colorado or you should go to the. Like, everyone
413
00:25:18,610 –> 00:25:21,290
has an opinion and an idea. Everything else like this. And I was just like,
414
00:25:21,290 –> 00:25:24,290
cool, I’m gonna go validate this for my own stuff. And you know what? We’re
415
00:25:24,290 –> 00:25:27,950
having an amazing time. Like, my family loves thing. Like, my
416
00:25:27,950 –> 00:25:31,230
daughter’s like, you know what? I could switch schools. And my wife was like, we
417
00:25:31,230 –> 00:25:34,590
could be outdoors. Like, everything is going according to plan. And if I listen to,
418
00:25:34,590 –> 00:25:37,550
like, all my friends that are like, man, why don’t you just go to, like,
419
00:25:37,550 –> 00:25:41,390
LA or Vegas? Because I don’t want to, guys. You know, so we’re
420
00:25:41,390 –> 00:25:44,710
here and everyone is so delightfully weird in a way that
421
00:25:45,110 –> 00:25:48,910
doesn’t really give a whole lot of space for. Right? Yeah, just
422
00:25:48,910 –> 00:25:51,910
call that what it is. Sure. And. But then
423
00:25:52,470 –> 00:25:56,070
it’s just this whole new perspective of all of this stuff, like, coming
424
00:25:56,070 –> 00:25:59,730
in and, like, you’re allowed to do what you want, which is really nice and
425
00:25:59,730 –> 00:26:03,570
cool. But this is why I don’t. This is why I
426
00:26:03,570 –> 00:26:07,250
don’t work B2C, honestly. Right? There you go. Right? Because there’s
427
00:26:07,250 –> 00:26:10,970
too many Cs, and every C has
428
00:26:10,970 –> 00:26:14,490
their own opinion, and every C thinks that their thing is
429
00:26:14,490 –> 00:26:18,210
unique to them, which I guess it is to them, but it’s not.
430
00:26:19,170 –> 00:26:22,250
And the thing is, when you’re a consumer, right, and you’re just thinking about yourself
431
00:26:22,250 –> 00:26:24,610
and your own needs and everything else like this, it’s super easy to be like,
432
00:26:24,610 –> 00:26:28,240
you know what? Improvement can wait. Right? The diet. Oh, yeah. The change starts
433
00:26:28,240 –> 00:26:30,920
next week and everything else like that. And. Right. But like, in a business, there
434
00:26:30,920 –> 00:26:33,680
is. You’re supposed to be working on something, right?
435
00:26:34,480 –> 00:26:37,920
And that’s, that’s why I don’t, like, I don’t want to go rattle the cages
436
00:26:37,920 –> 00:26:41,440
of people that are not ready for it to be rattled. Right? And that’s a
437
00:26:41,440 –> 00:26:43,880
big thing of how I teach people of, like, if you’re trying to force your
438
00:26:43,880 –> 00:26:47,480
way in, what happens after they say yes? Now you got to
439
00:26:47,480 –> 00:26:51,240
deliver under duress in high pressure environments. And how excited
440
00:26:51,240 –> 00:26:53,880
are you to do that? Right? As opposed to going and finding someone and being
441
00:26:53,880 –> 00:26:57,360
like, hey, I had an idea. And they say, hey, that sounds like an interesting
442
00:26:57,360 –> 00:27:01,200
idea. I’d like to explore that with you. These are two different realities, right? And
443
00:27:01,200 –> 00:27:04,840
you can explore either one. And it’s going to lead to you hating your
444
00:27:04,840 –> 00:27:07,840
business, hating your life, hating sales or anything else like this or
445
00:27:08,640 –> 00:27:12,400
it being manageable. Well, and that gets to this.
446
00:27:12,560 –> 00:27:16,280
Well, that gets back to my point about cycles, because we don’t at
447
00:27:16,280 –> 00:27:20,000
a sociocultural level, because it’s so complicated and the
448
00:27:20,000 –> 00:27:22,400
sample size is way the hell too big than the individual.
449
00:27:24,470 –> 00:27:28,230
Just on our small sample size of the continent of North America and
450
00:27:28,390 –> 00:27:32,030
the population of the United States, okay, that is a hyper gigantic
451
00:27:32,030 –> 00:27:35,430
sample size for most people. Right. And so
452
00:27:36,550 –> 00:27:40,310
the dynamic at a sociocultural level of cycle shifting, you
453
00:27:40,310 –> 00:27:44,150
don’t see that until you’re outside of it, on the other end of it.
454
00:27:44,230 –> 00:27:47,990
And so when I say things like the chaos
455
00:27:48,070 –> 00:27:51,510
that has started, that started with September 11th is almost over
456
00:27:51,750 –> 00:27:55,210
or is over and now we have to prepare for something else, I will tell
457
00:27:55,210 –> 00:27:59,010
you, 99% of people look at me like I’m crazy because
458
00:27:59,010 –> 00:28:02,850
I’m not saying, I’m not saying that Aldous Huxley’s
459
00:28:02,850 –> 00:28:06,610
brave New world hedonistic perspective on whatever pleasure
460
00:28:07,170 –> 00:28:10,930
isn’t still going to be around. I’m also not saying that
461
00:28:10,930 –> 00:28:13,770
Philip K. Dick, to go back to the book for just a second, that Philip
462
00:28:13,770 –> 00:28:16,530
K. Dick’s perspective on
463
00:28:17,650 –> 00:28:21,250
robots, I mean, we just read from the future CEO who wants to have a
464
00:28:21,250 –> 00:28:24,890
hundred thousand humanoid robots walking around by 2030. I’m not
465
00:28:24,890 –> 00:28:28,630
saying that that isn’t a possibility. And I’m not saying there will be a whole
466
00:28:28,630 –> 00:28:32,150
parade of horrors that will open up from that. I think there will be. But
467
00:28:32,150 –> 00:28:35,550
I think that the ways in which those are going to be dealt with
468
00:28:36,270 –> 00:28:39,870
are not going to be addressed or are not. The ways in which those are
469
00:28:39,870 –> 00:28:42,070
going to be dealt with are not going to be the same ways with the
470
00:28:42,070 –> 00:28:45,550
same mindset as someone in the last historical cycle, the last
471
00:28:45,550 –> 00:28:49,030
historical spring in the America which was, you know, the
472
00:28:49,030 –> 00:28:52,870
1940s, the end of World War II, going into the mid-60s. It’s just not.
473
00:28:52,870 –> 00:28:56,630
You’re not going to have silent generation people who saw their buddies, heads blown
474
00:28:56,630 –> 00:29:00,410
off in Okinawa, dealing with robots, bots. Yeah, it’s very.
475
00:29:02,330 –> 00:29:06,090
And that’s what we don’t understand about, about historical cycles. That’s what we don’t get.
476
00:29:06,090 –> 00:29:09,370
It’s going to be people like you and me and our kids dealing with
477
00:29:09,370 –> 00:29:12,890
humanoid robots. Well, like, man, there’s,
478
00:29:13,050 –> 00:29:16,730
I, I, I might have lived a whole life in the, in the time that
479
00:29:16,730 –> 00:29:19,250
you were talking through that, because I just started thinking about everything. You know, I
480
00:29:19,250 –> 00:29:20,010
started thinking about
481
00:29:23,050 –> 00:29:26,790
that, like, this all starts with, like, innovation, like, what can we
482
00:29:26,790 –> 00:29:30,630
do? What is there, right? And to be clear, I’d not
483
00:29:30,630 –> 00:29:34,390
read this book before, and I have not read the Asimov books either, right? So
484
00:29:34,390 –> 00:29:38,030
some of the stuff I’m getting to appreciate, kind of new, right? My path
485
00:29:38,030 –> 00:29:41,150
into science fiction fantasy was, like, through, like, Gerald, which is like,
486
00:29:41,630 –> 00:29:45,070
you know, in, like, Heinlein and stuff like this. Heinlein. Heinlein. However you want to
487
00:29:45,070 –> 00:29:48,070
say it, it doesn’t matter. But I was reading all that stuff, right? And so
488
00:29:48,070 –> 00:29:50,910
I go to my brother and who, who was an Asimov fan, and I was
489
00:29:50,910 –> 00:29:53,650
like, hey, should I be reading Asimov? He’s like, no, you’re too young. Enjoy all
490
00:29:53,650 –> 00:29:56,290
this other stuff. And then I just haven’t ever come back to it. But now
491
00:29:56,290 –> 00:29:59,650
I get to experience all this as, like, an adult who’s, like, also living in
492
00:29:59,650 –> 00:30:03,250
versions of this, right? And it’s delightful because it’s not
493
00:30:03,250 –> 00:30:06,890
exactly that. It’s not exactly what he’s talking about in the book. But,
494
00:30:06,890 –> 00:30:10,570
like, I could see that, right? A couple of decisions going different ways,
495
00:30:10,570 –> 00:30:14,170
right? Like in this, in this rich tapestry that we’re talking about, decisive
496
00:30:14,170 –> 00:30:18,010
moments and everything. A couple little things going the other way. Like, you know,
497
00:30:18,170 –> 00:30:21,810
it’s not crazy, right? But then what I think he does such a
498
00:30:21,810 –> 00:30:24,250
good job of in the book is just, like,
499
00:30:25,370 –> 00:30:29,170
how we would, how we would justify everything that we would justify that happens in
500
00:30:29,170 –> 00:30:32,970
this book. Oh, yeah. You know what I’m saying? Like, that, to me, is the
501
00:30:32,970 –> 00:30:36,610
most fascinating part of it. And this, this
502
00:30:36,610 –> 00:30:39,450
very interesting thing. It’s like all the nerds are worried because the nerds are the
503
00:30:39,450 –> 00:30:42,970
people who read this book, these books, right? But they’re also the people that are
504
00:30:42,970 –> 00:30:45,610
going to be pushing to make them better and to make them improved and everything
505
00:30:45,610 –> 00:30:48,800
else like this. And then everyone else who’s like, oh, it’s not going to be
506
00:30:48,800 –> 00:30:52,480
a problem, are going to inherit this problem because
507
00:30:52,480 –> 00:30:55,000
they’re also going to be the people that are pushing for all the advancement. I
508
00:30:55,000 –> 00:30:56,920
want it better. I want it better. I want it to do more. I want
509
00:30:56,920 –> 00:30:59,720
it to do more. Okay, cool. And then we’re going to. And then inadvertently, it’s
510
00:30:59,720 –> 00:31:03,560
going to tip the scales, and then there’s money behind it, and then
511
00:31:03,719 –> 00:31:07,160
it’s just like, in the book to where they’re trying to, like, see, like, what
512
00:31:07,160 –> 00:31:10,360
can they get away with? And it’s such an interesting. And it’s going to happen.
513
00:31:10,440 –> 00:31:14,200
It’s absolutely going to happen. This is my first. This is my first
514
00:31:14,200 –> 00:31:17,940
beat on this book. The nerds
515
00:31:18,020 –> 00:31:21,740
want the technology. So the nerds all read Martian Chronicles when they
516
00:31:21,740 –> 00:31:25,300
were kids. The nerds all read. All
517
00:31:25,300 –> 00:31:29,020
read Philip K. Dick. And they all. Not just. Not just
518
00:31:29,020 –> 00:31:32,540
do androids dreamable as your sheep. I mean, they read the three stigmata of Palmer
519
00:31:32,540 –> 00:31:36,100
Eldrick. They read Scanner Darkly. They read all of it, right?
520
00:31:37,140 –> 00:31:40,940
And they mixed all this together with, to your point, the Trillion
521
00:31:40,940 –> 00:31:44,540
Dollar Company. Like, even Elon has said he read science fiction. Peter Thiel,
522
00:31:44,540 –> 00:31:48,070
Sam Altman, all these guys, right? And they
523
00:31:48,070 –> 00:31:51,390
learned precisely the wrong lessons.
524
00:31:52,910 –> 00:31:56,710
And you can see this, and this is my first point. You can see
525
00:31:56,710 –> 00:31:58,910
this in the fact that there are no black people in the Jetsons.
526
00:32:02,030 –> 00:32:04,910
So you had sent.
527
00:32:06,190 –> 00:32:09,190
Okay, so for those who listen to this, right, I’m going to provide a. Little
528
00:32:09,190 –> 00:32:13,030
bit of background context on that. Haysan sends
529
00:32:13,030 –> 00:32:15,910
over a script, right, with kind of his ideas and some of his thinking about
530
00:32:15,910 –> 00:32:19,620
the book and everything. And this is actually very helpful when you’re a guest, right,
531
00:32:19,620 –> 00:32:22,900
because you kind of know where he’s going and everything. I don’t always read these
532
00:32:22,900 –> 00:32:25,900
things the moment he sends them over because I like to be a little unprepared
533
00:32:25,900 –> 00:32:29,100
because I like to be real on these things whenever I can.
534
00:32:29,500 –> 00:32:31,900
And so I was reading this last night and I saw this line and I
535
00:32:31,900 –> 00:32:35,660
was like, interesting, because the day before
536
00:32:35,660 –> 00:32:39,380
yesterday we go to the local museum here, right? We’re big museum nerds.
537
00:32:39,380 –> 00:32:43,180
And so we’re hanging out in there and there’s this huge science fiction area, right?
538
00:32:43,180 –> 00:32:47,020
Because we’re in the land of nuclear warfare and everything else
539
00:32:47,020 –> 00:32:50,650
like this, you know, and they’re talking about the science fiction
540
00:32:51,690 –> 00:32:55,530
inform the future or does the guessing inform
541
00:32:55,770 –> 00:32:59,490
the future kind of thing, right? Like, like chicken or the egg situation.
542
00:32:59,490 –> 00:33:02,410
And it was very interesting because he was talking about a lot of these ideas,
543
00:33:02,410 –> 00:33:05,290
right? And
544
00:33:09,130 –> 00:33:12,370
in the money just like ruins everything, which is like a very weird thing to
545
00:33:12,370 –> 00:33:15,770
say. Like as an entrepreneurial person and a guy running a business, and
546
00:33:15,930 –> 00:33:19,650
I’ve created this thing and I’m putting myself out there and everything. And I
547
00:33:19,650 –> 00:33:23,450
do have goals and aspirations of building this bigger, and I’ve got monetary goals and
548
00:33:23,450 –> 00:33:26,890
everything, but I’m also just hyper aware of how much money just
549
00:33:27,370 –> 00:33:31,130
blows everything out of proportion and just ruins things, right?
550
00:33:31,690 –> 00:33:35,369
And I also realize that it’s there, right? Like,
551
00:33:35,369 –> 00:33:38,410
you gotta, you gotta, you gotta. I mean, innovation.
552
00:33:38,890 –> 00:33:41,810
Innovation doesn’t happen if there’s not a way to monetize it. If there’s not those
553
00:33:41,810 –> 00:33:45,040
ways of like, okay, what, what are these things? Then it’s just like,
554
00:33:45,670 –> 00:33:49,070
then we have to have the really bad arguments of, like, the science matter, right?
555
00:33:49,070 –> 00:33:52,150
At least, at least now we can, like, you know, make a big push about
556
00:33:52,150 –> 00:33:55,990
science and innovation because we can tie it to capitalism and growth and cool things
557
00:33:55,990 –> 00:33:59,430
like that. But then all the stuff gets in the way. That gets in the
558
00:33:59,430 –> 00:34:03,150
way. Your scientists were so worried about whether or not they could, they
559
00:34:03,150 –> 00:34:06,950
didn’t stop to think about whether or not they should. Yeah, great line
560
00:34:06,950 –> 00:34:10,150
from Jurassic Park. Freaking Jeff Goldblum. Jeff Goldblum.
561
00:34:11,600 –> 00:34:15,400
There’s no one. There’s no. Okay, there’s no one better for that part. No one
562
00:34:15,400 –> 00:34:18,640
better. Never want to sell me on the idea that there’s any other actor on
563
00:34:18,640 –> 00:34:21,320
the planet who would have done that part better. That’s why they brought it back,
564
00:34:21,320 –> 00:34:24,240
like five times in all of the ridiculous sequels.
565
00:34:25,280 –> 00:34:29,040
And they need to stop. Just, I’ve had enough of these
566
00:34:29,440 –> 00:34:32,880
mother effing dinosaurs on this mother effing plane. I’ve really had enough.
567
00:34:34,000 –> 00:34:37,480
The other day, and it was like. It was like while they, While they were
568
00:34:37,480 –> 00:34:40,440
actually kind of talking about the fact that they’re really good at recreating dinosaurs, what
569
00:34:40,440 –> 00:34:44,200
they’ve really gotten really good at is recreating the next worst
570
00:34:44,280 –> 00:34:46,040
dinosaur movie, of course,
571
00:34:51,160 –> 00:34:54,760
which is just true. Like, like, oh, maybe this one will bring it. But
572
00:34:54,760 –> 00:34:57,720
no, no, no. Chris Pratt again. No,
573
00:34:58,760 –> 00:35:02,280
not happening. It’s not happening. And we don’t need her running around in her, like,
574
00:35:02,280 –> 00:35:06,120
high heels in a white. And somehow a miraculously white dress that never
575
00:35:06,120 –> 00:35:09,400
got a drop. I can’t, I can’t even. I can’t even with it. I can’t
576
00:35:09,400 –> 00:35:11,500
even stop at Hollywood. You go home, you’re drunk.
577
00:35:15,820 –> 00:35:19,340
No, I think you’re. I think you’re onto something. Because the thing about it is.
578
00:35:19,340 –> 00:35:23,180
So when I go to the museums, right? Or when I look at
579
00:35:23,180 –> 00:35:26,700
the cultural landscape surrounding all of these innovations and I
580
00:35:26,780 –> 00:35:30,380
listen to the kinds of discussions we’re having now. Like, I read an article
581
00:35:30,380 –> 00:35:33,900
today on Substack from a guy who, who is a
582
00:35:33,900 –> 00:35:37,610
philosophy major, was a philosophy major in college, like, 40 years
583
00:35:37,610 –> 00:35:40,370
ago, and he’s bringing up.
584
00:35:41,970 –> 00:35:45,650
And his point is valid, he’s bringing up how AI
585
00:35:46,450 –> 00:35:49,650
and large language models don’t have ethics
586
00:35:50,210 –> 00:35:54,010
built in and are now beginning to behave in ways
587
00:35:54,010 –> 00:35:57,450
that we would define if human beings were behaving in that way as
588
00:35:57,450 –> 00:36:01,250
evil, quote unquote. Right, okay. Yeah. Now I’m
589
00:36:01,250 –> 00:36:05,010
not going to get into the whole, does an AI have autonomy? Does AI
590
00:36:05,010 –> 00:36:07,490
have free will? I’m not going to get into any of that. I don’t care
591
00:36:07,490 –> 00:36:11,130
about any of that. My point with bringing this up is
592
00:36:13,770 –> 00:36:16,170
we built the AI. That’s fine.
593
00:36:17,770 –> 00:36:21,210
The people who built it were influenced
594
00:36:21,370 –> 00:36:25,050
by Asimov and Bradbury and
595
00:36:25,210 –> 00:36:29,010
Gibson and Heinlein and Dick. They
596
00:36:29,010 –> 00:36:32,510
were influenced by all those guys. Great, Cool. They learned all the wrong
597
00:36:32,510 –> 00:36:36,070
lessons because they didn’t go to philosophy classes in
598
00:36:36,070 –> 00:36:39,590
college. They went to computer engineering classes in college and
599
00:36:39,590 –> 00:36:43,390
finance classes in college. Right, okay. Or if they did go to
600
00:36:43,390 –> 00:36:46,990
those philosophy classes, they immediately forgot everything the second
601
00:36:46,990 –> 00:36:50,030
they were out of there because it didn’t mean anything to them.
602
00:36:53,150 –> 00:36:56,910
And so when I walk into a museum or I read a
603
00:36:56,910 –> 00:37:00,720
book like do Androids Dream of Electric
604
00:37:00,720 –> 00:37:04,560
Sheep? I look at it and I go, the first thing that hits me
605
00:37:04,560 –> 00:37:08,280
is, for all the money in the world, it doesn’t predict
606
00:37:08,280 –> 00:37:11,760
the civil rights movement. Oh, yeah,
607
00:37:12,080 –> 00:37:15,600
well, like. And so there is a counter. So I take the line from Gandalf
608
00:37:15,600 –> 00:37:19,160
in Lord of the Rings. There are other forces than just the evil ones in
609
00:37:19,160 –> 00:37:22,640
this world, 100%. And all those other forces
610
00:37:23,280 –> 00:37:26,960
are, are, are, are, are, are, are battling.
611
00:37:27,670 –> 00:37:31,430
And maybe this is a zero sum statement, I don’t know, but they’re battling
612
00:37:31,430 –> 00:37:34,550
in a cultural space for the minds and hearts of people.
613
00:37:34,950 –> 00:37:38,790
And in general, what wins the minds and hearts of people is not technology.
614
00:37:39,750 –> 00:37:42,550
In general, what wins the minds and hearts of people are
615
00:37:43,430 –> 00:37:46,590
challenges and issues, which is what Dick gets to, I think, in his book so
616
00:37:46,590 –> 00:37:49,510
brilliantly. Challenges and issues of identity.
617
00:37:50,150 –> 00:37:53,910
And that is something that the nerds have no answer for because they don’t
618
00:37:54,150 –> 00:37:57,260
care enough to think about it. Yeah,
619
00:37:57,500 –> 00:38:01,260
exactly. I, okay, so I,
620
00:38:01,260 –> 00:38:03,900
so I work in sales, right? You know this. But I don’t know if people
621
00:38:03,900 –> 00:38:06,500
listen to your show, do or not. So I work in sales. I’m a sales
622
00:38:06,500 –> 00:38:10,260
consultant and I get brought in to talk about sales improvement, sales coaching and sales
623
00:38:10,260 –> 00:38:14,020
training. And right now everyone is very enamored with the idea of an
624
00:38:14,020 –> 00:38:17,700
AI driven salesperson, right? And if you’re really into sales, then you might have
625
00:38:17,700 –> 00:38:21,180
even be thinking about the term of like an aisdr. Right? And so an
626
00:38:21,180 –> 00:38:24,590
AISDR is essentially AI generated outbound
627
00:38:24,590 –> 00:38:28,430
outreach robocalls, right? And if we look at all
628
00:38:28,430 –> 00:38:32,110
the warranty calls that everyone just loves to get, right,
629
00:38:32,670 –> 00:38:36,270
what happens there? Right? Let’s look at what happens
630
00:38:36,350 –> 00:38:39,470
when it’s annoying enough because
631
00:38:39,870 –> 00:38:43,430
now all cell phones have the ability to route any kind of
632
00:38:43,430 –> 00:38:47,110
probable spam risk over to spam, right? So what happens
633
00:38:47,110 –> 00:38:50,910
is we don’t get. We don’t fix the problem. We just create
634
00:38:50,910 –> 00:38:54,510
a really bad trash bin to put the furthest
635
00:38:54,510 –> 00:38:58,350
versions of it, right? Because think about it this way. If salespeople were taught to
636
00:38:58,350 –> 00:39:01,990
just be normal, consultative, solution focused and
637
00:39:01,990 –> 00:39:05,750
not held to such stupid goals that it puts you into a place of emotional
638
00:39:05,750 –> 00:39:09,390
need, we wouldn’t have as many people trying to
639
00:39:09,390 –> 00:39:13,230
run game, trying to manipulate, obfuscate, manip. You know, all these things,
640
00:39:13,230 –> 00:39:16,280
all the bad labels and everything else, and we would just have people going around,
641
00:39:16,440 –> 00:39:19,480
hey, do you need my help? Hey, we do these things. Hey, here’s how we
642
00:39:19,480 –> 00:39:21,960
serve our clients and customers. Do you need these things? But
643
00:39:23,240 –> 00:39:27,000
people don’t. People only recognize
644
00:39:27,000 –> 00:39:30,560
the bad aspects of selling, right? Every one of these people who hates the sales
645
00:39:30,560 –> 00:39:34,280
people, they have probably a financial advisor that they love working
646
00:39:34,280 –> 00:39:38,120
with, right? Who’s a salesperson. They probably have bought a home from a
647
00:39:38,120 –> 00:39:41,770
real estate agent who’s a salesperson and a doctor who’s a salesperson,
648
00:39:41,930 –> 00:39:45,530
right? And so what, they’re just thinking about, they’re thinking about
649
00:39:45,610 –> 00:39:49,290
the. The person in Walmart working the charter table
650
00:39:49,290 –> 00:39:52,290
right now. Like, like this is the new version of it. Because now even the
651
00:39:52,290 –> 00:39:55,930
car lot guys are getting a broader, wider pass because of Carvana and every. But,
652
00:39:55,930 –> 00:39:59,690
like, look at this, right? The technology’s coming along to build
653
00:39:59,850 –> 00:40:02,930
room to avoid the parts that you don’t want to hate or that you would
654
00:40:02,930 –> 00:40:06,540
hate. So you don’t want to deal with a salesperson. Cool. Carvana, right? But what
655
00:40:06,540 –> 00:40:10,340
happens is when you let the nerds try to engineer the path, right?
656
00:40:10,340 –> 00:40:13,700
And we stop thinking about the humans on the other end of this thing
657
00:40:14,260 –> 00:40:17,140
is when we start getting very misaligned on
658
00:40:19,060 –> 00:40:22,740
should versus what does work, right? And
659
00:40:22,820 –> 00:40:26,380
if you’re around this stuff, right, There’s a, there’s, there’s people talking about this
660
00:40:26,380 –> 00:40:28,900
idea that, like, you can have a sales process, which is the way that you
661
00:40:28,900 –> 00:40:32,560
want to sell. But, hey, let’s appreciate their journey as the buyer,
662
00:40:33,270 –> 00:40:36,430
right? You’ve never even heard of a CRM technology before. You don’t even know that
663
00:40:36,430 –> 00:40:39,390
you need one. Let me. Why would I show up on your doorstep and talk
664
00:40:39,390 –> 00:40:43,110
about the differences in five different top high, very,
665
00:40:43,190 –> 00:40:46,830
very high expensive CRMs in the place? Why would I? That doesn’t make any
666
00:40:46,830 –> 00:40:50,350
sense. But this is what people are doing all the Time, because they’re very smart
667
00:40:50,350 –> 00:40:54,030
nerds and they’re just thinking about the math of the problem, but they’re not
668
00:40:54,030 –> 00:40:57,790
thinking about the individual chaos machine that is the human being and the range of
669
00:40:57,790 –> 00:41:01,250
how they’re going to respond and react. Well, they are in one way, and this
670
00:41:01,250 –> 00:41:04,570
is something that I’ve said. I’ve even posted it on Facebook. I think you might
671
00:41:04,570 –> 00:41:08,370
have commented on it sometime within the last year, maybe year and a
672
00:41:08,370 –> 00:41:12,050
half. Because it’s something that. It’s a
673
00:41:12,050 –> 00:41:15,529
major insight that I think we all miss. And let me be very clear. I
674
00:41:15,529 –> 00:41:19,210
like the nerds. The nerds are the reason I’m here. Same without the
675
00:41:19,210 –> 00:41:21,570
nerds, this podcast wouldn’t be the people. Right?
676
00:41:22,850 –> 00:41:26,290
Yeah. That’s the most important part of this thing, is that without
677
00:41:26,450 –> 00:41:29,890
sales training, for anyone who’s listening to this, I mean,
678
00:41:30,610 –> 00:41:34,330
I thought that I was going to sales training to learn everything
679
00:41:34,330 –> 00:41:37,810
that everybody hates about salespeople. Right? Right. I thought, I’m here
680
00:41:38,370 –> 00:41:41,330
and my coach is like, hey, what if you didn’t even try to play that
681
00:41:41,330 –> 00:41:44,290
game? What if you just found the people who wanted. So, like, I’m. I’m even
682
00:41:44,290 –> 00:41:47,970
learning from these things, but culturally, I was being fed this
683
00:41:47,970 –> 00:41:51,010
idea that I can do it because other people can do it, but I’m going
684
00:41:51,010 –> 00:41:54,770
to use my powers for good. Which makes you the tyrant, right? Exactly
685
00:41:55,090 –> 00:41:58,590
right. That’s right. You’re the worst person. These nerds who were like, well, you know
686
00:41:58,590 –> 00:42:02,430
what? I don’t want to do outreach. I don’t want to bother people. So
687
00:42:02,430 –> 00:42:06,230
I’m going to create an AI bot. And then because everyone hates talking
688
00:42:06,230 –> 00:42:09,190
to AI bots, it only works, like, one time in a million. So then I
689
00:42:09,190 –> 00:42:12,710
got to build a machine that’ll let me do this 7 million times a day.
690
00:42:13,030 –> 00:42:16,790
Right. Like, and we’re just working on the problem from the wrong ass
691
00:42:16,870 –> 00:42:20,630
end of the equation. As opposed to, like, what do you really like?
692
00:42:21,030 –> 00:42:23,950
Well, the reason. The reason we’re working on it from the wrong end of the
693
00:42:23,950 –> 00:42:25,270
equation is because.
694
00:42:28,850 –> 00:42:32,210
Marketers, Engineers, Marketers ruin everything. Well, marketers is one reason,
695
00:42:32,370 –> 00:42:35,490
but engineers, I don’t disagree with that. I’m a marketer myself. And we do. We
696
00:42:35,490 –> 00:42:37,490
ruin everything. Engineers
697
00:42:39,090 –> 00:42:42,930
view all of the friction points. This is the point I made years
698
00:42:42,930 –> 00:42:46,130
ago on Facebook. They view all of those friction points
699
00:42:46,530 –> 00:42:49,810
as a bug to be algorithm away
700
00:42:50,690 –> 00:42:54,400
rather than as a feature. So here’s the thing. A human
701
00:42:54,400 –> 00:42:57,800
being wants to buy, to your point, about real estate or about
702
00:42:58,120 –> 00:43:01,840
charter cable or about cars. I don’t care what it is. I used to sell
703
00:43:01,840 –> 00:43:05,480
real estate. I’ve done a lot of those
704
00:43:05,480 –> 00:43:09,320
things. At the end of the day, you can automate the
705
00:43:09,320 –> 00:43:13,120
entire process if you would like. You can, you absolutely can. And human beings,
706
00:43:13,120 –> 00:43:16,000
we’re trying to figure this out for a while now. Oh, I know, they’re all
707
00:43:16,000 –> 00:43:19,850
clients. Right. But at the point
708
00:43:19,850 –> 00:43:23,690
of sale, a human being still wants to see
709
00:43:23,690 –> 00:43:27,410
another human being. Not a humanoid robot, not.
710
00:43:28,530 –> 00:43:32,290
Not an empathy box. They want to see an actual human being. You think? Okay,
711
00:43:32,290 –> 00:43:34,570
you think it’s two points? Okay, I think it’s two points. I think it’s the
712
00:43:34,570 –> 00:43:38,290
beginning. Right. Because we did. I was on a project a
713
00:43:38,290 –> 00:43:42,130
couple of years ago and this was, you know, B2B technology, all this other
714
00:43:42,130 –> 00:43:45,290
stuff. Right. You know, doesn’t really matter for this conversation, but we can talk about
715
00:43:45,290 –> 00:43:48,290
it if you want. Sure. And they were doing all automated
716
00:43:48,870 –> 00:43:52,630
outbound sequences. Everything is in a. Everything is done by machine and automations
717
00:43:52,630 –> 00:43:56,430
and everything. Nothing is done by hand, right. So I get brought in because
718
00:43:56,430 –> 00:43:59,870
they’re not hitting their goals. And so I’m like, okay, what is our process? What
719
00:43:59,870 –> 00:44:02,470
is our sequence? How do. How are we going to market? What are we doing
720
00:44:02,470 –> 00:44:06,150
here? And the phone call was like step seven,
721
00:44:06,470 –> 00:44:09,990
right? So there’s like three LinkedIn things and all this email and everything else like
722
00:44:09,990 –> 00:44:13,790
this. And the phone call is happening so late. Now I’m an introverted
723
00:44:13,790 –> 00:44:16,890
person, but I have to ask the question because I’m trained in sales, why is
724
00:44:16,890 –> 00:44:20,690
the phone conversation happening so late in the sequence? Why aren’t we leading with this?
725
00:44:20,690 –> 00:44:24,250
And they say, well John, it’s different for us. And I say,
726
00:44:24,250 –> 00:44:27,370
okay, please tell me more. Because everyone says this, please tell me why yours is
727
00:44:27,370 –> 00:44:30,930
different, right? And they say, well John, we work with major, major huge
728
00:44:30,930 –> 00:44:34,610
corporations. We’re in fin tech, financial technology, right?
729
00:44:34,610 –> 00:44:38,130
Everyone has a phone tree and everyone has a voicemail. And I’m like, cool.
730
00:44:38,370 –> 00:44:42,140
And you know. And they don’t have an answer for me. And
731
00:44:42,140 –> 00:44:45,660
I’m like, okay. I’m the nice coach. And I’m like, okay guys,
732
00:44:46,540 –> 00:44:50,380
so you guys think that if we do enough volume with your approach, we’re
733
00:44:50,380 –> 00:44:53,220
going to hit the goal? Absolutely, John. We just got to do enough volume. Cool.
734
00:44:53,220 –> 00:44:56,940
What does that volume look like? We have no idea. I’m like, okay, great.
735
00:44:56,940 –> 00:45:00,060
You have a month to find the volume. And then if we don’t find the
736
00:45:00,060 –> 00:45:03,620
volume, we’re going to do it my way. Because your leadership is
737
00:45:03,620 –> 00:45:06,020
concerned about Yalls way. Is that, does that sound fair? I’m going to Give you
738
00:45:06,020 –> 00:45:08,840
some time. And they were like, yeah, of course they miss the goal. First thing,
739
00:45:08,840 –> 00:45:12,560
we do calls first. And they’re like, john, it’s just a voicemail.
740
00:45:12,560 –> 00:45:16,240
I’m like, cool. Bridge to another channel, right? Stop expecting
741
00:45:16,240 –> 00:45:19,680
a sale. Start a relationship, right? And so I’m like, hey,
742
00:45:19,840 –> 00:45:23,559
this is me. We’ve never had a conversation before. It was reaching out about these
743
00:45:23,559 –> 00:45:26,800
situations we hear about all the time. You’ll need to call me back. I’m going
744
00:45:26,800 –> 00:45:29,560
to send you an email to see if this is even worth the conversation or
745
00:45:29,560 –> 00:45:32,080
not. And you know what happened? It was like a
746
00:45:32,080 –> 00:45:35,720
1400% increase in their booking rate, which
747
00:45:35,720 –> 00:45:39,560
sounds really, really crazy, but like, when you’re at one, you know, so like, so
748
00:45:39,560 –> 00:45:43,360
like any growth is, is really really, you know, when you’re starting from like, nothing.
749
00:45:43,440 –> 00:45:47,160
But it was just like. And I talked to more people that want to
750
00:45:47,160 –> 00:45:50,920
take that approach. They only want to put their humanity in there when they, when
751
00:45:50,920 –> 00:45:54,760
they see that there’s an opening in interest in sales. If
752
00:45:54,760 –> 00:45:57,280
you’re going to do it right, is about creating that interest. And if you don’t
753
00:45:57,280 –> 00:46:01,080
know how to create interest the right way, you’re stuck running this weird volume timing
754
00:46:01,080 –> 00:46:04,200
game that all these guys are trying to do over do out over here. So
755
00:46:04,200 –> 00:46:07,950
human first, to show that you’re a human and have some
756
00:46:07,950 –> 00:46:11,750
human connection so you actually get seen as a human first before we get
757
00:46:11,750 –> 00:46:14,230
boxed in as a salesperson. Oh, you just want to sell me. You just want
758
00:46:14,230 –> 00:46:16,990
to close me. You’re just asking these questions because you don’t care if I show
759
00:46:16,990 –> 00:46:20,710
you that I’m human first, I then get seen as a human first. And
760
00:46:20,710 –> 00:46:24,310
then if I have something to show you, we can talk about it. But all
761
00:46:24,310 –> 00:46:26,870
these guys that are trying to be like, okay, let’s just keep this brass tax.
762
00:46:26,870 –> 00:46:30,590
I’m the wolf of not Wall street, but Electric Street. And
763
00:46:30,590 –> 00:46:33,030
we’re just going to do SEO for you and everything. You’re just going to buy
764
00:46:33,030 –> 00:46:35,320
these things from me. But just brass tacks, do you want this thing or not?
765
00:46:36,350 –> 00:46:39,990
You’re never gonna find the volume ever, ever, ever,
766
00:46:39,990 –> 00:46:43,750
ever, ever. And like, I gotta say this just one more time
767
00:46:43,750 –> 00:46:47,310
because I am the most systematic laboratory nerd on the
768
00:46:47,310 –> 00:46:50,950
planet. And I’m still telling you that like leaning into the conversation has a,
769
00:46:50,950 –> 00:46:54,670
has it exponential quantified impact on your lift. And
770
00:46:55,230 –> 00:46:58,790
you’re just scared, which is why you don’t want to do it. That’s why all
771
00:46:58,790 –> 00:47:01,950
these people are trying to build these massive engines, because they’re scared and they’re fearful
772
00:47:01,950 –> 00:47:04,750
and they don’t want to change. And that’s exactly what is going to be the
773
00:47:04,750 –> 00:47:07,800
problem that’s going to lead to all the problems with all these bots.
774
00:47:08,440 –> 00:47:11,680
Okay, let’s talk about the bots. I’m glad you said that at the end. Let’s
775
00:47:11,680 –> 00:47:15,160
talk about the bots. Talk about the bots. Let’s talk about the bots. So.
776
00:47:15,880 –> 00:47:18,480
And actually, we should probably talk about a little bit about the. A little bit
777
00:47:18,480 –> 00:47:21,080
about the books. Let’s talk a little about the book. So this is your first
778
00:47:21,080 –> 00:47:24,840
touch on this. I’ve seen
779
00:47:24,840 –> 00:47:28,680
both Blade Runner movies, both Blade Runner and Blade Runner 2049.
780
00:47:31,890 –> 00:47:35,570
Pick from that what you will, but I’ve seen both of them
781
00:47:36,770 –> 00:47:39,330
and I’ve. I’ve drawn certain conclusions
782
00:47:40,370 –> 00:47:44,130
from them. And so seeing the film and then
783
00:47:44,130 –> 00:47:47,850
touching on the Philip K. Dick book, I hadn’t really, really read
784
00:47:47,850 –> 00:47:50,970
it. Okay, so you had not read the book, but you had seen them. I
785
00:47:50,970 –> 00:47:54,490
had seen the movies. Right. What was your perspective on the movies? Just seeing the
786
00:47:54,490 –> 00:47:58,250
movies without reading the material. So just seeing the movies without reading the
787
00:47:58,250 –> 00:48:00,990
material. Blade Runner as a film.
788
00:48:01,790 –> 00:48:05,550
Well, two things. One, Blade Runner as a film could never be made today.
789
00:48:05,790 –> 00:48:09,430
And that’s why Blade Runner 2049. Well, number one, that’s why
790
00:48:09,430 –> 00:48:12,750
Blade Runner’s sequel took so long to get up off the. Off the deck.
791
00:48:13,150 –> 00:48:15,870
But number two, that’s also why,
792
00:48:16,910 –> 00:48:20,510
as a sequel, it kind of doesn’t work,
793
00:48:21,390 –> 00:48:24,990
because the shift that has occurred since the
794
00:48:24,990 –> 00:48:28,770
original Blade Runner in culture, not in
795
00:48:28,770 –> 00:48:32,250
technology, in culture, the shift that has
796
00:48:32,250 –> 00:48:36,010
occurred doesn’t support some of the idea. Right,
797
00:48:36,090 –> 00:48:38,570
right. Yeah. You know, I have a guess. I have a guess. I have a
798
00:48:38,570 –> 00:48:42,010
guess. This is where we’re going. So hold on. Go ahead. Okay.
799
00:48:42,170 –> 00:48:45,810
Okay. I think. Oh, okay. So
800
00:48:45,810 –> 00:48:49,650
is the newer movie so sensationalized and overblown and so big?
801
00:48:49,650 –> 00:48:52,770
Because we’re kind of used to the idea of, like, AIs and robots and everything,
802
00:48:52,770 –> 00:48:55,770
because culturally, it’s kind of just more interwoven. So they’ve got to make it a
803
00:48:55,770 –> 00:48:58,390
bigger conflict point. And it ruins the movie and the story.
804
00:49:01,510 –> 00:49:05,110
Yes and no. Interesting. So they’re, they’re, they’re trying
805
00:49:05,110 –> 00:49:08,630
to hold on to the aesthetic in the original Blade Runner.
806
00:49:08,710 –> 00:49:12,550
Okay. Yeah. Because that, I mean, it’s, It’s. I mean, it’s iconic. I mean, it’s
807
00:49:12,550 –> 00:49:16,390
iconic. It’s. Yeah. Iconic aesthetic. And the people who, you
808
00:49:16,390 –> 00:49:19,910
know, Dennis Vilu. View. I cannot pronounce the guy’s last name. Who
809
00:49:19,910 –> 00:49:23,670
directed Dune and is directing Dune or directed Dune 2. And I
810
00:49:23,670 –> 00:49:27,480
was directing Dune 3. That guy directed Sicario,
811
00:49:27,480 –> 00:49:31,080
which I just watched the other day. Great film. Great. Tough watch.
812
00:49:31,480 –> 00:49:34,840
Tough watch, but a great film. You know, you.
813
00:49:34,920 –> 00:49:38,680
Guillermo del Toro, like, underrated actor. Like, I liked
814
00:49:38,680 –> 00:49:42,280
him. Right? But watching that. Yeah. Excuse me, Benicio del Toro. Yes,
815
00:49:42,360 –> 00:49:45,960
Del Toro. Guillermo del Toro is the director. Muffled. Yeah, but, like,
816
00:49:45,960 –> 00:49:49,080
watching that, because, you know, like, I’ve seen Snatch. I’ve seen this stuff, and I’m
817
00:49:49,080 –> 00:49:52,920
like, okay, cool. Like, he is that very interesting character. But then I’m
818
00:49:52,920 –> 00:49:56,730
like, oh, dude can be real dark. Okay. He. I loved it
819
00:49:56,730 –> 00:49:59,570
when he says to Emily Blunt, the things that you are going to see are
820
00:49:59,570 –> 00:50:03,410
going to offend your American eyes. And he just walks out of
821
00:50:03,410 –> 00:50:06,770
the room. And I was like, yep, yep, you’re exactly right.
822
00:50:07,090 –> 00:50:10,850
Get ready. And she couldn’t. She couldn’t. And then Josh
823
00:50:10,850 –> 00:50:14,170
Brolin is the CIA spook. Oh, yeah. He’s
824
00:50:14,170 –> 00:50:17,970
phenomenal in that role. It’s so good. Well, he actually. He actually. He
825
00:50:17,970 –> 00:50:21,570
might. I think he actually probably went and hung around some spooks because I’ve known
826
00:50:21,570 –> 00:50:25,100
some former CIA folks in my time, and they are
827
00:50:25,100 –> 00:50:28,820
literally exactly. That guy. Yeah. Just. That’s. That guy.
828
00:50:28,820 –> 00:50:32,380
Yeah. You’re just dead. It’s wild. You’re just. Just dead on. Dead on.
829
00:50:32,380 –> 00:50:35,940
Application of that. And then all the Delta guys who are what they are.
830
00:50:36,020 –> 00:50:39,780
Delta guys are what they are. Anyway. No. So you look at Blade Runner
831
00:50:39,780 –> 00:50:43,620
2049, and then you look at the original Blade Runner and see both. And I
832
00:50:43,620 –> 00:50:47,300
resisted, by the way, watching the sequel for a long time. I really did, because
833
00:50:47,540 –> 00:50:50,800
the other movie, to your point, is so iconic. But
834
00:50:51,280 –> 00:50:54,880
I had long ago come to the conclusion that we are
835
00:50:54,880 –> 00:50:58,400
already living in the future that Blade Runner promised,
836
00:50:59,280 –> 00:51:03,040
just with less pollution. Oh,
837
00:51:03,360 –> 00:51:07,080
so the Gray Dust, that’s in. That’s in. Do Robots Dream of
838
00:51:07,080 –> 00:51:10,920
Electric Sheep? Yeah. Okay. Like, we’re fighting each
839
00:51:10,920 –> 00:51:14,760
other viciously about global warming, while China, of course, produces
840
00:51:14,760 –> 00:51:18,420
one new coal plant a week. But we’re fighting viciously about global
841
00:51:18,420 –> 00:51:21,980
warming because, you know, we don’t want that.
842
00:51:21,980 –> 00:51:25,700
Okay, That’s. That’s a plus. I’m not saying that that’s a negative. That’s a plus.
843
00:51:25,700 –> 00:51:28,420
We should be having that vicious fight about what we do with the environment. Cool.
844
00:51:28,500 –> 00:51:32,020
Have the arguments. Let’s talk about it. But because of those
845
00:51:32,020 –> 00:51:35,860
arguments, because of how far that went, and by the way, Philip
846
00:51:35,860 –> 00:51:38,940
K. Dick died in 1982, so we didn’t have a chance to see sort of
847
00:51:38,940 –> 00:51:42,740
all this sort of come. Come to Fruition but because of regulation.
848
00:51:43,060 –> 00:51:46,720
Because of. To your point about capitalism, Capitalism
849
00:51:46,720 –> 00:51:49,960
being forced to be regulated. Right. By a
850
00:51:49,960 –> 00:51:53,640
governmental entity that doesn’t care about capitalism. Yep.
851
00:51:53,640 –> 00:51:57,440
Cleaned up the environment, but it had nothing to say.
852
00:51:57,760 –> 00:52:01,360
So you don’t have acid rainfall like in the original Blade Runner. Right. But
853
00:52:01,840 –> 00:52:05,640
nothing to say about. And this is.
854
00:52:05,640 –> 00:52:08,960
This is again, the point that I think Philip K. Dick makes nothing to say
855
00:52:08,960 –> 00:52:12,270
about the relationship between man and technology. Government
856
00:52:12,910 –> 00:52:16,750
is astonishingly silent on this, for the most part.
857
00:52:17,470 –> 00:52:20,590
And private industry is astonishingly, to our point earlier,
858
00:52:20,990 –> 00:52:24,750
venal on all these areas. And so the venality
859
00:52:25,710 –> 00:52:29,510
Dick didn’t. He got part of, but he thought the finality would
860
00:52:29,510 –> 00:52:33,190
come from government. That’s why Deckard works for the government. He thought it would come
861
00:52:33,190 –> 00:52:36,590
from the government because he came out of a time in the 50s and 60s
862
00:52:36,850 –> 00:52:40,690
when government was the biggest problem and private industry was highly regulated
863
00:52:40,690 –> 00:52:43,770
coming out of World War II. So when he was a kid, the world he
864
00:52:43,770 –> 00:52:47,450
grew up in was the CEO married their secretary
865
00:52:47,450 –> 00:52:51,090
and didn’t make more than X number of thousands of dollars a year. And yet,
866
00:52:51,330 –> 00:52:55,090
if adjusted for inflation, the CEO made just as much then as the CEO makes
867
00:52:55,090 –> 00:52:58,890
now. Except they just hid because of unions. They just hid the
868
00:52:58,890 –> 00:53:02,570
money in different places. Okay, fine. So everybody looked like they
869
00:53:02,570 –> 00:53:05,810
were conforming, and everybody looked like they were the same culturally,
870
00:53:06,350 –> 00:53:09,790
which certain political parties in our country would like to go back to
871
00:53:10,430 –> 00:53:13,950
that visual look of conformity and are
872
00:53:13,950 –> 00:53:17,710
hidebound to that idea. By the way, Ray Bradbury wrote about this. This is
873
00:53:17,710 –> 00:53:20,910
where the Martian Chronicles and everything else that he ever wrote, Fahrenheit 451, came out
874
00:53:20,910 –> 00:53:24,190
of that sense of being hamstrung by
875
00:53:24,190 –> 00:53:28,030
conformity during a time that, again,
876
00:53:28,270 –> 00:53:32,110
certain political parties in this country would like to get back to where everybody looked
877
00:53:32,110 –> 00:53:35,900
the same. Right. Okay, okay, okay, okay. Dick is
878
00:53:35,900 –> 00:53:39,740
writing as a countercultural, along with Heinlein, to a certain degree, writing
879
00:53:39,740 –> 00:53:43,140
as a counterculture, countercultural opposition.
880
00:53:43,300 –> 00:53:46,980
Right. To that idea. So he could project that forward into. Into the book. And
881
00:53:46,980 –> 00:53:50,580
that gets picked up in the movie quite brilliantly. The original braid runner
882
00:53:50,660 –> 00:53:53,060
that gets picked up quite brilliantly. Now,
883
00:53:54,500 –> 00:53:58,340
as a movie, it kind of falls apart in the third
884
00:53:58,340 –> 00:54:01,770
act. The original Blade Runner does. Okay. And it
885
00:54:01,770 –> 00:54:05,610
descends into this action movie trope because it’s the 80s and
886
00:54:05,610 –> 00:54:09,130
they didn’t really know where to go with it. And so they had to. It’s
887
00:54:09,130 –> 00:54:12,450
funny because my wife is a huge Harrison Ford fan and big
888
00:54:12,690 –> 00:54:15,690
reader and stuff like this, and so she was giving me some of the knowledge
889
00:54:15,690 –> 00:54:19,170
about the Backside of the book. Apparently he didn’t want to make the movie.
890
00:54:19,410 –> 00:54:23,210
He got. No, he didn’t. He got forced into a contract and apparently, like, was
891
00:54:23,210 –> 00:54:27,040
like a begrudging, like, unwilling hostage for most of the shooting
892
00:54:27,350 –> 00:54:30,710
in the. In the production of the movie and stuff like that, and has made
893
00:54:31,030 –> 00:54:34,630
no bones about it whenever anyone asks. Yeah, you know,
894
00:54:34,630 –> 00:54:37,910
him and Ridley Scott did not exactly see eye to eye on that set.
895
00:54:38,070 –> 00:54:41,910
Yeah. So, like, you know, and I can imagine, because the book
896
00:54:41,990 –> 00:54:45,670
itself, to me, like, I have a lot of questions after the book
897
00:54:45,670 –> 00:54:49,470
ends, you know, like a
898
00:54:49,470 –> 00:54:53,200
lot, honestly, you know, which in my mind makes it a good book.
899
00:54:53,270 –> 00:54:56,310
Book. Right. Like, you know, it’s supposed to have some momentum afterwards if it’s just
900
00:54:56,310 –> 00:54:59,830
like, okay, I’m done. I’m never gonna think about this again. You know, you’re probably
901
00:54:59,830 –> 00:55:02,710
lying to yourself if you tell. You tell yourself that you like it.
902
00:55:04,470 –> 00:55:07,990
But I, like, I’m excited to see it. I.
903
00:55:08,070 –> 00:55:11,350
I’m. I’m very interested to see how they talk about
904
00:55:11,830 –> 00:55:15,630
the chicken head stuff in the, in the movies, if they
905
00:55:15,630 –> 00:55:19,430
do it all, because I think that that was pro. That. That seems probably the
906
00:55:19,430 –> 00:55:22,560
hardest thing to, to nail in a film.
907
00:55:24,160 –> 00:55:25,600
Yes. So
908
00:55:29,200 –> 00:55:32,760
it is. It is visually referenced, but not
909
00:55:32,760 –> 00:55:36,560
verbally addressed. I’ll frame it that way. Okay. So
910
00:55:36,560 –> 00:55:40,000
if you know what to look for or if you’re primed to look for that
911
00:55:40,000 –> 00:55:43,760
in the visuals, you will see it in certain scenes, but it is
912
00:55:43,760 –> 00:55:47,480
not. And there’s like, like junkie homeless behavior kind of stuff.
913
00:55:47,480 –> 00:55:51,040
Yeah. Okay. It’s not directly referenced. Right. Because.
914
00:55:51,040 –> 00:55:54,890
Okay, so. So there were probably three things in, in the novel
915
00:55:54,970 –> 00:55:58,570
that were really hard to translate to film. So the first thing was, was the,
916
00:55:58,650 –> 00:56:02,370
was the, the destruction of people’s IQ by the gray
917
00:56:02,370 –> 00:56:06,010
dust. The chicken head piece. Yeah. The second
918
00:56:06,090 –> 00:56:09,610
piece was the lack of.
919
00:56:12,490 –> 00:56:14,810
Shall we say, lack of
920
00:56:15,130 –> 00:56:16,410
geopolitical
921
00:56:19,550 –> 00:56:23,390
anchoring that was in the book. Right. Because when Blade Runner came out, the
922
00:56:23,390 –> 00:56:27,110
movie came out in 1982, I mean, you’re knee
923
00:56:27,110 –> 00:56:30,430
deep in the Cold War at that point. You just. You’re just in it. Right.
924
00:56:31,070 –> 00:56:34,790
Gorbachev has not come along yet. I think that was at
925
00:56:34,790 –> 00:56:38,590
the time when. And so they’re all kind of. Oh, that’s
926
00:56:38,590 –> 00:56:42,430
fascinating. Right. Because like, whenever I was reading Heinlein in
927
00:56:42,750 –> 00:56:46,240
late 80s, early 90s at, like, I was. I. I read
928
00:56:46,240 –> 00:56:49,800
Heinlein too. Too early, I would say, like, I was probably 10,
929
00:56:49,880 –> 00:56:53,000
right. And my brother was like, oh, hey, stranger in a strange land. And, you
930
00:56:53,000 –> 00:56:56,800
know, here’s Body Dysmorphia. And transitioning consciousnesses into other sexes
931
00:56:56,800 –> 00:57:00,360
and everything, you know, so. But
932
00:57:00,920 –> 00:57:04,560
all of those books have kind of this weird vibe. It almost
933
00:57:04,560 –> 00:57:08,040
seems like. Of like I pick up the vibe that you’re talking about that like
934
00:57:08,040 –> 00:57:11,760
government is the bad guy, which then kind of forces individual focus onto the
935
00:57:11,760 –> 00:57:15,220
individual. Right. And so then in all of these books, like Highline books and the,
936
00:57:15,220 –> 00:57:18,420
in the Philip K. Dick book, Philip K. Dick books, it seems like there’s like
937
00:57:18,420 –> 00:57:22,020
this big divide between like almost like
938
00:57:22,020 –> 00:57:25,340
the, like the aware and the unaware. Almost.
939
00:57:25,660 –> 00:57:29,380
Right. And you know, not for nothing, there’s a.
940
00:57:29,380 –> 00:57:33,140
There’s seems like there’s a pretty big gap between those
941
00:57:33,140 –> 00:57:36,860
two circles and buckets of the population today. Right.
942
00:57:36,860 –> 00:57:39,660
People that are just like sitting at home watching the news and on the Internet
943
00:57:39,660 –> 00:57:43,430
and versus that are actually going out to. And being around people and seeing for
944
00:57:43,430 –> 00:57:46,830
their own eyes what is happening. Because like the whole buster part.
945
00:57:47,470 –> 00:57:51,310
Yeah, that I was like, right. But
946
00:57:51,310 –> 00:57:55,030
on, on some level that was redeeming a little bit of
947
00:57:55,030 –> 00:57:58,630
like, oh, okay, like, like it almost felt like in my head it was like
948
00:57:58,630 –> 00:58:01,110
too much of like a thing. So like I’m like. I’m like, okay, like this
949
00:58:01,110 –> 00:58:03,950
doesn’t make sense. So like in my head it was, it was, it was almost
950
00:58:03,950 –> 00:58:07,710
like hope that that that is what it was. And then whenever it was, I
951
00:58:07,710 –> 00:58:11,110
was like, I was. Oh, okay. But also a little convenient.
952
00:58:11,590 –> 00:58:14,190
Well, and, well, and then the third thing in the book. Well, so this ties
953
00:58:14,190 –> 00:58:17,110
into the third thing in the book, which is the mercerism.
954
00:58:17,430 –> 00:58:21,150
Okay, so mercerism does not. None of that is. I will
955
00:58:21,150 –> 00:58:24,590
just. Right now, none of that’s referenced in the movie at all. Oh, to me,
956
00:58:24,590 –> 00:58:27,670
that’s like the most interesting part. And.
957
00:58:29,990 –> 00:58:33,590
The thing that, that I think Philip K. Dick was trying to say with
958
00:58:33,590 –> 00:58:37,330
mercerism. Yes. He’s making a point
959
00:58:37,330 –> 00:58:41,050
about religion in the opiate of the masses and the Marxist point and blah,
960
00:58:41,050 –> 00:58:44,530
blah, blah. Yeah, okay, I’m uninterested in that. The Marxist framing of that. I
961
00:58:44,850 –> 00:58:48,610
don’t care. Or not that I don’t care,
962
00:58:48,610 –> 00:58:51,930
but I’m dismissive of that for a whole variety of reasons. I think that that’s.
963
00:58:51,930 –> 00:58:55,650
To surface an analysis of what he’s trying to do here. He was
964
00:58:55,650 –> 00:58:59,330
working through, to your point about the divide between
965
00:58:59,330 –> 00:59:03,010
the people who are paying attention and the people who aren’t. He was working
966
00:59:03,010 –> 00:59:06,850
through the divide of the masses versus the elite
967
00:59:06,850 –> 00:59:10,530
for sure in that. But he was also working through
968
00:59:11,330 –> 00:59:14,690
social control, the role of hallucinogenics
969
00:59:15,330 –> 00:59:19,090
and the role of lsd. His own personal experiences with
970
00:59:19,090 –> 00:59:22,690
drugs. I mean, he talked about it in an interview that was published.
971
00:59:22,690 –> 00:59:26,530
Gosh. In the 1970s or might have been a speech, I think was an interview
972
00:59:26,530 –> 00:59:30,300
published in the 1970s where he talked about how a lot
973
00:59:30,300 –> 00:59:33,700
of his friends use drugs and wound up screwed up on drugs.
974
00:59:35,300 –> 00:59:38,340
And so he was not interested in
975
00:59:38,660 –> 00:59:41,060
expanding his mind in that kind of way.
976
00:59:43,140 –> 00:59:46,660
Whereas you had. At the same time, folks,
977
00:59:46,660 –> 00:59:50,500
like, in 1982 is 10 years in, 10
978
00:59:50,500 –> 00:59:52,500
years in on the founding of Apple Computers,
979
00:59:54,510 –> 00:59:58,110
founded by a guy, Steve Jobs, who went
980
00:59:58,110 –> 01:00:01,750
to India and went traipsing
981
01:00:01,750 –> 01:00:04,910
around trying to expand his mind. And
982
01:00:04,910 –> 01:00:08,630
allegedly, according to him, anyway, went
983
01:00:08,630 –> 01:00:11,310
on an LSD trip and came up with Apple Computers.
984
01:00:12,590 –> 01:00:15,950
Oh, interesting. Okay. I’ve not heard this story.
985
01:00:16,350 –> 01:00:19,830
Oh, yeah, yeah, yeah. Well, the reason. You know why that the logo for Apple
986
01:00:19,830 –> 01:00:23,390
is Apple, right. Is an apple with a bite out of it.
987
01:00:23,390 –> 01:00:25,430
Yeah. Oh, it’s because I’m Alan Turing.
988
01:00:26,950 –> 01:00:30,230
When he was. When he was
989
01:00:30,310 –> 01:00:33,990
prosecuted by the British government after World War II for being a
990
01:00:33,990 –> 01:00:37,790
homosexual, committed suicide by putting it
991
01:00:37,790 –> 01:00:40,870
either cyanide or arsenic, I can’t remember which one. But it was some kind of
992
01:00:40,870 –> 01:00:44,510
poison in an apple. And he, He. His
993
01:00:44,510 –> 01:00:47,430
body was found with an apple next to it with a bite taken out of
994
01:00:47,430 –> 01:00:51,020
it. Oh, interesting. Okay.
995
01:00:51,260 –> 01:00:53,580
Huh. And that’s the reason for the Apple logo.
996
01:00:55,020 –> 01:00:58,820
Really? Yep. Man. Have I just been reading all, like the
997
01:00:58,820 –> 01:01:02,580
marketing virginized of all of this stuff of. You know,
998
01:01:02,580 –> 01:01:05,900
because, like, you gotta go back. You gotta go back and read what Steve actually
999
01:01:05,900 –> 01:01:09,580
said about it. Yeah. The marketers said afterward what he actually
1000
01:01:09,580 –> 01:01:13,180
said. Yeah, yeah. Because, like, I was like, I. I listen to all these business
1001
01:01:13,180 –> 01:01:16,750
books and read a bunch of business content and I just. And I just heard
1002
01:01:16,750 –> 01:01:20,110
a story that they were. They were shown this graphical
1003
01:01:20,110 –> 01:01:23,870
interface. Oh, yeah, that did happen. Yeah. They went to IBM and.
1004
01:01:23,870 –> 01:01:27,390
Yeah, oh, yeah, yeah, exactly. So, like. So like, all of that stuff is a
1005
01:01:27,390 –> 01:01:31,190
start. Is a stuff that stands out in the business lore. Not. Not this
1006
01:01:31,190 –> 01:01:34,270
other. Okay, so that’s. Well, because. Because they don’t. Because everybody,
1007
01:01:34,990 –> 01:01:38,750
God bless folks in business, but they want to siphon off to the
1008
01:01:38,750 –> 01:01:42,370
point about finance guys, Right. Who don’t take philosophy classes. They want to
1009
01:01:42,370 –> 01:01:45,450
partition off all that other stuff and just focus on the money. Because the money
1010
01:01:45,450 –> 01:01:49,290
seems clean and it seems pure and it seems as though it’s unfettered.
1011
01:01:49,450 –> 01:01:53,050
And of course, that’s a lie and a deceit. And then it gets too big.
1012
01:01:53,050 –> 01:01:56,890
Right. You know, like. And the fascinating thing is,
1013
01:01:56,890 –> 01:01:59,530
like, why do we keep. Like,
1014
01:02:00,890 –> 01:02:04,570
it almost feels like, you have to be willfully ignorant to like, not
1015
01:02:04,650 –> 01:02:08,450
look at the 101 versions of this, like, getting in our way,
1016
01:02:08,450 –> 01:02:12,240
like, over and over and over and over and over again, right? Like,
1017
01:02:12,240 –> 01:02:16,080
I mean, like, it leads me to start thinking, and this is not
1018
01:02:16,080 –> 01:02:19,440
helpful. How many times do we have to go through this, like, version of this
1019
01:02:19,440 –> 01:02:23,120
thing before we can be like, oh, hey, hey, hey, hey. We
1020
01:02:23,120 –> 01:02:26,840
see that, we see that, right? Like, it was, it was like the weirdest moment
1021
01:02:26,840 –> 01:02:29,720
of like, I remember. Like,
1022
01:02:30,680 –> 01:02:34,440
I think anyone who’s an adult right now can probably appreciate this. Like, before COVID
1023
01:02:34,440 –> 01:02:38,260
right? You’d watch one of those movies like Contagion or, you know,
1024
01:02:38,260 –> 01:02:41,140
any of that stuff, right? And it always has the same montage on the front
1025
01:02:41,140 –> 01:02:44,580
end of it, right? Of like the news cycle, right? And I, and I, and
1026
01:02:44,580 –> 01:02:46,980
I could remember watching that as a kid and being like, hey, you know what?
1027
01:02:46,980 –> 01:02:50,780
Like, no, no, no, not
1028
01:02:50,780 –> 01:02:54,460
like, like that’s all just drama. Like, people aren’t really that dumb,
1029
01:02:54,620 –> 01:02:58,340
you know. But then we fast forward to like, Covid and I’m
1030
01:02:58,340 –> 01:03:01,900
just like, oh. Like, it’s like, it’s like this very
1031
01:03:01,900 –> 01:03:05,570
weird moment of like, oh my God, the books are real. And then
1032
01:03:05,570 –> 01:03:09,410
like, oh, wow, those people really had a really good grasp on humanity.
1033
01:03:09,410 –> 01:03:12,930
Like, wow, you know, like that to me is the mind
1034
01:03:12,930 –> 01:03:16,730
blowing part. Not. I made my piece with the fact that we can do this
1035
01:03:16,730 –> 01:03:19,410
and I do a version of this and I’m probably not even paying attention to.
1036
01:03:19,410 –> 01:03:22,210
Because we’re all humans and, you know, you can only be aware of so many
1037
01:03:22,210 –> 01:03:26,010
things. But like, the people who don’t even think that it
1038
01:03:26,010 –> 01:03:29,810
could be somewhat similar are. I’m like, how
1039
01:03:30,370 –> 01:03:33,840
would they are people. Who are consistently and
1040
01:03:33,840 –> 01:03:37,560
repeatedly mugged by reality. I love this term. It’s a term the conservatives
1041
01:03:37,560 –> 01:03:41,240
sometimes use. To describe by reality. Mugged by reality. Describe liberals
1042
01:03:41,240 –> 01:03:44,600
who switch from being liberal to being conservative. They say that a liberal is just
1043
01:03:44,600 –> 01:03:48,440
a conservative that was mugged by reality. Or conservative is just a liberal who was
1044
01:03:48,440 –> 01:03:51,680
mugged by reality. That’s it, right? And, and,
1045
01:03:51,920 –> 01:03:55,560
and I like that idea of being mugged by reality
1046
01:03:55,560 –> 01:03:59,410
because it is only. So I often say
1047
01:03:59,410 –> 01:04:03,170
this in relation to people who are or,
1048
01:04:03,170 –> 01:04:06,930
or have been in, in, in war zones, right? Regardless
1049
01:04:06,930 –> 01:04:10,650
of how they got there, there’s certain things that you can
1050
01:04:10,650 –> 01:04:14,450
only learn when a bullet goes past your ear. And you could
1051
01:04:14,450 –> 01:04:17,050
read all the books about it, you can watch all the movies, it doesn’t matter.
1052
01:04:18,570 –> 01:04:21,370
None of all that stuff fades into, into, into nonsense
1053
01:04:22,340 –> 01:04:25,940
the second you almost die. Or if you are in
1054
01:04:25,940 –> 01:04:29,780
a traumatic. Any kind of trauma basically in your
1055
01:04:29,780 –> 01:04:33,580
life. And I’m talking, like, real trauma. I’m not talking like, I didn’t get an
1056
01:04:33,580 –> 01:04:37,140
Uber to take me from the airport to my house. That isn’t trauma.
1057
01:04:37,540 –> 01:04:40,100
I’m not talking about, like, oh,
1058
01:04:41,140 –> 01:04:44,900
you know, I’ll even go stuff. I will for myself. I’m
1059
01:04:44,900 –> 01:04:47,500
not talking about. Someone said some nasty name to me when I was walking down
1060
01:04:47,500 –> 01:04:50,690
the street. Like, I’m not talking about that. That ain’t trauma. That’s just. Just whatever.
1061
01:04:50,690 –> 01:04:54,330
That’s just stupidity or ignorance or whatever. That’s being a human, dealing with other humans.
1062
01:04:54,330 –> 01:04:57,530
All right? That’s life. We used to call that life. No,
1063
01:04:58,410 –> 01:05:00,890
real trauma is. And by the way, real trauma is.
1064
01:05:02,330 –> 01:05:04,730
I went to the doctor because I found this weird thing on me, and I’ve
1065
01:05:04,730 –> 01:05:07,530
been diagnosed with cancer. Now I gotta do it. I gotta deal with this. I
1066
01:05:07,530 –> 01:05:10,170
was just talking to a lady in my town
1067
01:05:11,290 –> 01:05:15,010
who. Small anecdote here. Lady who lives in my town. She had an
1068
01:05:15,010 –> 01:05:18,830
office next door to mine in a previous location. I moved. I
1069
01:05:18,830 –> 01:05:22,150
had seen her in a little over a year. I was sitting out on the.
1070
01:05:22,390 –> 01:05:25,670
On the. On the side. Well, not on the sidewalk, but on a bench. On
1071
01:05:25,670 –> 01:05:29,390
the sidewalk, waiting for my food outside of a local restaurant. And she
1072
01:05:29,390 –> 01:05:31,790
saw me. She walked across the street. She’s like, hey, how you doing? And we
1073
01:05:31,790 –> 01:05:34,670
were talking. I said, what’s up with you? And she said, oh, I’ve been diagnosed
1074
01:05:34,670 –> 01:05:38,150
with a pretty aggressive form of cancer. And.
1075
01:05:39,190 –> 01:05:42,670
And I’m going to. She didn’t say I’m going to die, but she said, I’m
1076
01:05:42,670 –> 01:05:46,240
exploring treatment options for this right now and kind of going through this. I was
1077
01:05:46,240 –> 01:05:49,960
walking through this with her, right? And that’s
1078
01:05:49,960 –> 01:05:53,360
trauma. That’s trauma, yeah. Like,
1079
01:05:53,600 –> 01:05:57,040
friend of mine, very old friend of mine, he’s another sales nerd, and
1080
01:05:58,160 –> 01:06:01,680
he. We got connected over this nonprofit that was for, like,
1081
01:06:02,800 –> 01:06:06,560
they were driving awareness around this cancer nonprofit.
1082
01:06:06,880 –> 01:06:09,920
And we. We became friends after that. And one time he shared this thing with
1083
01:06:09,920 –> 01:06:13,040
me. He goes. He goes. He goes. Every time I have an acre of pain,
1084
01:06:13,440 –> 01:06:17,040
first thought is that I got cancer again, right? And, like,
1085
01:06:17,040 –> 01:06:20,400
imagine, you know, you can’t.
1086
01:06:20,640 –> 01:06:24,240
You can’t really know it until you go through it, right? But then also,
1087
01:06:24,240 –> 01:06:27,920
it’s going to change how you think about it, right? Like, I mean, sometimes you
1088
01:06:27,920 –> 01:06:31,520
can’t. This is why it’s so important that if you have
1089
01:06:31,520 –> 01:06:35,040
never been in sales, do not think you can lead sales,
1090
01:06:35,040 –> 01:06:38,840
because. Oh, yeah, no, you. You just you, you, you can’t. And the people are
1091
01:06:38,840 –> 01:06:41,440
just not going to believe you because they’re going to be able to see through
1092
01:06:41,440 –> 01:06:45,100
the, the facade, the fabrication, the take it to you, make it stuff,
1093
01:06:45,100 –> 01:06:48,820
right? It. And just don’t try. Like we’re not, we’re not
1094
01:06:48,820 –> 01:06:52,660
great at it, you know, but also be open to going and
1095
01:06:52,660 –> 01:06:56,100
experiencing these things before we take these hard line decisions. Right? Like,
1096
01:06:56,260 –> 01:06:58,980
exactly. Going back to the thing that we were talking about. Like, you and I
1097
01:06:58,980 –> 01:07:02,660
have read all of these books, right? About everything that can go wrong.
1098
01:07:02,660 –> 01:07:05,860
Right. Like, and I’ve read way more books about it going wrong than it being
1099
01:07:05,860 –> 01:07:09,700
beneficial. Right. You know, kinda. Right. And
1100
01:07:10,170 –> 01:07:13,810
there’s also this, like, there’s a sec. There’s this kind of weird thing.
1101
01:07:13,810 –> 01:07:17,530
There’s this weird overlap with like martial arts thinking. Right.
1102
01:07:17,930 –> 01:07:20,930
I don’t train and I don’t practice and I didn’t put that time into that
1103
01:07:20,930 –> 01:07:24,530
thing. So that way I could go around with some weird inflated
1104
01:07:24,530 –> 01:07:28,370
ego and everything else like this, right. I go there. So that
1105
01:07:28,370 –> 01:07:32,170
way I’m, I’m trained and I’m
1106
01:07:32,170 –> 01:07:35,650
practiced. But there’s this weird shift that happens with every martial artist that gets to
1107
01:07:35,650 –> 01:07:39,300
any kind of height, right? And you, and you’ve seen it, it, you have to
1108
01:07:39,300 –> 01:07:41,860
go from that I want to be a badass to like, this is the way
1109
01:07:41,860 –> 01:07:45,540
of life. This is how I approach my, my conflict, my thinking,
1110
01:07:45,540 –> 01:07:49,380
my, my judgment and everything. And like you don’t make it past the
1111
01:07:49,380 –> 01:07:53,020
mid tier in anything without that shift happens.
1112
01:07:53,020 –> 01:07:56,700
Right. And so this is a weird thing. I don’t talk about this a whole
1113
01:07:56,700 –> 01:08:00,500
lot, but like now whenever there’s anything I want to go learn. It
1114
01:08:00,500 –> 01:08:03,460
used to be because I am this person, I am the nerd that thinks that
1115
01:08:03,460 –> 01:08:06,060
it, well, if I just know the basic fundamentals, I’m going to be able to
1116
01:08:06,060 –> 01:08:09,350
figure this thing out. Okay. And that’s my normal, normal way of approaching everything. But
1117
01:08:09,350 –> 01:08:12,630
I spent 10 years as a salesperson trying to think that I could find a
1118
01:08:12,630 –> 01:08:16,390
way around the human connection and dealing with the humans and everything. And
1119
01:08:16,390 –> 01:08:20,190
I only found success whenever I stopped doing that kind of thing. So in
1120
01:08:20,190 –> 01:08:23,950
every aspect, there’s the version of reality that you think matters and then
1121
01:08:23,950 –> 01:08:27,790
there’s the higher level of it that everyone else is thinking about. Right. As a
1122
01:08:27,790 –> 01:08:30,990
poker player, how do I win these pots? Okay, well, as a professional poker player,
1123
01:08:30,990 –> 01:08:34,469
how do I lose the less. Right. As a, as a, as a
1124
01:08:34,469 –> 01:08:37,989
beginner black, as a beginner martial arts student, how do I like, never get hit?
1125
01:08:38,069 –> 01:08:41,509
Okay. Master, how do I go home?
1126
01:08:41,749 –> 01:08:45,109
How do I make sure that I’m the one going home? Right? Yeah. And. And
1127
01:08:45,109 –> 01:08:48,389
more often than not, that means I don’t need anything at all.
1128
01:08:48,948 –> 01:08:52,709
You know what I’m saying? So the old enough to know
1129
01:08:52,709 –> 01:08:56,509
better kind of thing, Right? To use that, like very old folksy wisdom and
1130
01:08:56,509 –> 01:09:00,309
everything else like this. Right? Like, we’re only old enough to know better because
1131
01:09:00,309 –> 01:09:03,940
we’ve spent our time reading and consuming these ideas that everyone
1132
01:09:03,940 –> 01:09:07,620
else was like, oh, this is just hyper nerdy, stupid. Oh, but robots,
1133
01:09:07,860 –> 01:09:11,420
you guys can just like give me robots and they can do all this cold
1134
01:09:11,420 –> 01:09:14,620
calling that I don’t want to do. And we just need to do 3,000 of
1135
01:09:14,620 –> 01:09:17,020
these a day and we’re like out of
1136
01:09:17,020 –> 01:09:20,540
00,0001%. And John, like, you’re an
1137
01:09:20,540 –> 01:09:23,860
injury. There’s a better way to fix the problem. Guys,
1138
01:09:24,420 –> 01:09:28,220
let me ask you a question with all this. No,
1139
01:09:28,220 –> 01:09:30,820
because this is good. Let me ask you a question about all this. Please.
1140
01:09:32,570 –> 01:09:35,690
Cheer ship. I’ll allow it. Oh, thank you. I appreciate that.
1141
01:09:39,450 –> 01:09:40,570
Good on you. Thank you.
1142
01:09:43,530 –> 01:09:47,049
If the figure CEO Mr. Adcock
1143
01:09:47,530 –> 01:09:50,930
is correct, right? If there will
1144
01:09:50,930 –> 01:09:53,850
indeed. If he can indeed. And I don’t know that he can. I think
1145
01:09:54,810 –> 01:09:58,490
I. Number one, I think building a robot is incredibly difficult. Number two, I think
1146
01:09:58,490 –> 01:10:02,340
embodying a robot that can even do just minimal
1147
01:10:02,340 –> 01:10:06,060
menial tasks is also incredibly difficult. Then I think your third
1148
01:10:06,060 –> 01:10:09,900
order level of difficulty is putting the object in the world and
1149
01:10:09,900 –> 01:10:13,500
having it be accepted by other human beings. With all of our years
1150
01:10:13,500 –> 01:10:16,980
of whether it’s evolution or creationism, I don’t care. We’ve got
1151
01:10:16,980 –> 01:10:20,580
stuff that that robot doesn’t have. I don’t care where you think it came
1152
01:10:20,580 –> 01:10:24,380
from. We’ve got it. They don’t. Okay? That’s the difference between us and
1153
01:10:24,380 –> 01:10:28,080
the monkeys. Okay? The Monk. We ain’t putting the monkeys on the
1154
01:10:28,080 –> 01:10:31,880
factory floor. Okay, so you’ve got three major
1155
01:10:31,880 –> 01:10:35,400
first order problems there. You have to solve three major ones.
1156
01:10:35,640 –> 01:10:39,480
And by the way, the AI does not solve for any of those three at
1157
01:10:39,480 –> 01:10:42,320
all. I don’t care how evil it is, I don’t care how much it hallucinates,
1158
01:10:42,320 –> 01:10:45,800
it doesn’t solve for those problems. It’s also other problems, but not those three. Okay,
1159
01:10:46,840 –> 01:10:49,640
let’s say those three problems could be solved in the next four years.
1160
01:10:53,650 –> 01:10:57,490
You and I both live in or near a major city where one of
1161
01:10:57,490 –> 01:11:01,250
these humanoid robots is probably going to show up. Yeah.
1162
01:11:02,370 –> 01:11:03,410
More likely than not.
1163
01:11:06,530 –> 01:11:10,130
First, as a sales guy, let’s start from here. And then let’s move down the
1164
01:11:10,130 –> 01:11:12,450
level. Okay. You want to put the worst version of him?
1165
01:11:14,770 –> 01:11:16,690
Absolutely, absolutely, absolutely.
1166
01:11:19,420 –> 01:11:23,220
How is sales going to sell to a robot? Let’s
1167
01:11:23,220 –> 01:11:26,700
start with that question. And by the way, not a bot.
1168
01:11:26,700 –> 01:11:30,380
Beautiful question. Not a bot. I want to be very clear. Not
1169
01:11:30,380 –> 01:11:34,219
a bot on a phone, disembodied somewhere. I mean, you
1170
01:11:34,219 –> 01:11:38,060
show up at the building and the robot
1171
01:11:38,060 –> 01:11:41,580
is the gatekeeper. Oh, well,
1172
01:11:41,660 –> 01:11:44,890
oh, that’s, that’s the body problem. There you go. The embodiment problem.
1173
01:11:45,920 –> 01:11:49,200
Well, that’s what everybody wants, allegedly.
1174
01:11:49,600 –> 01:11:52,880
Well, they, they think that they want that, right. Like, like one of the biggest
1175
01:11:52,880 –> 01:11:56,400
lessons that I, that I learned as a, as a stuck,
1176
01:11:56,960 –> 01:12:00,760
not high minded salesperson is very stuck, was that the idea is that
1177
01:12:00,760 –> 01:12:03,400
the gatekeeper is not really there to tell you. No, they’re there to protect the
1178
01:12:03,400 –> 01:12:06,560
time of the person who hired them. Right. Because
1179
01:12:07,120 –> 01:12:10,480
sometimes, and people will talk about this sometimes the reason why they have a
1180
01:12:10,480 –> 01:12:13,200
gatekeeper is because they’re so excited to talk to you that they got to have
1181
01:12:13,200 –> 01:12:16,910
someone there, you know, And I’m like, you know, and there. And that’s true.
1182
01:12:16,910 –> 01:12:20,430
Like some people are just so people oriented that they can’t help themselves. Right?
1183
01:12:20,510 –> 01:12:24,230
But I was like, gatekeepers are bad humans and all this
1184
01:12:24,230 –> 01:12:27,590
stuff, right? Because I wasn’t winning, you know. And then the
1185
01:12:27,590 –> 01:12:31,230
biggest change was whenever I started to make them like part of
1186
01:12:31,230 –> 01:12:34,950
my approach, right. Hey, you probably see a thousand
1187
01:12:34,950 –> 01:12:38,710
people looking exactly like me coming in here, trying to get time with a doctor
1188
01:12:38,710 –> 01:12:42,310
and stuff like that. And I don’t want to, I don’t want to waste their
1189
01:12:42,310 –> 01:12:45,990
time. Right? You probably got some pretty good insight into the doctor, what he
1190
01:12:45,990 –> 01:12:49,830
likes and what he doesn’t like, where his frustrations are with his outcomes,
1191
01:12:49,830 –> 01:12:53,390
with his patients, with his time on the table. I’m curious, like, like, what do
1192
01:12:53,390 –> 01:12:55,870
you think that that would look like? Because I don’t want to waste their time.
1193
01:12:56,030 –> 01:12:59,550
Oh, I don’t, I don’t, I don’t actually know. Or they would say like, okay,
1194
01:12:59,550 –> 01:13:03,350
he, he hates this, but he likes this, right. And things like that. So
1195
01:13:03,350 –> 01:13:07,070
whenever I made them part of my team, right. And treated them like a human,
1196
01:13:07,310 –> 01:13:10,320
which they are, I had better outcomes. Right.
1197
01:13:11,200 –> 01:13:14,920
And so now there is a very robotic version of
1198
01:13:14,920 –> 01:13:18,000
a buying. It’s called an rfq, right.
1199
01:13:18,480 –> 01:13:22,160
Hey, we’re going to go out to market and we don’t know what
1200
01:13:22,160 –> 01:13:26,000
in the hell we’re actually looking for. So we’re going
1201
01:13:26,000 –> 01:13:28,560
to, we’re going to do an RFQ and we’re going to, and we’re going to
1202
01:13:28,560 –> 01:13:32,160
go to chat GPT. We’re going to have GPT create this RFQ for us,
1203
01:13:32,160 –> 01:13:35,320
right? And because it’s AI, it’s going to know exactly what to do. And then
1204
01:13:35,320 –> 01:13:39,030
essentially it’s like, hey, invitation to waste your time. Oh yeah,
1205
01:13:39,350 –> 01:13:42,870
right. Like, like that’s what I get as a sales professional. I’m like, okay, cool.
1206
01:13:42,870 –> 01:13:45,910
Would love to ask you a couple questions. No questions. You must fill it out
1207
01:13:45,910 –> 01:13:48,910
and then we will select it. Okay, cool. I know where I’m at. You not
1208
01:13:48,910 –> 01:13:52,350
ever make any decisions because you don’t even know what’s important to you. I’m gonna
1209
01:13:52,350 –> 01:13:56,150
go find someone who’s at least open to a con. And so then right there
1210
01:13:56,390 –> 01:14:00,230
I am making a judgment driven decision based upon my own experience
1211
01:14:00,550 –> 01:14:04,390
as my time as a professional. Okay, but if you’ve not spent
1212
01:14:04,390 –> 01:14:08,070
the time as a professional in that space, just hang
1213
01:14:08,070 –> 01:14:11,590
out, right? Like, like that’s just follow up. 17
1214
01:14:11,670 –> 01:14:15,270
touches of follow up and you know, start a newsletter, you know, and
1215
01:14:15,270 –> 01:14:17,750
everything else like this that everyone else is pushing.
1216
01:14:19,750 –> 01:14:23,510
But here’s the thing. All of this stuff, all of this, all the
1217
01:14:23,510 –> 01:14:26,310
marketing stuff is I think going to go out of the way because you know
1218
01:14:26,310 –> 01:14:29,790
what you’re not going to be able to do? Indoctrinate an AI into your way
1219
01:14:29,790 –> 01:14:32,630
of thinking. Right? So all these things of like, well, if we just, just, if
1220
01:14:32,630 –> 01:14:36,110
we just give them a lead magnet and then they opt into our stuff,
1221
01:14:36,510 –> 01:14:39,870
if we just keep sending the messaging, eventually they’re going to start thinking about it.
1222
01:14:40,190 –> 01:14:43,910
Yeah, okay, but here’s the thing. You’re not going to make the Apple guy
1223
01:14:43,910 –> 01:14:47,710
go to PC. You’re just not right in, like in, in
1224
01:14:47,710 –> 01:14:51,470
reality. And if your gatekeeper is an AI that you teach it and
1225
01:14:51,470 –> 01:14:54,510
tilt it to, like here’s what you live through and here’s what you don’t.
1226
01:14:55,870 –> 01:14:59,400
You don’t ever get in front of anybody unless they’re
1227
01:14:59,400 –> 01:15:03,080
confirming your own biases. Right? One of the things about salespeople that is so
1228
01:15:03,080 –> 01:15:06,720
difficult is you’re supposed to have an opinion, right? People hate us
1229
01:15:06,720 –> 01:15:09,480
because they all think that we’re going to do anything, be anything, say anything to
1230
01:15:09,480 –> 01:15:11,600
get a deal. But the only people that are doing that are the people who
1231
01:15:11,600 –> 01:15:14,520
have made a decision that they’re only going to do this for a short period
1232
01:15:14,520 –> 01:15:18,280
of time. So yeah, they only got to do it until like it really takes
1233
01:15:18,280 –> 01:15:21,080
off and then they’re going to go back to being their like normal regular selves.
1234
01:15:21,080 –> 01:15:23,960
But here’s the thing I am a normal, regular self, and you either like it
1235
01:15:23,960 –> 01:15:26,870
or you don’t. I’m going to go find other people to talk to, you know,
1236
01:15:27,110 –> 01:15:30,750
so it’s like all of those little bitty shifts that come
1237
01:15:30,750 –> 01:15:33,910
from time and grade of, like, really doing the work and realizing,
1238
01:15:34,310 –> 01:15:37,630
hey, this is. This is the weirdest way to make my piece with this. But,
1239
01:15:37,630 –> 01:15:41,190
like, I’ve lied to prospects and it still didn’t close,
1240
01:15:41,510 –> 01:15:44,390
so why would I just keep doing that to myself, right? Like, Like, I now
1241
01:15:44,390 –> 01:15:48,070
have these facts that making myself uncomfortable and lying and,
1242
01:15:48,230 –> 01:15:52,040
like, didn’t actually change anything. It just made me more uncomfortable. So, like, why would
1243
01:15:52,040 –> 01:15:55,800
I continue to do that, right? You have to go through
1244
01:15:55,800 –> 01:15:59,280
the experience and learn. And one of my favorite lines and, and this is one
1245
01:15:59,280 –> 01:16:02,480
of my biggest concerns about this whole motion, right? And I’m going to butcher this
1246
01:16:02,480 –> 01:16:06,200
line. You probably, you probably might know it in the. In the original form, but
1247
01:16:06,200 –> 01:16:10,040
it’s show me the person who cares about the. About the, the
1248
01:16:10,040 –> 01:16:13,720
one in the volume of numbers. And that’s a
1249
01:16:13,720 –> 01:16:17,520
real human, right? It’s around this idea that, like, you make a big enough
1250
01:16:17,520 –> 01:16:21,060
sample size, it’s impossible to care about the individuals, right?
1251
01:16:22,340 –> 01:16:25,980
There’s a whole lot of room to extrapolate on that whole
1252
01:16:25,980 –> 01:16:29,820
topic, right, when it comes to political and culture conversations
1253
01:16:29,820 –> 01:16:33,380
and stuff like this, you know? But the same thing happens here, right? So if
1254
01:16:33,940 –> 01:16:37,580
all these people, the first time an AI sabotages their
1255
01:16:37,580 –> 01:16:40,980
business or does something that categorically harms their reality,
1256
01:16:41,300 –> 01:16:44,180
they’re then going to be a lot more concerned about it. But until then, you’re
1257
01:16:44,180 –> 01:16:47,990
just thinking about the positives. Okay, now, now, we saw this in chapter
1258
01:16:47,990 –> 01:16:51,350
10 of Do Androids Dream of Electric Sheep when
1259
01:16:51,590 –> 01:16:55,190
Rick Deckard fails to kill retire
1260
01:16:55,510 –> 01:16:59,310
the Android Luba Luft, who’s posing as a. As an
1261
01:16:59,310 –> 01:17:03,030
opera singer, right? Great little framing there by Philip K.
1262
01:17:03,030 –> 01:17:06,750
Dick also on that. But he, He. He’s
1263
01:17:06,750 –> 01:17:10,350
administering her the empathy test, the Voight comp test, and
1264
01:17:10,350 –> 01:17:14,090
she doesn’t like the questions. And so instead of answering the questions, she behaves
1265
01:17:14,090 –> 01:17:17,090
as a human would, which I think of this in sales. She doesn’t like the
1266
01:17:17,090 –> 01:17:20,730
questions the salesperson is asking. I was gonna bring up the same.
1267
01:17:20,730 –> 01:17:23,730
Thing, so you knew where I was going. And so.
1268
01:17:24,450 –> 01:17:28,210
And so she calls a cop, which the cop is a robot or the
1269
01:17:28,210 –> 01:17:31,490
cop is an Android. The Android cop puts him, arrests him,
1270
01:17:31,810 –> 01:17:35,570
takes him to this second police station in San Francisco
1271
01:17:35,570 –> 01:17:39,290
that he’s never seen before. And again, this is
1272
01:17:39,290 –> 01:17:42,970
for me, the moment where sort of the bottom drops out of the plane, right?
1273
01:17:42,970 –> 01:17:46,370
Yeah. And Deckard is having this existential
1274
01:17:46,370 –> 01:17:49,570
crisis in an Android police station
1275
01:17:50,050 –> 01:17:51,330
full of androids.
1276
01:17:54,130 –> 01:17:57,730
Maybe I’m the Android, right? And that maybe I’m the Android.
1277
01:17:57,730 –> 01:18:01,170
Maybe I’m not real. Maybe I’m the one with a problem. Now,
1278
01:18:02,130 –> 01:18:05,290
in our world today, many, many people,
1279
01:18:06,410 –> 01:18:09,410
and we could argue about the genesis of where this comes from. I think it
1280
01:18:09,410 –> 01:18:13,210
comes from the, the decline of meaning in, in Western society
1281
01:18:14,250 –> 01:18:17,690
and specifically Western society, or generally in Western society, but
1282
01:18:17,690 –> 01:18:21,490
specifically in American culture, we’re struggling with meaning right now, and we
1283
01:18:21,490 –> 01:18:24,890
have been for the last 15 years. And this is tied into identity and a
1284
01:18:24,890 –> 01:18:28,650
whole bunch of other things. I mean, you see this in, you see this
1285
01:18:28,650 –> 01:18:32,350
in, in teenagers, most notoriously. Now, I was talking with a
1286
01:18:32,350 –> 01:18:35,630
friend of mine who’s a clinical, clinical
1287
01:18:35,630 –> 01:18:39,430
psychologist, and he said the clinical studies
1288
01:18:39,430 –> 01:18:42,790
are showing that a child forms their first political
1289
01:18:42,790 –> 01:18:46,630
identity in ninth grade. It’s starting as early as ninth grade now.
1290
01:18:46,870 –> 01:18:50,710
He said it didn’t start earlier than that. He’s like, we started
1291
01:18:50,710 –> 01:18:54,510
way later than that. Previously. He said it’s because we’re having an identity problem.
1292
01:18:54,510 –> 01:18:57,310
We’re having a meaning problem problem in our country. Okay,
1293
01:18:58,190 –> 01:19:00,830
I’ll let you, based on that while I ask you this question.
1294
01:19:01,790 –> 01:19:04,190
Well, I’m going to let you based on this, hold on. Based on that for
1295
01:19:04,190 –> 01:19:07,870
just a minute. Hold, hold that thought in your head. So the question
1296
01:19:07,950 –> 01:19:08,670
becomes,
1297
01:19:11,710 –> 01:19:15,230
how long will it be with these 100,000 humanoid robots before
1298
01:19:15,390 –> 01:19:18,830
somebody says that robots have rights?
1299
01:19:21,400 –> 01:19:22,040
Well, in,
1300
01:19:25,640 –> 01:19:29,080
so A.I. the movie, the Spielberg movie with Jude Law.
1301
01:19:29,160 –> 01:19:33,000
Right. I haven’t seen in 20 years. Yeah. And my daughter is,
1302
01:19:33,400 –> 01:19:37,240
you know, about to be 14 and we
1303
01:19:37,240 –> 01:19:41,040
litter read widely. Right. You know, if
1304
01:19:41,040 –> 01:19:44,600
we’re like, hey, you know, give this a couple of years. But you know, we,
1305
01:19:44,600 –> 01:19:47,160
we don’t ever say like, this is not allowed. We try to give her some
1306
01:19:47,160 –> 01:19:50,520
like, hey, when you’re going through this, this will be more meaningful to, to you
1307
01:19:50,520 –> 01:19:54,240
then. Yep. And you know, so part of me is very excited
1308
01:19:54,240 –> 01:19:57,800
now about going back and re watching AI with her and watching the Will Smith
1309
01:19:57,800 –> 01:20:01,600
iRobot. Right? Because like, I remember watching that movie and like there’s a scene
1310
01:20:01,600 –> 01:20:04,840
in the bar where like, everything is so ungodly expensive. It’s like, it’s like
1311
01:20:04,840 –> 01:20:08,480
$160 for like two beers and a pair of Chucks or something like this.
1312
01:20:08,480 –> 01:20:11,840
And I remember watching it being like, no way. Well, like, hey,
1313
01:20:13,120 –> 01:20:16,240
you know, I’m on vacation and the tourist traps aren’t the only things that are
1314
01:20:16,240 –> 01:20:20,050
expensive anymore, you know what I’m saying? Like, like, like we’re living in this
1315
01:20:20,290 –> 01:20:24,090
reality, right? And you can, I think it’s very easy
1316
01:20:24,090 –> 01:20:27,810
to like read the books
1317
01:20:27,810 –> 01:20:31,610
and then when it’s not perfect, completely dismiss any of
1318
01:20:31,610 –> 01:20:34,650
the thinking of the, of, of the author of the book, of the story and
1319
01:20:34,650 –> 01:20:38,370
everything. And I think if you read enough, you see
1320
01:20:38,370 –> 01:20:42,210
enough of those trends, enough of those patterns, enough
1321
01:20:42,210 –> 01:20:45,820
of those things happening, and it’s not ever exactly the same
1322
01:20:45,820 –> 01:20:49,340
thing. But we can, if we’re being really honest about it, if we can
1323
01:20:49,340 –> 01:20:52,900
shelves our egos a little bit, man.
1324
01:20:53,060 –> 01:20:55,860
We, we have a lot of room to just
1325
01:20:56,340 –> 01:21:00,020
dismiss and go for the negative, right? And then you have people that
1326
01:21:00,020 –> 01:21:03,060
know that and they, and they have those levers around
1327
01:21:03,380 –> 01:21:07,100
negativity and hate and you know, creating one common opponent. So
1328
01:21:07,100 –> 01:21:10,860
then, you know, like the playbook is so well known by the people who shouldn’t
1329
01:21:10,860 –> 01:21:14,100
have access to it. It. Right. Like that’s, that’s the bigger concern
1330
01:21:14,740 –> 01:21:18,420
for me over that because I, I’m more concerned that someone would
1331
01:21:18,420 –> 01:21:21,380
start that whole thing just that way they have a battle to go fight against.
1332
01:21:21,860 –> 01:21:25,580
More than it being like a real reasonable kind of discussion of like,
1333
01:21:25,580 –> 01:21:29,140
hey, like it. The question is not do they have
1334
01:21:29,140 –> 01:21:32,100
rights in my opinion, it’s like at what stage
1335
01:21:33,620 –> 01:21:37,420
should they, at what stage would it make sense
1336
01:21:37,420 –> 01:21:40,720
for them to be in the situation? So that way we don’t do
1337
01:21:41,040 –> 01:21:44,880
well. They’re never going to have rights ever. Right? And then something
1338
01:21:44,880 –> 01:21:48,720
changes, right? And then I, I don’t, I don’t know. It’s a very interesting
1339
01:21:48,720 –> 01:21:52,280
thing. I’ve got, I’ve got a lot of concern about it, you know,
1340
01:21:52,280 –> 01:21:55,720
and I think that right now back to the earlier question. We
1341
01:21:55,720 –> 01:21:59,480
can’t, we can’t even get a software that like all
1342
01:21:59,480 –> 01:22:02,440
these softwares, that market being the best at all the things and not a one
1343
01:22:02,440 –> 01:22:05,540
of them is they’re good at one thing and then they’re just packaging on, you
1344
01:22:05,540 –> 01:22:09,220
know, all this other use case stuff so that way they can bump
1345
01:22:09,220 –> 01:22:12,980
up their, their costs and everything. I think we’re going to have that,
1346
01:22:12,980 –> 01:22:16,580
I think we’re going to have potentially some androids
1347
01:22:16,580 –> 01:22:19,660
capable of doing very basic tasks that are going to be very robotic and very
1348
01:22:19,660 –> 01:22:23,340
clunky because all the people that are putting effort into empathy and
1349
01:22:23,340 –> 01:22:27,060
emotion and communication, I don’t think are really focused on like the body,
1350
01:22:27,220 –> 01:22:30,420
right? So I think we’re going to have these two very, very different tracks, right?
1351
01:22:30,420 –> 01:22:34,040
The body people and then the conversational stuff. The,
1352
01:22:34,040 –> 01:22:37,880
the Empathy stuff, the connection with the human. And because I think
1353
01:22:37,880 –> 01:22:40,880
that these will be very different packs or tracks when they try to bridge them
1354
01:22:40,880 –> 01:22:43,760
together, it’s going to be a bit of a nightmare. That’s my thinking, on the.
1355
01:22:44,000 –> 01:22:47,480
On that specific thing. So I think you. You’re
1356
01:22:47,480 –> 01:22:50,240
probably on to something. I think
1357
01:22:51,520 –> 01:22:55,200
the. The reason I asked the
1358
01:22:55,200 –> 01:22:58,810
question is because of the explosion in
1359
01:22:58,810 –> 01:23:02,290
transgenderism over the last people
1360
01:23:02,290 –> 01:23:06,090
identifying as transgender. Yep. And I’m not. I want to be very clear.
1361
01:23:06,090 –> 01:23:09,690
I’m not getting. I’m not. I’m using it as merely a
1362
01:23:09,690 –> 01:23:12,890
data point for extrapolating to a future pattern of behavior,
1363
01:23:13,210 –> 01:23:16,410
because past performance does sometimes indicate future results.
1364
01:23:16,890 –> 01:23:20,170
But fascinating thing here, right? Like, I. Like,
1365
01:23:21,130 –> 01:23:24,980
have you read Altered Carbon? Or have you seen the Netflix. I know the
1366
01:23:24,980 –> 01:23:27,540
Netflix show. I have not. I haven’t watched it, but I’m aware of what it
1367
01:23:27,540 –> 01:23:31,300
is. The book was solid. Right. And. And it kind of exposes this idea, like,
1368
01:23:31,300 –> 01:23:35,020
what if we can transfer consciousness into, like, other people? Right.
1369
01:23:35,020 –> 01:23:37,740
What if we could live for forever, but the body wasn’t meant to do so?
1370
01:23:37,740 –> 01:23:41,180
Okay, great. What does that whole thing look like? Well, if it’s not the body
1371
01:23:41,180 –> 01:23:44,900
and it’s just the consciousness, like, what happens if you can shift into another body
1372
01:23:44,900 –> 01:23:48,660
and it’s not your body? Right. And, you know, it’s. It’s wild
1373
01:23:48,660 –> 01:23:52,420
because it goes back to the whole main thing, like, do we
1374
01:23:52,420 –> 01:23:54,660
give access to this? Do we ban books? Do we keep it off to the
1375
01:23:54,660 –> 01:23:58,060
side? Do we give access to information? I was 11 years old when I read
1376
01:23:58,060 –> 01:24:01,300
a Heinlein book about a guy who was in a car accident and gets his
1377
01:24:01,380 –> 01:24:04,660
consciousness transfer transferred into a female’s body.
1378
01:24:05,140 –> 01:24:07,860
Right, Right. I was reading other science fiction books about
1379
01:24:08,340 –> 01:24:12,060
telepaths that could take over bodies of other people, genderized or not, and
1380
01:24:12,060 –> 01:24:14,680
everything else like that. But so, because I was so, I guess,
1381
01:24:16,670 –> 01:24:20,110
open to the idea in the science fiction format. Right.
1382
01:24:20,350 –> 01:24:24,030
It doesn’t offend me the way that it offends, like.
1383
01:24:24,030 –> 01:24:27,870
Oh, and I’m not saying you. But there are people. Yeah, yeah.
1384
01:24:27,870 –> 01:24:31,590
That this. That this idea offends. Right? Yeah, yeah. And I’m not asking from
1385
01:24:31,590 –> 01:24:34,870
an offense perspective, because I think. Here’s the thing. I think. I think that the
1386
01:24:34,870 –> 01:24:38,670
bifurcation, the trifurcations will be along
1387
01:24:38,750 –> 01:24:39,550
these lines.
1388
01:24:42,680 –> 01:24:46,360
You will for sure have people who will insist
1389
01:24:46,360 –> 01:24:49,720
that a humanoid robot have rights, for sure.
1390
01:24:50,520 –> 01:24:54,040
Even though it’ll be very, very proven that it is all if this, then
1391
01:24:54,040 –> 01:24:57,760
that. And they won’t care. There’s no ingenuity if you will.
1392
01:24:57,760 –> 01:25:01,360
Right, right, right. They won’t care. And I think there’s going to be crossover between
1393
01:25:01,360 –> 01:25:05,040
those people who insist that the robot has rights and people who
1394
01:25:05,040 –> 01:25:08,690
are using. Let me be very blunt.
1395
01:25:08,690 –> 01:25:11,050
And, and, and because they have a lot of kids listen to the show, so
1396
01:25:11,050 –> 01:25:14,370
let me be blunt and baseline on this. Who are using
1397
01:25:15,570 –> 01:25:19,250
objects for base physical pleasure. Okay. There’s gonna be
1398
01:25:19,250 –> 01:25:22,610
crossover between those two groups. Immense crossover.
1399
01:25:23,490 –> 01:25:26,290
Oh, okay. So, yeah, if I’m hearing you correctly,
1400
01:25:27,810 –> 01:25:31,530
all the red pill incel folks who are
1401
01:25:31,530 –> 01:25:34,130
going to see this as an outlet so that way they don’t have to deal
1402
01:25:34,130 –> 01:25:37,710
with. They’re going to be the people who are both pushing and very
1403
01:25:37,710 –> 01:25:41,310
much against. Yes. At the same time. That. Yeah, I.
1404
01:25:41,310 –> 01:25:44,190
Yeah, I can see all the same time. All the same time. So that’s one.
1405
01:25:44,190 –> 01:25:47,710
That’s one. This is my significant other. And this is my
1406
01:25:47,710 –> 01:25:50,710
toy. Exactly. Yeah. Yeah. So that’s gonna be one group of people.
1407
01:25:51,350 –> 01:25:55,150
Notoriously, the Joaquin Phoenix movie. Her. Right. Okay. It’s gonna be those
1408
01:25:55,150 –> 01:25:58,750
folks. I’m not that movie. Do you recommend it? I’ve seen the
1409
01:25:58,750 –> 01:26:02,430
trailer for it and I, I
1410
01:26:02,430 –> 01:26:06,210
thought it was kind of like a book that I was like, oh, I
1411
01:26:06,290 –> 01:26:09,690
know where this is. That’s a reality. I don’t want to explain. I don’t need
1412
01:26:09,690 –> 01:26:13,490
to explore. That’s fair. I need to explore that. I. I’ve
1413
01:26:13,490 –> 01:26:16,010
had enough of my challenges in those spaces. I don’t need to go down that
1414
01:26:16,010 –> 01:26:19,570
road. Thank you. Then you’ll have another group of people.
1415
01:26:19,650 –> 01:26:22,850
This is the second group. The second group of people will be those
1416
01:26:23,170 –> 01:26:27,010
who are not supportive, but
1417
01:26:27,010 –> 01:26:30,760
they’re also not violently opposed. They’re the wait and see people. I think that’s. The
1418
01:26:30,760 –> 01:26:34,400
vast majority of folks want to wait and see. Because if it’s
1419
01:26:34,400 –> 01:26:37,760
not in my town of name your place here,
1420
01:26:38,320 –> 01:26:41,160
then it’s. Not that big of a deal. It ain’t that big a deal. That’s
1421
01:26:41,160 –> 01:26:44,560
something for those people in Chicago or Detroit or New York or LA or
1422
01:26:44,560 –> 01:26:47,840
wherever. It could be a brave new world over there.
1423
01:26:48,400 –> 01:26:52,240
I’m still hanging out. Yeah. And they’re taking the robot. That’s like picking
1424
01:26:52,240 –> 01:26:54,720
the corn, you know, Like. Like there’s going to be.
1425
01:26:56,810 –> 01:26:59,690
I mean, like, I. I just don’t think it’s possible to not have a blind
1426
01:26:59,690 –> 01:27:03,450
spot. Like, I really don’t. And that’s your third group of folks. So
1427
01:27:03,450 –> 01:27:07,130
your third group of folks are going to be the people who are
1428
01:27:09,130 –> 01:27:12,210
all the way. Well, they’re all the way on the other end of the continuum.
1429
01:27:12,210 –> 01:27:15,410
I think it’s going to take while for the reflective people to shake out. And
1430
01:27:15,410 –> 01:27:18,730
I think they’ll probably shake out from a weird combination of these three groups.
1431
01:27:18,890 –> 01:27:21,770
But you’ll have your people all the way over on the other end of the
1432
01:27:21,770 –> 01:27:22,890
spectrum who will say,
1433
01:27:26,210 –> 01:27:29,890
no, robots don’t deserve rights. What the hell are we doing?
1434
01:27:30,370 –> 01:27:34,050
And they’re going to sabotage stuff left and right. Case in point,
1435
01:27:34,370 –> 01:27:37,890
during the most recent LA riots, one of the more
1436
01:27:37,970 –> 01:27:40,370
fascinating videos that I saw
1437
01:27:41,490 –> 01:27:45,170
was of people who were riding against ICE raids
1438
01:27:45,410 –> 01:27:49,010
in la destroying self driving
1439
01:27:49,090 –> 01:27:52,060
vehicles in downtown la.
1440
01:27:53,580 –> 01:27:57,140
That’s your third group of people. And I feel like the third
1441
01:27:57,140 –> 01:28:00,700
group in. And I’m trying to,
1442
01:28:00,860 –> 01:28:03,580
I’m trying to look at trends and patterns and seeing if this like holds true
1443
01:28:03,580 –> 01:28:07,100
as I zoom out on it. I think that the third group is where
1444
01:28:08,620 –> 01:28:12,420
all the, all the, all the potential bad actors hide. Oh yeah,
1445
01:28:12,420 –> 01:28:16,140
for sure. So remember I said, remember the question I asked you, like, would you
1446
01:28:16,140 –> 01:28:19,450
walk up to a building with a robot gatekeeper? The vast majority of people, by
1447
01:28:19,450 –> 01:28:21,730
the way, will be in that middle group of. I don’t know, I don’t care.
1448
01:28:21,730 –> 01:28:25,130
It’s not an issue until it shows up to me. Right, right, right. When it
1449
01:28:25,130 –> 01:28:27,970
shows up to you and you have to walk up there, the third group of
1450
01:28:27,970 –> 01:28:31,450
people will be the people hiding in the car around the corner trying to get
1451
01:28:31,450 –> 01:28:34,130
into the robot via wi fi to hack it.
1452
01:28:35,010 –> 01:28:38,610
Yeah, exactly, right, exactly. You know, or the 14
1453
01:28:38,610 –> 01:28:42,050
year old in Arkansas who has nothing to do but hack drones all day
1454
01:28:42,210 –> 01:28:45,570
because he’s bored. And by the way, I’m sorry ladies, it will be a he.
1455
01:28:46,290 –> 01:28:49,650
It always is. I’m sorry, it just is. Yeah,
1456
01:28:50,050 –> 01:28:52,450
because you know, there, there, there’s going to be.
1457
01:28:53,730 –> 01:28:56,890
Well, and then what’ll happen is like, it’s like, okay, cool, you have a robot
1458
01:28:56,890 –> 01:28:59,810
gatekeeper. Here’s our robot AI. Cold email bot.
1459
01:29:00,130 –> 01:29:03,530
Bingo. And then, and then, and then there’ll be a
1460
01:29:03,530 –> 01:29:07,330
cultural answer of like harder, harder spam filters. Right?
1461
01:29:07,570 –> 01:29:11,410
And then. But like, what’s fascinating to me is like as a
1462
01:29:11,410 –> 01:29:15,140
salesperson, I analyze how this
1463
01:29:15,220 –> 01:29:18,940
weird tit for tat, you know, we’re gonna win. No, we’re gonna win.
1464
01:29:18,940 –> 01:29:22,660
We’re gonna win. No, we’re gonna win. Happens, you know, because
1465
01:29:22,660 –> 01:29:26,180
it’s like all the bad stuff that everyone hates about salespeople,
1466
01:29:26,260 –> 01:29:30,020
most of it happens because most consumers
1467
01:29:30,420 –> 01:29:34,180
are very uncomfortable being direct with salespeople. I’m not alone for
1468
01:29:34,180 –> 01:29:37,380
you. I would never buy this. And you can waste your time if you want,
1469
01:29:37,380 –> 01:29:41,210
but I’m pretty set y. Okay. And so if,
1470
01:29:41,370 –> 01:29:45,170
if, you know, if, if it seems like a maybe, I’m
1471
01:29:45,170 –> 01:29:48,730
supposed to follow up on a maybe. Right? But if you could tell me, right,
1472
01:29:48,970 –> 01:29:51,970
that, hey, and this is why I try to be very direct with people. Like
1473
01:29:51,970 –> 01:29:55,530
if you cold call me for like lead gen. Hey, no, I’m not ever buying
1474
01:29:55,530 –> 01:29:59,210
lead gen from anybody. Ever. Right? I, I don’t believe in the time
1475
01:29:59,210 –> 01:30:03,010
waste. I just don’t. Right. So you can keep following up.
1476
01:30:03,010 –> 01:30:06,860
I am telling you, I’m an Apple guy. You’re trying to sell me PC
1477
01:30:06,860 –> 01:30:10,420
products. It is a waste of time. Right, Right. I’m a vegan and you’re trying
1478
01:30:10,420 –> 01:30:13,940
to sell me on, on half a cow. Like, just stop. Like, like
1479
01:30:14,100 –> 01:30:17,860
go find a better opportunity as opposed to thinking that you can force
1480
01:30:18,020 –> 01:30:21,660
change minds. Because that’s just not how it works. But
1481
01:30:21,660 –> 01:30:24,860
the people who don’t do the job are not the people that are building the
1482
01:30:24,860 –> 01:30:28,540
motions and building the strings and trying to find the shortcuts. Correct? Right.
1483
01:30:28,540 –> 01:30:32,040
Well, and I mean, this is where Roy
1484
01:30:32,040 –> 01:30:35,760
Batty’s speech in the Tears in the Rain speech. I’ve
1485
01:30:35,760 –> 01:30:39,320
seen things you people wouldn’t believe. Shape we get ready for that,
1486
01:30:39,320 –> 01:30:42,920
folks. I think, I think Philip K. Dick called it. There are going to be
1487
01:30:42,920 –> 01:30:46,039
things in this next cyclical historical cycle
1488
01:30:46,600 –> 01:30:50,440
around embodied. I
1489
01:30:50,680 –> 01:30:54,280
hesitate to use the word intelligence. Embodied computer.
1490
01:30:54,600 –> 01:30:58,210
Algorithmic algorithm. Embodied algorithms. That’s a better term,
1491
01:30:58,450 –> 01:31:02,090
embodied algorithms. That you’re not going to believe. And it’s not going to be the
1492
01:31:02,090 –> 01:31:05,810
embodied algorithms problem. This is sort of where I get to. It’s going to be
1493
01:31:05,810 –> 01:31:09,450
our problem. It’s going to be a human problem. Okay? So
1494
01:31:09,450 –> 01:31:12,250
let’s, so let’s just look at this reality for just a minute. I’m going to,
1495
01:31:12,250 –> 01:31:15,050
I’m going to try to extrapolate this as we’re talking about it, okay? You’re a
1496
01:31:15,050 –> 01:31:18,890
founder in this new world and there’s, and there’s robots, okay? Because this happens
1497
01:31:18,890 –> 01:31:21,650
now with salespeople. And they’re like, you know what? I have a good idea.
1498
01:31:22,780 –> 01:31:25,700
But like, I just can’t seem to get anyone to buy, so I must just
1499
01:31:25,700 –> 01:31:28,820
not be good at sales. So I’m gonna go hire a salesperson because salesperson can
1500
01:31:28,820 –> 01:31:32,380
do this. Okay? So already your perception is
1501
01:31:32,380 –> 01:31:35,660
screwed, right? That you were not enough versus
1502
01:31:35,740 –> 01:31:38,940
understanding that how you communicate about your product and services
1503
01:31:39,500 –> 01:31:42,740
is really the difference maker. Right? And also just basic
1504
01:31:42,740 –> 01:31:46,420
expectations. Some people just aren’t gonna buy. Right? And so
1505
01:31:46,420 –> 01:31:49,980
then what happens is they come along and then they’re like, okay, great. I’m gonna
1506
01:31:49,980 –> 01:31:53,280
hire a bunch of apartments setters, right? And like, okay, great. How many dolls do
1507
01:31:53,280 –> 01:31:56,080
they have to do and everything, right? And they start doing these things. Now what
1508
01:31:56,080 –> 01:31:59,560
happens is, because humans are humans, you have these battle royale
1509
01:31:59,560 –> 01:32:03,360
situations, right? And some of these companies do, right? And financial services is
1510
01:32:03,440 –> 01:32:06,640
major culprit of this, right? And SAS is becoming that way as well. It’s kind
1511
01:32:06,640 –> 01:32:09,800
of shifting a little bit in SaaS, but not quick enough in my opinion of
1512
01:32:09,800 –> 01:32:12,800
like, hey, we’re going to hire 30. Most of you are going to quit, and
1513
01:32:12,800 –> 01:32:16,360
that’s fine because we just need the one and. And we’ll be good.
1514
01:32:16,360 –> 01:32:19,890
Okay? Imagine no one quits because they’re all
1515
01:32:19,890 –> 01:32:23,530
robots, right? And like, imagine your
1516
01:32:23,530 –> 01:32:27,130
brand. Imagine your brand capital whenever you go out and you
1517
01:32:27,130 –> 01:32:30,530
program 5,000, like, SDR AIs that are just going to
1518
01:32:30,530 –> 01:32:34,290
bludgeon everybody in your market. Like, the stakes are going to be
1519
01:32:34,290 –> 01:32:37,010
bigger for brands, I would say, right? Because
1520
01:32:38,530 –> 01:32:42,330
eventually, if enough people quit from your sales
1521
01:32:42,330 –> 01:32:45,810
team, you might be like, hey, maybe I’m part of the problem,
1522
01:32:46,470 –> 01:32:50,150
right? But if no one ever quits because they’re all like these like, AI,
1523
01:32:50,150 –> 01:32:53,310
yes men. Oh, yes, boss. Yes, boss. How much more can you do? Well, I
1524
01:32:53,310 –> 01:32:57,030
don’t know where. Okay, well, you know, can you. Can you make $3,000 a
1525
01:32:57,030 –> 01:33:00,470
day versus just the 300? We’ll figure out a way, boss. Right now it’s
1526
01:33:00,470 –> 01:33:04,310
duplicating itself, these things that are happening, you know, and so it’s like
1527
01:33:04,790 –> 01:33:08,390
that. That’s gonna get. That’s my, that’s my number one
1528
01:33:08,390 –> 01:33:12,240
concern as a sales professional. Not that I’m not going to have a job, but
1529
01:33:12,640 –> 01:33:16,480
that, like, no one is going to ever hear
1530
01:33:16,800 –> 01:33:20,600
anybody else out again. Because everyone
1531
01:33:20,600 –> 01:33:23,920
is just so thinking that, you know, oh, you’re just the
1532
01:33:23,920 –> 01:33:27,639
AI. You just want to sell me. All the rapport is
1533
01:33:27,639 –> 01:33:30,640
fake, John. You just want to network with me so that way you can sell
1534
01:33:30,640 –> 01:33:33,040
me. Actually, you’re kind of jumping ahead a little bit because I don’t know if
1535
01:33:33,040 –> 01:33:36,480
you’re tall enough to ride this ride well. And I, I think there will be
1536
01:33:36,480 –> 01:33:38,920
a fourth group in there, which I kind of hold back a little bit on
1537
01:33:38,920 –> 01:33:41,400
because I don’t know where this fourth group is going to come from. I think
1538
01:33:41,400 –> 01:33:45,070
there’ll be a fourth group of folks, right? So you’ll have your, Your, Your
1539
01:33:45,070 –> 01:33:48,430
crossover. These robots have rights slash
1540
01:33:48,430 –> 01:33:51,990
baseline pleasure people of all kinds, by the way. Not just physical,
1541
01:33:51,990 –> 01:33:55,790
but, okay, tied into identity and all that. Because
1542
01:33:55,790 –> 01:33:59,110
all the, all the same entrepreneurs that are like, hey, why would I hire Americans
1543
01:33:59,110 –> 01:34:02,830
when I can just like get Filipinos for $3 an hour? And it’s okay
1544
01:34:02,830 –> 01:34:06,230
because the cost of living is just so different. Like, I mean, there is the
1545
01:34:06,230 –> 01:34:09,110
realm of cost of living differences and then there was the realm of you justifying
1546
01:34:09,110 –> 01:34:12,770
being a cheap asshole or just. Right. Just exploiting people for
1547
01:34:12,770 –> 01:34:15,930
whatever. But now you can expl. But now you can exploit an object.
1548
01:34:17,290 –> 01:34:21,010
Right? So it’s all better. And they’re not
1549
01:34:21,010 –> 01:34:24,730
humans. Right, Because. Because that’s the other concern. Right, because since I am the
1550
01:34:24,730 –> 01:34:28,450
science fiction fantasy reader, right, like, is it
1551
01:34:28,450 –> 01:34:32,210
beyond the realm of possibility that eventually, like a. More of a
1552
01:34:32,210 –> 01:34:35,810
cyborg kind of thing of like, you know, like the neuro link and things like
1553
01:34:35,810 –> 01:34:39,600
that, of it moving together and everything. Like, like at some point we’re gonna have
1554
01:34:39,600 –> 01:34:42,880
to have some really deep conversations about this and those
1555
01:34:42,880 –> 01:34:46,440
conversations are going to be easier
1556
01:34:46,840 –> 01:34:50,160
or harder. Kind of like the civil rights movement based upon some of the decisions
1557
01:34:50,160 –> 01:34:53,960
that we’re making, I think in these early stages of this thing. Yeah,
1558
01:34:53,960 –> 01:34:57,600
yeah. So I don’t, I, I think those, I think those kinds of
1559
01:34:57,600 –> 01:35:01,360
conversations are going to come up. I do not think they will come
1560
01:35:01,360 –> 01:35:04,600
up in this next cyclical cycle that’s going to start probably around
1561
01:35:04,600 –> 01:35:08,400
2030 or so, 2035 at the latest, and
1562
01:35:08,400 –> 01:35:11,680
then run out, run out 20 years to 2055 or even
1563
01:35:11,680 –> 01:35:15,360
2060. It won’t start until what I call
1564
01:35:15,360 –> 01:35:18,960
the 21st century’s version of the Summer of Love. It’ll be
1565
01:35:18,960 –> 01:35:22,760
my kids in middle age that’ll have. My youngest son, who’s 8 in
1566
01:35:22,760 –> 01:35:25,360
middle age, will have to deal with the answers to that question
1567
01:35:26,400 –> 01:35:30,220
because we still have. To your point, we’re right at the cusp of
1568
01:35:30,220 –> 01:35:34,020
all of the nonsense. And so there’s certain decisions that have to be
1569
01:35:34,020 –> 01:35:37,260
made here that are going to, to your point about poker, lead to one of
1570
01:35:37,260 –> 01:35:40,180
18 different outcomes or a multiplicity
1571
01:35:40,340 –> 01:35:44,140
downstream, you know, for, for. As pervasive as it is in my,
1572
01:35:44,140 –> 01:35:47,900
in my inbox and in my. And in my feeds and my, in
1573
01:35:47,900 –> 01:35:51,620
my world, if you will. Like, we’re still
1574
01:35:51,620 –> 01:35:55,270
on the very much bleeding edge of most of this stuff in conversation,
1575
01:35:55,660 –> 01:35:59,500
right? Oh, absolutely, yeah. Like most people don’t know the difference
1576
01:35:59,500 –> 01:36:03,300
between like an LLM and automation. No, they don’t. You know,
1577
01:36:03,300 –> 01:36:06,740
like, like at all. Well, well, when you walk around, this is that fourth group
1578
01:36:06,740 –> 01:36:09,780
of people. So the fourth group of people. And I think that I keep going
1579
01:36:09,780 –> 01:36:12,020
back to this idea that the fourth group of people are the ones that are
1580
01:36:12,020 –> 01:36:15,300
going to keep the other three groups anchored because those other three groups are going
1581
01:36:15,300 –> 01:36:18,380
to be, for lack of a better term, they’re going to be a minority report.
1582
01:36:19,020 –> 01:36:22,590
The group that’s going to. The group that’s going to keep
1583
01:36:23,230 –> 01:36:26,990
those other four groups anchored is going to be the same group of people that’s
1584
01:36:26,990 –> 01:36:30,550
always kept that other group of people anchored. And it’s going to be the people
1585
01:36:30,550 –> 01:36:34,150
that do not. I hate to. No, no, I don’t hate to say
1586
01:36:34,150 –> 01:36:37,230
this. All the people who live their lives online
1587
01:36:37,950 –> 01:36:41,070
miss this. The vast
1588
01:36:41,630 –> 01:36:45,350
majority of people that you need to engage with, you talk
1589
01:36:45,350 –> 01:36:49,030
about sales that you need to engage with, are not in
1590
01:36:49,030 –> 01:36:52,470
online spaces. And they do not care what
1591
01:36:52,470 –> 01:36:55,750
happens there because their lives
1592
01:36:56,790 –> 01:37:00,550
are still driven by human to human interaction
1593
01:37:01,190 –> 01:37:04,830
from literally the time they get up in the morning to the time they go
1594
01:37:04,830 –> 01:37:08,230
to bed at night. Case in point, your garbage man.
1595
01:37:09,510 –> 01:37:12,310
Case in point, your road crew.
1596
01:37:13,520 –> 01:37:17,040
Case in point, your, your.
1597
01:37:17,280 –> 01:37:21,120
Yeah, your, your. Well, to a certain degree, your academic educator in
1598
01:37:21,120 –> 01:37:24,760
certain types of schooling situations. Okay, hold on a second. And those people,
1599
01:37:24,760 –> 01:37:27,600
Those people are the ones. Those people are the people who will keep the other
1600
01:37:27,600 –> 01:37:30,080
three groups anchored because they. Empathy box.
1601
01:37:31,280 –> 01:37:34,560
There you go. That’s the empathy box, right? It’s that.
1602
01:37:34,640 –> 01:37:38,080
Okay, so this is fascinating, right? I spend a lot of time online. Most of
1603
01:37:38,080 –> 01:37:41,700
my business comes from my. From my online network,
1604
01:37:41,700 –> 01:37:44,420
if you will. I had a pretty good local network, but whenever I started my
1605
01:37:44,420 –> 01:37:47,460
business, I had this moment of realizing that, like, I didn’t want to be limited
1606
01:37:47,460 –> 01:37:51,260
to just Fort Worth, Texas, because there wasn’t enough business for
1607
01:37:51,260 –> 01:37:53,820
me, right. And I didn’t want to make compromises. I wanted to work with who
1608
01:37:53,820 –> 01:37:57,260
I wanted to work with. So I started. Okay, this worked
1609
01:37:57,260 –> 01:38:01,020
locally. Let’s see if it works digitally, right? And so I just ran the same
1610
01:38:01,020 –> 01:38:04,740
play in a digital format. And so most of a lot of my relationships are
1611
01:38:04,740 –> 01:38:07,590
with people that I’ve never met in person before, right. Which is kind of a
1612
01:38:07,590 –> 01:38:10,750
weird version of the future to live in, right? It feels kind of High Line
1613
01:38:10,750 –> 01:38:14,590
esque, right? Or, or Demolition Man. Right, right. Video
1614
01:38:14,590 –> 01:38:18,110
calls like we’re living that reality. And so
1615
01:38:18,750 –> 01:38:21,950
it. There is this weird thing, right? So I bought this
1616
01:38:22,190 –> 01:38:25,590
lathe off a guy off of Facebook Marketplace. Right. And it was, you know, in
1617
01:38:25,590 –> 01:38:28,630
this, I mean, it was about an hour away. A friend of mine went, went
1618
01:38:28,630 –> 01:38:31,390
with me because it’s the real world and you know, you see the stories and
1619
01:38:31,390 –> 01:38:34,050
everything else like this, you know, and also it’s a big ass lathe. I’m gonna
1620
01:38:34,050 –> 01:38:37,090
have to put in the back of a truck and everything. So I take a
1621
01:38:37,090 –> 01:38:40,170
friend with me, the guy had it listed for like 300 and everything. And we
1622
01:38:40,170 –> 01:38:43,090
show up and I start talking about the fact that I’m new to this whole
1623
01:38:43,090 –> 01:38:45,690
thing and then I don’t know anything about this at all. And I’m brand new
1624
01:38:45,690 –> 01:38:49,369
and I got to learn. I’m asking him questions and everything. The guy goes, let
1625
01:38:49,369 –> 01:38:51,650
me knock a hundred dollars off of this thing because you’re have to go buy
1626
01:38:51,650 –> 01:38:55,410
some things that are essentially going to cost you about 100 bucks. And I just
1627
01:38:55,410 –> 01:38:59,250
want you to start, right? And for like three days I had this like half
1628
01:38:59,250 –> 01:39:02,990
life of like humanity of like, man, there’s some
1629
01:39:02,990 –> 01:39:06,390
genuinely nice people out here, you know, so, you know,
1630
01:39:06,630 –> 01:39:10,470
there’s this weird kind of thing for myself of like, I’m trying to do more
1631
01:39:10,470 –> 01:39:14,110
of that now of, you know, that’s part of the reason for the road trip.
1632
01:39:14,110 –> 01:39:16,910
Like let’s just go hang around and be around some people, like some real actual
1633
01:39:16,910 –> 01:39:20,430
people who probably have an agenda, but it doesn’t start
1634
01:39:20,430 –> 01:39:23,670
with how they’re messaging me, you know what I’m saying? And stuff like this. And
1635
01:39:23,670 –> 01:39:27,510
so it’s just been fascinating that as I have really great
1636
01:39:27,510 –> 01:39:31,100
relationships with people that I’ve never met before, you and I have never met in
1637
01:39:31,100 –> 01:39:34,900
person ever, right? Like, I have people that I’ve done five figure business
1638
01:39:34,900 –> 01:39:38,380
deals with that don’t, don’t live on the same continent as me, right? But by
1639
01:39:38,380 –> 01:39:41,860
that same token, I also know that like by, by choosing to spend time with
1640
01:39:41,860 –> 01:39:45,660
humans. Oh, that connection is faster, it’s richer, it’s deeper. There’s
1641
01:39:45,660 –> 01:39:48,340
so much more to it in this, right? So it’s one of those things to
1642
01:39:48,340 –> 01:39:51,380
where don’t get lured in by
1643
01:39:51,780 –> 01:39:55,460
scale, right? There is this messy aspect to it
1644
01:39:56,250 –> 01:39:59,890
that is going to fundamentally change what, what your
1645
01:39:59,890 –> 01:40:03,570
reality is. Right? And the nerds are the people who want to remove
1646
01:40:03,570 –> 01:40:07,330
all the human connection, right? And, and I say this as a nerd who
1647
01:40:07,330 –> 01:40:10,810
thought that in the beginning I could just brass tax my way through. Well, let
1648
01:40:10,810 –> 01:40:14,650
me just build something that you can’t say no to. Oh my God. Right?
1649
01:40:15,210 –> 01:40:18,090
But I go back to this thing of, one of my favorite things is if
1650
01:40:18,090 –> 01:40:20,890
you can ask the question, am I being narcissistic? You’re fundamentally not.
1651
01:40:21,780 –> 01:40:25,500
Right? And I think about this like, like with, with AI and automation
1652
01:40:25,500 –> 01:40:29,060
and everything else, like, why am I using this? Like am I
1653
01:40:29,220 –> 01:40:32,940
am, you know, and it’s weird in entrepreneurial circles, right? Because people
1654
01:40:32,940 –> 01:40:36,700
talk about automation and be in virtual assistance and delegation and
1655
01:40:36,700 –> 01:40:40,180
just delegate all the work that you don’t want to do. Okay,
1656
01:40:40,340 –> 01:40:43,220
like, at what standard? Because, like, I think.
1657
01:40:44,020 –> 01:40:47,350
I think that’s where the, where, where the line is going to be. What. What
1658
01:40:47,350 –> 01:40:50,990
is your standard? And you’re going to have all these people that are building
1659
01:40:50,990 –> 01:40:54,350
very, very salesy, pushy sales robots that hate
1660
01:40:54,430 –> 01:40:57,870
whenever a sales robot reaches out to them, but they think that they’re completely
1661
01:40:57,870 –> 01:41:01,710
justified as the tyrant. Just like I did whenever I was thinking that, that my
1662
01:41:01,710 –> 01:41:04,990
way was okay because I’m not using it for nefarious sins. It’s all, it’s all
1663
01:41:04,990 –> 01:41:08,070
the. It’s all these other guys that are the problem. We’re going to have that
1664
01:41:08,070 –> 01:41:11,350
on a much bigger scale because it’s not going to be one person doing their
1665
01:41:11,350 –> 01:41:14,800
own reach. It’s one guy who owns 5,000 robots that are sending 3,000
1666
01:41:14,800 –> 01:41:18,280
messages a day. They’re talking to other
1667
01:41:18,280 –> 01:41:21,760
robots, sending him responses. Should we go into robot
1668
01:41:21,760 –> 01:41:23,880
shorthand? Jeez.
1669
01:41:26,680 –> 01:41:30,399
Well, I’ve got to turn the corner here. Yeah. This has been a
1670
01:41:30,399 –> 01:41:33,880
delightful conversation, though. This has been awesome. This is. This has been great.
1671
01:41:34,360 –> 01:41:37,800
No, I gotta. Let me turn the corner here. Let’s get ready to close.
1672
01:41:40,450 –> 01:41:44,290
Three things I think will absolutely be true in the upcoming fourth turning,
1673
01:41:44,290 –> 01:41:48,090
and to John’s point, or the first turning, we’re going
1674
01:41:48,090 –> 01:41:50,810
into a secular high. And I don’t know what it’s going to look like, I
1675
01:41:50,810 –> 01:41:54,650
have no idea. But it will look different than the last secular high that we
1676
01:41:54,650 –> 01:41:58,130
went through in the mid 20th century, that we all sort of hearken back to
1677
01:41:58,210 –> 01:42:01,930
the high that Philip K. Dick was born into and Ray Bradbury was born
1678
01:42:01,930 –> 01:42:05,330
into and Robert Heinlein was born into. And then that shaped them
1679
01:42:06,270 –> 01:42:09,590
as they went into an awakening period. And an awakening period always
1680
01:42:09,590 –> 01:42:13,390
involves, at least in our country,
1681
01:42:13,790 –> 01:42:17,430
it always involves an almost spiritual
1682
01:42:17,430 –> 01:42:21,030
like or religious pursuit of a higher conscience. We saw that
1683
01:42:21,030 –> 01:42:23,310
during the transcendental movement
1684
01:42:24,830 –> 01:42:28,670
in the 19th century. And then we saw it 100 years later, which, by the
1685
01:42:28,670 –> 01:42:31,990
way, led into the Civil War. And then we also see this a little bit
1686
01:42:31,990 –> 01:42:35,350
later in the 1960s, which of course led into the civil rights movement in the
1687
01:42:35,350 –> 01:42:37,580
20th century and our own version of
1688
01:42:38,380 –> 01:42:42,060
interseen civil strife. And I think it will arrive 100
1689
01:42:42,060 –> 01:42:44,700
years from now, right on time, in about 2060.
1690
01:42:45,820 –> 01:42:49,420
I won’t be around maybe to see. Or if I will be around to see
1691
01:42:49,420 –> 01:42:51,700
it, I’ll be a very old man and no one will care what I have
1692
01:42:51,700 –> 01:42:55,340
to say, but it will
1693
01:42:55,420 –> 01:42:58,940
show up and how that war will be
1694
01:42:58,940 –> 01:43:02,620
fought and who will have thoughts on that war?
1695
01:43:04,040 –> 01:43:07,160
Well, those people are the sons and daughters of folks like
1696
01:43:07,880 –> 01:43:11,720
myself and John who are having these conversations and thinking about these things
1697
01:43:12,040 –> 01:43:15,720
right now. Based off of what we were reading from the last
1698
01:43:16,600 –> 01:43:20,200
time we all went through this. I do
1699
01:43:20,200 –> 01:43:24,000
believe fundamentally that human beings and. And work
1700
01:43:24,000 –> 01:43:27,560
done by humans, for humans, with other humans. I hate to be the bearer of
1701
01:43:27,560 –> 01:43:31,170
bad news, but that ain’t going nowhere where it’s almost built.
1702
01:43:31,170 –> 01:43:34,610
It’s almost as if it was built into the structure of reality itself. And so
1703
01:43:34,610 –> 01:43:37,370
far the engineers and the nerds have not been able to pull apart the structure
1704
01:43:37,370 –> 01:43:40,930
of reality itself. They could barely understand the structure of
1705
01:43:40,930 –> 01:43:44,649
reality itself, much less pull it apart. So
1706
01:43:44,649 –> 01:43:48,090
we’ll still have work. I’m not one of these people who believes that
1707
01:43:48,330 –> 01:43:51,690
we’ll have a bunch of people, surplus people floating around with nothing to do who
1708
01:43:51,690 –> 01:43:55,380
will engage in civil strife. I don’t believe that for a moment people will
1709
01:43:55,380 –> 01:43:59,180
find something to do. Even with all our robot
1710
01:43:59,180 –> 01:44:02,500
helpers, I think we will also still have
1711
01:44:02,900 –> 01:44:06,260
Christianity, Islam and Judaism. I don’t think those are going away.
1712
01:44:06,500 –> 01:44:09,820
And the reason why is because we. We need to have an
1713
01:44:09,820 –> 01:44:12,820
eschatology of meaning. We need to have a
1714
01:44:13,220 –> 01:44:17,060
structure of meaning that goes beyond the inevitable ceiling. I
1715
01:44:17,060 –> 01:44:20,460
love this line that I wrote, the script that goes beyond the amenable ceiling that
1716
01:44:20,460 –> 01:44:24,280
an art artificial transhuman paradigm will eventually hit. I
1717
01:44:24,280 –> 01:44:28,120
guarantee they will hit a ceiling. The cybernetics folks. To John’s
1718
01:44:28,120 –> 01:44:31,960
point, I do think there will be a rake Hertz Weil weird merging or
1719
01:44:31,960 –> 01:44:35,000
at least a proposal for it of man and machine. And it will only look
1720
01:44:35,000 –> 01:44:38,440
weird because it’s not something that I’m used to. But that will be a
1721
01:44:38,440 –> 01:44:42,160
proposal that will be put forth. It’s already being put forth
1722
01:44:42,160 –> 01:44:45,440
by several people. And I don’t know if it will come in the form of
1723
01:44:45,440 –> 01:44:49,280
a downloading of quote unquote consciousness. We already understand very little about
1724
01:44:49,280 –> 01:44:52,810
consciousness, much less how it differs from an algorithm
1725
01:44:54,490 –> 01:44:58,250
or is the same. So the big
1726
01:44:58,250 –> 01:45:01,770
three. The big three religions aren’t going anywhere. And if you’re not a
1727
01:45:01,770 –> 01:45:05,530
religious person listening to this, I hate to tell you they’re not
1728
01:45:05,530 –> 01:45:09,370
going anywhere because they’re containers of belief that have been around
1729
01:45:09,450 –> 01:45:12,650
longer than the robots and longer than our current problems.
1730
01:45:13,130 –> 01:45:16,980
They’re sturdy for a reason. Then the third
1731
01:45:16,980 –> 01:45:19,940
thing that will be true is that
1732
01:45:22,020 –> 01:45:25,700
there will be resistance to all of these changes. For sure
1733
01:45:25,860 –> 01:45:28,820
that will be true. Some of it will Come in the form of.
1734
01:45:29,620 –> 01:45:33,340
Of loud, obstreperous and
1735
01:45:33,340 –> 01:45:37,180
grading objection like riots and
1736
01:45:37,180 –> 01:45:41,020
people destroying the Waymos in la. And some will
1737
01:45:41,020 –> 01:45:44,600
be very quiet and subtle, sort of like the
1738
01:45:44,600 –> 01:45:48,440
Essenes back in the Old Testament inter period time,
1739
01:45:49,160 –> 01:45:51,640
who in the Middle east,
1740
01:45:56,360 –> 01:46:00,200
mountains. And they created something called the Dead Sea Scrolls that
1741
01:46:00,200 –> 01:46:03,880
showed up, you know, 2,000 years later. You’ll have those
1742
01:46:03,880 –> 01:46:07,560
folks, you’ll have the preppers, what we would call in our time in America, the
1743
01:46:07,560 –> 01:46:11,230
preppers. You’ll have those folks who will say, no,
1744
01:46:11,390 –> 01:46:14,630
I don’t want to be involved in your world. I don’t want to roll the
1745
01:46:14,630 –> 01:46:16,190
dice and I don’t want to play.
1746
01:46:18,270 –> 01:46:21,390
Those are going to be the things that are going to be
1747
01:46:21,390 –> 01:46:24,750
consistent even through the next cycle with
1748
01:46:25,310 –> 01:46:28,750
whatever is coming down the pike.
1749
01:46:30,830 –> 01:46:34,030
And so I remember the words from Ecclesiastes 1, not
1750
01:46:35,000 –> 01:46:38,720
the thing that has been is a thing which shall be, and that which is
1751
01:46:38,720 –> 01:46:42,440
done is that which shall be done. And of course, there is no new thing,
1752
01:46:43,640 –> 01:46:46,760
no genuinely new thing under the sun.
1753
01:46:50,920 –> 01:46:54,760
Final thoughts. John, today on do androids
1754
01:46:54,760 –> 01:46:58,320
dream of electric sheep? Actually, maybe that’s the final question.
1755
01:46:58,320 –> 01:47:00,200
Do androids dream of electric sheep?
1756
01:47:06,130 –> 01:47:09,930
I think that when they do, we need to be ready to
1757
01:47:09,930 –> 01:47:13,570
have all those connecting conversations, because I think it’s just a matter
1758
01:47:13,570 –> 01:47:17,250
of time before we get to go do that.
1759
01:47:17,330 –> 01:47:21,090
There. There was this very weird thing yesterday at the.
1760
01:47:21,090 –> 01:47:24,090
At the museum, and I would love to just kind of talk about this. My
1761
01:47:24,090 –> 01:47:27,050
daughter was going through it and she. And it was. It’s called. It’s called the
1762
01:47:27,050 –> 01:47:30,870
Drake formula. Are you familiar with the Drake formula? Okay,
1763
01:47:30,870 –> 01:47:33,470
so it’s a science fiction formula. I don’t know if it comes from science fiction
1764
01:47:33,470 –> 01:47:37,150
books or just science and theory, but it’s around the idea of. It’s a formula
1765
01:47:37,150 –> 01:47:40,990
that you can kind of run yourself through, which kind of extrapolates
1766
01:47:40,990 –> 01:47:44,710
whether or not you’re optimistic or pessimistic about other intelligent life
1767
01:47:44,710 –> 01:47:48,390
in the universe. Right. Are we alone or not? Right. And it was
1768
01:47:48,390 –> 01:47:51,950
interesting because my daughter was going through it and it’s telling her, like,
1769
01:47:51,950 –> 01:47:54,990
hey, on average, we usually see one planet
1770
01:47:55,850 –> 01:47:59,170
capable of bearing life in a. In a system. Right? And then it asks the
1771
01:47:59,170 –> 01:48:02,170
question, what do you think? Just like
1772
01:48:02,490 –> 01:48:06,250
0.01% 01%, 01% on
1773
01:48:06,330 –> 01:48:10,090
all these things. Right. And I appreciated how, how it was framing the question.
1774
01:48:10,170 –> 01:48:14,010
Right. It was talking about like, hey, you know, science shows
1775
01:48:14,090 –> 01:48:17,890
that as soon as the environment was right on Earth, life
1776
01:48:17,890 –> 01:48:21,370
started to move in that direction, scientifically speaking. Right.
1777
01:48:21,940 –> 01:48:25,180
So bearing that in mind. Right? What do you think that if you have the
1778
01:48:25,180 –> 01:48:28,980
right components for intelligent life just from a scientific perspective
1779
01:48:29,620 –> 01:48:33,340
that you end up with this as a, as a product just like
1780
01:48:33,340 –> 01:48:37,100
0.01% right. And I’m like you’re so fixated
1781
01:48:37,100 –> 01:48:40,660
on humans as humans as humans in our
1782
01:48:40,660 –> 01:48:43,780
specialty, our, our
1783
01:48:43,780 –> 01:48:47,420
preciousness that it’s leading you to like I
1784
01:48:47,420 –> 01:48:51,110
think gauge a little low and of course this is a conversation that’s very hard
1785
01:48:51,110 –> 01:48:54,950
to have with a you know, 13 about to be 14 year old girl but
1786
01:48:54,950 –> 01:48:58,670
it was just such as this beautiful thing like I am so optimistic about these
1787
01:48:58,670 –> 01:49:02,510
things that it also makes me thoughtful and considerate about
1788
01:49:02,510 –> 01:49:05,869
the conversations that are necessary around it as opposed to it’s never going to happen,
1789
01:49:05,869 –> 01:49:09,590
never going to happen. Oh crap. How do I deal with
1790
01:49:09,590 –> 01:49:10,910
this kind of situation?
1791
01:49:17,080 –> 01:49:20,800
And with that I’d like to thank you for listening to the leadership
1792
01:49:20,800 –> 01:49:23,640
lessons for the Great Books podcast today. And with that, well hey,
1793
01:49:25,640 –> 01:49:26,360
we’re out.









