The Light & the Dark Side of Ai for Business Growth with David G. Ewing, CEO of MOTIV
This week, Lee Murray hosts a deep dive into the future of business alongside David G. Ewing, CEO of Motiv. Together, they explore how AI is reshaping customer experiences and job roles, emphasizing the importance of visionary thinking in an ever-evolving landscape. David advocates for stringent data privacy protections and delves into the potential of AI-powered marketing strategies.
Have a guest recommendation, question, or just want to connect?
Go here: https://www.harvardmurray.com/exploring-growth-podcast
Connect on LinkedIn:
Lee - https://www.linkedin.com/in/leehmurray
David - https://www.linkedin.com/in/davidgewing/
#ExploringGrowthPodcast #BusinessInsights #EntrepreneurialJourney #BusinessExpansion #EmbracingGrowth #MarketingTalk #David G. Ewing #MOTIV #AI #FutureofAI
1
00:00:00,060 --> 00:00:02,490
I promise I won't get too political
or controversial in your show,
2
00:00:02,490 --> 00:00:05,340
but I'm thinking like,
how long before I. President. Right.
3
00:00:05,370 --> 00:00:08,010
Yeah, here's here's the thing.
Like, we're all, this is I.
4
00:00:08,250 --> 00:00:11,130
I'm not going to say anything about
either candidate, but most people are
5
00:00:11,130 --> 00:00:14,730
dissatisfied with their choices. Yes.
And I was thinking about this
6
00:00:14,730 --> 00:00:16,140
and I was joking with my son.
I'm like,
7
00:00:16,140 --> 00:00:18,600
what would an AI president be like?
And then I realized, well,
8
00:00:18,600 --> 00:00:21,270
you know what that would be like.
Let's suppose we trained an eye
9
00:00:21,300 --> 00:00:24,300
to be like Lincoln, right?
And grab some FDR and some Washington
10
00:00:24,300 --> 00:00:27,930
and all your favorites and throw them
all into the eye soup and and say,
11
00:00:27,930 --> 00:00:34,620
here's our president. Okay.
Welcome back to Exploring Growth.
12
00:00:34,620 --> 00:00:38,220
I'm very excited about this episode,
and I'll tell you about that in
13
00:00:38,220 --> 00:00:40,950
a minute of why.
But today we have the opportunity
14
00:00:40,950 --> 00:00:43,980
to hear from David G.
Ewing, CEO of Motiv.
15
00:00:44,010 --> 00:00:47,130
Welcome to the show, David.
Hey, Lee. Thanks for having me.
16
00:00:47,130 --> 00:00:51,240
It's good to be here. Yeah.
So I'm not sure exactly what I'm
17
00:00:51,240 --> 00:00:54,410
going to title this episode,
but I could title it The Future
18
00:00:54,410 --> 00:00:56,780
of Business Growth.
You know, that's probably not that
19
00:00:56,780 --> 00:00:59,540
catchy, but that's essentially
what I think that we're going to
20
00:00:59,540 --> 00:01:04,160
be talking about today. All right.
I think that the topic of our
21
00:01:04,160 --> 00:01:08,360
discussion, is going to kind
of hit, you know, right on that,
22
00:01:08,720 --> 00:01:11,030, because I think that we're
going to end up really in the
23
00:01:11,030 --> 00:01:16,400
meat of the convergence of AI
and BI or business intelligence.
24
00:01:16,910 --> 00:01:19,580, but before we get into all
of that, first tell us about who
25
00:01:19,580 --> 00:01:24,290
you are and what you do. Sure.
So, my name is David G.
26
00:01:24,290 --> 00:01:26,810
Ewing, and I'm the CEO of a
company called Motiv.
27
00:01:26,810 --> 00:01:32,720
And at Motiv, we believe that the
best customer experience wins.
28
00:01:32,720 --> 00:01:35,930
And customer experience to us is
everything.
29
00:01:35,930 --> 00:01:39,680
From the very first moment
someone hears about your company,
30
00:01:39,680 --> 00:01:43,270
checks out your website or a social
media page all the way through
31
00:01:43,270 --> 00:01:47,410
the the seven phases of getting
to engage with your company.
32
00:01:47,410 --> 00:01:52,390
And finally, you know, repeat buying.
Its all of that put together and
33
00:01:52,390 --> 00:01:56,080
it's the systems and the success
strategy that you have to make
34
00:01:56,080 --> 00:01:59,710
that work ultimately, so that you
can get your customer lifetime
35
00:01:59,710 --> 00:02:04,450
value somewhere close to what it
actually is. Max potential. Right?
36
00:02:04,450 --> 00:02:07,960
I think every company realizes that, you know, there's so much more
37
00:02:07,960 --> 00:02:10,390
of the wallet that they could be
getting from their customers.
38
00:02:10,600 --> 00:02:14,350
And customer experience is that
emotional glue that brings that
39
00:02:14,350 --> 00:02:17,770
customer back so that they look
for reasons to do business with
40
00:02:17,770 --> 00:02:21,520
you and not with someone else.
And so that's a that's what we focus
41
00:02:21,520 --> 00:02:27,040
on is, is making our customers
successful across the spectrum. Yeah.
42
00:02:27,040 --> 00:02:31,840
And so help me understand the
technical connection to Oracle.
43
00:02:31,840 --> 00:02:36,310
What's your relationship with Oracle.
So Oracle is the world's largest
44
00:02:36,310 --> 00:02:38,170
software company.
There's never been another
45
00:02:38,170 --> 00:02:40,750
software company for enterprise
business like Oracle I mean
46
00:02:40,750 --> 00:02:44,050
Google is huge Microsoft is huge.
They have a lot of consumer side
47
00:02:44,050 --> 00:02:45,940
stuff.
But but Oracle is the biggest in
48
00:02:45,940 --> 00:02:47,530
business.
And so,
49
00:02:47,530 --> 00:02:52,420
what we do is we typically bring
Oracle technology wherever possible
50
00:02:52,420 --> 00:02:56,370
in to solve problems for small,
medium and enterprise companies.
51
00:02:56,370 --> 00:02:59,280
And you'd be surprised how affordable
it is when someone owns the
52
00:02:59,280 --> 00:03:03,270
entire stack from top to bottom.
So, what we like about Oracle is
53
00:03:03,270 --> 00:03:06,900
that it, it brings the security,
it brings the scalability,
54
00:03:06,900 --> 00:03:10,260
and it allows us to set up a
customer experience strategy for
55
00:03:10,260 --> 00:03:13,890
a company and then just scale
the heck out of it so that,
56
00:03:13,890 --> 00:03:16,800
they can continue to use it when
they add zeros to their,
57
00:03:16,800 --> 00:03:20,220
top line and bottom line. Love that.
Yeah, that's that's awesome.
58
00:03:20,220 --> 00:03:23,190
Speaking my language, and, and that's really kind of
59
00:03:23,190 --> 00:03:27,060
how we got connected is I think
I saw a post on LinkedIn,
60
00:03:27,060 --> 00:03:31,920
where you were talking about Oracle, getting I don't remember exactly
61
00:03:31,920 --> 00:03:37,080
how I got to it, but Oracle becoming
a partner with, OpenAI.
62
00:03:37,080 --> 00:03:41,820
And so when I saw that you were
involved in that, I was like, hey,
63
00:03:41,820 --> 00:03:45,270
this would be a great discussion
because everyone's talking about AI,
64
00:03:45,420 --> 00:03:49,980
and a lot of people don't really
understand what AI is and how it
65
00:03:49,980 --> 00:03:53,490
operates and how it works.
I've had a recently had a,
66
00:03:53,490 --> 00:03:56,760
software developer on that,
kind of explained a little bit about
67
00:03:56,760 --> 00:04:00,420
what AI is and how it how it works.
So we don't need to get into that.
68
00:04:00,420 --> 00:04:05,580
But I really like, you know,
this is very current as of what,
69
00:04:05,580 --> 00:04:08,180
a couple of weeks ago, maybe,
or a week ago that this happened.
70
00:04:08,180 --> 00:04:11,840
Yeah, they just announced it. Yeah.
That OpenAI decided to expand
71
00:04:11,840 --> 00:04:15,110
all of their infrastructure into
the what's called the Oracle
72
00:04:15,110 --> 00:04:17,300
Cloud infrastructure.
So everybody's heard of Google
73
00:04:17,300 --> 00:04:20,120
or Amazon.
They all at everybody has these
74
00:04:20,120 --> 00:04:22,040
clouds.
And Oracle has been putting
75
00:04:22,070 --> 00:04:25,820
together a cloud for years.
And you know they would argue that
76
00:04:25,820 --> 00:04:30,800
they have the cheapest, most scalable
cloud that's highly secure. So.
77
00:04:30,980 --> 00:04:34,790
Seeing that validation come from
OpenAI was a big move for them.
78
00:04:35,150 --> 00:04:38,000
Yeah. Yeah, for sure.
And, as I understand it,
79
00:04:38,000 --> 00:04:40,100
I'm very much on the outside of
this conversation because I'm
80
00:04:40,100 --> 00:04:42,350
just observing things happening
in the headlines.
81
00:04:42,350 --> 00:04:45,440
But,
it's really hard to get a connection
82
00:04:45,440 --> 00:04:49,340
to OpenAI in that capacity.
Like, it's really a I mean,
83
00:04:49,340 --> 00:04:50,450
Oracle's up there.
I mean, they're going to
84
00:04:50,450 --> 00:04:52,790
probably get one of the first
seats to to have the discussion.
85
00:04:52,790 --> 00:04:56,020
But but I mean, it's not just
they're not just giving these out.
86
00:04:56,920 --> 00:05:00,700
No, no, but, but, you know,
the the thing that I found is that,
87
00:05:00,700 --> 00:05:04,750, Oracle is hungry because it's
kind of like, you remember those old
88
00:05:04,750 --> 00:05:07,630
Avis commercials where they're like,
we're number two, so we try harder.
89
00:05:07,630 --> 00:05:10,240
I kind of think you see the same
thing with Oracle. They're.
90
00:05:10,270 --> 00:05:12,250
They're behind.
They're not the name that you
91
00:05:12,250 --> 00:05:14,200
think of when you think of cloud,
you think of Google,
92
00:05:14,200 --> 00:05:17,350
you think of Microsoft Azure cloud.
You think of all these other clouds.
93
00:05:17,350 --> 00:05:19,030
Yeah.
So Google or sorry,
94
00:05:19,030 --> 00:05:23,710
Oracle is just so scrappy right now.
And so it's easy actually to get
95
00:05:23,710 --> 00:05:27,100
an account and get started.
And, and so if some of the
96
00:05:27,100 --> 00:05:29,500
fish want to jump in the boat,
they, they're gonna,
97
00:05:29,530 --> 00:05:32,350
they're gonna bring them in.
But, but what's nice about
98
00:05:32,350 --> 00:05:35,980
their cloud is, yeah, you can set
it up and, it is pretty cheap.
99
00:05:35,980 --> 00:05:40,030
And, and at the same time, it's
got the enterprise scalability and
100
00:05:40,030 --> 00:05:43,180
horsepower that they're looking for.
So I think they're going to rapidly
101
00:05:43,180 --> 00:05:46,780
make up some ground in this space
is my prediction. So I. Like that.
102
00:05:46,780 --> 00:05:51,490
So now how does that affect the
growth and offerings for your
103
00:05:51,490 --> 00:05:54,160
company.
Well, you know it's really
104
00:05:54,160 --> 00:05:58,300
interesting because last year we had
the idea that we would build an AI
105
00:05:58,300 --> 00:06:02,110
product and we, we, we built one.
And what it was is anyone who was a
106
00:06:02,110 --> 00:06:06,550
marketer and got sick and tired of
writing all of their marketing email,
107
00:06:06,570 --> 00:06:09,390
which most digital marketers are
sick and tired of writing marketing.
108
00:06:09,420 --> 00:06:12,660
Yeah, what you could do instead
is you could just pop open this
109
00:06:12,660 --> 00:06:16,620
little ChatGPT sort of window and go
through a little chat bot and say,
110
00:06:16,620 --> 00:06:20,460
I want this style of marketing email,
this is my target.
111
00:06:20,460 --> 00:06:22,710
This is, you know,
the three value propositions and
112
00:06:22,710 --> 00:06:25,860
this is the persona I want.
And then bang, we'd fire that off
113
00:06:25,860 --> 00:06:30,060
to ChatGPT it would write this
beautiful email, you know, that you
114
00:06:30,060 --> 00:06:32,640
needed and it would bring it right
into your marketing application.
115
00:06:32,640 --> 00:06:34,650
And you could just, you know,
looks good.
116
00:06:34,650 --> 00:06:37,950
Press send and you're away. You go.
Right. Save you a lot of time.
117
00:06:38,220 --> 00:06:42,420, and nobody bought it.
And it was really fascinating to say,
118
00:06:42,420 --> 00:06:45,780
well, why did this product fail?
And so as we went and we interviewed
119
00:06:45,780 --> 00:06:48,960
all the people who we thought
would just love this thing,
120
00:06:48,960 --> 00:06:52,530
it all came down to data privacy
and security and the fact that
121
00:06:52,530 --> 00:06:55,320
most people don't understand it.
Because even if all I'm saying to
122
00:06:55,320 --> 00:06:59,490
ChatGPT is I have this customer base,
which is so public, right.
123
00:06:59,490 --> 00:07:02,790
And I'd like to do this email,
which is so public, and I'd like
124
00:07:02,790 --> 00:07:05,460
to accentuate these benefits,
which will be in the email.
125
00:07:05,460 --> 00:07:09,360
This will all be public.
The fear of oh my gosh, like,
126
00:07:09,360 --> 00:07:13,020
I can't imagine putting this
into the public domain.
127
00:07:13,560 --> 00:07:17,400, even though that doesn't
make any rational sense,
128
00:07:17,730 --> 00:07:20,840
stop that product cold in its tracks.
And what we realized is that,
129
00:07:20,840 --> 00:07:25,310
yes, it's it's going to be about
trust and privacy.
130
00:07:25,310 --> 00:07:27,470
And, you know,
what are you sharing with an eye?
131
00:07:27,470 --> 00:07:30,920
And who else can see that?
Because the AI is going to, you know,
132
00:07:30,920 --> 00:07:33,920
bring that into its model anytime
you interact with it. Yeah.
133
00:07:33,920 --> 00:07:38,240
So yeah, which is why I think,
you know, I sell my clients and
134
00:07:38,240 --> 00:07:41,930
my clients are very, you know,
sort of part of their DNA is to
135
00:07:41,930 --> 00:07:44,750
not put any private,
actual private information into
136
00:07:44,750 --> 00:07:48,890
ChatGPT because that it's out there,
right, you don't want that.
137
00:07:49,190 --> 00:07:52,460, the privacy thing is really
interesting because I think,
138
00:07:52,700 --> 00:07:56,120
I think people are very aware of
keeping their stuff private,
139
00:07:56,120 --> 00:07:59,480
but this kind of goes to this
other conversation we were having
140
00:07:59,480 --> 00:08:05,360
about privately held data and then
data as a whole for a company.
141
00:08:05,840 --> 00:08:08,140, and so I, you know,
I've been talking to some clients
142
00:08:08,140 --> 00:08:11,380
and colleagues of this about this, for some time now.
143
00:08:11,380 --> 00:08:14,800
And, you know,
when we were talking about this in
144
00:08:14,800 --> 00:08:18,220
our intro call to to decide what
are we going to talk about today?
145
00:08:18,790 --> 00:08:23,290, I got really excited because
this is the place where I really
146
00:08:23,290 --> 00:08:27,340
feel like there's a huge
convergence between what data we
147
00:08:27,340 --> 00:08:29,980
have in our organizations.
Obviously,
148
00:08:29,980 --> 00:08:32,140
the bigger the organization,
the more data you're going to have.
149
00:08:32,470 --> 00:08:36,520
So it's potentially could benefit you
more, to be able to call on it.
150
00:08:36,520 --> 00:08:40,360
And the way we interact with that
data so that you have AI, right.
151
00:08:40,360 --> 00:08:42,640
So I don't know, I just had this
sort of shower moment,
152
00:08:42,640 --> 00:08:45,490
shower thought the other day,
and it happened before we got on
153
00:08:45,490 --> 00:08:48,970
our call, which gave me even more
clarity for the for this discussion.
154
00:08:48,970 --> 00:08:53,470
But I thought maybe, maybe I'm just
dumb and maybe just coming to this.
155
00:08:53,470 --> 00:08:57,280
But, you know, the internet gives
us all of this information, and
156
00:08:57,280 --> 00:09:01,030
there's 99% of the things we need to
know or want to know are out there.
157
00:09:01,030 --> 00:09:03,820
And people are continually
creating new information, a lot of
158
00:09:03,820 --> 00:09:07,300
misinformation, too. There's that.
But at this point,
159
00:09:07,480 --> 00:09:12,490
the problem has been accessing the
information in a low labor way,
160
00:09:12,910 --> 00:09:19,030
in a way that serves our need pretty
immediately and, and effectively.
161
00:09:19,030 --> 00:09:23,160
And now that we have ChatGPT and
we have other AI tools,
162
00:09:23,760 --> 00:09:28,230
we're able to essentially have a
conversation with the data. Right.
163
00:09:28,230 --> 00:09:32,100
And so it came to me, I was like,
this is really where we're headed is
164
00:09:32,340 --> 00:09:34,980
and I you know, when you say it,
I, when I said it to myself,
165
00:09:34,980 --> 00:09:38,670
I was like, well, obviously,
I mean, that's what ChatGPT is.
166
00:09:38,670 --> 00:09:42,090
But it wasn't so obvious.
Obvious to me at first, you know.
167
00:09:42,090 --> 00:09:46,740
So I'm curious your thoughts.
So you've hit an important nail
168
00:09:46,740 --> 00:09:50,940
right on the head.
And that is when it comes to AI.
169
00:09:50,970 --> 00:09:55,830
It is all about the data.
And yes, you know,
170
00:09:55,830 --> 00:09:58,350
there's this wealth of information,
the whole internet out there.
171
00:09:58,350 --> 00:10:02,010
And, you know, the idea or promise
of a, of an AI is that it can
172
00:10:02,010 --> 00:10:05,100
somehow get your arms around it.
But, you know, when we think about
173
00:10:05,100 --> 00:10:08,670
it, the what's going to happen?
The next thing that happens in AI
174
00:10:08,670 --> 00:10:11,550
is we've now gone to public AI.
And the way I like to think of
175
00:10:11,550 --> 00:10:14,670
this is, you know, if you've ever
been to a karaoke bar, right,
176
00:10:14,670 --> 00:10:19,440
you go to a karaoke bar, it's public.
And if you want to get up and sing,
177
00:10:19,440 --> 00:10:21,570
you don't get to talk about who
you get to sing in front of.
178
00:10:21,570 --> 00:10:23,400
You're singing in front of everyone.
Yeah. That's right. Right. Yeah.
179
00:10:23,400 --> 00:10:26,730
And whether that's your boss or
whether it's to a bunch of strangers
180
00:10:26,730 --> 00:10:29,970
you don't have total control over,
you're just singing. Right?
181
00:10:29,970 --> 00:10:32,690
And then, you know,
the next generation of karaoke was,
182
00:10:32,690 --> 00:10:35,840
oh, we can go rent like a private
room with just me and my friends,
183
00:10:35,840 --> 00:10:38,420
and I can sing and hear,
and it's sort of safe because now I'm
184
00:10:38,420 --> 00:10:41,210
just singing with my friends, right?
And that's kind of what's going
185
00:10:41,210 --> 00:10:44,450
to happen to I.
We're going from the big open karaoke
186
00:10:44,450 --> 00:10:47,420
bar to my private little eye that,
you know,
187
00:10:47,420 --> 00:10:50,090
I still have all the songs, right?
I can still take the internet
188
00:10:50,090 --> 00:10:52,760
and download it into my eye and
train it somehow.
189
00:10:52,760 --> 00:10:55,670
But now I'm going to throw my own
private stuff in here, and I'm
190
00:10:55,670 --> 00:10:57,650
going to keep it private for me.
And I'm gonna have my own
191
00:10:57,650 --> 00:11:02,300
private little eye karaoke room.
And that's that's what I think
192
00:11:02,300 --> 00:11:05,180
happens next.
And so that's what, you know,
193
00:11:05,180 --> 00:11:08,270
this whole thing that,
this partnership between Oracle and
194
00:11:08,690 --> 00:11:12,260
OpenAI, I think it really makes that, easy and possible so that
195
00:11:12,260 --> 00:11:15,740
everybody can have their private
karaoke room of AI and,
196
00:11:15,740 --> 00:11:19,220
and now we can do all sorts of
interesting things because to your
197
00:11:19,220 --> 00:11:22,480
point, it's all about the data.
So what kind of data are we
198
00:11:22,480 --> 00:11:24,100
going to feed in there that
wouldn't be public?
199
00:11:24,100 --> 00:11:26,680
Well, you know,
if you're a company that has,
200
00:11:26,680 --> 00:11:30,820
you know, wearable technology or
large equipment with IoT sensors
201
00:11:30,820 --> 00:11:35,080
on Internet of Things sensors,
this stream of information is
202
00:11:35,080 --> 00:11:37,630
going to feed in, and you don't
want that in public, right?
203
00:11:37,630 --> 00:11:41,590
Those are your customers,
you know, sensors or whatever.
204
00:11:41,590 --> 00:11:44,590
You know your equipment,
but you can throw that into your eye.
205
00:11:44,590 --> 00:11:47,620
And what does that enable?
Well, one of the coolest things that
206
00:11:47,620 --> 00:11:52,450
I'm excited about is, what if
service went the opposite direction?
207
00:11:52,450 --> 00:11:55,540
Right now we have a problem,
and we call our customer support
208
00:11:55,540 --> 00:11:58,660
agent at our vendor and say,
I have a problem, right?
209
00:11:58,660 --> 00:12:01,600
And then they fix it.
What if they called you?
210
00:12:01,780 --> 00:12:04,210
You know what if because they're
looking at the IoT stuff,
211
00:12:04,210 --> 00:12:08,440
they go, whoa, you know,
leas whatever is out of whack.
212
00:12:08,440 --> 00:12:11,410
You know, we're gonna we gotta
let them know that it's a problem
213
00:12:11,410 --> 00:12:13,420
and that we're already on it,
and that we're going to send the
214
00:12:13,420 --> 00:12:16,180
replacement part doodad,
check up whatever it is that we're
215
00:12:16,180 --> 00:12:19,960
gonna do, that's. I can do that.
Whereas we could never have service
216
00:12:19,960 --> 00:12:23,350
agents being that proactive and
scaling like that. Right, right.
217
00:12:23,350 --> 00:12:27,640
So Private Eye is going to make
that happen.
218
00:12:27,640 --> 00:12:29,740
You know,
and so that I think it's things like
219
00:12:29,740 --> 00:12:33,090
that get me really excited. Yeah.
Even to the point where take it to
220
00:12:33,090 --> 00:12:36,510
the next step where they just go
ahead and solve it and, you know,
221
00:12:36,510 --> 00:12:39,420
either report on it or don't report
on it, but they're solving stuff
222
00:12:39,420 --> 00:12:43,950
before even having to wait for it to
be prompted to, to give service.
223
00:12:43,950 --> 00:12:48,600
I think you could have the next
level of serving your customer come,
224
00:12:48,600 --> 00:12:51,120
come to fruition there.
And I think you could probably even
225
00:12:51,120 --> 00:12:55,050
take it one more step in that the I
think what's going to be natural is
226
00:12:55,050 --> 00:12:58,380
that the customer can solve their own
problems so much quicker than having
227
00:12:58,380 --> 00:13:02,220
to go to someone else to solve it.
You know, I, I have this I have this
228
00:13:02,220 --> 00:13:07,830
sort of, train of thought that
I think where it's going is,
229
00:13:07,830 --> 00:13:13,260
you have you'll end up having this,
this person that works at a company,
230
00:13:13,260 --> 00:13:15,210
potentially internally at the
company,
231
00:13:15,210 --> 00:13:18,570
depending on the relationship of what
they're buying and or the vendors,
232
00:13:18,570 --> 00:13:21,600
the partnership or whatever,
and they're called the implementer
233
00:13:21,600 --> 00:13:25,620
or the the knowledge worker or the
integrator or, you know, there's
234
00:13:25,620 --> 00:13:27,540
lots of different words you could
use to describe this person, but
235
00:13:27,540 --> 00:13:32,070
they're the person that understands
fully how to utilize the tools.
236
00:13:32,340 --> 00:13:39,120, with AI baked in, in that domain.
And they're the person that is
237
00:13:39,120 --> 00:13:44,580
really could be if they're internal.
The key to customer success customer
238
00:13:44,580 --> 00:13:49,130
service you know and and on.
And so what I think it.
239
00:13:49,550 --> 00:13:52,400
So let's stop there.
Let me get your your ideas on that.
240
00:13:52,640 --> 00:13:56,450
So I like to think of this as kind
of how's it going to like roll
241
00:13:56,450 --> 00:13:59,660
out the next couple of years.
So I think 2025 these are
242
00:13:59,840 --> 00:14:04,310
realistic scenarios, right,
where people are are having proactive
243
00:14:04,310 --> 00:14:10,460
eyes contacting humans right now.
Let's go to 2026 or maybe 2027.
244
00:14:10,460 --> 00:14:13,820
I think that those proactive AI
services aren't contacting
245
00:14:13,820 --> 00:14:16,280
humans anymore.
I think they're contacting eyes on
246
00:14:16,280 --> 00:14:20,570
the other side to to, you know,
handle that because when we can go
247
00:14:20,600 --> 00:14:25,880
eye to eye, suddenly a lot of things
speed up and, you know, and then
248
00:14:25,880 --> 00:14:28,910
that that brings up another one.
So the 20 so, so, you know,
249
00:14:28,910 --> 00:14:32,930
just thinking a couple years out and
not not next year, I think by 2027.
250
00:14:32,930 --> 00:14:35,590
Here's another thing that is likely
to happen is that if marketers
251
00:14:35,590 --> 00:14:41,950
are really good with AI, then, vendor AI will be searching
252
00:14:41,950 --> 00:14:45,310
for information and interacting
on with the selling AI.
253
00:14:45,340 --> 00:14:48,550
I'm pretty sure that all decisions
will still be made by humans,
254
00:14:48,550 --> 00:14:51,790
and it'll still be a human to
human contact, but the information
255
00:14:51,790 --> 00:14:55,090
gathering, comparison,
needs analysis, all of that stuff,
256
00:14:55,270 --> 00:14:58,120
you know, a lot of that what I
would just call grunt work.
257
00:14:58,270 --> 00:15:03,340
Yeah, I think 20, 27 that goes eye to
eye. So service goes eye to eye.
258
00:15:03,340 --> 00:15:07,600
And I think selling goes I to I which
is going to be kind of interesting.
259
00:15:07,930 --> 00:15:12,160 interesting. Yeah.
So we'll see how that goes.
260
00:15:12,160 --> 00:15:14,890
But I think that's, that's kind of
what we're preparing for right now.
261
00:15:15,130 --> 00:15:21,070
So, you know, if I extrapolate
this in my mind out to, okay,
262
00:15:21,070 --> 00:15:25,990
there are a lot of jobs or roles
inside these companies that are going
263
00:15:25,990 --> 00:15:28,930
to be what you might call flattened,
right where there are a lot of the
264
00:15:28,930 --> 00:15:32,200
time that they spent doing all
these things is going to go away.
265
00:15:32,200 --> 00:15:35,620
They're still going to be needed
for 20, 30, 40% of the work.
266
00:15:36,550 --> 00:15:39,130
And I was just talking to
somebody about this morning.
267
00:15:39,130 --> 00:15:43,360
It's like if you have if you
have true domain expertise in
268
00:15:43,360 --> 00:15:47,640
people inside your organization,
let's say CFO, CEO.
269
00:15:47,670 --> 00:15:52,470
They're C-suite leaders, VP's, depending on the size of
270
00:15:52,470 --> 00:15:56,550
nature of your organization,
then those people automatically
271
00:15:56,550 --> 00:16:00,090
have to become so much more
specialized in what they do.
272
00:16:00,090 --> 00:16:03,330
And and they're essentially
going to say, okay, well,
273
00:16:03,330 --> 00:16:06,180
I don't I'm not spending time on
any of this other stuff anymore.
274
00:16:06,180 --> 00:16:09,360
It's gone away over the course
of weeks or months.
275
00:16:09,360 --> 00:16:11,730
It's just going to happen.
So now what are they going to do
276
00:16:11,730 --> 00:16:15,420
with their time?
And my thought goes to okay, they'd
277
00:16:15,420 --> 00:16:21,960
go deeper into their specialty CFO,
SEO, their role inside the domain.
278
00:16:21,960 --> 00:16:28,530
They have expertise, long standing.
I think it starts to create a an
279
00:16:28,530 --> 00:16:32,700
environment that where the
visionary thrives.
280
00:16:32,700 --> 00:16:37,470
So if you have a visionary, that is.
That sees what's happening,
281
00:16:37,470 --> 00:16:41,310
adapts to it quickly because other
people are going to just be forced to
282
00:16:41,310 --> 00:16:45,120
adapt, then you're going to have,
you know, generational issues and
283
00:16:45,120 --> 00:16:48,900
there's lots of factors at play in.
But if you get the right person,
284
00:16:48,900 --> 00:16:52,170
that's a specialty in a domain.
They realize what's happening.
285
00:16:52,170 --> 00:16:56,520
They adapt to it.
That organization can can essentially
286
00:16:56,520 --> 00:17:02,630
be, the, the, the Starlink,
the Tesla type of organization.
287
00:17:02,630 --> 00:17:06,800
They can they can grow rapidly and
they create such a gap between
288
00:17:06,800 --> 00:17:10,280
them and their, their competitors.
So, Lee, I think I,
289
00:17:10,280 --> 00:17:12,830
I I've got an idea for what you
should call this conversation.
290
00:17:12,860 --> 00:17:16,790
You call this the light and the
dark of the future of AI?
291
00:17:16,790 --> 00:17:21,020
I think that, I love it because.
Because, yes, on the dark side is
292
00:17:21,620 --> 00:17:24,710
there are going to be some people
who are just swept away by the
293
00:17:24,710 --> 00:17:28,280
rapid amount of change that we're
going to see here, right? Yeah.
294
00:17:28,310 --> 00:17:32,720, and it's going to be scary, you know, even if the AI is
295
00:17:32,720 --> 00:17:35,150
totally benevolent and everybody
loves to talk about how it might not
296
00:17:35,150 --> 00:17:38,600
be, but I think it will be, and I
think it will still be terrifying
297
00:17:38,630 --> 00:17:43,220
to see how fast everything goes,
because that is the light side.
298
00:17:43,250 --> 00:17:46,840
The light side of this is it's
going to go so much faster than
299
00:17:46,840 --> 00:17:50,530
it's gone before things.
I mean, if you I was it was funny.
300
00:17:50,530 --> 00:17:53,110
I was just reading a,
some narrative about,
301
00:17:53,110 --> 00:17:56,590
Ulysses S Grant, and I was reading
the Ron Chernow book on this and just
302
00:17:56,590 --> 00:17:59,440
watching the pace of how things move.
You know,
303
00:17:59,860 --> 00:18:03,280
Ulysses S Grant is taking over,
and he's beginning the Civil War,
304
00:18:03,280 --> 00:18:06,460
and he's, you know, starting to march
down the Mississippi and just the
305
00:18:06,460 --> 00:18:09,760
ponderous speed at which everything
is going compared with now.
306
00:18:09,760 --> 00:18:12,490
And he's, of course,
lightning fast for the day. Yes.
307
00:18:12,490 --> 00:18:16,720
You know, you just you compare that
to now and it's, it's just hilarious.
308
00:18:16,720 --> 00:18:19,090
Like how it seems quaint,
you know that.
309
00:18:19,090 --> 00:18:21,370
And yet it's this horrendous
civil war, right.
310
00:18:21,580 --> 00:18:25,810, the thing is, is that we are
about to go into a phase where things
311
00:18:25,810 --> 00:18:30,340
can go so much faster that even
the things we did in 2015 or 2020
312
00:18:30,340 --> 00:18:34,540
will seem hilariously slow and,
and, and that's really the
313
00:18:34,540 --> 00:18:37,210
light side of this is we just
will go so much faster.
314
00:18:37,210 --> 00:18:39,790
And to your point about everyone
will become a visionary.
315
00:18:39,790 --> 00:18:44,170
I totally think that's true,
because what I've noticed with AI
316
00:18:44,200 --> 00:18:48,910
is it will until further notice.
I can't imagine how it flips the
317
00:18:48,910 --> 00:18:52,840
switch and becomes proactive and
self-directed in a way that's
318
00:18:52,840 --> 00:18:56,920
going to really matter to me, where, you know,
319
00:18:56,920 --> 00:18:59,430
I can get how any entity could
be self-directed for itself,
320
00:18:59,430 --> 00:19:02,280
but how can it be self-directed
for me? That part I can't see.
321
00:19:02,280 --> 00:19:05,100
So I think that what's going to
happen is, yes,
322
00:19:05,100 --> 00:19:09,330
most people are going to become far
more visionary than they are now.
323
00:19:09,330 --> 00:19:11,490
I think younger people will find
that easier.
324
00:19:11,490 --> 00:19:14,970
Older people will be flexing,
you know, pathways that they're
325
00:19:14,970 --> 00:19:18,060
not used to flexing.
But I do believe that all of us
326
00:19:18,060 --> 00:19:21,780
could be more visionary than we are.
And, and I think it will be
327
00:19:21,780 --> 00:19:25,320
a freeing feeling for most
people once they've gotten over
328
00:19:25,320 --> 00:19:27,930
the trepidation and fear.
And I think we're going to be
329
00:19:27,930 --> 00:19:31,530
that'll be the light side of
this is speed and vision. Yes.
330
00:19:31,770 --> 00:19:37,980
And I think that even though it's
it seems super clear, like what
331
00:19:37,980 --> 00:19:40,080
we're laying out here, right.
This thesis of what's going to
332
00:19:40,080 --> 00:19:45,030
happen, it seems super clear,
I think, that it still will have a
333
00:19:45,030 --> 00:19:49,860
lot of trepidation getting to that
clarity, because you have businesses
334
00:19:49,860 --> 00:19:54,900
that are still operating in 1990,
and somehow they're still in business
335
00:19:54,900 --> 00:19:58,950
and they still have customers. Right, and you have generational,
336
00:19:59,070 --> 00:20:01,860, type of thinking,
where someone comes out of high
337
00:20:01,860 --> 00:20:05,070
school and they don't like the way
it was done for the last 50 years,
338
00:20:05,070 --> 00:20:07,980
they're going to go opposite.
So you have that coming into the
339
00:20:07,980 --> 00:20:12,170
market,
seeing the upside and the downside.
340
00:20:12,170 --> 00:20:15,020
Like you were saying,
the dark and the light of I, there's
341
00:20:15,020 --> 00:20:18,170
a whole generation or generations of
people that are going to see the
342
00:20:18,170 --> 00:20:21,950
dark side or what they think is
the dark side, how they've been.
343
00:20:21,950 --> 00:20:25,160
He's either been propaganda to
them legitimately,
344
00:20:25,160 --> 00:20:28,760
or they propaganda themselves and
they're going to go away from it,
345
00:20:28,760 --> 00:20:33,380
you know, so it's it's it's this
sort of, convergence of,
346
00:20:33,380 --> 00:20:37,100
oceanic waters at the same time,
you know, it's not all going to just
347
00:20:37,100 --> 00:20:40,850
be up into the right or, you know,
a straight line from A to B.
348
00:20:40,850 --> 00:20:46,190
So I think there I think it has the
potential to do what we're saying
349
00:20:46,190 --> 00:20:51,110
it's going to do, but I think it's
going to have the the water so much
350
00:20:51,110 --> 00:20:55,430
more muddy getting there because
you have so many factors that play.
351
00:20:55,670 --> 00:21:00,460
It's not just a few factors that
have to change. I agree with that.
352
00:21:00,460 --> 00:21:04,540
And I you know, one thing that I, I heard not too long ago was
353
00:21:04,540 --> 00:21:10,210
the age that we are at really has
so much to do with our reaction
354
00:21:10,210 --> 00:21:13,420
to so how this technology is.
And the description I got is, look,
355
00:21:13,420 --> 00:21:16,720
anything that was invented between
the time you were born and the
356
00:21:16,720 --> 00:21:20,860
time you're like, about 14,
you just assume that's just like,
357
00:21:20,860 --> 00:21:23,860
that's just part of your life.
Like, you know, that's just as
358
00:21:23,860 --> 00:21:26,290
far as, like most kids are aware.
Like it's just been here forever.
359
00:21:26,290 --> 00:21:27,640
Like they didn't even realize it,
right?
360
00:21:27,880 --> 00:21:30,730, anything that gets invented
after you were about 14 years
361
00:21:30,730 --> 00:21:34,720
old till about 24 or 25,
it's kind of this exciting new
362
00:21:34,720 --> 00:21:38,230
technology full of opportunity.
Yeah, yeah. Oh, how can I seize this?
363
00:21:38,230 --> 00:21:42,610
I can be part of this.
Yeah, from 25 to about 40.
364
00:21:42,640 --> 00:21:45,190
It's kind of like this.
This is a curveball.
365
00:21:45,190 --> 00:21:47,080
It's like uncomfortable,
but it's okay.
366
00:21:47,650 --> 00:21:50,830
And and when you're over 40,
the in general the knee jerk
367
00:21:50,830 --> 00:21:53,620
reaction is this is something
that's interrupting my status quo.
368
00:21:53,620 --> 00:21:57,370
It needs to die right now,
you know? And it doesn't matter.
369
00:21:57,370 --> 00:22:01,030
You can go back through any portion
and realize that's everyone's knee
370
00:22:01,030 --> 00:22:04,780
jerk reaction, to anything.
So yes, I think as these
371
00:22:04,780 --> 00:22:07,840
sweeping changes occur,
we all have to be mindful.
372
00:22:07,840 --> 00:22:12,750
I am over 40 and I am realizing that,
like my initial reaction of like,
373
00:22:12,750 --> 00:22:15,510
shut this down and kill it because
it's interrupting the status quo,
374
00:22:15,510 --> 00:22:20,640
I have to resist that temptation
and and figure out how do I how
375
00:22:20,640 --> 00:22:23,160
do I go back to being in my
early 20s and think, oh,
376
00:22:23,160 --> 00:22:26,340
what is this exciting technology?
Even if it could be, you know,
377
00:22:26,340 --> 00:22:29,910
kind of disruptive to even, you know,
how I run my own company and.
378
00:22:30,030 --> 00:22:33,600
Yeah. And it it's a lot.
So yeah, I think the Gen X and
379
00:22:33,600 --> 00:22:37,530
millennials, it will depend on
how adaptable they are to this.
380
00:22:37,530 --> 00:22:40,950
And then for your Gen Z and
Alpha coming up,
381
00:22:40,950 --> 00:22:44,640
how excited they get about it.
And if they don't get as excited
382
00:22:44,640 --> 00:22:49,470
about it, it will die.
Because ultimately the market
383
00:22:49,470 --> 00:22:53,400
decides on whatever happens, so if they want, you know,
384
00:22:53,400 --> 00:22:55,890
it makes sense.
And a lot of people are on board and
385
00:22:56,160 --> 00:22:58,650
we're going to move forward with it.
It really doesn't matter how
386
00:22:58,650 --> 00:23:01,620
much money is money is behind it
or even how much, you know,
387
00:23:01,620 --> 00:23:04,080
political clout is behind it.
If it's not going to happen,
388
00:23:04,080 --> 00:23:06,810
it's not going to happen.
So that's still out there.
389
00:23:06,810 --> 00:23:11,310
They're still there. Yeah, yeah.
But you know, I think the general
390
00:23:11,310 --> 00:23:14,820
direction is anything that
lowers cost usually wins. Yes.
391
00:23:14,820 --> 00:23:18,210
And you're right about that because.
The time savings and the cost
392
00:23:18,210 --> 00:23:22,320
savings that are coming our way are
so astronomical that, you know,
393
00:23:22,320 --> 00:23:26,180
they're going to win. Yes. We'll see.
I think the opposite of this
394
00:23:26,180 --> 00:23:29,660
conversation is, you know,
the human element,
395
00:23:29,660 --> 00:23:34,310
because a lot of what we're talking
about is becoming more specialized,
396
00:23:34,310 --> 00:23:38,150, doing less of, like,
if we went from paper to excel,
397
00:23:38,150 --> 00:23:41,450
you know, that's a big shift.
And you could do tremendously
398
00:23:41,450 --> 00:23:44,390
more with Excel, and you could
on paper or the calculator.
399
00:23:44,930 --> 00:23:49,370, and I think it's similar to that.
it's not the same, but,
400
00:23:49,370 --> 00:23:52,520
potentially exponential shift
versus incremental.
401
00:23:53,030 --> 00:23:56,510, but the mentality is still there.
The mentality is still the same
402
00:23:56,510 --> 00:24:01,160
of that shift has to happen, and I think that the human side
403
00:24:01,160 --> 00:24:05,210
of it still is in play. Right?
So as a marketer, you know,
404
00:24:05,210 --> 00:24:10,100
in having content marketing agency,
you know, I the shift I see happening
405
00:24:10,100 --> 00:24:16,570
there in real time is the shift
to more personalized content,
406
00:24:16,570 --> 00:24:21,850
more community oriented, where, you know, even B2B brands
407
00:24:21,850 --> 00:24:26,260
are trying to create communities
around their what they do and
408
00:24:26,260 --> 00:24:30,550
add value to their customer
before they become a customer.
409
00:24:30,550 --> 00:24:35,170
There's a lot more value that has to
be added up the chain versus right
410
00:24:35,170 --> 00:24:39,940
at the point of sale and beyond, you know, people want to feel
411
00:24:39,940 --> 00:24:41,770
like they're a part of something.
They want to.
412
00:24:42,160 --> 00:24:46,000
Even to the point of, you know, does
this mesh with who I am as a person?
413
00:24:46,000 --> 00:24:49,720
That's how I'm making my decisions.
So there's a huge, huge,
414
00:24:49,720 --> 00:24:54,520
human element that's happening
in parallel to this technical,
415
00:24:55,120 --> 00:24:59,530
you know, hyper efficient element
that's kind of machine driven.
416
00:25:00,220 --> 00:25:04,060, that makes it super
interesting for me. Yeah.
417
00:25:04,060 --> 00:25:07,720
And, you know, there was this
study that was done on physicians
418
00:25:07,960 --> 00:25:12,280
with AI not too long ago.
And what they found was that
419
00:25:12,280 --> 00:25:16,840
when physicians were diagnosing
patients with problems,
420
00:25:16,840 --> 00:25:20,770
and they compared their diagnosis
versus AIS diagnosis, guess what?
421
00:25:20,770 --> 00:25:23,380
They actually they held up about
422
00:25:23,470 --> 00:25:24,930
You know,
they were they were right online.
423
00:25:24,930 --> 00:25:28,020
And so we doing just fine there.
You know,
424
00:25:28,020 --> 00:25:31,770
what's really interesting is they
tested bedside manner through,
425
00:25:31,770 --> 00:25:35,910
you know, telemedicine and,
without the doctor being present.
426
00:25:35,910 --> 00:25:42,000
And the eyes were way better bedside
manner than the physicians because,
427
00:25:42,000 --> 00:25:45,510
you know, they just didn't have that.
They never got tired. Right?
428
00:25:45,510 --> 00:25:47,130
They never they never dealt with
any of this thing.
429
00:25:47,130 --> 00:25:49,500
So when you think about the human
element of this, I think the one
430
00:25:49,500 --> 00:25:53,820
thing that softens the blow for
all of us is that no matter how
431
00:25:53,820 --> 00:25:57,720
frustrated or angry you are or,
you know, mad at AI, you're going
432
00:25:57,720 --> 00:26:00,600
to get some benefit from it.
And it pretty much is indestructible
433
00:26:00,600 --> 00:26:04,560
in terms of its, yes, willingness to.
It's never going to, you know, recoil
434
00:26:04,560 --> 00:26:07,800
from you can't. Hurt his feelings.
You cannot hurt its feelings, you
435
00:26:07,800 --> 00:26:10,890
know, and and I think that's really
funny. I, you know, it's funny.
436
00:26:10,890 --> 00:26:14,580
I wear one of these,
wearable, health devices and
437
00:26:14,580 --> 00:26:18,930
it's got an AI component now,
where I can say, hey, how am I doing?
438
00:26:18,930 --> 00:26:20,880
What's, you know,
am I working out enough and whatnot?
439
00:26:20,880 --> 00:26:23,100
And sometimes the information it
comes back with,
440
00:26:23,100 --> 00:26:25,680
I am not happy with it.
I've been a little salty and
441
00:26:25,680 --> 00:26:27,690
it's just amazing to see it just
snap back.
442
00:26:27,690 --> 00:26:31,260
It's still my cheerful little coach,
you know, and it's, you know,
443
00:26:31,260 --> 00:26:34,170
telling me that I just I can still do
it, and I should listen to my body.
444
00:26:34,170 --> 00:26:38,120
And he's like, absolutely.
I can help you with that. Yeah.
445
00:26:38,390 --> 00:26:40,880
You can do it, David.
And I'm like, no, I can't stop.
446
00:26:40,910 --> 00:26:44,960
Leave me alone. So.
Well, you know,
447
00:26:44,960 --> 00:26:48,440
this conversation could go, you know,
probably five different directions
448
00:26:48,440 --> 00:26:52,430
from here, but I think I think what,
you're just to sit on this for a
449
00:26:52,430 --> 00:26:55,400
minute, the the human element.
You're right about that.
450
00:26:55,400 --> 00:27:02,360
Like the efficiency factor and even
the effectiveness factor is huge.
451
00:27:02,750 --> 00:27:06,350, for for AI coming into all
of these outer worlds.
452
00:27:06,770 --> 00:27:11,870, and the and it will replace
the human element to some degree.
453
00:27:11,870 --> 00:27:14,660
There's a,
there's a fast food restaurant that I
454
00:27:14,660 --> 00:27:16,160
go through every once in a while,
455
00:27:16,160 --> 00:27:19,610
just because it's just dead simple.
It's like right at the end of my
456
00:27:19,610 --> 00:27:21,680
street, I won't name their name
457
00:27:21,680 --> 00:27:23,360
because I've reached out to them,
try to work with them, and they
458
00:27:23,360 --> 00:27:25,600
haven't responded. I don't like that.
I want to work with them and
459
00:27:25,600 --> 00:27:29,170
help them, it's not it's not chick fil A,
460
00:27:29,200 --> 00:27:31,630
because chick fil A is amazing.
And we actually do work with chick
461
00:27:31,630 --> 00:27:35,260
fil A and, you know, but it's
it's it's another one, right.
462
00:27:35,500 --> 00:27:39,850
And every blue moon I'll go through
there when it's like nothing,
463
00:27:39,970 --> 00:27:42,820
nothing in the refrigerator.
Like we just have nothing like
464
00:27:42,820 --> 00:27:45,310
the kids are they.
We've had pizza three nights in
465
00:27:45,310 --> 00:27:46,810
a row.
Let's just go down there and
466
00:27:46,810 --> 00:27:49,540
grab some burgers.
Well, they started implementing AI
467
00:27:49,540 --> 00:27:54,190
at their, at the little,
you know, radio thing where you pull
468
00:27:54,190 --> 00:27:58,840
up the. And it takes your order.
This was like a year ago. Maybe.
469
00:27:58,840 --> 00:28:02,800
Maybe longer.
I pull up and it starts to take
470
00:28:02,800 --> 00:28:04,270
my order.
Tell me what you want,
471
00:28:04,270 --> 00:28:06,520
and I can, you know,
I can speak in complete sentences.
472
00:28:06,520 --> 00:28:09,940
It kind of tells you the whole thing.
I was like, no way.
473
00:28:09,940 --> 00:28:13,720
I'm not doing this at all.
And so I just I didn't even give
474
00:28:13,720 --> 00:28:15,850
it my order.
I just drove up to the window
475
00:28:15,850 --> 00:28:19,000
and I said, really?
You're using I now like or
476
00:28:19,000 --> 00:28:21,280
whatever this automated bot,
they're like, yeah,
477
00:28:21,280 --> 00:28:24,160
that's what they're having us do.
And I said, okay, well, I'm not
478
00:28:24,160 --> 00:28:27,760
doing that. I'll give you my order.
And and she's like, okay,
479
00:28:27,790 --> 00:28:30,160
takes my order.
So, you know, the next blue moon
480
00:28:30,160 --> 00:28:33,100
comes around and we drive up
there for hamburgers and I go up
481
00:28:33,100 --> 00:28:35,350
and and it's still there.
They're still using it.
482
00:28:35,350 --> 00:28:37,260
I thought they would have gotten
rid of it.
483
00:28:37,410 --> 00:28:40,260
And I was like, okay, whatever,
I'm going to I'm going to give it
484
00:28:40,260 --> 00:28:43,320
the hardest order that I can think
of and see if it gets it right.
485
00:28:43,500 --> 00:28:46,590
Well, it got it perfectly right.
And at the end I said,
486
00:28:46,590 --> 00:28:49,620
can you tell me back my order to
make sure I got everything right?
487
00:28:49,620 --> 00:28:52,830
It's hold it back perfectly.
And I was sold.
488
00:28:52,830 --> 00:28:58,110
I was like, wow, this is crazy.
So every time I go through there,
489
00:28:58,110 --> 00:29:00,840
I'm now talking about the human
element of it.
490
00:29:00,840 --> 00:29:06,630
I'm actually kind of excited to,
to to talk to this thing.
491
00:29:06,630 --> 00:29:09,810
Like what? What is gone?
What is happening. Right.
492
00:29:10,020 --> 00:29:13,440
So I think that there there is an
aspect to it that is for real,
493
00:29:13,440 --> 00:29:15,840
because I've experienced it.
And I think that's even more and
494
00:29:15,840 --> 00:29:19,290
more is going to come,
and we'll enjoy certain parts of
495
00:29:19,290 --> 00:29:23,400
customer service at the Walmarts of
the world or wherever, that we don't
496
00:29:23,400 --> 00:29:26,790
have to interact with the human who
doesn't want to be there, right?
497
00:29:26,820 --> 00:29:28,980
But then on the other side,
I think that there's there's a
498
00:29:28,980 --> 00:29:33,240
human element that won't ever go
away because everyone like Covid
499
00:29:33,240 --> 00:29:35,910
is the perfect example.
We go through Covid and,
500
00:29:35,910 --> 00:29:38,730
you know, three months in,
we're all thinking, we're all going
501
00:29:38,730 --> 00:29:42,360
to stop shaking hands forever.
Like, because we just don't do
502
00:29:42,360 --> 00:29:43,860
that anymore.
Like, no,
503
00:29:43,860 --> 00:29:47,100
we're going to still shake hands with
people because this is what we do.
504
00:29:47,100 --> 00:29:52,430
This is a colloquialism at worst.
Like this is this is what we do.
505
00:29:52,430 --> 00:29:54,500
We we want to connect with people.
We want to hug people,
506
00:29:54,500 --> 00:29:57,410
want to touch people and tell me
something that I don't know.
507
00:29:57,410 --> 00:29:59,570
And I want to be in your world
and you be in mine.
508
00:29:59,570 --> 00:30:03,260
So I think there's that element
of it that's going to stay
509
00:30:03,290 --> 00:30:06,170
heavily in business.
And in fact, going back to the
510
00:30:06,170 --> 00:30:10,520
marketing side, building communities,
I think is going to become even
511
00:30:10,520 --> 00:30:13,670
more so, because if you look at
these younger generations,
512
00:30:13,910 --> 00:30:16,370
that's what they're looking for,
is they're looking for getting
513
00:30:16,370 --> 00:30:20,540
off of even my kids are even in
Gen Z and Alpha. Like looking.
514
00:30:20,540 --> 00:30:22,760
Get off the device and go
connect with their friends.
515
00:30:23,090 --> 00:30:27,170
Go play pickleball, right, so it's like this weird
516
00:30:27,170 --> 00:30:32,480
mashup that's coming. Yeah.
And, you know, so I have a
517
00:30:32,480 --> 00:30:35,990
couple of thoughts on that.
So, so one is, is that your adoption
518
00:30:35,990 --> 00:30:41,770
of the, the drive through AI
is really interesting, right?
519
00:30:41,770 --> 00:30:46,900
It is a microcosm of what's going to
happen as as we adopt this because
520
00:30:47,230 --> 00:30:50,350
you went from fear and rejection,
right?
521
00:30:50,440 --> 00:30:56,050
That was your opening move to a
skeptical try to confidence
522
00:30:56,050 --> 00:30:59,710
completely. I'm all right now.
Now it's all it's what you want.
523
00:30:59,710 --> 00:31:01,900
Yeah.
So I am I promise I won't get too
524
00:31:01,900 --> 00:31:04,300
political or controversial in
your show, but I'm thinking like,
525
00:31:04,300 --> 00:31:07,720
how long before I. President. Right.
Yeah. Here's here's the thing.
526
00:31:07,720 --> 00:31:10,090
Like, we're all decent.
I, I'm not going to say anything
527
00:31:10,090 --> 00:31:12,580
about either candidate,
but most people are dissatisfied
528
00:31:12,580 --> 00:31:15,580
with their choices. Yes.
And I was thinking about this
529
00:31:15,580 --> 00:31:16,960
and I was joking with my son.
I'm like,
530
00:31:16,960 --> 00:31:19,450
what would an AI president be like?
And then I realized, well,
531
00:31:19,450 --> 00:31:22,090
you know what that would be like.
Let's suppose we trained an AI
532
00:31:22,120 --> 00:31:25,120
to be like Lincoln, right?
And grab some FDR and some Washington
533
00:31:25,120 --> 00:31:28,750
and all your favorites and throw
them all into the AI soup and say,
534
00:31:28,750 --> 00:31:32,020
here's our president, you could talk to him every day,
535
00:31:32,020 --> 00:31:34,420
right? You.
You and I could each have a
536
00:31:34,420 --> 00:31:37,150
conversation person.
Every person could think of the
537
00:31:37,150 --> 00:31:39,940
democracy of that.
Yeah, they could be synthesizing.
538
00:31:39,940 --> 00:31:41,710
Hey, you know. Thanks, Lee.
That's a great idea.
539
00:31:41,710 --> 00:31:43,660
I'll think about it, you know?
And, you know,
540
00:31:43,660 --> 00:31:46,540
bedside manner would be wonderful.
And and I'm like, hey,
541
00:31:46,540 --> 00:31:49,810
I president's not as scary.
So yes, I go, we're all at the fear
542
00:31:49,840 --> 00:31:52,590
stage of that right now. Right.
What the nuclear codes. Oh no.
543
00:31:52,590 --> 00:31:56,070
You know, but,
I kind of think that, you know,
544
00:31:56,250 --> 00:31:59,340
how long is that going to take?
Is that a 50 year thing? Is it a.
545
00:31:59,580 --> 00:32:03,030
I think we should get hats printed
up that say, ChatGPT 2024.
546
00:32:03,060 --> 00:32:07,140
Yeah, yeah. For president. Yeah.
Probably sell for you. Yeah.
547
00:32:07,140 --> 00:32:10,260
So so we'll see how long that,
that takes to, to go.
548
00:32:10,260 --> 00:32:13,320
But, but you know, the, the, you know,
549
00:32:13,320 --> 00:32:15,660
a couple of the other things that
happen about this is, you know,
550
00:32:15,660 --> 00:32:19,740
the one thing about technologies that
I think is the best rule of thumb,
551
00:32:20,010 --> 00:32:22,980, came from the Siem Taleb,
who has this rule of thumb that says
552
00:32:22,980 --> 00:32:26,340
you should assume that every book,
every idea,
553
00:32:26,340 --> 00:32:31,380
every technology is about halfway
through its usefulness in life cycle.
554
00:32:31,380 --> 00:32:33,570
And here's what I mean by that.
Like, we all have chairs.
555
00:32:33,570 --> 00:32:36,660
How long have chairs been around?
I don't know, 50,000 years.
556
00:32:36,660 --> 00:32:38,550
I mean, they've been around
probably since we could sit
557
00:32:38,550 --> 00:32:41,430
around a campfire. Right after.
Rocks. Right after rocks.
558
00:32:41,430 --> 00:32:43,860
It's chairs, you know, so we're
probably halfway through that.
559
00:32:43,860 --> 00:32:46,200
We're probably going to have
chairs for the next 50,000 years.
560
00:32:46,200 --> 00:32:49,110
It's probably rolled them a book
that came out last year.
561
00:32:49,110 --> 00:32:51,900
Most books that that have come out,
they have about, you know,
562
00:32:51,900 --> 00:32:54,900
they're at about the halfway point.
So any book that people are talking
563
00:32:54,900 --> 00:32:56,910
about last year, it's probably
done by next year, then that's.
564
00:32:57,030 --> 00:33:01,140
Literally where we get the phrase
it has a shelf life. Right? Yeah.
565
00:33:01,140 --> 00:33:04,550
You know, and and so, you know,
everything that we're talking
566
00:33:04,550 --> 00:33:06,950
about is new.
So you got to realize that a lot of
567
00:33:06,950 --> 00:33:09,140
this stuff is going to come and drop,
come and drop, and it's going to
568
00:33:09,140 --> 00:33:12,110
get replaced by something else.
And so hopefully when people are
569
00:33:12,110 --> 00:33:14,720
snickering about this conversation,
if they're listening to it in ten
570
00:33:14,720 --> 00:33:17,660
years and they're like, how wrong
were these two fools, you know.
571
00:33:17,660 --> 00:33:20,690
Yeah, it's because we're yeah,
we're aware that, like,
572
00:33:20,690 --> 00:33:23,630
it's going to shift and change
in so many unimaginable ways.
573
00:33:23,990 --> 00:33:27,710, but yeah, I think to the,
to the better is what I'm thinking.
574
00:33:27,710 --> 00:33:29,690
But yes, a lot of these things
will come and go.
575
00:33:29,690 --> 00:33:32,180
We'll, we'll we'll see how long, you know,
576
00:33:32,180 --> 00:33:37,010
private jets go before they get
replaced by something else. So. Yeah.
577
00:33:37,010 --> 00:33:40,880
Well, I think that I, for president
is a kind of interesting idea.
578
00:33:40,880 --> 00:33:44,030
I don't think we could ever get
there because none of us could
579
00:33:44,030 --> 00:33:48,620
ever agree on what that president,
the ideal president, would look like.
580
00:33:48,620 --> 00:33:51,340
Like, so we would never even get
to the standard of what we feed
581
00:33:51,340 --> 00:33:54,340
it to make it.
Well, imagine if our democracy
582
00:33:54,340 --> 00:33:57,700
was around voting for which
philosophy we wanted. Right?
583
00:33:57,700 --> 00:34:00,520
Do you want something that's a little
bit more Athenian and a little
584
00:34:00,520 --> 00:34:03,610
bit more, you know, Aristotle,
or do you want something platonic
585
00:34:03,610 --> 00:34:06,010
and a little bit more touchy feely?
And, I don't know, you could
586
00:34:06,010 --> 00:34:08,890
probably think about some different
philosophies around the direction
587
00:34:08,890 --> 00:34:11,920
we'd want the country to go.
And then that just comes back to
588
00:34:11,920 --> 00:34:15,400
where the question that we have
to answer anyways, and that is
589
00:34:15,400 --> 00:34:19,210
who controls the AI. Exactly.
Because we never puts whoever
590
00:34:19,210 --> 00:34:23,170
controls it wins. Yep, yep.
That's right.
591
00:34:23,200 --> 00:34:27,370
So that's the dark side.
The very dark side. Yes.
592
00:34:27,640 --> 00:34:31,990
So yes, I'm not ready to surrender
my life to an overlord just yet.
593
00:34:34,000 --> 00:34:37,630
I don't know, I mean, given my,
you know, fast food experience,
594
00:34:37,630 --> 00:34:40,360
we might have fear at first and,
you know, trepidation.
595
00:34:40,360 --> 00:34:42,640
And then we'll be all, hey,
take it. Yeah. You know.
596
00:34:42,640 --> 00:34:44,920
And then, you know, the funny
thing is, is 100 years from now,
597
00:34:44,920 --> 00:34:47,650
they couldn't imagine the opposite
of, like, you trusted a human
598
00:34:47,650 --> 00:34:50,110
being with the nuclear codes.
Like, what is the human being?
599
00:34:50,110 --> 00:34:52,690
A human being is so fallible.
Like, they could just get mad
600
00:34:52,690 --> 00:34:55,960
one day and nuke everyone.
And so, you know, much better to
601
00:34:55,960 --> 00:35:00,280
have a stable AI, you know?
So we'll see. Amazing, okay.
602
00:35:00,280 --> 00:35:02,830
Well, we're we're kind of getting
to the end of the time frame I
603
00:35:02,850 --> 00:35:07,470
usually have for, these episodes.
So we'll kind of let's ramp that
604
00:35:07,470 --> 00:35:10,050
conversation down,
and I want to just quickly touch on.
605
00:35:10,050 --> 00:35:14,370
There's something else that you're
out in Austin, Texas, and,
606
00:35:14,370 --> 00:35:17,160
for a lot of great things about
Austin and what people are doing
607
00:35:17,160 --> 00:35:20,820
out there, and you are a part of an
organization that you just stepped
608
00:35:20,820 --> 00:35:24,960
down as president from, which is
Entrepreneur's Organization, I think.
609
00:35:24,960 --> 00:35:29,130
Tell us a little bit about that.
Yeah. So EO is a global organization.
610
00:35:29,130 --> 00:35:32,730
So there's actually a chapter in
the city near Uli. Cool. All right.
611
00:35:32,730 --> 00:35:36,690
I didn't know that. Yeah. Yeah.
And so, each chapter it's a
612
00:35:36,690 --> 00:35:39,660
very fractal organization.
So the global organization is
613
00:35:39,660 --> 00:35:43,200
broken down into citywide chapters.
I was the president of the Austin
614
00:35:43,200 --> 00:35:47,790
chapter, which is about 235
entrepreneurs in Austin, Texas.
615
00:35:48,120 --> 00:35:50,580, and I've got to know all the
presidents of all the other
616
00:35:50,580 --> 00:35:54,000
chapters in Detroit and Chicago
and Phoenix and L.A. and so forth.
617
00:35:54,270 --> 00:35:57,720, and really, the mission of
the organization is to unlock the
618
00:35:57,720 --> 00:36:02,310
potential of every entrepreneur.
So what we do, we have all kinds
619
00:36:02,310 --> 00:36:05,220
of programs that really are
meant to help entrepreneurs deal
620
00:36:05,220 --> 00:36:08,940
with the three things that they
will get in their way. Right.
621
00:36:08,940 --> 00:36:11,940
And the three things that will get in
their way. Number one of themselves.
622
00:36:12,120 --> 00:36:15,270
And so trying to. Number one.
Deal with. Yeah.
623
00:36:15,270 --> 00:36:19,580
Like your fear and anxiety, your
change, your your leadership foibles,
624
00:36:19,580 --> 00:36:23,270
all of the flaws that we all have as
individuals. That's the first thing.
625
00:36:23,270 --> 00:36:27,020
The second thing is trying to be
an entrepreneur and have a family.
626
00:36:27,020 --> 00:36:31,520
Right? Which is also so difficult.
It doesn't matter if you're the
627
00:36:31,520 --> 00:36:34,160
man or the woman or what kind of
relationship you have.
628
00:36:34,310 --> 00:36:36,860
If you bring entrepreneurship
into a family, you are going to
629
00:36:36,860 --> 00:36:40,400
have stresses in that 100%.
And and so that's the second
630
00:36:40,400 --> 00:36:42,290
thing that we deal with.
And then the third thing is the
631
00:36:42,290 --> 00:36:44,540
nuts and bolts of actually
running that darn business.
632
00:36:44,720 --> 00:36:48,440
And so how do you deal with like
AI and all sorts of stuff.
633
00:36:48,440 --> 00:36:53,930
In fact, you know, my AI ramp up
involved a lot of me reaching out
634
00:36:53,930 --> 00:36:56,540
to other entrepreneurs through the
network or going to different,
635
00:36:56,540 --> 00:36:58,340
different sessions.
We had a guy named Scott
636
00:36:58,340 --> 00:37:02,420
Everingham come in last September
and gave a signature talk to us,
637
00:37:02,420 --> 00:37:06,580
and what he did is he sat us all
down and in 90 minutes he gave us
638
00:37:06,580 --> 00:37:12,460
the prompts so that we could go
into ChatGPT and get a $90,000
639
00:37:12,460 --> 00:37:16,270
marketing plan. By the time we left.
And so in 90 minutes,
640
00:37:16,270 --> 00:37:19,000
we all had all of our personas,
we had all of our segments,
641
00:37:19,000 --> 00:37:22,480
we had marketing strategies,
we had emails, we had everything.
642
00:37:22,480 --> 00:37:24,490
Which is what gave me the idea
for that product I mentioned
643
00:37:24,490 --> 00:37:28,120
earlier in our conversation.
And it all came from that.
644
00:37:28,150 --> 00:37:31,030
Hanging out with Scott for 90
minutes. And it was it was amazing.
645
00:37:31,030 --> 00:37:33,400
So that's awesome.
So all of our programs are
646
00:37:33,400 --> 00:37:35,770
designed to to do that.
Some of them are very confidential
647
00:37:35,770 --> 00:37:38,350
and private and very small groups.
So we get people together in groups
648
00:37:38,350 --> 00:37:41,530
of like 6 to 8, where they can deal
with some of those personal things
649
00:37:41,530 --> 00:37:43,690
that you're really not willing to
talk about with everyone else,
650
00:37:43,690 --> 00:37:46,450
and other things are wider,
and some of them are even global.
651
00:37:46,450 --> 00:37:48,880
So, you know,
there's been a lot of stuff.
652
00:37:48,880 --> 00:37:51,250
I've met all kinds of crazy,
crazy people.
653
00:37:51,250 --> 00:37:53,620
And last year I had to sit down
and had lunch with,
654
00:37:53,620 --> 00:37:56,890
Vicente Fox, the former president
of Mexico came and visited us.
655
00:37:56,890 --> 00:37:59,560
That was great. That's cool.
I got got to,
656
00:37:59,560 --> 00:38:03,130
pass a basketball with Magic Johnson.
That was really fun, you know,
657
00:38:03,130 --> 00:38:05,650
and, talk to him about all
these investments and all the
658
00:38:05,650 --> 00:38:07,990
things he's doing and several
other people on the way.
659
00:38:07,990 --> 00:38:10,090
So lots of once in a lifetime
experiences,
660
00:38:10,090 --> 00:38:13,360
but also lots of personal growth
and education along the way.
661
00:38:13,360 --> 00:38:16,620
It's a great community and a lot
of fun. That's awesome.
662
00:38:16,620 --> 00:38:18,690
So you're stepping down as
president now what?
663
00:38:19,080 --> 00:38:21,360
How are you going to fill that time?
Because, you know, you have to write.
664
00:38:21,450 --> 00:38:23,310
Well,
that's a that's a great question.
665
00:38:23,310 --> 00:38:25,470
So, you know,
the great thing about OEO is it's
666
00:38:25,470 --> 00:38:29,310
an all volunteer organization.
So, so my time was our fiscal
667
00:38:29,310 --> 00:38:33,330
year from July 1st to June 30th.
So I wasn't, I, you know, did
668
00:38:33,330 --> 00:38:36,210
my one year. And it was wonderful.
I had a great experience and
669
00:38:36,210 --> 00:38:39,510
hand it off to a really terrific
person for the for the next year.
670
00:38:39,690 --> 00:38:42,090, but you know,
what's really interesting is EO
671
00:38:42,090 --> 00:38:45,930
has a path of leadership.
And so now that I've run a local
672
00:38:45,930 --> 00:38:48,450
chapter, you know,
there's opportunities to work with
673
00:38:48,450 --> 00:38:51,450
the region and then ultimately
with global. So that's one avenue.
674
00:38:51,450 --> 00:38:53,640
But the avenue I'm going,
at least in this next year is I'm
675
00:38:53,640 --> 00:38:56,910
going to be launching a new company, with my copious free time.
676
00:38:56,910 --> 00:39:00,270
So there'll be motive and then, you know, heard it here
677
00:39:00,270 --> 00:39:03,510
first on your podcast lead.
Yeah. Launching content.
678
00:39:03,510 --> 00:39:10,080
Lion, a software company designed to
help you create AI driven content and
679
00:39:10,080 --> 00:39:14,250
to take your content that you have
and, put in AI searching and
680
00:39:14,250 --> 00:39:17,940
tagging so that you can disseminate
it everywhere where it belongs.
681
00:39:17,940 --> 00:39:21,450
So Content Lion is, is going
to be out by the end of the year.
682
00:39:21,450 --> 00:39:23,550
So according to our plans,
that's awesome.
683
00:39:23,550 --> 00:39:25,950
Well, you know,
once you get to the point of
684
00:39:25,950 --> 00:39:28,820
launching or pre-launch or whatever,
we'll have to have you back on and
685
00:39:28,820 --> 00:39:34,190
talk about that and dig into it.
Awesome. Count on it. Well, cool.
686
00:39:34,190 --> 00:39:36,470
This has been a great
conversation as I knew it would.
687
00:39:36,470 --> 00:39:39,920
And, I definitely want to have
you back. We talk about that.
688
00:39:39,920 --> 00:39:42,020
We talk about lingo.
I think, you know,
689
00:39:42,020 --> 00:39:45,140
as this evolves, there's there's
several guests I've had on that.
690
00:39:45,140 --> 00:39:48,530
We've got into this conversation
or some surrounding it, and we've
691
00:39:48,530 --> 00:39:51,350
got to have repeat discussions to
keep up with what's happening.
692
00:39:52,670 --> 00:39:56,120
Oh, absolutely. Yeah.
This, I think, is an unfolding
693
00:39:56,120 --> 00:39:59,240
day by day, week by week.
And, it's going to be wild
694
00:39:59,240 --> 00:40:01,130
to see what happens over the
next couple of years.
695
00:40:01,130 --> 00:40:05,090
So the the thing, the only thing I'm
sure of is that we will look back
696
00:40:05,090 --> 00:40:08,240
and laugh at this conversation in a
very short period of time because
697
00:40:08,240 --> 00:40:12,950
it's changing so fast. Yes it is.
So if I want to send people your way,
698
00:40:12,950 --> 00:40:17,170
where do I send them, David G ewing.com is,
699
00:40:17,170 --> 00:40:19,450
is my main site.
I have a newsletter there that I
700
00:40:19,450 --> 00:40:22,030
send out to people about wondrous
things that are going on in
701
00:40:22,030 --> 00:40:25,630
customer experience and I and my
other thoughts on on leadership.
702
00:40:25,630 --> 00:40:28,420
So that's great, also on LinkedIn,
703
00:40:28,420 --> 00:40:31,810
always love to link in with people.
So, David G you can find me
704
00:40:31,810 --> 00:40:34,690
on LinkedIn and it's,
in forward slash David G. Ewing.
705
00:40:34,780 --> 00:40:37,420
Love it. Hey thanks again.
This has been great.
706
00:40:37,420 --> 00:40:40,360
And we'll definitely have you back.
Great. Thanks, Lee.
707
00:40:40,360 --> 00:40:41,320
It's been a pleasure