WEBVTT
1
00:00:00.080 --> 00:00:02.279
The topics and opinions expressed on the following show are
2
00:00:02.319 --> 00:00:04.200
solely those of the hosts and their guests and not
3
00:00:04.240 --> 00:00:07.200
those of W four WN Radio. It's employees are affiliates.
4
00:00:07.280 --> 00:00:10.880
We make no recommendations or endorsements for radio show programs, services,
5
00:00:10.919 --> 00:00:13.960
or products mentioned on air or on our web. No liability,
6
00:00:14.080 --> 00:00:17.320
explicit or implied shall be extended to W four WN Radio.
7
00:00:17.440 --> 00:00:20.160
It's employees are affiliates. Any questions or common should be
8
00:00:20.199 --> 00:00:22.519
directed to those show hosts. Thank you for choosing W
9
00:00:22.640 --> 00:00:23.719
four WN Radio.
10
00:00:25.679 --> 00:00:29.120
This is Beyond Confidence with your host, DV Park. Do
11
00:00:29.160 --> 00:00:31.280
you want to live a more fulfilling life? Do you
12
00:00:31.320 --> 00:00:34.200
want to live your legacy and achieve your personal, professional,
13
00:00:34.320 --> 00:00:35.479
and financial goals?
14
00:00:35.759 --> 00:00:36.079
Well?
15
00:00:36.240 --> 00:00:39.000
Coming up on ZVO parks Beyond Confidence, you will hear
16
00:00:39.079 --> 00:00:42.880
real stories of leaders, entrepreneurs, and achievers who have steps
17
00:00:42.880 --> 00:00:46.320
into discomfort, shattered their status quo, and are living the
18
00:00:46.359 --> 00:00:49.079
life they want. You will learn how relationships are the
19
00:00:49.159 --> 00:00:52.799
key to achieving your aspirations and financial goals. Moving your
20
00:00:52.799 --> 00:00:55.240
career business forward does not have to happen at the
21
00:00:55.280 --> 00:00:58.520
expense of your personal or family life or vice versa.
22
00:00:58.679 --> 00:01:02.320
Learn more at www dot Gvpark dot com and you
23
00:01:02.320 --> 00:01:06.239
can connect with v ants contact dance gvpark dot com.
24
00:01:06.560 --> 00:01:10.359
This is beyond confidence and now here's your host, giv Park.
25
00:01:11.840 --> 00:01:15.239
Good morning listeners, It's Tuesday. I'm so thrilled to be
26
00:01:15.319 --> 00:01:18.519
here with you all. So today we are going to
27
00:01:18.560 --> 00:01:23.519
be talking about how we can scale with AI. How
28
00:01:23.560 --> 00:01:27.680
can leaders bring in AI into the organizations. If you're
29
00:01:27.680 --> 00:01:30.680
an entrepreneurial, how can you bring AI into your lives?
30
00:01:31.400 --> 00:01:35.920
How can you dance with AI without losing your soul?
31
00:01:36.239 --> 00:01:39.359
Because the key thing to remember is that at the
32
00:01:39.400 --> 00:01:45.280
intersection of AI and business, it is very important to
33
00:01:45.400 --> 00:01:51.959
have humans at the home because without humans, without us,
34
00:01:52.959 --> 00:02:00.599
AI is not anything. So as you scale or as
35
00:02:00.680 --> 00:02:06.519
you adopt, really crucial to have that in mind. So
36
00:02:06.680 --> 00:02:09.960
first of all, let's tackle and talk about the feelings
37
00:02:09.960 --> 00:02:14.960
and emotions that nobody talks about. Because when you're going
38
00:02:15.039 --> 00:02:20.400
for a adoption, people are going to be thinking yay,
39
00:02:20.719 --> 00:02:25.000
especially early adopters. There's a lot of excitement, enthusiasm, what
40
00:02:25.199 --> 00:02:29.039
more can be done? This could change everything? Then there
41
00:02:29.080 --> 00:02:35.479
could be curiosity what's possible here? Because in twenty twenty
42
00:02:35.520 --> 00:02:40.800
two when open AI came in, people were just kind
43
00:02:40.800 --> 00:02:46.360
of getting to know and like thinking, h this is
44
00:02:46.400 --> 00:02:50.280
something new. But I want you to travel back a
45
00:02:50.319 --> 00:02:54.080
few years back. You know, we did not have GPS.
46
00:02:54.479 --> 00:02:58.199
We had those maps, and I still remember taking those
47
00:02:58.199 --> 00:03:02.960
big maps and going on trips and having fun. And
48
00:03:03.000 --> 00:03:09.439
then came GPS, then came email, then came automated voice messages.
49
00:03:09.960 --> 00:03:14.280
So EI has been around, just not so much in
50
00:03:14.360 --> 00:03:19.120
the face. So the first thing is to take a
51
00:03:19.159 --> 00:03:23.159
step back and look at AI for what it is,
52
00:03:24.960 --> 00:03:30.080
coming to learn about AI, checking it out with curiosity
53
00:03:30.400 --> 00:03:37.639
what's possible here while knowing that it is not perfect.
54
00:03:39.080 --> 00:03:44.199
That is the critical piece to remember. And then sometimes
55
00:03:44.240 --> 00:03:47.560
what happens, especially if you're just kind of getting to
56
00:03:47.759 --> 00:03:51.800
know AI, or if you're a small organization, or even
57
00:03:51.840 --> 00:03:57.879
if you're a big organization, you bring in AI and
58
00:03:57.919 --> 00:04:02.280
then you have started tool, you're doing pilot, and while
59
00:04:02.319 --> 00:04:07.400
you're uploading some features, you are developing training programs. Every
60
00:04:07.479 --> 00:04:10.919
twenty seconds, there is that change, there is that oh mom,
61
00:04:11.039 --> 00:04:13.919
And then what happens is okay. The first question could
62
00:04:13.919 --> 00:04:17.319
be where do I even start? Another could be I've
63
00:04:17.319 --> 00:04:21.439
already started, I've already implied, and now there's hold this upgrade.
64
00:04:21.519 --> 00:04:26.360
What am I going to do? And as a result,
65
00:04:26.439 --> 00:04:33.639
of that, it can create that uncertainty, that feeling of
66
00:04:33.800 --> 00:04:38.000
doubt as to am I behind And that's always going
67
00:04:38.040 --> 00:04:41.480
to be the case. What's going to happen is that
68
00:04:42.040 --> 00:04:49.199
there's always going to be new things. So regardless, if
69
00:04:49.240 --> 00:04:53.480
you're feeling all of that, it's okay because that's where
70
00:04:53.480 --> 00:04:57.759
you are supposed to be. And now the key things
71
00:04:57.879 --> 00:05:04.040
is it's important recognize the feelings because whether you are
72
00:05:04.360 --> 00:05:09.160
bringing in AF just for yourself, or whether you're bringing
73
00:05:09.199 --> 00:05:12.160
in AF for the organization, or you're bringing in for
74
00:05:12.240 --> 00:05:15.120
your business, or you are helping somebody else adopt AI,
75
00:05:16.240 --> 00:05:21.680
these are the questions to ask. And I just kind
76
00:05:21.680 --> 00:05:24.920
of tell a story. So one of my clients was
77
00:05:25.000 --> 00:05:30.560
leading a whole AI department in a very big, reputable company.
78
00:05:30.759 --> 00:05:34.920
Thought his job was secure, his team was secure because
79
00:05:34.959 --> 00:05:38.560
they were the ones who had launched it. But what
80
00:05:38.639 --> 00:05:44.360
happened was and the reality is that they had made
81
00:05:44.439 --> 00:05:49.279
it so well that they could be eliminated. So he
82
00:05:49.480 --> 00:05:54.079
and his whole team was just let go. Who would
83
00:05:54.120 --> 00:06:03.040
have thought that an expert in artificial eelligence, that expert
84
00:06:03.399 --> 00:06:06.759
who had built the whole structure for the company, he
85
00:06:06.839 --> 00:06:09.680
and his team would be gone. So yes, they were
86
00:06:09.759 --> 00:06:14.920
laid off and guess what, because there were new updates,
87
00:06:14.920 --> 00:06:19.399
there were new changes. What happened was other people didn't
88
00:06:19.439 --> 00:06:23.839
know and then started crashing. So that is just one story.
89
00:06:24.240 --> 00:06:28.959
And some companies they are bringing an AI without looking
90
00:06:28.959 --> 00:06:32.759
at the process. In some companies they have built it
91
00:06:32.800 --> 00:06:35.759
to a very good extent, but have taken the humans
92
00:06:35.800 --> 00:06:40.439
out of the equation. And that's where these are just
93
00:06:40.439 --> 00:06:45.680
a couple of examples. Their problems happened an AI adoption failed.
94
00:06:46.720 --> 00:06:50.199
I mean some parts were successful, some parts were failing.
95
00:06:52.759 --> 00:06:55.360
And then when my client came to me and said, Tavia,
96
00:06:55.839 --> 00:07:03.040
what do I do over here? He felt so disappointed.
97
00:07:03.600 --> 00:07:10.240
He felt disillusioned and just that feeling of emptiness and
98
00:07:10.279 --> 00:07:13.319
he was saying to me, I feel like I'm looking
99
00:07:13.360 --> 00:07:17.439
into abyss. I was so excited about it. So the
100
00:07:17.560 --> 00:07:22.639
key is knowing how to deal with AI. And that's
101
00:07:22.680 --> 00:07:25.959
where not only my disclient let's call him John, but
102
00:07:26.000 --> 00:07:28.079
a lot of my other clients where I've helped them
103
00:07:29.680 --> 00:07:34.759
work through the era of AI to make themselves future proof,
104
00:07:35.399 --> 00:07:42.199
to make the A adoption, integration and governance in such
105
00:07:42.199 --> 00:07:49.240
a way that it is our partner, not something that
106
00:07:49.279 --> 00:07:53.199
can take away jobs. That's why I wrote my book
107
00:07:53.439 --> 00:07:59.199
The AI Agency, and the thing is that what happens
108
00:07:59.319 --> 00:08:02.120
is I want us to kind of look at some
109
00:08:02.240 --> 00:08:07.079
of these failures that I talked about. Companies are chasing
110
00:08:07.199 --> 00:08:12.000
tools instead of outcomes, that too many disconnected pilots. There's
111
00:08:12.079 --> 00:08:17.160
no clear ownership and there's no definition of success. So
112
00:08:17.199 --> 00:08:24.800
it's not about what AI can we used, it's not
113
00:08:25.079 --> 00:08:30.439
about which tool we can use. It's important that we
114
00:08:30.560 --> 00:08:34.919
stop talking about it over there. The key is what
115
00:08:35.120 --> 00:08:41.960
problem is words solving. And even before we go into
116
00:08:42.080 --> 00:08:45.120
what problem we are solving, it's important to talk about
117
00:08:45.480 --> 00:08:49.200
the AI governance. And when we talk about AI governance,
118
00:08:49.799 --> 00:08:52.080
is that what it's going to look like. Who's going
119
00:08:52.120 --> 00:08:55.240
to own the decisions? And Yes, there's so much to
120
00:08:55.279 --> 00:08:57.320
talk about that today. We're not going to go into
121
00:08:57.360 --> 00:09:01.519
all that details, but these are the things to remember,
122
00:09:03.279 --> 00:09:10.039
so think about it. Let's we talk about, like, Okay,
123
00:09:10.120 --> 00:09:12.639
which is the problem I want to solve. Whether you're
124
00:09:12.679 --> 00:09:16.559
in an individual entrepreneur, individual leader, or an organization, that's
125
00:09:16.639 --> 00:09:21.519
like the first thing we talk about. So let's say
126
00:09:21.559 --> 00:09:24.000
one of the problems is I'm just going to share
127
00:09:24.039 --> 00:09:28.480
a story where there's this team who had to do
128
00:09:28.639 --> 00:09:32.440
a lot of testing, and because they were testing the
129
00:09:32.480 --> 00:09:35.519
same thing, a lot of Q testing is involved. It
130
00:09:35.679 --> 00:09:40.720
is an iterative process, it is mechanical, and it was
131
00:09:40.759 --> 00:09:45.759
taking people's motivation away. So what they did, and this
132
00:09:45.919 --> 00:09:49.279
is just one piece of the product I'm talking about.
133
00:09:49.519 --> 00:09:52.840
So as I was helping this team, what we did
134
00:09:53.080 --> 00:09:57.240
was we took all the processes that they had and
135
00:09:57.360 --> 00:10:00.879
out of that we narrowed down what was the most
136
00:10:01.039 --> 00:10:06.840
painful process for them. And from there, the painful process
137
00:10:07.080 --> 00:10:11.679
was where there was a lot of grind every day,
138
00:10:11.840 --> 00:10:15.000
day to day mechanical work, where people were not being challenged,
139
00:10:15.039 --> 00:10:20.440
there was no growth opportunities, no meaning, no satisfaction. And
140
00:10:20.480 --> 00:10:23.519
then we broke it down into chunk And when you
141
00:10:23.600 --> 00:10:28.000
take this process and break it down into chunks, then
142
00:10:28.080 --> 00:10:30.320
you take a look at it. You sit down and
143
00:10:30.360 --> 00:10:33.919
you assess how much time does it take for a
144
00:10:34.000 --> 00:10:39.600
human being to do it? Where will the automation help.
145
00:10:40.799 --> 00:10:44.480
And then the key is starting out with a small
146
00:10:44.559 --> 00:10:50.679
pilot where it is a focused use case. Another important
147
00:10:50.720 --> 00:10:56.240
thing to remember is having clear metrics. And when you
148
00:10:56.279 --> 00:11:02.879
start out with a small use case, work through the problem,
149
00:11:03.159 --> 00:11:08.720
fix one thing, and you find that the results are good,
150
00:11:09.399 --> 00:11:11.879
then you move to the next space. And of course
151
00:11:11.919 --> 00:11:14.320
you can do multiple things at the same time too,
152
00:11:15.120 --> 00:11:19.559
but then you need to have multiple teams doing small pilots.
153
00:11:21.039 --> 00:11:25.000
And the reason, the key thing is that one thing
154
00:11:25.000 --> 00:11:28.639
that I want to introduce in this is a business
155
00:11:28.679 --> 00:11:36.639
technology ethics framework. I call about the three things to
156
00:11:36.759 --> 00:11:40.159
work on business. Why does it matter that we are
157
00:11:40.200 --> 00:11:45.559
working on this problem? Super important? Okay, here was like
158
00:11:45.600 --> 00:11:49.480
you know, not only it's taking going back to that process.
159
00:11:49.960 --> 00:11:53.960
Not only it was taking them fifty hours, white matters
160
00:11:53.960 --> 00:11:56.399
if you take a look at it, it was using
161
00:11:56.399 --> 00:11:58.519
a lot of men power, and of course when it
162
00:11:58.639 --> 00:12:02.120
uses a lot of menpower, it impacts the profitability too.
163
00:12:02.440 --> 00:12:05.360
So that was the business size. Now we look at
164
00:12:05.440 --> 00:12:09.600
the second piece of the framework technology, What enables it?
165
00:12:10.080 --> 00:12:12.679
Do we need to automate, do we need to create workflows,
166
00:12:12.720 --> 00:12:16.679
do we need to have agentic AI? What is it
167
00:12:16.720 --> 00:12:18.840
that we want to include. So then we took that
168
00:12:18.919 --> 00:12:21.879
part of the process and what they did was they
169
00:12:21.919 --> 00:12:25.799
took only painful pieces of the process to automate it.
170
00:12:27.080 --> 00:12:33.200
They built in a lot of humans in the loop. Yes,
171
00:12:33.759 --> 00:12:38.240
that's exactly what I mean. Because artificial intelligence will hallucinate,
172
00:12:39.120 --> 00:12:49.200
it has the confidence of the person of a person
173
00:12:50.240 --> 00:12:53.320
who may be wrong, but things that they're good. So yes,
174
00:12:53.720 --> 00:12:57.840
artificial intelligence can make mistakes, whether it's CHGPT claud if
175
00:12:57.879 --> 00:13:01.200
wich isnthropic, or even if if you have open Claw,
176
00:13:01.360 --> 00:13:03.919
or if you have a Gemini, or if you have
177
00:13:04.200 --> 00:13:09.000
the cowork which has been released from Claude for doing
178
00:13:09.080 --> 00:13:12.159
agentic work. You can have any of that and it
179
00:13:12.200 --> 00:13:15.960
can still make mistakes. So that is the key to remember.
180
00:13:16.919 --> 00:13:23.000
So ethics is what protects the data, what protects the process,
181
00:13:23.480 --> 00:13:26.279
and what protects the people. So these are the key
182
00:13:26.320 --> 00:13:31.399
things to remember because remember if you don't have ethics,
183
00:13:31.679 --> 00:13:34.840
like when I'm talking about this framework business technology and ethics,
184
00:13:34.919 --> 00:13:39.080
if you don't have it, the risk can be your
185
00:13:39.120 --> 00:13:43.200
employees can get impacted. And I'm going to go back
186
00:13:43.200 --> 00:13:46.720
to that story. So that is from Sean and Chantel.
187
00:13:46.799 --> 00:13:50.279
I'm just going to share about these two people, you know,
188
00:13:50.360 --> 00:13:56.080
Sean and Chantell other characters in my book, and one
189
00:13:56.200 --> 00:14:00.240
time they are doing this pilot and what they're doing
190
00:14:00.399 --> 00:14:07.039
is that every step wherever there's a quality assurance testing,
191
00:14:07.720 --> 00:14:11.519
they're putting in a developers check where they're making sure
192
00:14:12.320 --> 00:14:19.840
that the AI is checking upon themselves and also humans
193
00:14:19.840 --> 00:14:25.360
are checking at certain levels and why human check is important.
194
00:14:26.480 --> 00:14:30.720
So let's say if a decision comes in that is wrong,
195
00:14:30.799 --> 00:14:34.039
which AI may have hallucinated or skipped over a step,
196
00:14:36.360 --> 00:14:40.519
then a human being, let's say Sean had really good
197
00:14:40.559 --> 00:14:43.840
experience in the quality insurance testing. He knows he knows
198
00:14:43.840 --> 00:14:46.200
it in and out, and when something is wrong he
199
00:14:46.240 --> 00:14:48.600
can put a finger on it. He'll be able to
200
00:14:48.639 --> 00:14:52.159
identify it. And when he's able to identify it, guess
201
00:14:52.159 --> 00:14:55.759
what's going to happen. They are going to be able
202
00:14:55.799 --> 00:14:58.840
to fix it. Otherwise it can keep on looping and
203
00:14:58.919 --> 00:15:04.120
churning out wrang outcomes and wrong results. So going back
204
00:15:04.159 --> 00:15:08.240
to that process, what we did was they had put
205
00:15:08.279 --> 00:15:12.120
in a lot of checks and they ran the pilot
206
00:15:12.639 --> 00:15:18.639
for a quick twenty days, moved through a lot of failures,
207
00:15:18.759 --> 00:15:21.039
remove a lot of kings, but at the same time
208
00:15:21.279 --> 00:15:25.720
they reduced the human checks. So initially the human checks
209
00:15:25.720 --> 00:15:30.919
were let's sit ten at ten checkpoints. They kept it
210
00:15:30.960 --> 00:15:34.879
now at three pain points. But knowing that working with
211
00:15:35.000 --> 00:15:39.759
AI is iterative, you make it and then you find
212
00:15:39.840 --> 00:15:42.840
that where is it not doing it right? Where it
213
00:15:42.879 --> 00:15:46.600
can do better, just like any human beings. So think
214
00:15:46.639 --> 00:15:48.759
about it like you know, as a if you were
215
00:15:48.759 --> 00:15:52.000
a human resource professional or a human resource leader or
216
00:15:52.039 --> 00:15:57.440
a chief people officer, you are going to invest that training.
217
00:15:58.440 --> 00:16:01.879
You are going to invest in those employees, in your
218
00:16:02.039 --> 00:16:05.559
succession planning, in your emerging leaders. You are going to
219
00:16:05.639 --> 00:16:08.960
allow them. You are going to give them the space
220
00:16:09.440 --> 00:16:14.039
to grow, to make mistakes. So same thing. Artificial intelligence
221
00:16:14.120 --> 00:16:18.639
is no different. So the key is like having those
222
00:16:18.679 --> 00:16:22.080
three elements because when you bring in ethics, when you
223
00:16:22.200 --> 00:16:25.639
bring in that governance, that Okay, how are we going
224
00:16:25.679 --> 00:16:28.879
to deploy how are we going to make sure that
225
00:16:29.120 --> 00:16:37.120
we keep our employees because their experience matters, the experience
226
00:16:37.200 --> 00:16:42.639
that sitting in people's heart and brain, the knowledge that's
227
00:16:42.840 --> 00:16:46.759
not on the web. So remember, yeah, it does not
228
00:16:46.879 --> 00:16:51.879
have access to everything. And when you are bringing in ethics,
229
00:16:51.960 --> 00:16:56.320
the risk to your employees, your top talent leaving your company,
230
00:16:57.399 --> 00:17:02.480
or your customers getting unhappy, or your brand getting stigmatized
231
00:17:02.600 --> 00:17:06.599
or your brand getting a bad name goes significantly low
232
00:17:07.240 --> 00:17:10.839
because you've tested it, you've tried it. The key thing
233
00:17:10.880 --> 00:17:14.160
to remember with artificial intelligence is that there will be
234
00:17:14.240 --> 00:17:17.720
at certain points, certain segments which you have tested, where
235
00:17:17.799 --> 00:17:21.440
you will trust it because there's still automation in the
236
00:17:21.480 --> 00:17:25.640
whole world. Yes, developers are doing the testing and all that,
237
00:17:25.799 --> 00:17:29.119
so it's no different than artificial intelligence doing it. So
238
00:17:29.279 --> 00:17:33.400
when you apply the same principles what had gone in before,
239
00:17:35.000 --> 00:17:40.519
you will find that interacting with artificial intelligence will be successful.