Oct. 28, 2025

How to Build a Business That Runs Itself and Wins

How to Build a Business That Runs Itself and Wins

Drawing from their upcoming book AUTONOMOUS, they lay out a compelling vision of the AI-powered enterprise—not just faster or more efficient, but truly fit to thrive in an unpredictable world. Together, they challenge leaders to question old assumptions, break through operational blockages, and design organizations that can learn, adapt, and scale at machine speed.

Beyond Confidence is broadcast live Tuesdays at 10AM ET on W4WN Radio - Women 4 Women Network (www.w4wn.com) part of Talk 4 Radio (www.talk4radio.com) on the Talk 4 Media Network (www.talk4media.com). Beyond Confidence TV Show is viewed on Talk 4 TV (www.talk4tv.com).

Beyond Confidence Podcast is also available on Talk 4 Media (www.talk4media.com), Talk 4 Podcasting (www.talk4podcasting.com), iHeartRadio, Amazon Music, Pandora, Spotify, Audible, and over 100 other podcast outlets.

Become a supporter of this podcast: https://www.spreaker.com/podcast/beyond-confidence--1885197/support.

WEBVTT

1
00:00:00.040 --> 00:00:02.240
The topics and opinions expressed on the following show are

2
00:00:02.279 --> 00:00:04.160
solely those of the hosts and their guests and not

3
00:00:04.200 --> 00:00:07.000
those of W four WN Radio. It's employees are affiliate.

4
00:00:07.240 --> 00:00:10.800
We make no recommendations or endorsement for radio show programs, services,

5
00:00:10.880 --> 00:00:13.919
or products mentioned on air or on our web. No liability,

6
00:00:14.039 --> 00:00:17.239
explicit or implied shall be extended to W four WN Radio.

7
00:00:17.399 --> 00:00:20.120
It's employees are affiliates. Any questions or common should be

8
00:00:20.120 --> 00:00:22.480
directed to those show hosts. Thank you for choosing W

9
00:00:22.600 --> 00:00:24.199
four WN Radio.

10
00:00:25.640 --> 00:00:28.640
This is Beyond Confidence with your host d V park.

11
00:00:29.000 --> 00:00:31.120
Do you want to live a more fulfilling life? Do

12
00:00:31.199 --> 00:00:34.159
you want to live your legacy and achieve your personal, professional,

13
00:00:34.280 --> 00:00:38.439
and financial goals? Well? Coming up on dvaparks Beyond Confidence,

14
00:00:38.560 --> 00:00:42.119
you will hear real stories of leaders, entrepreneurs, and achievers

15
00:00:42.159 --> 00:00:45.560
who have steps into discomfort, shattered their status quo, and

16
00:00:45.640 --> 00:00:48.119
are living the life they want. You will learn how

17
00:00:48.159 --> 00:00:52.000
relationships are the key to achieving your aspirations and financial goals.

18
00:00:52.280 --> 00:00:54.920
Moving your career business forward does not have to happen

19
00:00:54.960 --> 00:00:57.560
at the expense of your personal or family life or

20
00:00:57.640 --> 00:01:01.920
vice versa. Learn more at www dot divopork dot com.

21
00:01:02.000 --> 00:01:05.480
And you can connect with Diva at contact dant divapark

22
00:01:05.560 --> 00:01:09.359
dot com. This is beyond confidence and now here's your host,

23
00:01:09.560 --> 00:01:10.439
div Park.

24
00:01:11.840 --> 00:01:15.120
Good morning, listeners, It's Tuesday morning and I'm so thrilled

25
00:01:15.159 --> 00:01:19.079
to be with you. So today we are going to

26
00:01:19.120 --> 00:01:23.799
be talking about something that is very close to my

27
00:01:23.879 --> 00:01:27.959
heart and close to your heart as well. It's about

28
00:01:28.040 --> 00:01:31.560
the leadership and the age of AI, which is artificial intelligence.

29
00:01:31.719 --> 00:01:35.439
And artificial intelligence has been in our lives since the

30
00:01:35.519 --> 00:01:38.280
advent of the GPS and the emails and like we

31
00:01:38.319 --> 00:01:40.760
talk about as if AA is new, it is not.

32
00:01:41.239 --> 00:01:45.000
It has been there. Think about Alexa or the first

33
00:01:45.000 --> 00:01:49.120
time that GPS came out and you left those maps behind.

34
00:01:50.200 --> 00:01:53.840
So we are going to bring in our guests today

35
00:01:54.280 --> 00:02:00.920
without further ado and jump straight into it. Welcome Henry,

36
00:02:01.000 --> 00:02:02.400
Welcome Ala.

37
00:02:02.319 --> 00:02:04.439
Thank you David. It's great to be here, Great to

38
00:02:04.439 --> 00:02:04.799
be here.

39
00:02:04.840 --> 00:02:10.400
Thank you absolutely. So here's what I want to like,

40
00:02:10.560 --> 00:02:12.879
jump straight into it. You know, so many like two

41
00:02:12.879 --> 00:02:15.960
and a half years back or maybe it was I

42
00:02:15.960 --> 00:02:18.560
think so twenty twenty. I don't remember the exactly, like

43
00:02:18.560 --> 00:02:23.199
probably somewhere around like yeah, I don't remember the exact days.

44
00:02:23.199 --> 00:02:25.479
I was just reading a few days back, like thousand

45
00:02:25.560 --> 00:02:30.199
days plus thousand plus short change some over there. People

46
00:02:30.319 --> 00:02:31.919
used to be talk like, oh, we want to be

47
00:02:31.960 --> 00:02:34.280
air first, we want to be air powered. Now we

48
00:02:34.360 --> 00:02:37.360
are gone beyond that. The stats that are coming out

49
00:02:37.439 --> 00:02:40.000
is that more and more people are using large language

50
00:02:40.039 --> 00:02:45.159
models throughout the country and throughout the world for that matter.

51
00:02:46.280 --> 00:02:51.280
So now those days are gone that you cannot toeut

52
00:02:51.319 --> 00:02:53.759
to be a powered first. So where do you think

53
00:02:53.840 --> 00:02:58.520
the company stands or an organization stands in today's twenty

54
00:02:58.560 --> 00:03:01.719
twenty five and as the more thanto twenty twenty.

55
00:03:01.560 --> 00:03:05.520
Six, do you want me to start?

56
00:03:05.960 --> 00:03:06.360
Sure?

57
00:03:08.000 --> 00:03:10.479
Well do you? First of all, I think you're well again,

58
00:03:10.560 --> 00:03:12.240
thank you for having us on your show. We really

59
00:03:12.240 --> 00:03:16.080
appreciate being here. And I think you're absolutely right you

60
00:03:16.120 --> 00:03:18.919
know you're talking about I think it was November twenty

61
00:03:19.080 --> 00:03:22.039
twenty two when chat GPT was first launched to the

62
00:03:22.039 --> 00:03:27.719
public and we were still in the midst of writing

63
00:03:27.719 --> 00:03:30.960
our first book, and our first book was called Boundless,

64
00:03:31.319 --> 00:03:34.879
and there it is in fact, and Boundless was really

65
00:03:34.919 --> 00:03:40.159
kind of about a mindset for leadership in this kind

66
00:03:40.159 --> 00:03:43.879
of new era that we find ourselves in. And I

67
00:03:43.919 --> 00:03:48.159
think it's fair to say that we were partially prepared

68
00:03:48.159 --> 00:03:53.479
but by no means fully prepared for AI, and having

69
00:03:53.560 --> 00:03:57.439
written Boundless and then it being published in October twenty three,

70
00:03:58.240 --> 00:04:01.479
we kept asking ourselves in each other there is AI

71
00:04:01.680 --> 00:04:06.000
going to render our book irrelevant? Or is it still

72
00:04:06.000 --> 00:04:08.400
going to be relevant? And we were sort of holding

73
00:04:08.400 --> 00:04:11.280
our breath for a little while. We're like, after months,

74
00:04:11.280 --> 00:04:13.680
we're like, well, thank goodness, it proves actually that it's

75
00:04:13.680 --> 00:04:17.680
still very relevant. And so this is why we went

76
00:04:17.720 --> 00:04:22.240
ahead and then started to write Autonomous because in direct

77
00:04:22.279 --> 00:04:27.639
answer to your question, generative AI is not the start,

78
00:04:27.720 --> 00:04:29.920
although it was really kind of the start of AI

79
00:04:30.000 --> 00:04:33.319
from a very public perspective. You know, we've had predictive

80
00:04:33.360 --> 00:04:36.920
AI for a long long time, and that has been used,

81
00:04:38.680 --> 00:04:41.839
you know, to one extent or another across organizations of

82
00:04:41.879 --> 00:04:45.639
all type. But we're really now going beyond predictive. We're

83
00:04:45.680 --> 00:04:48.959
going beyond generative and now we're starting to enter the

84
00:04:49.000 --> 00:04:53.240
age of agentic AI as well. And I think where

85
00:04:53.319 --> 00:04:58.600
most companies are currently is I think the individuals within

86
00:04:58.680 --> 00:05:02.879
most companies are pretty comfortable with a generative AI. We

87
00:05:02.959 --> 00:05:05.439
all use it on a daily basis. My whole family

88
00:05:05.560 --> 00:05:10.839
uses it, everyone I know uses chat GPT or Claude

89
00:05:10.920 --> 00:05:14.720
or Gemini or any one or more than one of

90
00:05:14.800 --> 00:05:18.279
the models that are around. But where we really are

91
00:05:18.319 --> 00:05:22.199
now organizationally, is how do we start to take advantage

92
00:05:22.199 --> 00:05:31.360
of agentic AI Where AGENTICAI is qualitatively different to generative AI.

93
00:05:31.759 --> 00:05:36.120
This is now AI that can actually do work individually

94
00:05:36.360 --> 00:05:39.399
of a human and this, we think, puts us into

95
00:05:39.399 --> 00:05:42.480
an entirely new place. And this is where I think

96
00:05:42.560 --> 00:05:47.079
that organizations are right now. We're right at the beginning

97
00:05:47.120 --> 00:05:50.720
of this period, this new new period where we have

98
00:05:50.959 --> 00:05:53.800
AI that's now a member of the workforce. It's no

99
00:05:53.920 --> 00:05:56.480
longer at all a member of the workforce, and that's

100
00:05:56.560 --> 00:06:00.920
categorically different. And so most companies are right at the

101
00:06:01.000 --> 00:06:04.120
right at the beginning of this era. And I think

102
00:06:04.160 --> 00:06:08.120
we're all asking ourselves what does that mean? What does

103
00:06:08.120 --> 00:06:11.040
it mean to have a new member of the workforce?

104
00:06:11.079 --> 00:06:14.800
What does it mean to be a human colleague? But

105
00:06:15.000 --> 00:06:19.040
knowing that as a human we may have a digital colleague,

106
00:06:19.079 --> 00:06:21.959
we may have digital advisor, we may have digital boss,

107
00:06:22.680 --> 00:06:25.800
an AI boss, what does it mean for human leaders

108
00:06:26.199 --> 00:06:29.240
thinking about the way that they manage their teams? What

109
00:06:29.279 --> 00:06:31.600
does it mean to have teams which are hybrid teams

110
00:06:31.600 --> 00:06:35.160
of humans and machines or humans and AI. What does

111
00:06:35.199 --> 00:06:41.120
it mean to hand over some of the responsibility for

112
00:06:41.240 --> 00:06:45.680
leadership of an organization to AI. So I think that's

113
00:06:46.480 --> 00:06:48.360
that's kind of where we are right now. We're at

114
00:06:48.360 --> 00:06:51.560
the we're really at the beginning of this new age

115
00:06:51.600 --> 00:06:54.120
of agentic AI, and I think that's what we've been

116
00:06:54.120 --> 00:06:56.000
focused on in the book, and that's what we're very

117
00:06:56.000 --> 00:06:56.759
excited about.

118
00:06:58.000 --> 00:07:02.879
Absolutely, And you're right on the market, Henry, because there

119
00:07:02.920 --> 00:07:06.120
are agents all over, but there's still so much in

120
00:07:06.199 --> 00:07:12.319
their ncency that the current agents that are out there

121
00:07:12.319 --> 00:07:15.439
in the market, yes they're performing, but they're still stalling.

122
00:07:15.519 --> 00:07:18.959
They're still like that robotic they're developing personality, they're learning,

123
00:07:19.000 --> 00:07:22.040
and they're reasoning. So if a company like you know,

124
00:07:22.040 --> 00:07:26.279
if an organization we're thinking like okay, I want to

125
00:07:26.319 --> 00:07:30.040
bring in that agentic the key is like having that governance.

126
00:07:30.120 --> 00:07:32.639
Like you know, so many companies do not talk about

127
00:07:32.680 --> 00:07:39.120
having the AI governance. So what would you say, while Henry, like,

128
00:07:39.240 --> 00:07:42.480
what are some top three to five things before they

129
00:07:42.480 --> 00:07:44.720
even think about bringing in the agentic AI.

130
00:07:46.600 --> 00:07:50.160
Well, as you mentioned, you know, governance is critically important.

131
00:07:50.920 --> 00:07:54.879
My company's core value Number one core value since its inception,

132
00:07:55.480 --> 00:08:00.079
so for twenty six years has been trust. And you

133
00:08:00.120 --> 00:08:02.360
know in the book we talk about trust is competence

134
00:08:02.399 --> 00:08:08.160
plus character. Competence is capability and reliability, characters, integrity, and benevolence.

135
00:08:08.600 --> 00:08:12.839
I love the word benevolence. Your intentionality, your thoughts, your words,

136
00:08:12.839 --> 00:08:16.120
your actions are aligned, and so you have to be

137
00:08:16.199 --> 00:08:20.000
clear with your intentions when you leverage powerful technology that

138
00:08:20.160 --> 00:08:24.519
understands Natural language processing AI has evolved where it can

139
00:08:24.600 --> 00:08:31.160
understand language incredibly precise. It has reasoning capabilities, lots of

140
00:08:31.879 --> 00:08:38.159
combination of capabilities, retrieval, augmented generation, vector database, romantic search.

141
00:08:38.200 --> 00:08:42.919
You've got all the scientific work that combine has created

142
00:08:43.399 --> 00:08:46.559
software that can think and can reason and can be

143
00:08:46.679 --> 00:08:50.559
embedded in complex workflows, and as Henry mentioned, take action.

144
00:08:51.360 --> 00:08:55.159
Where you're no longer talking about a tool that humans use.

145
00:08:55.639 --> 00:08:58.399
This is now software that can work on your behalf

146
00:08:59.600 --> 00:09:03.519
and it constantly learns with every iteration. So the combination

147
00:09:03.600 --> 00:09:10.120
of natural language processing and multimodalit it can understand images, voice, texts, videos,

148
00:09:10.639 --> 00:09:14.399
it can reason, it can take action, and it can improve.

149
00:09:15.039 --> 00:09:17.600
This is the most important technology of the twenty first century.

150
00:09:17.759 --> 00:09:21.159
I've been in tech thirty years, and I view agentic

151
00:09:21.200 --> 00:09:24.759
AI as electricity for the twenty first century. Without it,

152
00:09:25.360 --> 00:09:30.240
you will not be able to compete in this hyperconnected

153
00:09:30.799 --> 00:09:35.960
knowledge sharing economy powered by AI. And as Henry mentioned,

154
00:09:36.200 --> 00:09:40.399
when we wrote Boundless, the goal was to discuss the

155
00:09:40.440 --> 00:09:47.279
negative impact of silos, silo mentality, silo structures, data silos.

156
00:09:47.440 --> 00:09:49.879
All AI projects are data projects, So if you have

157
00:09:49.960 --> 00:09:53.360
data silos, you cannot reach your boundless potential when it

158
00:09:53.360 --> 00:09:57.480
comes to AI algorithms. And we decided that based on

159
00:09:57.759 --> 00:10:01.120
five years of research looking at companies that are growing

160
00:10:01.120 --> 00:10:04.200
at double digital compound annual growth rates, what were some

161
00:10:04.240 --> 00:10:06.120
of the muscles, what were some of the attributes and

162
00:10:06.200 --> 00:10:10.240
characteristics they had, and we identified and converged down to

163
00:10:10.360 --> 00:10:14.240
seven design principles, and one of the seven design principles

164
00:10:14.360 --> 00:10:17.519
was autonomy. So as much as Henry says we were

165
00:10:17.519 --> 00:10:22.000
nervous about this launch of agentic AI, for five years,

166
00:10:22.039 --> 00:10:25.919
we were researching and identified autonomous capabilities. Is one of

167
00:10:25.960 --> 00:10:31.840
the key elements of eliminating silos. And AI agents are

168
00:10:31.879 --> 00:10:36.720
autonomous agents. And as you mentioned, fifty nine days after

169
00:10:36.799 --> 00:10:39.840
launch of JATGPT, you got one hundred million people using

170
00:10:39.919 --> 00:10:44.200
language models to generate content, but agentic goes beyond that,

171
00:10:44.720 --> 00:10:48.320
and so trust is critically important fundamentally in my career

172
00:10:48.440 --> 00:10:51.200
in tech. When I look at the most successful companies,

173
00:10:51.200 --> 00:10:54.639
including my own, I would put technology at the bottom

174
00:10:54.720 --> 00:10:57.440
of the key success factors. I would say, it's culture.

175
00:10:58.159 --> 00:11:00.840
You have to have good, healthy, culture, and in this

176
00:11:01.480 --> 00:11:05.000
situation that we're in, it's a culture of experimentation, culture

177
00:11:05.000 --> 00:11:09.159
of trust, culture of collaboration, culture that removes silos. You

178
00:11:09.200 --> 00:11:12.120
have to have strong talent, you have to hire people

179
00:11:12.240 --> 00:11:15.159
with high rate of learning and good judgment. This is

180
00:11:15.200 --> 00:11:18.240
why trust is so important. It's very difficult to maintain

181
00:11:18.279 --> 00:11:21.919
good judgment, especially if you're working in an environment where

182
00:11:22.000 --> 00:11:24.799
velocity is not like anything we've ever seen innovation velocity,

183
00:11:24.879 --> 00:11:28.480
speed and direction, and you have to have good processes.

184
00:11:28.720 --> 00:11:31.440
You have to look for waste, you have to look

185
00:11:31.440 --> 00:11:35.279
for bottlenecks, you have to create lean processes, and then

186
00:11:35.360 --> 00:11:39.759
lastly technology, So culture, people, process, and technology, and if

187
00:11:39.799 --> 00:11:44.559
you are guided by the north star trust and customer success,

188
00:11:44.720 --> 00:11:47.320
you know ultimately you have to have shared success, which

189
00:11:47.399 --> 00:11:51.519
was one of the seven principles in Boundless. We keep

190
00:11:51.600 --> 00:11:53.519
going back to the book we wrote two years ago,

191
00:11:54.679 --> 00:11:57.120
not the one we're trying to talk about now because

192
00:11:57.159 --> 00:12:00.799
it was foundational to the book Autonomous. So I think

193
00:12:00.840 --> 00:12:04.559
if you have those elements as top of mind, you

194
00:12:04.600 --> 00:12:07.440
position yourself to be successful. And the last thing I

195
00:12:07.480 --> 00:12:12.320
would say is stay teachable, Stay teachable. As I get older,

196
00:12:12.360 --> 00:12:15.879
I realize how heart it is for me to stay teachable.

197
00:12:15.879 --> 00:12:18.000
This is why I you know, I love being on

198
00:12:18.080 --> 00:12:21.799
your podcast because you know, it allows me to learn

199
00:12:21.840 --> 00:12:23.799
from you, Henry, and I had to learn from you.

200
00:12:24.240 --> 00:12:26.480
It helps us identify blind spots that we may have,

201
00:12:27.279 --> 00:12:29.919
and it reminds us that, you know, to have a

202
00:12:29.919 --> 00:12:34.759
beginner's mindset is critically important. And as much as your

203
00:12:34.799 --> 00:12:37.440
commentary is absolutely correct in terms of this is the

204
00:12:37.480 --> 00:12:41.120
beginning when we're talking about agentic AI. I just came

205
00:12:41.120 --> 00:12:43.559
back from San Francisco. When I stand on the streets

206
00:12:43.559 --> 00:12:46.360
of San Francisco, I can count to ten and I'll

207
00:12:46.360 --> 00:12:49.480
see a car drive by me without a driver. I

208
00:12:49.480 --> 00:12:51.799
could see Waymos, eight hundred of them in San Francisco.

209
00:12:51.919 --> 00:12:54.840
I can see Zekes from Amazon. I will see Cybercamp

210
00:12:54.840 --> 00:13:00.720
from Tesla. I'll see Moya from Volkswagen. We've got eight

211
00:13:00.759 --> 00:13:03.679
hundred waymos that have started passed forty five thousand lyft

212
00:13:03.759 --> 00:13:05.919
drivers in San Francisco in terms of number of rides.

213
00:13:06.360 --> 00:13:09.879
So anyone who doubts that AI is here and what

214
00:13:09.919 --> 00:13:13.399
are we talking about when we talk about self driving cars?

215
00:13:13.960 --> 00:13:18.159
AI has replaced a driver in a major city when

216
00:13:18.159 --> 00:13:20.360
life and death is on the line. And by the way,

217
00:13:20.399 --> 00:13:23.519
there's been zero fatalities with manmos since twenty seventeen. So

218
00:13:23.559 --> 00:13:27.480
it's safer drive and it's less costly. And so it's

219
00:13:27.519 --> 00:13:31.440
a realization that, yes, the term was coined in nineteen

220
00:13:31.480 --> 00:13:34.879
fifty seven at Dartmouth College and it's been around a

221
00:13:34.879 --> 00:13:37.159
long time, but what we can do with it today

222
00:13:37.279 --> 00:13:41.159
is unlike anything we've ever experienced in that lifetime. And

223
00:13:41.200 --> 00:13:44.200
we have to be aware that if we don't maintain

224
00:13:44.279 --> 00:13:48.120
that beginner's mindset, if you don't challenge dominant logic, if

225
00:13:48.159 --> 00:13:51.960
we're not comfortable with software guiding our careers, we may

226
00:13:52.000 --> 00:13:55.919
be positioning ourselves in a place that's harder to compete

227
00:13:55.960 --> 00:13:56.320
and win.

228
00:13:58.240 --> 00:14:03.120
Absolutely, you on the head because unless you're looking to

229
00:14:03.159 --> 00:14:07.639
get obsolete, the time has passed that you cannot embrace it.

230
00:14:08.759 --> 00:14:12.679
And as you mentioned that agenda GAI is still it's

231
00:14:12.679 --> 00:14:15.279
in its beginnings, and I really like what you mentioned

232
00:14:15.279 --> 00:14:19.000
about having that beginner's mindset, being teachable, having that humility,

233
00:14:19.759 --> 00:14:23.759
because AI can replace everything, but it cannot replace that

234
00:14:23.879 --> 00:14:28.720
human connection in that touchability. So given that you mentioned that,

235
00:14:28.799 --> 00:14:34.799
it's really essential to have that edge, like to have

236
00:14:34.840 --> 00:14:37.639
the competitive edge, you have to keep on depending on

237
00:14:37.720 --> 00:14:42.720
the automation and the technology. So as companies are looking

238
00:14:42.840 --> 00:14:46.399
to implement it, and as they're look going for AI adoption,

239
00:14:47.240 --> 00:14:50.639
what are some of the guardrails that they can think

240
00:14:50.679 --> 00:14:52.960
about that what is the right place to do some

241
00:14:53.080 --> 00:14:55.879
of the cost cutting or for that matter, like you know,

242
00:14:56.000 --> 00:14:59.919
move people or layoff people, And that's one of the

243
00:15:00.080 --> 00:15:04.720
tough challenging questions their organization sends today.

244
00:15:05.320 --> 00:15:10.840
Yeah, Henry and I absolutely don't like cost cutting, go

245
00:15:10.919 --> 00:15:13.879
straight there, but we don't think that that's the right approach,

246
00:15:13.960 --> 00:15:18.039
and I'll want Henry uh away in, but we think

247
00:15:18.120 --> 00:15:21.600
that it's short sighted and it's absolutely the wrong way

248
00:15:21.840 --> 00:15:26.080
if you're guided by cost cutting, and Henry take it away.

249
00:15:26.759 --> 00:15:30.159
Sure, yes, So thank you for even mentioning that and

250
00:15:30.200 --> 00:15:34.120
bringing a smile to our faces there, Divia. Of course,

251
00:15:34.159 --> 00:15:36.879
cost cutting everyone you know, we know that we are

252
00:15:36.919 --> 00:15:40.200
in in this part of the market cycle where everyone

253
00:15:40.279 --> 00:15:43.279
is thinking about that, and you know, over the last

254
00:15:43.320 --> 00:15:47.320
few years the expression do more with less has become

255
00:15:47.600 --> 00:15:50.360
kind of rather common, and it all kind of comes

256
00:15:50.399 --> 00:15:55.440
down to the same thing of cutting costs. We think

257
00:15:56.200 --> 00:16:00.840
that the that there's some challenges are sociated with taking

258
00:16:00.840 --> 00:16:07.960
a cost cutting approach to business. One is it's there's

259
00:16:07.960 --> 00:16:10.360
a question about whether you should be fair or not.

260
00:16:10.799 --> 00:16:13.879
You know, a lot of leaders are concerned about do

261
00:16:14.000 --> 00:16:18.679
we cut costs evenly across the entire place. Do we

262
00:16:18.720 --> 00:16:22.000
say everyone needs to take five percent out of you know,

263
00:16:22.080 --> 00:16:24.519
out of their budgets or ten percent or whatever happens

264
00:16:24.559 --> 00:16:28.720
to be, or do we identify specific areas of the business.

265
00:16:28.799 --> 00:16:32.080
And if we identify specific areas of the business, does

266
00:16:32.120 --> 00:16:35.840
that really mean that we are criticizing that part of

267
00:16:35.879 --> 00:16:41.360
the business, And you know, risk demotivating them because you know,

268
00:16:41.440 --> 00:16:43.879
they think that we think that they are failing in

269
00:16:43.919 --> 00:16:49.120
some way. So it's difficult whether you actually take a

270
00:16:49.200 --> 00:16:51.919
targeted approach to cost cutting or whether you take a

271
00:16:51.960 --> 00:16:56.960
peanut butter, you know, approach to cost cutting. Neither one

272
00:16:57.320 --> 00:17:01.320
really kind of work, particularly a effectively, and it's always

273
00:17:01.320 --> 00:17:05.920
demotivating for people and it's not something that you're building

274
00:17:06.039 --> 00:17:09.440
to use kind of others term of muscles. It's not

275
00:17:09.640 --> 00:17:13.480
a muscle that you're going to continue to use every

276
00:17:13.559 --> 00:17:16.839
day across all different types or all different parts of

277
00:17:16.519 --> 00:17:21.279
your market cycle. When you're in growth mode, you're not

278
00:17:21.319 --> 00:17:24.359
necessarily going to be using that cost cutting muscle, right,

279
00:17:24.440 --> 00:17:27.680
And so that kind of muscle can atrophy in parts

280
00:17:27.680 --> 00:17:31.160
of the economic cycle and then kind of grow again,

281
00:17:31.279 --> 00:17:35.079
and then atrophy and then grow again, and it means

282
00:17:35.079 --> 00:17:40.240
that cost cutting is not actually a core business discipline.

283
00:17:40.279 --> 00:17:44.119
It's not a core business competence. Whereas if you take

284
00:17:44.240 --> 00:17:48.160
the approach that waste is never a good thing to have,

285
00:17:48.720 --> 00:17:52.759
you always want to be able to reduce or eliminate

286
00:17:52.839 --> 00:17:56.920
waste from your operations, then you can build that to

287
00:17:57.039 --> 00:18:00.599
be an organizational muscle that you use all the time.

288
00:18:00.960 --> 00:18:03.160
And it doesn't matter whether you're in growth mode or

289
00:18:03.200 --> 00:18:10.039
whether you're in you know, a slower part of your cycle.

290
00:18:10.559 --> 00:18:13.759
It's always relevant. You can always be looking for waste.

291
00:18:14.079 --> 00:18:18.279
And what we're thinking about waste is because when most

292
00:18:18.359 --> 00:18:21.599
people hear the world waste, they're thinking about materials largely,

293
00:18:21.759 --> 00:18:24.119
but they're thinking about what do you do with excess

294
00:18:24.160 --> 00:18:27.039
materials that you actually don't use, that are kind of

295
00:18:27.359 --> 00:18:33.720
or byproducts for manufacturing and production operations. When we're talking

296
00:18:33.720 --> 00:18:37.960
about waste, we're talking about any resource. So waste for

297
00:18:38.119 --> 00:18:42.759
us is the misuse, the overuse or the underuse of

298
00:18:42.880 --> 00:18:46.039
a resource in order to do a job. And so

299
00:18:46.079 --> 00:18:49.279
that can be a human I mean, just generally, that

300
00:18:49.359 --> 00:18:52.359
can be you shouldn't be using this person to do

301
00:18:52.400 --> 00:18:56.680
this job, because they're much more qualified than you need

302
00:18:56.720 --> 00:18:59.480
to do this particular job. You could be using an

303
00:18:59.559 --> 00:19:03.559
agent at this point, and you could be honoring all

304
00:19:03.599 --> 00:19:05.759
of the time and all of the energy and all

305
00:19:05.799 --> 00:19:08.599
of the effort that this person has put in to

306
00:19:08.720 --> 00:19:12.519
making them uniquely qualified to do a job, and you

307
00:19:12.559 --> 00:19:16.119
could be having them do that job. Right, So you

308
00:19:16.160 --> 00:19:18.680
can have waste at all sorts of levels. They can

309
00:19:18.720 --> 00:19:21.519
be materials, they can be humans, they can be data,

310
00:19:21.640 --> 00:19:24.680
they can be money, they can be any resource that

311
00:19:24.759 --> 00:19:28.640
you have. And basically what we think should be happening

312
00:19:28.720 --> 00:19:31.680
is we think businesses should be looking all the time

313
00:19:32.400 --> 00:19:38.440
tom or reduce and eliminate waste. And the kind of

314
00:19:38.400 --> 00:19:41.960
a punchline there, if you like, is the all waste

315
00:19:42.319 --> 00:19:45.480
is costly, right, All waste has a cost associated with it,

316
00:19:45.599 --> 00:19:48.440
but not all costs are wasteful. So it means that

317
00:19:48.480 --> 00:19:52.559
whenever you're focused on waste, you're always cutting costs anyway,

318
00:19:53.240 --> 00:19:56.400
you're just cutting the right kind of costs, because anything

319
00:19:56.440 --> 00:20:00.200
that's costly is anything that does not add vantage you

320
00:20:00.319 --> 00:20:02.640
and in particular does not add value to your customers.

321
00:20:02.680 --> 00:20:05.880
So that's where that's where we're really thinking about waste

322
00:20:06.160 --> 00:20:08.960
rather than cost, and where that way think about waste

323
00:20:09.279 --> 00:20:10.400
rather than value.

324
00:20:10.920 --> 00:20:15.920
And some examples in business research shows that when you're

325
00:20:15.960 --> 00:20:19.480
a sales professional, you're only spending twenty five percent of

326
00:20:19.480 --> 00:20:24.839
your time actually selling, becoming a trusted advisor guiding your customers.

327
00:20:24.920 --> 00:20:29.440
Seventy five percent of sales professionals are doing research administrative

328
00:20:29.519 --> 00:20:32.160
tasks and not in front of a client or a

329
00:20:32.200 --> 00:20:36.559
prospect actually helping them achieve their success. Eighty percent of

330
00:20:36.680 --> 00:20:41.799
marketing leads are never transitioned to sales organizations. They sit

331
00:20:41.880 --> 00:20:45.200
in the funnel and they decay to become irrelevant. So

332
00:20:45.240 --> 00:20:49.240
there's a ton of research that speaks to wasteful activity

333
00:20:49.279 --> 00:20:52.119
in your call centers, wasteful activity in your sales and

334
00:20:52.160 --> 00:20:56.559
marketing human resources. And then when you look at pure

335
00:20:56.599 --> 00:21:00.400
research and you look at people's sentiment towards work of

336
00:21:00.440 --> 00:21:03.279
workers are looking for their next gig. It's harder to

337
00:21:03.359 --> 00:21:06.480
maintain talent in your company, and we believe part of

338
00:21:06.480 --> 00:21:10.920
that is we're signing wasteful activity to our human resources,

339
00:21:11.799 --> 00:21:14.680
things that are repetitive, things that are boring, boring. In

340
00:21:14.720 --> 00:21:17.160
a blink of an eye, you realize you spent ten

341
00:21:17.240 --> 00:21:20.640
years of your career pushing a bunch of buttons and

342
00:21:21.519 --> 00:21:24.720
doing work that, frankly, software could have done on your behalf,

343
00:21:25.119 --> 00:21:28.960
freeing you up to listen to podcasts, to go to conferences,

344
00:21:29.000 --> 00:21:31.880
to read books, to go back to school and become

345
00:21:31.960 --> 00:21:37.319
lifelong learners. Frankly, I believe this technology can free us

346
00:21:38.119 --> 00:21:41.559
and give us joy and not just sense of belonging.

347
00:21:41.599 --> 00:21:45.039
Because we carry a badge. We have a badge on

348
00:21:45.079 --> 00:21:47.640
the cover of our book. It's a reminder that you

349
00:21:47.680 --> 00:21:51.319
can belong to an organization and a company, but that

350
00:21:51.359 --> 00:21:55.559
doesn't guarantee a sense of mattering. And I believe that

351
00:21:55.680 --> 00:21:58.839
if you use technology in the right way, you can

352
00:21:58.839 --> 00:22:01.559
not only have a sense of belonging, but you actually

353
00:22:01.559 --> 00:22:04.359
believe your work matters. And I don't know how you

354
00:22:04.400 --> 00:22:07.039
can achieve your balance potential if you don't think your

355
00:22:07.039 --> 00:22:11.359
work matters. And so I'm very optimistic. We're very optimistic,

356
00:22:11.759 --> 00:22:14.599
and I know the pessimists can sound smarter, but in

357
00:22:14.640 --> 00:22:18.839
our experience, the future is built by optimists. So you know,

358
00:22:18.920 --> 00:22:21.880
regardless of your point of view with the technology, and

359
00:22:21.920 --> 00:22:24.240
it can be scary. Trust me, when I get a

360
00:22:24.279 --> 00:22:27.480
ride in these autonomous vehicles, I put my seatbelt on.

361
00:22:27.960 --> 00:22:31.079
You know, you know it's scary, but it's also exciting.

362
00:22:31.599 --> 00:22:36.359
And we're witnessing companies really achieve a level of creativity

363
00:22:37.119 --> 00:22:41.559
and value creation unlike anytime in our lifetime. You just

364
00:22:41.640 --> 00:22:45.319
have to be mindful and you have to put you know,

365
00:22:45.680 --> 00:22:49.759
our colleagues, you know, on the edge, in the center

366
00:22:49.839 --> 00:22:52.920
and everywhere when you're making decisions on how to invest

367
00:22:52.920 --> 00:22:55.920
in technology so you can elevate your people at the

368
00:22:56.000 --> 00:22:59.119
end of the day. It should not be a replacement strategy.

369
00:22:59.160 --> 00:23:02.640
This should be, as Henry said, a waste removal strategy,

370
00:23:03.279 --> 00:23:06.920
freeing your human talent to reach their balless potential.

371
00:23:09.039 --> 00:23:13.759
Yes, definitely very powerful to both Henry and Waila. Like

372
00:23:13.839 --> 00:23:19.200
your point is, cost cutting is something that I do

373
00:23:19.279 --> 00:23:24.000
not like. So coming from a and you no one

374
00:23:24.119 --> 00:23:28.279
likes it. Coming from a lean sig Sigma background, I

375
00:23:28.279 --> 00:23:33.119
can definitely kind of tell you that redundancies can be

376
00:23:33.240 --> 00:23:36.079
waste looking at the processes, and so many times what

377
00:23:36.119 --> 00:23:39.559
happens I go to the doctor's office, or I am

378
00:23:39.640 --> 00:23:42.960
at the car dealership and I'm like looking, oh my goodness,

379
00:23:43.039 --> 00:23:46.480
like there's so much waste over here and like duplicates.

380
00:23:46.559 --> 00:23:49.799
So I completely agree with you all that you know,

381
00:23:49.839 --> 00:23:52.480
it's important to look at where the waste is happening.

382
00:23:52.759 --> 00:23:56.359
And I'm sure y'all have also seen that. But with

383
00:23:56.440 --> 00:24:00.519
so many of my clients, the one thing that heatedly

384
00:24:00.559 --> 00:24:04.799
props up is that people are tired of doing the research.

385
00:24:04.880 --> 00:24:06.640
Like you know, even if you're an account executive as

386
00:24:06.640 --> 00:24:10.079
you are talking about sales or marketing or for that matter,

387
00:24:10.359 --> 00:24:14.079
any like, you know, most of these jobs, they do

388
00:24:14.160 --> 00:24:16.559
spend all that time in doing the research, like okay,

389
00:24:16.559 --> 00:24:19.319
what what is the competition doing? So if you have

390
00:24:19.440 --> 00:24:25.480
agents for that, then you're freeing up time for employees

391
00:24:25.640 --> 00:24:30.039
to do the work that they like. So definitely, that's

392
00:24:30.279 --> 00:24:32.640
that's a good way of looking at it. Now, when

393
00:24:32.640 --> 00:24:35.920
we say this, how would you say this is going

394
00:24:35.960 --> 00:24:39.240
to impact the changing role of CIO and CHR.

395
00:24:39.119 --> 00:24:47.519
O Great, great question, Okay, Henry smiling here, go ahead, Henry, Well,

396
00:24:48.119 --> 00:24:49.799
I love the fact that you just kind of jumped

397
00:24:49.839 --> 00:24:53.759
straight into that. I think that's I think that's super important.

398
00:24:55.240 --> 00:25:00.319
You know, we're so Vala talked about the siloed world, right,

399
00:25:00.599 --> 00:25:04.240
and I mean to be clear when we talk about silos.

400
00:25:04.680 --> 00:25:08.160
You know, first of all, silos are everywhere. No one

401
00:25:08.319 --> 00:25:11.920
likes them, right, everyone complains about that, but they're but

402
00:25:11.920 --> 00:25:16.119
but they're in every single industry that we've come across,

403
00:25:16.160 --> 00:25:20.240
every single industry. You know, we were in an education

404
00:25:20.319 --> 00:25:23.160
we were giving a keynote and an education conference in

405
00:25:23.400 --> 00:25:26.920
Arizona State University last week, and you know, they're complaining

406
00:25:27.000 --> 00:25:32.319
about their silos in higher education, they complained them about

407
00:25:32.359 --> 00:25:35.200
them in agriculture, they complain about them in manufacturing and

408
00:25:35.640 --> 00:25:39.640
their retail and finance, and they're they're just everywhere and

409
00:25:39.680 --> 00:25:41.559
no one likes them. And so really kind of the

410
00:25:41.599 --> 00:25:44.880
question is why does that happen? And and it turns

411
00:25:44.920 --> 00:25:48.400
out that silos are actually just the way we manage resources,

412
00:25:48.559 --> 00:25:53.440
or the way that we that we conventionally have managed resources.

413
00:25:53.480 --> 00:25:55.799
So we put all of our marketing expertise into a

414
00:25:55.880 --> 00:26:00.799
marketing bucket or a marketing silo marketing department, we do

415
00:26:00.880 --> 00:26:02.559
the same all of our I t all of our

416
00:26:03.279 --> 00:26:06.680
technological resources we put into the I T Department and

417
00:26:06.720 --> 00:26:08.880
so on and so forth, and this is how we

418
00:26:09.000 --> 00:26:13.519
end up with And this is not a criticism of

419
00:26:13.559 --> 00:26:18.279
any individual organization, it's just how we have developed over decades,

420
00:26:18.359 --> 00:26:21.480
maybe even over hundreds of years. This is how we

421
00:26:21.559 --> 00:26:25.279
come to have a CIO or a CTO who is

422
00:26:25.480 --> 00:26:28.960
seen as being the leader of all of the technological resources,

423
00:26:29.400 --> 00:26:32.279
and we have a HROO who is seen as being

424
00:26:32.319 --> 00:26:35.240
the leader of all of the human resources. And the

425
00:26:35.359 --> 00:26:37.720
question is, and this is I think very much what

426
00:26:37.799 --> 00:26:40.440
we're getting at is, so what does the future.

427
00:26:40.079 --> 00:26:45.039
Look like digital labor where it's not just human resources,

428
00:26:45.759 --> 00:26:49.200
it's human and you know agentic AI.

429
00:26:49.119 --> 00:26:55.960
Resources Go ahead, Yeah, that's right. So there are discussions

430
00:26:55.960 --> 00:26:58.519
going on about, you know, whether the CIO now kind

431
00:26:58.559 --> 00:27:02.240
of takes a closer position next to the CEO because

432
00:27:02.279 --> 00:27:06.640
they're responsible for more important resources, or or you know,

433
00:27:06.640 --> 00:27:10.799
what does it look like in our mind? Just as

434
00:27:10.880 --> 00:27:14.359
Valla said, we now need to be not only cognizant of,

435
00:27:14.480 --> 00:27:18.039
but we need to be actively working in a world

436
00:27:18.119 --> 00:27:21.079
where we know that we have both digital and human

437
00:27:22.200 --> 00:27:27.079
and working together on some tasks, working individually on other tasks.

438
00:27:27.759 --> 00:27:31.880
And we need to be able to see them collectively,

439
00:27:32.440 --> 00:27:36.920
as you know, our most important resource in business, and

440
00:27:37.079 --> 00:27:39.799
we need to be able to see the relationships between

441
00:27:39.880 --> 00:27:42.759
them as the most important thing that we have in

442
00:27:42.799 --> 00:27:45.519
our business. Right at the beginning of your podcast, Devia,

443
00:27:45.640 --> 00:27:47.839
you've got to I think that there's a statement that

444
00:27:47.880 --> 00:27:52.599
says something like reships, right, relationships are essential to achieving

445
00:27:52.640 --> 00:27:55.720
your potential. Right, I think that's right in the first

446
00:27:55.799 --> 00:27:59.759
thirty seconds of the introduction to your podcast. And of course,

447
00:27:59.799 --> 00:28:01.960
as you can tell, both of us are like, she's

448
00:28:02.039 --> 00:28:07.160
so right, that's exactly right. And so we think that

449
00:28:07.240 --> 00:28:11.640
the future of the CIO and the CHR, however, that's

450
00:28:11.759 --> 00:28:16.880
kind of organized, but that needs to be about developing

451
00:28:17.000 --> 00:28:24.680
new relationships between humans and urgentic ai such that they

452
00:28:24.839 --> 00:28:30.480
all individually and collectively achieve their potential. When we talk

453
00:28:30.480 --> 00:28:34.599
about autonomous I want to get here, here's this book.

454
00:28:35.440 --> 00:28:38.200
When we talk about autonomous we're not just talking about

455
00:28:38.279 --> 00:28:42.839
argentic autonomy. We talk about human autonomy as well. And

456
00:28:43.000 --> 00:28:46.880
the difference that we're really talking about between two is

457
00:28:47.599 --> 00:28:51.160
that we think that the agentic AI can take on

458
00:28:51.319 --> 00:28:54.880
more of the operational autonomy. They can do more and

459
00:28:55.000 --> 00:28:59.519
more tasks without humans, but humans take on the mission autonomy.

460
00:29:00.079 --> 00:29:03.240
You know, we're the ones that actually set the direction.

461
00:29:04.160 --> 00:29:07.279
We set all of the guardrails. We've already kind of

462
00:29:07.279 --> 00:29:09.920
talked about all of that. We set the governance, but

463
00:29:10.000 --> 00:29:13.200
we set the direction, We set the north star, as

464
00:29:13.240 --> 00:29:15.960
we talked about, as Valor just talked about, when you're

465
00:29:15.960 --> 00:29:20.119
in a weaymow. Now you're not driving the car anymore.

466
00:29:19.920 --> 00:29:25.279
You're not a you're not a machine operator, which is

467
00:29:25.319 --> 00:29:28.599
what we've always been as car drivers before. But we're

468
00:29:28.680 --> 00:29:32.240
still saying here's where we want to go and here's

469
00:29:32.279 --> 00:29:36.400
how we want to get there, and then having set

470
00:29:36.400 --> 00:29:39.640
the mission, then we're free. And that's what we see

471
00:29:39.720 --> 00:29:44.799
in business. We're seeing that humans set the direction, they

472
00:29:44.839 --> 00:29:48.119
set the vision, they set the goals, they set the culture,

473
00:29:48.680 --> 00:29:52.920
they set all of the conditions for success, and AI

474
00:29:53.000 --> 00:29:57.160
takes more of the operational autonomy. So, again, in answer

475
00:29:57.200 --> 00:29:59.240
to your question, where does this lead the CIO and

476
00:29:59.279 --> 00:30:03.599
the CHR. There's a danger that if they stay in

477
00:30:03.640 --> 00:30:07.119
their existing buckets, if they stay in it, and if

478
00:30:07.119 --> 00:30:10.440
they stay in HR, the either one or both of

479
00:30:10.480 --> 00:30:13.279
them are going to become redundant and obsolete, to use

480
00:30:13.319 --> 00:30:17.000
a word that you used earlier. If they combine forces

481
00:30:17.799 --> 00:30:22.720
and they look over each other's silos and they recognize

482
00:30:22.720 --> 00:30:25.599
that what we're really looking now at is a future

483
00:30:25.759 --> 00:30:29.559
of a hybrid workforce, then I think some very very

484
00:30:29.599 --> 00:30:31.960
strong things can be done that have never been done before.

485
00:30:32.839 --> 00:30:37.559
My advice to CHROs and the longest study on happiness

486
00:30:37.559 --> 00:30:41.559
and success Harvard study that I believe you know over

487
00:30:41.559 --> 00:30:45.920
eighteen monitored one thousand people for seventy years, found that

488
00:30:46.400 --> 00:30:53.200
the number one success factor for happiness and success was relationships,

489
00:30:53.480 --> 00:30:56.599
not fame, not fortune. The people I had the healthiest relationships,

490
00:30:56.880 --> 00:30:59.839
but the happiest and more successful. Again, the longest study

491
00:30:59.880 --> 00:31:03.279
on happiness by Harvard. And so, in fact, I work

492
00:31:03.319 --> 00:31:07.160
for a company CRM with that R is relationships in CRM,

493
00:31:07.960 --> 00:31:12.079
and CHROs need to recognize that as much as we

494
00:31:12.160 --> 00:31:17.480
all value relationships, none of us have taken formal courses

495
00:31:18.039 --> 00:31:21.799
on designing healthy relationships. I've never had one in my career.

496
00:31:22.599 --> 00:31:26.279
And now you're talking about we're the last generation of

497
00:31:26.359 --> 00:31:29.680
business leaders that are only going to manage humans. We're

498
00:31:29.680 --> 00:31:32.160
not going to be tasked with managing humans and agents.

499
00:31:33.000 --> 00:31:37.039
So how do you reach your full potential? Your balance potential?

500
00:31:37.440 --> 00:31:43.680
In the absence of purposeful frameworks that ensure human to human,

501
00:31:44.160 --> 00:31:50.319
human to software, software to software creates relationships that help

502
00:31:50.400 --> 00:31:53.880
us be happy and successful. We need to now think

503
00:31:53.920 --> 00:31:58.160
about strongly designing for relationships. And then it's no longer

504
00:31:58.279 --> 00:32:01.920
human resources, it's intelligent resources, and it could be human,

505
00:32:01.920 --> 00:32:04.880
it could be software, and so the paradigm has changed.

506
00:32:04.960 --> 00:32:07.559
We are now at an inflection point where my company

507
00:32:07.559 --> 00:32:11.680
has thirteen thousand companies that have deployed agents and so

508
00:32:11.920 --> 00:32:16.720
unlike any other adoption curve in my lifetime. So CHROs

509
00:32:16.799 --> 00:32:21.079
get in the game designed for healthy relationships. Recognize that

510
00:32:21.200 --> 00:32:24.480
all companies are tech companies. Meaning this is why we

511
00:32:24.559 --> 00:32:27.759
said AI first strategy and digital labor, and we use

512
00:32:27.799 --> 00:32:31.079
the word fittest because Henry and I can't imagine a

513
00:32:31.160 --> 00:32:36.240
company that can compete and win today and tomorrow unless

514
00:32:36.279 --> 00:32:40.640
they have digital labor, because digital labor gives you limitless

515
00:32:40.680 --> 00:32:44.759
resources for you to create value at speed, scale and

516
00:32:44.799 --> 00:32:48.599
intelligence alike anything we've had in the past. So I

517
00:32:48.640 --> 00:32:51.440
think it's Batman and Robin when I think about CIOs

518
00:32:51.440 --> 00:32:55.480
and Chrros or Batman and Wonder Woman. Let's make sure

519
00:32:57.880 --> 00:33:02.319
the diversity is actually a competitive advantage. Especially in a

520
00:33:02.319 --> 00:33:05.480
world where our thoughts are becoming algorithms. We have to

521
00:33:05.519 --> 00:33:08.559
have diversity because if you don't have diversity, you get

522
00:33:08.599 --> 00:33:12.200
biases of a common cohort creating algorithms that doesn't represent

523
00:33:12.599 --> 00:33:17.119
society or customers, your employees, your partners. So these two

524
00:33:18.440 --> 00:33:23.440
line of business leaders, I've never had more opportunity to

525
00:33:23.480 --> 00:33:28.279
help shape their companies' future success. And hopefully Batman and

526
00:33:28.319 --> 00:33:31.039
Wonder Woman get in the game and think about this

527
00:33:31.079 --> 00:33:34.920
whole new world that we're building, because it's incredibly exciting

528
00:33:35.279 --> 00:33:37.480
but at the same time a little bit scary as well.

529
00:33:38.359 --> 00:33:42.519
And this is this is you know, one of the

530
00:33:42.599 --> 00:33:45.519
one of the unknowns as well. Just as Vallas said,

531
00:33:45.559 --> 00:33:49.480
you know, he's never taken a course in relationship building,

532
00:33:49.839 --> 00:33:53.200
you know, a professional course in relationship building, because there

533
00:33:53.200 --> 00:33:56.400
aren't any And not only are there are no courses,

534
00:33:56.720 --> 00:33:59.519
but there's no design that he talked to again about

535
00:33:59.599 --> 00:34:04.519
design and designing intentionally, there is no design practice or

536
00:34:04.559 --> 00:34:09.800
no design discipline for relationships. We do have and we

537
00:34:09.920 --> 00:34:12.239
have had for the last twenty twenty five years or so.

538
00:34:12.920 --> 00:34:21.760
It's a design design practices for experiences, but never for relationships.

539
00:34:22.400 --> 00:34:25.320
And so this is something that we think is going

540
00:34:25.360 --> 00:34:27.519
to be another of these new things. It's nothing to

541
00:34:27.519 --> 00:34:30.079
do with the technology. This is nothing to do with

542
00:34:30.079 --> 00:34:32.960
the technology. It's like we've been in a world of

543
00:34:33.000 --> 00:34:38.760
relationships for tens of thousands or hundreds of thousands of years,

544
00:34:39.760 --> 00:34:42.760
and in the business world, we don't really take them

545
00:34:42.800 --> 00:34:45.440
as seriously as we should, even though everyone knows they're

546
00:34:45.480 --> 00:34:48.559
the most important thing. Again at the beginning of your podcast,

547
00:34:48.639 --> 00:34:51.400
the most important thing. Everyone knows it, and yet we

548
00:34:51.400 --> 00:34:53.760
don't design for them. There is no way to design

549
00:34:53.760 --> 00:34:54.519
for them currently.

550
00:34:55.840 --> 00:34:58.159
It's learning my analogy, we assume if you deliver a

551
00:34:58.159 --> 00:35:01.159
good experience, we're likely to have a good relationship, and

552
00:35:01.199 --> 00:35:03.320
if you don't hear from the other side, we assume

553
00:35:03.320 --> 00:35:05.840
they're satisfied. In fact, it could be indifference, which is

554
00:35:05.880 --> 00:35:09.480
the greatest enemy to your success. So you know, we

555
00:35:09.559 --> 00:35:14.079
can't and experiences or loyalty it's based you know, advocacy

556
00:35:14.119 --> 00:35:17.280
and loyalty is based on your last experience, your first experience,

557
00:35:17.320 --> 00:35:21.880
and all the experiences in between. But that's not specific

558
00:35:22.000 --> 00:35:26.440
design thinking in terms of reciprocity of value and trust,

559
00:35:26.800 --> 00:35:30.760
where it's foundational to healthy relationships. And now you're talking

560
00:35:30.760 --> 00:35:35.599
about humans and software you're talking about speed scales, intelligence,

561
00:35:36.000 --> 00:35:39.079
and personalization. That's not like anything we've ever experienced in

562
00:35:39.119 --> 00:35:42.079
the past. So we can no longer afford to just

563
00:35:42.159 --> 00:35:46.320
assume a good experience means good relationship really needs to

564
00:35:46.320 --> 00:35:47.519
be more deliberate and precise.

565
00:35:48.199 --> 00:35:50.519
You're absolutely right, and at the core of it, it's

566
00:35:50.559 --> 00:35:53.599
going to be the human relationships. And you mentioned something

567
00:35:53.760 --> 00:35:57.519
very powerful that it's not just now between human and human,

568
00:35:57.639 --> 00:36:01.920
it is between software and software. Between let's just call

569
00:36:01.960 --> 00:36:04.639
them bots, because most people, like you know in the audience,

570
00:36:04.719 --> 00:36:07.280
like understand, so like bad talking to the but talking

571
00:36:07.280 --> 00:36:11.039
to the human human having different bosses. So given all

572
00:36:11.039 --> 00:36:14.239
of that, you know what, as we keep on building

573
00:36:14.280 --> 00:36:19.159
these UIs or the user interface and user experiences, whether

574
00:36:19.159 --> 00:36:21.800
it's with the human being or with the software, what

575
00:36:21.920 --> 00:36:25.119
are the seven levels? If you can mention that in

576
00:36:25.199 --> 00:36:27.320
brief as we kind of coming to a close to

577
00:36:27.360 --> 00:36:31.760
the show, what would those seven autonomous levels might be?

578
00:36:33.159 --> 00:36:37.119
Yeah, in the book, we've included even a chart to summarize. Again.

579
00:36:38.280 --> 00:36:42.280
You know, you take any job, and a job consists

580
00:36:42.320 --> 00:36:45.920
of multiple tasks. So in order to complete a job,

581
00:36:46.199 --> 00:36:48.840
it may be one task, it maybe twenty tasks, but

582
00:36:49.000 --> 00:36:52.639
ultimately you can break it to very composable levels of tasks.

583
00:36:53.159 --> 00:36:55.679
So the first thing that an agent, and once you

584
00:36:55.760 --> 00:37:00.440
define the task, you you allow the agent actions that

585
00:37:00.480 --> 00:37:04.719
you sanction actions you want the agent to take. Guardrails

586
00:37:04.760 --> 00:37:07.239
things you never want the agent to take. So if

587
00:37:07.280 --> 00:37:09.519
you're on an e commerce side, you don't want the

588
00:37:09.559 --> 00:37:12.760
agent to offer up discounts. You know you want always

589
00:37:12.760 --> 00:37:16.280
a human involved or offer up promotions. To next level,

590
00:37:16.639 --> 00:37:20.599
so does guardrails, and then the channels you want the

591
00:37:20.639 --> 00:37:23.760
intelligence software to operate. It could be WhatsApp, it could

592
00:37:23.800 --> 00:37:28.159
be your app, your website, your social network. So jobs, actions,

593
00:37:28.199 --> 00:37:32.960
guard rails, and channels very precise, very composable way. So

594
00:37:33.360 --> 00:37:35.679
the first level of autonomy is a task that can

595
00:37:35.719 --> 00:37:39.719
be delegated to intelligent software. If you can do collections

596
00:37:39.760 --> 00:37:44.159
of tasks, it could be potentially a job or a role.

597
00:37:44.679 --> 00:37:48.960
So an individual within your line of business, entire set

598
00:37:49.039 --> 00:37:52.320
of tasks and job could be delegated to multiple one

599
00:37:52.480 --> 00:37:56.840
or multiple agents. Eventually you can spread that capability where

600
00:37:56.880 --> 00:38:00.119
it could be your marketing department, your sales department. You

601
00:38:00.159 --> 00:38:03.039
can go from an individual to a role, to a

602
00:38:03.159 --> 00:38:07.960
team to a line of business. You have agentic orchestration,

603
00:38:08.119 --> 00:38:10.679
So your sales agent is talking to your marketing agent,

604
00:38:10.800 --> 00:38:14.239
your customer service agent, your commerce or human resources agent.

605
00:38:14.599 --> 00:38:18.119
Your entire enterprise could have an agentic layer. Now you

606
00:38:18.119 --> 00:38:22.840
have people and software across your entire business that can

607
00:38:22.840 --> 00:38:26.159
co create value. And then ultimately every business has an

608
00:38:26.199 --> 00:38:30.400
ecosystem of partners, so you have, you know, your tech stack,

609
00:38:30.440 --> 00:38:34.639
and an enterprise could represent forty fifty sixty different partners

610
00:38:35.119 --> 00:38:37.960
and every company will have its own agents. And at

611
00:38:37.960 --> 00:38:40.480
that point you need to be able to orchestrate and

612
00:38:40.519 --> 00:38:44.960
create more of a choreography of agentic capabilities that's beyond

613
00:38:45.039 --> 00:38:49.079
just your enterprise, but your ecosystem. So the last level

614
00:38:49.360 --> 00:38:54.679
is ecosystem orchestration of agentic capabilities. And those are the levels.

615
00:38:54.760 --> 00:38:57.920
And in an autonomous vehicle that we've been mentioning, you know,

616
00:38:57.920 --> 00:39:01.360
there's also six levels of autonomy, zero to level five.

617
00:39:01.519 --> 00:39:04.840
Zero means no autonomy. Level two or three could be

618
00:39:04.960 --> 00:39:07.559
like parking assist or you have to have your hand

619
00:39:07.559 --> 00:39:11.159
on the wheel while the car is operating. But when

620
00:39:11.159 --> 00:39:13.360
we talk about the zekes and we talk about way

621
00:39:13.400 --> 00:39:18.639
More and we talk about cybercamp, there are no human involvement.

622
00:39:18.719 --> 00:39:23.920
The car is designed for passengers, not drivers. And when

623
00:39:23.920 --> 00:39:28.079
we talk about levels of autonomy in business, the scary

624
00:39:28.119 --> 00:39:30.639
part when you get into a zekes or a cybercamp

625
00:39:30.639 --> 00:39:33.880
from Tesla is the first thing you notice is and

626
00:39:33.960 --> 00:39:37.679
you mentioned user interface, all human user interfaces have been removed.

627
00:39:38.079 --> 00:39:40.880
There is no steering wheel in a cybercap, there is

628
00:39:40.920 --> 00:39:45.199
no break or gas pedal, there is no mirror, there

629
00:39:45.239 --> 00:39:49.239
is no window in the back. So that first AI,

630
00:39:49.480 --> 00:39:53.960
first design means some courageous engineer. She rose her She

631
00:39:54.039 --> 00:39:56.400
had her hand up probably five six years ago in

632
00:39:56.440 --> 00:40:00.719
a meeting at Amazon. At Tesla, she said, I have

633
00:40:00.760 --> 00:40:03.480
an idea. Let's design cars where we don't need a

634
00:40:03.480 --> 00:40:07.719
steering wheel, what we don't need gas or brake pedals.

635
00:40:08.800 --> 00:40:12.199
I'm sure the room was very quiet as she courageously

636
00:40:12.280 --> 00:40:17.519
raised her hand. But the realization was, let's build cars

637
00:40:17.519 --> 00:40:20.239
for writers so that I don't have to take a

638
00:40:20.320 --> 00:40:22.199
key for my eighty five year old dad, which I

639
00:40:22.199 --> 00:40:25.440
had to do this year, which five million people that

640
00:40:25.559 --> 00:40:28.440
lose their vision every year can still be able to

641
00:40:28.480 --> 00:40:31.239
get from a to B and own a vehicle. If

642
00:40:31.239 --> 00:40:34.199
you're dealing with seizures, if you're dealing with aging, if

643
00:40:34.239 --> 00:40:39.519
you're dealing with site loss, and ultimately we'll see external

644
00:40:39.559 --> 00:40:42.440
positivity with all of this because these cars are constantly

645
00:40:42.480 --> 00:40:45.000
in motion, so we don't need as many parking spots.

646
00:40:45.239 --> 00:40:48.639
We can have libraries, we can have museums, we can

647
00:40:48.639 --> 00:40:51.840
have more schools, so there'll be societal impact as a

648
00:40:51.840 --> 00:40:55.639
result of this AI for a strategy, and most importantly,

649
00:40:55.840 --> 00:40:58.800
you know it'll be safer to operate and cheaper to produce.

650
00:40:59.079 --> 00:41:03.719
So the seven levels is a reminder that the worst

651
00:41:03.760 --> 00:41:07.519
AI we're using is today. No matter what complaints you

652
00:41:07.559 --> 00:41:11.320
have about the algorithm, it's slow, it hallucinates. Just wait

653
00:41:11.360 --> 00:41:14.639
a week, wait a month, wait a quarter. Certainly by

654
00:41:14.679 --> 00:41:17.079
this time next year, I mean, just look at the

655
00:41:17.119 --> 00:41:19.360
improvements on the tools that we have on our phones,

656
00:41:19.400 --> 00:41:21.159
whether it's Gemini, Perplexity Chat.

657
00:41:21.920 --> 00:41:24.239
Yeah, like you know, the day one, like when I

658
00:41:24.360 --> 00:41:27.639
jumped into like when GEDGPT was like, because I'm usually

659
00:41:27.639 --> 00:41:31.119
the early adopter for any technology there, it was like

660
00:41:31.159 --> 00:41:34.039
that plunky you know, Jedgibt three. And now you have

661
00:41:34.119 --> 00:41:36.239
got the reasoning and depri so you can even kind

662
00:41:36.239 --> 00:41:39.920
of see like how it is thinking through, like it

663
00:41:40.079 --> 00:41:43.719
literally kind of shows you and it's just fascinating. So

664
00:41:44.079 --> 00:41:47.039
not to deter from what you're saying, is that one

665
00:41:47.039 --> 00:41:49.719
thing that really hit me hard while I was that

666
00:41:50.360 --> 00:41:54.639
taking the keys away from your parents, taking that independence away.

667
00:41:55.199 --> 00:41:59.599
So AI is something that will provide that agency to

668
00:41:59.719 --> 00:42:02.760
peopleeople to live their life with dignity and to bring

669
00:42:03.119 --> 00:42:06.880
absolutely potentiel to the workers, like you know, making that

670
00:42:06.960 --> 00:42:10.119
work matter because and it will also allow people to

671
00:42:10.239 --> 00:42:14.519
take leadership, giving them a permission, like anybody can take

672
00:42:14.519 --> 00:42:17.920
a leadership and look at what could be the next frontier.

673
00:42:20.599 --> 00:42:24.199
And as Henry mentioned, you know, an average adult in

674
00:42:24.199 --> 00:42:26.559
the US can have up to an hour of commute.

675
00:42:27.119 --> 00:42:29.960
You know what's wasteful driving a car for an hour,

676
00:42:30.559 --> 00:42:32.960
where you could have a vehicle that's autonomous. So now

677
00:42:33.039 --> 00:42:38.639
you can read your favorite book, you can walk Netflix,

678
00:42:39.039 --> 00:42:42.559
you can work on your Salesforce dashboards, you can listen

679
00:42:42.599 --> 00:42:47.679
to a podcast comfortably. And so you have this nonlinear

680
00:42:47.800 --> 00:42:52.000
optionality when you have autonomous vehicles. With today US three,

681
00:42:52.039 --> 00:42:53.840
when we get in our car, we have one job,

682
00:42:54.079 --> 00:42:56.800
drive the car. There's nothing else we can do. But

683
00:42:57.000 --> 00:42:59.159
my son is fifteen years old, and when I can

684
00:42:59.199 --> 00:43:03.000
imagine and when he leaves graduate school and he's commuting

685
00:43:03.039 --> 00:43:05.960
to work, all the options he'll have in his vehicle.

686
00:43:06.440 --> 00:43:10.400
He can really invest himself and self educate and train.

687
00:43:10.960 --> 00:43:13.119
And I used to have two hour commute early in

688
00:43:13.119 --> 00:43:15.800
my career US. I mean Massachusetts would drift to New

689
00:43:15.840 --> 00:43:19.400
Hampshire one hour each way. I think about for ten

690
00:43:19.519 --> 00:43:22.159
years I drove two hours, five days a week. You

691
00:43:22.159 --> 00:43:24.119
know what I could have done in that two hours

692
00:43:24.119 --> 00:43:26.360
over a decade. First of all, I have a PhD,

693
00:43:26.440 --> 00:43:30.599
which would make my mom happy. I'm her parents, So

694
00:43:30.679 --> 00:43:32.199
no matter what I do, I still need to get

695
00:43:32.239 --> 00:43:35.239
my doctorate. And also, Henry and I would have been

696
00:43:35.239 --> 00:43:39.280
on our seventh book, not on our seconds. So think

697
00:43:39.360 --> 00:43:44.480
about wastefulness and lean into technology. We believe that ultimately

698
00:43:44.960 --> 00:43:48.559
we're going to see a graduation from autonomous products, to

699
00:43:48.599 --> 00:43:53.280
autonomous factories, to autonomous companies, and this great amount of

700
00:43:53.360 --> 00:43:56.559
freedom and waste that we can we can remove on

701
00:43:56.599 --> 00:43:58.719
this journey. And this journey is another ten twenty years

702
00:43:58.760 --> 00:44:01.440
and ongoing. This is not about next week, next month,

703
00:44:01.480 --> 00:44:04.159
next year. This we're absolutely at the beginning.

704
00:44:04.920 --> 00:44:09.159
Absolutely well, thank you Henry, and thank you Alla. How

705
00:44:09.199 --> 00:44:12.559
can people connect with you? Where can they connect with you?

706
00:44:12.760 --> 00:44:15.800
Where can they find your book? So I know it's

707
00:44:15.800 --> 00:44:18.400
on Amazon? Is it any other place.

708
00:44:20.039 --> 00:44:24.800
Yes, So it's on it's Bound and Noble online and

709
00:44:25.000 --> 00:44:30.800
in physical stores. So it's on all major e commerce platforms.

710
00:44:30.960 --> 00:44:37.360
It's in all major physical bookstores across the US. It's

711
00:44:37.519 --> 00:44:42.760
available on Amazon in all major European countries as well,

712
00:44:45.679 --> 00:44:49.400
And so that's the best way to get in touch

713
00:44:49.440 --> 00:44:51.280
with us is to buy the book if you If

714
00:44:51.320 --> 00:44:55.000
you will, we greatly appreciate your support, of course, and

715
00:44:55.079 --> 00:44:59.920
we're both on LinkedIn and so Vallesan is val Afsha

716
00:45:00.159 --> 00:45:03.400
and I'm on as Henry King and Vala is also

717
00:45:06.280 --> 00:45:09.679
a major thought leader on x as well, and so

718
00:45:09.920 --> 00:45:13.719
he's available there at Valafshaw on X.

719
00:45:15.599 --> 00:45:15.800
Well.

720
00:45:15.840 --> 00:45:20.519
Fantastic, thank you for sharing so openly and definitely I

721
00:45:20.679 --> 00:45:24.440
am positive that our audience are definitely going to head

722
00:45:24.440 --> 00:45:26.719
out to Amazon, Bands and Nobles, you know, where are

723
00:45:26.760 --> 00:45:30.519
your favorite places and get the book because they're not

724
00:45:30.559 --> 00:45:34.199
only bringing the head, they're also bringing the heart and

725
00:45:34.760 --> 00:45:39.519
those are the next levels of leadership that we want

726
00:45:39.519 --> 00:45:43.440
to see, not only in our organizations, but wherever you are,

727
00:45:44.039 --> 00:45:47.599
doesn't matter where you're students, you know you're a professional,

728
00:45:47.639 --> 00:45:51.239
you're rising star. Wherever you are at, remember that human

729
00:45:51.280 --> 00:45:56.800
relationships and the human connection is the thing that will

730
00:45:56.840 --> 00:46:01.360
bring us to tab that potential all of us have.

731
00:46:01.559 --> 00:46:04.760
So thank you Allen, thank you Henry for joining us.

732
00:46:04.800 --> 00:46:08.519
We appreciate you ur thanks to heres from really pleased

733
00:46:08.519 --> 00:46:10.079
to be on and thank you for inviting us.

734
00:46:10.519 --> 00:46:13.800
Thank you so much truly, and thank you wonderful audience

735
00:46:13.840 --> 00:46:16.159
for being part of her Sure because without you the

736
00:46:16.199 --> 00:46:18.519
show would not be possible. Reach out to us like

737
00:46:18.559 --> 00:46:21.280
you always do. I so appreciate each and every one

738
00:46:21.320 --> 00:46:25.360
of you and let us know how can we help

739
00:46:25.440 --> 00:46:28.679
you live that life you want to live and you deserve.

740
00:46:29.440 --> 00:46:33.039
And thank you one a tech visit for making the

741
00:46:33.039 --> 00:46:35.880
show possible. So be well and take care until next time.

742
00:46:36.920 --> 00:46:39.360
Thank you for being part of Beyond Confidence. With your

743
00:46:39.360 --> 00:46:41.679
host d v Park, we hope you have learned more

744
00:46:41.719 --> 00:46:44.400
about how to start living the life you want. Each

745
00:46:44.400 --> 00:46:47.320
week on Beyond Confidence, you hear stories of real people

746
00:46:47.440 --> 00:46:51.920
who've experienced growth by overcoming their fears and building meaningful relationships.

747
00:46:52.280 --> 00:46:55.679
During Beyond Confidence, Vapark shares what happened to her when

748
00:46:55.679 --> 00:46:58.119
she stepped out of her comfort zone to work directly

749
00:46:58.159 --> 00:47:01.400
with people across the globe. Not only coaches people how

750
00:47:01.440 --> 00:47:05.199
to form hard connections, but also transform relationships to mutually

751
00:47:05.239 --> 00:47:08.719
beneficial partnerships as they strive to live the life they want.

752
00:47:09.119 --> 00:47:11.000
If you are ready to live the life you want

753
00:47:11.199 --> 00:47:15.559
and leverage your strengths, learn more at www dot dwpark

754
00:47:15.599 --> 00:47:19.280
dot com and you can connect with vat contact at

755
00:47:19.360 --> 00:47:22.519
dvpark dot com. We look forward to you joining us

756
00:47:22.519 --> 00:47:23.119
next week