00:00:00,100 --> 00:00:02,033
Yeah, it's not a little different.
00:00:02,033 --> 00:00:05,066
You know, I think even if you took out
90% of the hype,
00:00:05,766 --> 00:00:09,266
you've got a capability that is
00:00:10,266 --> 00:00:12,566
twice as big as electricity.
00:00:12,566 --> 00:00:14,500
If you believe
we can change the narrative.
00:00:14,500 --> 00:00:16,700
If you believe,
we can change our communities.
00:00:16,700 --> 00:00:20,066
If you believe we can change the outcomes,
then we can change the world.
00:00:20,800 --> 00:00:22,366
I'm Rob Richardson.
00:00:22,366 --> 00:00:24,266
Welcome to disruption. Now.
00:00:24,266 --> 00:00:25,466
Welcome to Disruption Now.
00:00:25,466 --> 00:00:27,700
I'm your host and moderator,
Rob Richardson.
00:00:27,700 --> 00:00:29,666
As always, please like subscribe.
00:00:29,666 --> 00:00:30,966
We're here at 1819.
00:00:30,966 --> 00:00:34,033
We're honored to have them as a sponsor
with me, a special guest,
00:00:34,200 --> 00:00:37,466
Pete Blackshaw,
who is the founder of Brand Rank I.
00:00:37,566 --> 00:00:40,200
Also,
he was the former leader of CentriFuze
00:00:40,200 --> 00:00:43,433
which is the central ecosystem
here in Cincinnati.
00:00:43,700 --> 00:00:45,800
And before that,
I think he did stuff with Nestle
00:00:45,800 --> 00:00:48,000
and was just doing things
in digital innovation.
00:00:48,000 --> 00:00:50,366
So he's been innovating
and disrupting for a while.
00:00:50,366 --> 00:00:52,100
Pete, great to have you shake.
It have you?
00:00:52,100 --> 00:00:53,433
Yeah, I mean.
00:00:53,433 --> 00:00:57,866
So, Pete, you know, you
this is your second startup, right?
00:00:58,966 --> 00:01:00,866
Like, what made you
00:01:00,866 --> 00:01:03,866
jump into the startup world again
at this point?
00:01:04,733 --> 00:01:07,733
In both instances,
you know, I was in a moment.
00:01:08,100 --> 00:01:08,366
Yeah.
00:01:08,366 --> 00:01:10,333
kind of an inflection point
00:01:10,333 --> 00:01:14,300
in the digital world
where you just knew that a disruption.
00:01:14,366 --> 00:01:14,900
Yeah.
00:01:14,900 --> 00:01:17,733
You know, as you would put
it, was upon us.
00:01:17,733 --> 00:01:19,933
And, you know, I struggled on the,
00:01:19,933 --> 00:01:23,866
I think, trying to convince myself
that it was hype, shiny new object.
00:01:23,900 --> 00:01:26,900
And as I
00:01:27,166 --> 00:01:30,433
use the tool myself,
which has been personally transformative,
00:01:30,700 --> 00:01:35,300
I concluded that this was the time
to take the plunge.
00:01:35,300 --> 00:01:36,333
And it's a little bit scary.
00:01:36,333 --> 00:01:39,333
You know, my age,
two kids going to college, but
00:01:39,866 --> 00:01:42,866
I concluded that
I would never forgive myself if I didn't.
00:01:43,933 --> 00:01:45,766
You know,
00:01:45,766 --> 00:01:47,266
take a shot at it. Right.
00:01:47,266 --> 00:01:51,266
There's some I'm working on an idea
that I'm particularly passionate about.
00:01:51,266 --> 00:01:54,566
So you can kind of align a historic moment
with the personal passion.
00:01:54,866 --> 00:01:56,066
You know why not.
00:01:56,066 --> 00:01:58,933
And then the other thing
I was just thinking about was
00:01:58,933 --> 00:02:02,800
I just think, you know, I
expertise is going to be sell demand.
00:02:02,833 --> 00:02:06,566
There's
no there's no downside if you jump into it
00:02:06,566 --> 00:02:09,000
because you're going to learn ten times
faster in startup mode.
00:02:09,000 --> 00:02:11,566
You may come out with no cash.
You may come out of failure.
00:02:11,566 --> 00:02:12,600
It happens technically.
00:02:12,600 --> 00:02:17,666
But there's going to be a lot
of opportunities, post that experience.
00:02:17,666 --> 00:02:19,600
And so for me, it was an acceptable risk.
00:02:19,600 --> 00:02:22,566
And but I'm really excited
about kind of shaping the space.
00:02:22,566 --> 00:02:26,066
So, hey, I've been around for a long time
like, you know, so
00:02:26,066 --> 00:02:29,900
but it was obviously generative
AI that really kind of sparked you.
00:02:30,700 --> 00:02:33,500
What do you see as this moment?
00:02:33,500 --> 00:02:36,866
Like, I know a lot of people always say
this is going to change everything.
00:02:37,133 --> 00:02:39,133
And at many moments in time.
00:02:39,133 --> 00:02:40,966
And there's been a couple
times where that's been true.
00:02:42,500 --> 00:02:43,833
but this time feels a little different.
00:02:43,833 --> 00:02:47,000
I want to get your perspective
on how you think AI
00:02:47,000 --> 00:02:50,000
is going to shape and change business
in our world in general.
00:02:50,200 --> 00:02:52,133
Yeah, it's not a little different.
00:02:52,133 --> 00:02:55,166
You know, I think even if you took out
90% of the hype,
00:02:55,866 --> 00:02:59,366
you've got a capability that is
00:03:00,366 --> 00:03:02,266
twice as big as electricity.
00:03:02,266 --> 00:03:04,466
And I often say this is as much twice.
00:03:04,466 --> 00:03:05,500
As big as electricity.
00:03:05,500 --> 00:03:07,733
Yeah, I really I really believe that.
00:03:07,733 --> 00:03:11,233
and I've often said that, you know,
I think one of the biggest historic
00:03:11,233 --> 00:03:13,700
moments was a big print
and printing press. Yeah.
00:03:13,700 --> 00:03:16,700
You know, all the knowledge was kind
of stored in the hills with the monks,
00:03:16,700 --> 00:03:19,500
you know, and that liberated everything.
00:03:19,500 --> 00:03:21,366
It was almost like social media
on steroids.
00:03:21,366 --> 00:03:22,166
Right.
00:03:22,166 --> 00:03:25,500
And, you know, this is, I mean,
00:03:25,500 --> 00:03:29,566
this is like intelligence
that's highly accessible, right?
00:03:29,700 --> 00:03:32,700
It is iterating at a scary pace.
00:03:32,700 --> 00:03:35,700
so facet policy can't keep up with it.
00:03:36,033 --> 00:03:38,666
none of us can really kind of
keep up with it.
00:03:38,666 --> 00:03:40,500
And we're not really sure
where it's going to lead.
00:03:40,500 --> 00:03:43,833
And there's always a dark side of
technology that we need to be on top of.
00:03:43,833 --> 00:03:47,733
That's why you and I are taking,
a proactive and important lead
00:03:47,733 --> 00:03:50,133
on responsible AI.
00:03:50,133 --> 00:03:52,733
but yeah, this is absolutely huge.
00:03:52,733 --> 00:03:54,433
I mean, just think about it from a,
00:03:55,433 --> 00:03:57,466
you know,
businesses are going to be transformed.
00:03:57,466 --> 00:04:02,533
there will, in fact be a lot of job
dislocation.
00:04:02,533 --> 00:04:04,700
That doesn't mean net jobs will go down.
00:04:04,700 --> 00:04:07,000
But a nice way of saying job lost.
00:04:07,000 --> 00:04:09,600
Yeah, but but but I think as leaders
00:04:09,600 --> 00:04:13,933
we have to get really in front of what
the workforce could look like.
00:04:13,933 --> 00:04:14,333
Yeah.
00:04:14,333 --> 00:04:18,533
You know, what does it mean
to copilot with this new capability?
00:04:18,533 --> 00:04:21,433
And what does it mean to say
bye bye to a certain industry?
00:04:21,433 --> 00:04:23,433
But hello to a new one.
00:04:23,433 --> 00:04:25,500
This is where universities play
a really big role.
00:04:25,500 --> 00:04:28,566
Universities can also be part of
the problem if they don't kind of get in.
00:04:28,866 --> 00:04:29,633
If they don't. Evolve.
00:04:29,633 --> 00:04:29,833
Yeah.
00:04:29,833 --> 00:04:35,033
That's why I think this kind of this
island of the future 1819 digital futures
00:04:35,033 --> 00:04:38,400
is really, really important
at this critical moment in time.
00:04:38,400 --> 00:04:41,200
But it's big.
It's it's it's going to change everything.
00:04:41,200 --> 00:04:41,366
Yeah.
00:04:41,366 --> 00:04:45,866
I, I have I'm obviously pro innovation
across the board.
00:04:45,866 --> 00:04:49,133
You know that
of course I do have concerns because,
00:04:49,133 --> 00:04:52,766
I think this I think this is different
than past technologies.
00:04:52,966 --> 00:04:55,600
So past
technologies have been sometimes industry
00:04:56,666 --> 00:04:57,500
specific.
00:04:57,500 --> 00:05:02,033
We've had we've had very little data
on, you know, general purpose technology.
00:05:02,066 --> 00:05:04,333
The internet was a general
purpose technology.
00:05:04,333 --> 00:05:06,866
so that's one example. There's not many.
00:05:06,866 --> 00:05:11,933
I think AI is a general purpose technology
where it's not getting rid of
00:05:12,466 --> 00:05:15,000
or modernizing one industry.
00:05:15,000 --> 00:05:17,433
It's not the Luddites
where it's just like across the board.
00:05:17,433 --> 00:05:18,900
It's across the board. Right?
00:05:18,900 --> 00:05:22,300
It's like we've gotten to the point
where we're think Mark Cuban said this.
00:05:22,300 --> 00:05:24,833
We're automating automation,
00:05:24,833 --> 00:05:27,266
and we don't know what that looks like
in terms of the job,
00:05:27,266 --> 00:05:30,266
in terms of the workforce,
in terms of what it looks like.
00:05:30,333 --> 00:05:33,133
It's a to your point, it's ever changing.
00:05:33,133 --> 00:05:37,566
And systems and, businesses
and everyone else have to adapt quickly.
00:05:37,833 --> 00:05:42,066
So I like to hear from you like
what is your what is your biggest concern.
00:05:42,533 --> 00:05:47,333
That business that people aren't
realizing about AI real.
00:05:47,400 --> 00:05:49,400
And then we can talk
about the biggest opportunity. Yeah.
00:05:49,400 --> 00:05:52,466
Well the the biggest concern is if
people are going to kind of miss the boat
00:05:52,466 --> 00:05:56,100
and they're and when you're not kind of in
front of it,
00:05:56,366 --> 00:06:00,066
you know, you get a little bit
brash and reckless and maybe
00:06:00,066 --> 00:06:03,866
you, pushed the wrong policies
out of fear and paranoia.
00:06:03,866 --> 00:06:06,866
And that's understandable
because there's a lot of unknown. Yep.
00:06:08,433 --> 00:06:10,766
I've been since the very day it came out.
00:06:10,766 --> 00:06:15,900
I think this can cut in a couple different
ways on the access and equity side.
00:06:15,900 --> 00:06:19,866
There's a part of me that believes that
knows that this is a revolution in access.
00:06:20,233 --> 00:06:23,233
And so therefore,
if we get ahead of the curve, we could
00:06:23,266 --> 00:06:27,033
redefine the rules of access
for entrepreneurs.
00:06:27,333 --> 00:06:31,066
I've co-led workshops
with minority small businesses where
00:06:31,566 --> 00:06:35,866
these tools have allowed these businesses,
which have not had access
00:06:35,866 --> 00:06:40,033
to a lot of resources, tools and the like
to kind of get a lot of them up front.
00:06:40,366 --> 00:06:43,366
The job issue is almost moot
at the small business level.
00:06:43,366 --> 00:06:46,766
AI gets complicated at the big level.
00:06:46,766 --> 00:06:47,100
It gets
00:06:47,100 --> 00:06:51,466
complicated at the white collar level,
it gets complicated at the service level.
00:06:51,466 --> 00:06:53,666
And that's one area
that I think is disproportionately
00:06:53,666 --> 00:06:57,933
represented by minorities and that's where
and we just got to get in front of it.
00:06:57,966 --> 00:06:59,866
Yeah. You know,
and this is where I think leadership is.
00:06:59,866 --> 00:07:01,866
I agree with that. Important I. Think.
00:07:01,866 --> 00:07:05,600
And I would say
I would give credit to the state of Ohio,
00:07:05,600 --> 00:07:07,233
kind of both sides of the aisle, I agree.
00:07:07,233 --> 00:07:10,566
Or, you know, thinking ahead again,
I think that's why you're creating.
00:07:10,566 --> 00:07:12,700
These absolutely. Innovation hubs
and the like.
00:07:12,700 --> 00:07:17,366
But we have to shape the dialog
and we need to yeah, we need to.
00:07:17,366 --> 00:07:19,533
You want a little bit of paranoia right.
00:07:19,533 --> 00:07:22,366
I think that keeps people on guard.
00:07:22,366 --> 00:07:25,066
But we also have to help people
see possibilities.
00:07:25,066 --> 00:07:26,233
I mean, what makes me
00:07:27,200 --> 00:07:27,700
different?
00:07:27,700 --> 00:07:28,400
Well, not different.
00:07:28,400 --> 00:07:30,766
I think you're I,
I probably put you in the same camp.
00:07:30,766 --> 00:07:32,633
Yeah.
You know, I'm an urgent optimist. Yeah.
00:07:32,633 --> 00:07:33,433
So I too.
00:07:33,433 --> 00:07:35,966
I see possibilities.
I'm not a total cynic.
00:07:35,966 --> 00:07:39,100
I'm not on the dark side of Armageddon,
you know? But.
00:07:40,400 --> 00:07:43,766
But I recognize that if you want
the optimism, you have to get it.
00:07:44,033 --> 00:07:45,666
But the urgency of cannot you.
00:07:45,666 --> 00:07:48,000
And then this is what we're kind of doing
right now. Yeah.
00:07:48,000 --> 00:07:50,966
What happens if we get it wrong?
00:07:50,966 --> 00:07:52,900
Listen, I think it's going to be.
00:07:52,900 --> 00:07:55,700
It'll probably change
the political environment.
00:07:55,700 --> 00:07:59,333
You know,
you've just got a lot of polar extremes
00:07:59,433 --> 00:08:02,566
in the marketplace and the political
environment on both sides.
00:08:02,566 --> 00:08:07,600
And I think where there's fear, there's
division and, where there's the unknown.
00:08:07,633 --> 00:08:10,433
and so, yeah, this could get really ugly.
00:08:10,433 --> 00:08:15,466
And, I think, I don't think we've,
you know, we've dodged
00:08:15,466 --> 00:08:19,366
a few bullets with a couple of the strikes
that have taken place.
00:08:19,400 --> 00:08:23,466
You know, I think the Hollywood thing
landed, you know, it landed.
00:08:23,466 --> 00:08:24,066
You know? Yeah.
00:08:24,066 --> 00:08:28,800
You know, and I think they kind of found
that the hybrid model, but,
00:08:29,133 --> 00:08:32,800
yeah, it could get there could be
some industries that really, create
00:08:33,233 --> 00:08:38,266
a lot of, political instability and
absolutely just got to get in front of it.
00:08:38,433 --> 00:08:42,233
And I think there's going to potentially
there's some, you know,
00:08:42,233 --> 00:08:44,100
some racial issues. We're going
00:08:44,100 --> 00:08:47,866
to have to be really sensitive,
like who is helped and who is not.
00:08:47,866 --> 00:08:48,600
And this is where we need
00:08:48,600 --> 00:08:52,333
to be really attentive to,
you know, the different sectors.
00:08:52,366 --> 00:08:57,200
I thought McKinsey did a really good job
about four months ago, looking at
00:08:57,233 --> 00:09:00,666
which groups will be impacted or not,
but they also had a
00:09:01,333 --> 00:09:02,900
they kind of same thing we said earlier.
00:09:02,900 --> 00:09:05,833
Like if we're proactive when we lead
that we can get in front of it.
00:09:05,833 --> 00:09:08,233
Yeah. so I, I agree with that completely.
00:09:08,233 --> 00:09:12,166
And I, we already have data on the issue,
like, we, we know
00:09:13,066 --> 00:09:15,400
what, what algorithms do without
00:09:15,400 --> 00:09:18,566
any type of thought process guardrails.
00:09:18,866 --> 00:09:20,666
That's what social media became.
00:09:20,666 --> 00:09:22,833
And that's the that's how we got
00:09:23,800 --> 00:09:24,533
we these are
00:09:24,533 --> 00:09:27,600
part of our human nature for on is
we have these part of our human nature.
00:09:27,600 --> 00:09:29,866
To be prejudiced against others
is part of our human nature.
00:09:29,866 --> 00:09:33,666
To, de-emphasizing
is sometimes to do things.
00:09:33,666 --> 00:09:36,666
These things it depends upon,
00:09:37,600 --> 00:09:39,466
what we want to emphasize
and unfortunately.
00:09:39,466 --> 00:09:40,166
Right.
00:09:40,166 --> 00:09:44,200
You know, a lot of the algorithms
focus on outrage and conspiracy.
00:09:44,500 --> 00:09:45,866
So those things tend to divide.
00:09:45,866 --> 00:09:47,366
And they do that
not because they're trying
00:09:47,366 --> 00:09:50,800
to focus on those things,
because their goal is singular.
00:09:51,366 --> 00:09:52,833
Is algorithms are rules.
00:09:52,833 --> 00:09:56,266
And all rules,
are written in different ways.
00:09:56,266 --> 00:09:58,966
Correct.
All rules manifest themselves in dial.
00:09:58,966 --> 00:10:02,066
That may be correct a little bit
to the right or a little bit to the left.
00:10:02,066 --> 00:10:04,500
In fact, this is kind of the heart of what
we're trying to measure.
00:10:04,500 --> 00:10:06,433
As part of my new startup.
00:10:06,433 --> 00:10:06,900
And we are
00:10:06,900 --> 00:10:11,666
when we just need to be hypersensitive
to where we think the dial needs to.
00:10:11,733 --> 00:10:15,800
Yeah, to sit in a way
that it, positively impacts,
00:10:15,833 --> 00:10:18,433
the economy, consumer.
Just have to be aware,
00:10:18,433 --> 00:10:20,866
you have to be aware
that there could be issues.
00:10:20,866 --> 00:10:21,700
Oh no question.
00:10:21,700 --> 00:10:22,033
Right.
00:10:22,033 --> 00:10:25,333
Like and that's what it gets to because
they, it's it's algorithms are rules.
00:10:25,333 --> 00:10:29,366
But you know AI is also
the maybe you're training on the data
00:10:29,800 --> 00:10:33,066
and it's it it's going to reflect
what you input to it.
00:10:33,066 --> 00:10:34,800
This is which is why it's so important,
00:10:34,800 --> 00:10:36,566
which is why it's so important
to be inclusive.
00:10:36,566 --> 00:10:41,433
So but tell us okay, but you entered into
AI wanting to solve a specific problem.
00:10:41,433 --> 00:10:43,566
What? What got you up at night?
00:10:43,566 --> 00:10:47,366
What problem are you trying to solve
within with using AI as a tool?
00:10:47,366 --> 00:10:50,366
Because I tell people it's not the
technology is the problem you're solving.
00:10:50,666 --> 00:10:52,533
So what problem are you solving?
00:10:52,533 --> 00:10:54,266
Well, let me take a step back. So I
00:10:55,666 --> 00:10:56,133
came to
00:10:56,133 --> 00:10:59,966
Cincinnati after eight years
in Switzerland, leading
00:11:00,500 --> 00:11:05,666
digital work for NASA, and had a chance
to, go all over the world and really kind
00:11:05,666 --> 00:11:09,066
of see the digital revolution
sure place in different ways.
00:11:09,066 --> 00:11:12,066
And so I came back to Cincinnati to lead
00:11:12,300 --> 00:11:15,366
this startup accelerator
00:11:16,033 --> 00:11:20,066
under funds to really kind of grow
the local and economy.
00:11:21,200 --> 00:11:25,566
And in the process
of trying to figure out, like where
00:11:26,433 --> 00:11:32,133
we could lead a big believer that internet
economies where there's a concentration
00:11:32,133 --> 00:11:36,166
of talent, technology and capital,
yeah, have kind of an outsize advantage.
00:11:36,166 --> 00:11:40,000
And so the areas that have been
not just keeping me up at night,
00:11:40,000 --> 00:11:41,766
but actually making me want to go to work.
00:11:41,766 --> 00:11:42,066
Yeah.
00:11:42,066 --> 00:11:44,700
Or areas like sustainability. Sure.
00:11:44,700 --> 00:11:47,166
Or resilience supply chains.
00:11:47,166 --> 00:11:49,833
That was a big theme
that came out of Covid.
00:11:49,833 --> 00:11:54,433
a lot of things on the health front,
but and then and then
00:11:54,766 --> 00:11:58,866
as the AI stuff started to develop,
especially in the last 18 months,
00:11:59,400 --> 00:12:02,766
thinking a lot
about the power of responsible AI,
00:12:02,766 --> 00:12:07,566
could a region that has really good
consumer credentials.
00:12:07,600 --> 00:12:10,766
Again, you got the world's largest
advertiser here, the world's third largest
00:12:10,766 --> 00:12:13,800
retailer here, and the whole ecosystem
that supports that.
00:12:14,366 --> 00:12:17,100
So could we leverage
00:12:17,100 --> 00:12:20,266
those consumer chops credentials to
00:12:21,433 --> 00:12:23,700
get in
front of these very complicated issues?
00:12:23,700 --> 00:12:27,633
And moreover,
could we cultivate a lot of entrepreneurs
00:12:27,633 --> 00:12:30,633
that would want to be the problem solvers
or the heroes in that area?
00:12:31,433 --> 00:12:35,366
And I found myself as a pretty loud
cheerleader.
00:12:35,366 --> 00:12:36,333
Let's do it.
00:12:36,333 --> 00:12:39,533
And then and then I kind of got to
that moment of truth.
00:12:39,533 --> 00:12:42,533
And I just said, I'm just going to do it
myself,
00:12:42,566 --> 00:12:45,566
you know, give up this really big
salary and,
00:12:45,666 --> 00:12:49,633
be part of the solution and is a whole
nother element of wealth creation.
00:12:49,633 --> 00:12:51,100
If you're successful.
00:12:51,100 --> 00:12:54,866
And, but and what I've been thinking
about, you know,
00:12:54,866 --> 00:12:58,000
the areas that I get excited about
is really the collision of
00:12:59,133 --> 00:13:01,200
AI and sustainability.
00:13:01,200 --> 00:13:06,133
So I actually think I'm a, I,
I do lose sleep about the world,
00:13:06,133 --> 00:13:09,133
not kind of keeping up with its,
its carbon commitments.
00:13:09,333 --> 00:13:13,866
And I think AI is a huge, opportunity
to kind of move that ahead.
00:13:13,866 --> 00:13:17,066
And some of my inspiration has come
from local companies like 80 acres,
00:13:17,066 --> 00:13:20,700
where they use 99% less water
because of the way they use
00:13:20,700 --> 00:13:24,833
artificial intelligence, you know,
and so there's some inspiration.
00:13:24,833 --> 00:13:29,466
So I think as we get smarter,
the use of AI, AI plus sensors,
00:13:29,466 --> 00:13:32,633
I mean, we could dramatically change
the green landscape.
00:13:32,633 --> 00:13:35,900
And I would like to see it cultivated,
you know, here in our backyard.
00:13:36,700 --> 00:13:36,933
Yeah.
00:13:36,933 --> 00:13:39,933
So it sounds like you were working
to be an entrepreneur,
00:13:40,166 --> 00:13:42,933
but, like,
you decided the best way to do that.
00:13:42,933 --> 00:13:45,733
You had to go. Actually,
I've always I've always kind of worked.
00:13:45,733 --> 00:13:46,666
And I was thinking about this
the other day.
00:13:46,666 --> 00:13:49,666
I've always worked in like,
five year cycles, so I was always get
00:13:49,700 --> 00:13:53,133
everybody thinks as a P&G veteran,
I was only there for like 5 or 6 years.
00:13:53,133 --> 00:13:56,766
And then I, you know,
after we got interactive Marketer the year
00:13:56,766 --> 00:14:00,066
I left to do a startup and then I did
the startup was a bit of a slog.
00:14:00,066 --> 00:14:00,666
It took us a while
00:14:00,666 --> 00:14:03,866
before we got sold to Nielsen,
but then I went to Nestlé and then I went.
00:14:03,866 --> 00:14:05,333
Startups are usually a bit of a thing.
00:14:05,333 --> 00:14:08,666
Is just kind of a hybrid
between big company and startup.
00:14:08,666 --> 00:14:11,166
There are elements of that.
We're kind of pretty big and bureaucratic.
00:14:11,166 --> 00:14:13,366
The other parts that we're kind of startup
nimble. Yeah.
00:14:13,366 --> 00:14:14,133
And now I feel.
00:14:14,133 --> 00:14:16,766
Very like I those probably sounds,
but it feels very corporate some time.
00:14:16,766 --> 00:14:17,933
But go ahead. That's my opinion.
00:14:17,933 --> 00:14:19,966
Yeah. I mean it sparks a big debate.
00:14:19,966 --> 00:14:22,000
I mean, listen,
we have a store front for services,
00:14:22,000 --> 00:14:25,300
but we also work with corporates
and we're trying to figure out how to,
00:14:26,500 --> 00:14:26,733
you know,
00:14:26,733 --> 00:14:30,600
how to unlock that startup
corporate connection.
00:14:30,600 --> 00:14:33,266
And it's a lot. Of hard challenges.
00:14:33,266 --> 00:14:34,366
Yeah I know and it's hard.
00:14:34,366 --> 00:14:35,966
Because there's different cultures.
00:14:35,966 --> 00:14:36,866
Yeah. Exactly.
00:14:36,866 --> 00:14:38,933
Yeah. Now what's different today.
00:14:38,933 --> 00:14:40,900
It's like again
you have to exploit the moment.
00:14:40,900 --> 00:14:44,133
There's always moments
where the organization need to be.
00:14:44,133 --> 00:14:45,266
It needs to be stimulated.
00:14:45,266 --> 00:14:48,433
And I think AI is a moment
where startups might have a chance
00:14:48,433 --> 00:14:49,633
to kind of get in the door,
00:14:49,633 --> 00:14:53,766
because there's a lot the big cows
don't really fully understand.
00:14:54,500 --> 00:14:56,900
And I'm hoping
that I'll be one of those folks that does
00:14:56,900 --> 00:14:58,900
get in the door,
because I'm bringing an outside
00:14:58,900 --> 00:15:02,866
in perspective on a very,
very fast evolving, world.
00:15:02,866 --> 00:15:05,700
It's just very, very hard to move fast
in an organization.
00:15:05,700 --> 00:15:08,900
And organizations are really good
at like bottling up their learning
00:15:08,900 --> 00:15:11,900
and everything about disseminating
and executing against the learning.
00:15:11,933 --> 00:15:13,733
But now in the world, generative AI,
00:15:13,733 --> 00:15:16,733
you got a whole different playbook
that's being developed. Yep.
00:15:16,800 --> 00:15:19,400
So real brand rank. yeah.
00:15:19,400 --> 00:15:20,633
Trade this. Like what?
00:15:20,633 --> 00:15:24,500
Again, I can go back to that question
like what is brand rank seeking to do?
00:15:25,333 --> 00:15:27,700
So I'll tell you all great ideas.
00:15:27,700 --> 00:15:29,733
Start with a big insight. Yes.
00:15:29,733 --> 00:15:31,466
And it's like that.
00:15:31,466 --> 00:15:34,066
Oh. Like moment.
00:15:34,066 --> 00:15:37,066
And for me, I was like
00:15:37,066 --> 00:15:40,266
you and everybody
just playing around with generative AI.
00:15:41,366 --> 00:15:42,266
And I started to do
00:15:42,266 --> 00:15:45,266
searches on sustainability.
00:15:45,366 --> 00:15:49,466
And I would ask questions
like, are campus diapers sustainable
00:15:49,933 --> 00:15:53,766
or is Unilever's Dove brand sustainable?
00:15:54,500 --> 00:15:57,500
And I was shocked how
00:15:58,166 --> 00:16:01,466
granular and size the responses were.
00:16:02,533 --> 00:16:06,433
And in some cases, some of that data
would come from the brand websites.
00:16:06,433 --> 00:16:08,600
But, you know, generative
AI is a weird way.
00:16:08,600 --> 00:16:12,233
Imagine like a blender that just kind of
takes best available knowledge,
00:16:12,233 --> 00:16:15,400
puts it in a blender, and then pours it
in a mold that you then access.
00:16:15,400 --> 00:16:16,566
Absolutely. Yeah. Okay. Yeah.
00:16:16,566 --> 00:16:19,100
And that that's a really good metaphor
doesn't really change.
00:16:19,100 --> 00:16:19,933
Now you can
00:16:19,933 --> 00:16:22,900
you can change the how you can say,
give it to me as a Jay-Z rap
00:16:22,900 --> 00:16:25,400
or give it to me as haiku
or to me as a sonnet.
00:16:25,400 --> 00:16:26,966
But the what?
00:16:26,966 --> 00:16:29,100
It's like one shot. What? Yeah.
00:16:29,100 --> 00:16:32,666
And and I thought to myself,
and it was funny because I typed in
00:16:33,300 --> 00:16:34,266
our Pampers sustainable.
00:16:34,266 --> 00:16:38,766
And it was interesting because they said
they're trying but they're not.
00:16:38,766 --> 00:16:40,966
And here's the reasons
and here are some alternatives.
00:16:40,966 --> 00:16:44,500
And I thought, wow, that's like that's
like that's like implicating for brands.
00:16:44,600 --> 00:16:46,266
Yeah. That's like a whole different level.
00:16:46,266 --> 00:16:48,766
So I just thought
so that the light bulb went off.
00:16:48,766 --> 00:16:51,833
I thought, you know, maybe
there's a business model in creating kind
00:16:51,833 --> 00:16:55,866
of a Nielsen ratings of AI search
results, helping
00:16:55,866 --> 00:16:59,866
brand builders understand
where they stand in this world.
00:17:00,233 --> 00:17:05,133
And then could you use the, I'm a really
passion about digital marketing, right?
00:17:05,133 --> 00:17:08,133
I kind of missed the role of being chief
digital officer.
00:17:08,166 --> 00:17:09,300
You know. You do that well.
00:17:09,300 --> 00:17:09,866
Yeah. Yeah.
00:17:09,866 --> 00:17:12,266
But but I think that's
at the forefront of change.
00:17:12,266 --> 00:17:13,633
And what I've noticed
00:17:13,633 --> 00:17:17,566
is that a lot of brands
have gotten really lazy about not lazy.
00:17:17,566 --> 00:17:20,966
It's just I don't think
they are even aware that they're not
00:17:21,466 --> 00:17:25,233
serving consumers at the level
that they should in this environment.
00:17:25,233 --> 00:17:27,966
So for example, like what?
What is. Stagnant?
00:17:27,966 --> 00:17:29,066
What is what is generative?
00:17:29,066 --> 00:17:32,933
I really done
it's raised the bar for giving you answers
00:17:33,600 --> 00:17:35,600
with minimal friction, correct.
00:17:35,600 --> 00:17:40,066
And with shocking levels of detail
and a willingness to continue the dialog.
00:17:40,066 --> 00:17:41,900
Yeah, right.
I mean, you can just keep going.
00:17:41,900 --> 00:17:42,466
You can.
00:17:42,466 --> 00:17:45,633
And it feels like you're talking
to a really smart best friend.
00:17:46,300 --> 00:17:48,633
And brands are really bad at that. Yeah.
00:17:48,633 --> 00:17:53,500
I mean, you can go to most brands from
you know, whether it's Unilever.
00:17:53,900 --> 00:17:56,266
I mean, there's a bunch of them.
I get names.
00:17:56,266 --> 00:17:59,633
I don't piss anybody off locally,
but but yeah, if you go into their,
00:18:00,133 --> 00:18:02,666
their FAQs or their brand sites, you know,
you're
00:18:02,666 --> 00:18:04,766
not going to get a lot of information
very, very little.
00:18:04,766 --> 00:18:08,433
So you're more likely say I'm going
to learn about Pampers on generative AI.
00:18:08,433 --> 00:18:09,266
The problem with that
00:18:09,266 --> 00:18:13,666
is that that narrative often is counter
to the brand's desired narrative.
00:18:13,700 --> 00:18:18,233
So what we're doing is we're kind of
metering it and we've run thousands
00:18:18,233 --> 00:18:24,233
of searches, to really help brands
understand, you know, how they stack up.
00:18:24,233 --> 00:18:26,833
And then we're benchmarking
it versus competition,
00:18:26,833 --> 00:18:29,300
and then we're putting a lot of consulting
on top of it.
00:18:29,300 --> 00:18:32,433
Like it's not enough to just say,
we metered your brand.
00:18:32,433 --> 00:18:34,166
It's like, here's what you need to do.
00:18:34,166 --> 00:18:36,500
And this gets tricky. Like to really win.
00:18:36,500 --> 00:18:39,500
In a world of generative AI,
you have to market to algorithms.
00:18:39,733 --> 00:18:41,733
You have to understand how.
00:18:41,733 --> 00:18:43,166
How do you market the algorithm.
00:18:43,166 --> 00:18:44,400
Well this is important.
00:18:44,400 --> 00:18:45,300
So we just put out
00:18:45,300 --> 00:18:49,200
a white paper called Brand Vulnerability
in the age of AI search.
00:18:49,200 --> 00:18:52,066
And there's something that's called
algorithmic anchors okay.
00:18:52,066 --> 00:18:55,033
And so has some similar areas
of search 1.0.
00:18:55,033 --> 00:18:58,566
But in generative AI
there are clear anchors
00:18:58,566 --> 00:19:01,966
that disproportionately influence
what is being said about you.
00:19:01,966 --> 00:19:04,100
Okay.
So I'm. Looking at Rob Richardson. Okay.
00:19:04,100 --> 00:19:04,933
It's probably going
00:19:04,933 --> 00:19:08,200
to be one of your algorithmic anchors
is probably going to be you see content.
00:19:08,766 --> 00:19:09,400
Okay. Yeah.
00:19:09,400 --> 00:19:11,500
The legacy of leadership here.
00:19:11,500 --> 00:19:13,200
It's shaped your brand.
00:19:13,200 --> 00:19:16,466
And I would say what you see
is on a scale of
00:19:16,466 --> 00:19:19,633
1 to 10, I probably give them a 7 or 8
in terms of marketing algorithms.
00:19:19,633 --> 00:19:23,633
So your content has a good chance
of getting to those algorithms.
00:19:23,633 --> 00:19:24,366
Okay.
00:19:24,366 --> 00:19:27,166
Brand websites are very,
very if they're done
00:19:27,166 --> 00:19:30,833
right, right are very important
algorithmic anchors.
00:19:30,833 --> 00:19:33,500
But it's not just the websites, it's
how you build it.
00:19:33,500 --> 00:19:35,833
So for example,
I was meeting with someone from,
00:19:36,866 --> 00:19:38,000
Kroger the other day.
00:19:38,000 --> 00:19:38,466
Okay.
00:19:38,466 --> 00:19:41,366
And Kroger's
actually doing some pretty good work on
00:19:41,366 --> 00:19:44,400
zero waste, zero hunger.
00:19:44,400 --> 00:19:47,633
I really I, I've read all their reports,
but I noticed
00:19:47,633 --> 00:19:50,733
they weren't getting enough credit
in like the AI algorithms.
00:19:50,733 --> 00:19:54,500
And I suddenly dawned on me that they put
most of their best data in PDFs.
00:19:54,933 --> 00:19:55,766
Oh, well, that'll do it.
00:19:55,766 --> 00:19:56,400
Okay.
00:19:56,400 --> 00:19:56,966
And so, you know,
00:19:56,966 --> 00:20:00,500
it's not a, you know, like a fake engine
to get it's like in a PDF.
00:20:00,500 --> 00:20:01,466
And I'm thinking, oh my gosh.
00:20:01,466 --> 00:20:04,766
They're like they're only getting
a fraction of the ROI on all that work.
00:20:04,766 --> 00:20:05,966
Think of all that work.
00:20:05,966 --> 00:20:07,500
Think of all the Rodney's speeches.
00:20:07,500 --> 00:20:09,700
Think about all those hundreds
of employees
00:20:09,700 --> 00:20:13,166
that have kind of toiled to kind of
get those statistics on the table.
00:20:13,166 --> 00:20:16,966
And then when you got to the last mile
of helping the consumer to understand it,
00:20:17,066 --> 00:20:20,333
you put it in a file
that the algorithms don't understand.
00:20:20,333 --> 00:20:21,433
And so I think,
00:20:21,433 --> 00:20:23,066
you know,
we're almost trying to figure out
00:20:23,066 --> 00:20:25,533
scorecards
that make it easy for brands to see,
00:20:25,533 --> 00:20:28,033
this is what I need to do
to get full credit. Okay.
00:20:28,033 --> 00:20:31,866
Now, in some cases, you won't get full
credit because you're not sustainable.
00:20:32,200 --> 00:20:34,100
You might be dishonest, you know?
00:20:34,100 --> 00:20:37,566
And again, the algorithms
will go to town on you and that respect.
00:20:37,566 --> 00:20:38,700
But I think generally.
00:20:38,700 --> 00:20:40,400
Speaking, how are they going to know you
if you're dishonest?
00:20:40,400 --> 00:20:41,266
Like, how would they.
00:20:41,266 --> 00:20:43,500
Oh, well, because they're looking
at a lot of sources.
00:20:43,500 --> 00:20:44,066
I mean, I'd say
00:20:44,066 --> 00:20:47,900
the brand websites are clearly the most
the biggest cheerleaders for their.
00:20:47,966 --> 00:20:49,133
Well, they're. Biased. They say, okay.
00:20:49,133 --> 00:20:50,433
Right. Yeah. I mean.
00:20:50,433 --> 00:20:52,166
It's an advertising. Vehicle. Exactly.
00:20:52,166 --> 00:20:57,266
But I mean, why is the New York Times
suing OpenAI for 20, $40 billion?
00:20:57,600 --> 00:20:57,866
Yeah.
00:20:57,866 --> 00:21:01,333
It's taken
they know that they're source of truth.
00:21:01,500 --> 00:21:03,966
This is being thrown into the blender.
00:21:03,966 --> 00:21:07,200
It's very hard for people
to see the attribution, but they know
00:21:07,800 --> 00:21:12,033
it is weaving in old New York Times
articles are pretty harsh about brand.
00:21:12,033 --> 00:21:13,900
I mean, they just kind of speak
the truth, right? Right.
00:21:13,900 --> 00:21:16,500
And there's other sites.
It could be Wikipedia.
00:21:16,500 --> 00:21:20,066
Google just spent,
what, $40 million to license read it.
00:21:20,366 --> 00:21:22,366
Yeah. So that stuff's getting weaved in.
00:21:22,366 --> 00:21:25,366
Ratings and reviews are creeping into it
as well. So.
00:21:25,466 --> 00:21:29,800
So one thing that we're, working
on, and I can't get too specific,
00:21:29,800 --> 00:21:32,766
but I will say that this institution
is helping our thinking.
00:21:32,766 --> 00:21:33,666
Right. A lot
00:21:34,633 --> 00:21:36,066
is really
00:21:36,066 --> 00:21:39,866
trying to develop some science
around those sources of attribution,
00:21:40,100 --> 00:21:43,300
like what are those anchors
and how do you get really precise?
00:21:43,300 --> 00:21:47,033
And I would love for, Greater
00:21:47,033 --> 00:21:51,266
Cincinnati to be a true hub for search
analytics, AI, search analytics.
00:21:51,266 --> 00:21:54,633
And it's going to take some work,
but I think we can figure it out.
00:21:54,633 --> 00:21:57,133
And I think the universities are
a big are a big part of that.
00:21:57,133 --> 00:21:59,666
But I think every brand needs
to understand
00:21:59,666 --> 00:22:01,566
what is disproportionately impacting.
00:22:01,566 --> 00:22:03,100
Their their search results.
00:22:03,100 --> 00:22:07,033
What would you say is your top three
advice just 1 to 1 of the top three things
00:22:07,033 --> 00:22:12,566
brands need to do in the age of algorithms
to protect their brand identity.
00:22:12,566 --> 00:22:14,233
Some of it is old fashioned guidance.
00:22:14,233 --> 00:22:17,666
I mean this you got to listen
and these are different signals.
00:22:18,266 --> 00:22:21,333
So in the same way that you would listen
to a consumer in a focus group
00:22:21,333 --> 00:22:24,666
or listen to website data
or listen to social media conversation,
00:22:24,666 --> 00:22:28,233
you got to listen
to these new search dynamics.
00:22:28,266 --> 00:22:28,566
Yeah.
00:22:28,566 --> 00:22:31,633
And that's we're kind of providing a tool
that helps them listen.
00:22:32,200 --> 00:22:34,500
And then they need to
they need to act on it.
00:22:34,500 --> 00:22:36,033
I mean they need to disrupt.
00:22:36,033 --> 00:22:36,266
Yeah.
00:22:36,266 --> 00:22:39,566
And and I mean we're trying to provide a
data set that makes it easier to disrupt.
00:22:39,900 --> 00:22:42,866
In most cases,
data leads you to incrementalism.
00:22:42,866 --> 00:22:43,266
Exactly.
00:22:43,266 --> 00:22:45,566
We're kind of saying
you've got a long way to go.
00:22:45,566 --> 00:22:47,033
So brands need to move faster.
00:22:47,033 --> 00:22:48,033
Your website sucks.
00:22:48,033 --> 00:22:50,433
It doesn't. You know, the websites
don't pay attention.
00:22:50,433 --> 00:22:54,900
The algorithms don't pay attention to it
or you know, and yeah, so we're trying to
00:22:55,266 --> 00:22:58,566
and then and then you got to follow up
and you got to be really iterative.
00:22:58,566 --> 00:23:01,966
Like like I would say, you know,
if you look at our web, our,
00:23:02,066 --> 00:23:07,066
our online site, we say today brands
need to think and act like algorithms.
00:23:07,533 --> 00:23:11,233
It doesn't mean we're all going
to become bots, but we have to understand
00:23:11,666 --> 00:23:14,933
how these our whole future
is going to get dictated by this stuff.
00:23:15,166 --> 00:23:17,333
But we can get in front of it,
we can partner with it.
00:23:17,333 --> 00:23:18,900
But we got to understand what it is.
00:23:18,900 --> 00:23:20,866
And I think that's critical for everyone.
00:23:22,066 --> 00:23:22,666
Tough question here.
00:23:22,666 --> 00:23:25,333
What happens is the algorithms know us
better than we know ourselves.
00:23:25,333 --> 00:23:28,666
This is we're going to
well they will they will I mean, you know
00:23:28,800 --> 00:23:32,666
next year you're probably going to be
interviewing yourself as a joke, right?
00:23:32,900 --> 00:23:34,766
You know two faces of Rob.
00:23:34,766 --> 00:23:36,500
That'll be interesting. Now the interest.
00:23:36,500 --> 00:23:36,700
Yeah.
00:23:36,700 --> 00:23:39,700
That might be, so
00:23:39,866 --> 00:23:40,466
I don't know.
00:23:40,466 --> 00:23:41,966
It's going to get really tricky.
00:23:41,966 --> 00:23:46,433
And we're going to have to really reboot
00:23:48,266 --> 00:23:49,033
not only brands.
00:23:49,033 --> 00:23:51,266
We got to reboot training. Yep.
00:23:51,266 --> 00:23:52,266
We're going to have to.
00:23:52,266 --> 00:23:55,366
I personally think gearing up to shake up
the universities to think differently.
00:23:55,366 --> 00:23:55,600
Okay.
00:23:55,600 --> 00:23:58,633
Through all of it, I mean, think about it
like with generative,
00:23:58,666 --> 00:24:00,566
I got Socrates in my pocket. Yeah.
00:24:00,566 --> 00:24:02,900
I mean, it's like an incredible learning
experience.
00:24:02,900 --> 00:24:04,033
It teaches math.
00:24:04,033 --> 00:24:05,800
It never forgets anything. Yeah.
00:24:05,800 --> 00:24:09,800
It's like and so the potential
for personalized education is huge,
00:24:09,800 --> 00:24:12,233
but there's gonna have to be give
and take at the university level, right?
00:24:12,233 --> 00:24:13,766
Yeah, absolutely.
I mean. It's going to be. Tricky.
00:24:13,766 --> 00:24:15,566
It isn't like how. Will Neville
Pinto lead?
00:24:15,566 --> 00:24:17,700
I mean, I you know,
I've even wrote a letter to him once.
00:24:17,700 --> 00:24:22,000
I said, dude, like this is your this will
shape your legacy more than anything.
00:24:22,000 --> 00:24:22,466
Full stop.
00:24:22,466 --> 00:24:26,400
I'd say the same thing that John Mueller
at PNG or Rodney at Kroger.
00:24:26,400 --> 00:24:29,533
This is like how you manage
this is critical I agree.
00:24:29,566 --> 00:24:30,066
You know.
00:24:30,066 --> 00:24:32,066
And if you know that there's going to be,
00:24:32,066 --> 00:24:34,600
you know, job shakeups,
how do you get in front of it?
00:24:34,600 --> 00:24:36,166
How do you start
thinking about the new jobs?
00:24:36,166 --> 00:24:38,033
How do you manage expectations?
00:24:38,033 --> 00:24:39,533
How do you turn it into a positive?
00:24:39,533 --> 00:24:42,533
And it sounds romantic,
but it's going to be tricky.
00:24:42,633 --> 00:24:43,733
It's going to be really, really tricky.
00:24:43,733 --> 00:24:46,800
It's, you know, before we go on to,
to our talk to our other points, there's
00:24:47,200 --> 00:24:51,300
one line I read from you about customer
service, about customer satisfaction.
00:24:51,300 --> 00:24:54,100
It's very important,
I think, even more with algorithms,
00:24:54,100 --> 00:24:57,566
because I think you said this is that
if you have one successful customer,
00:24:57,566 --> 00:25:01,200
they tell three people, if you have one
bad customer, they sell 3000.
00:25:01,666 --> 00:25:04,233
And now algorithm.
That was. My book, right?
00:25:04,233 --> 00:25:05,133
Right. Yes. That's your book.
00:25:05,133 --> 00:25:08,066
Yeah I see and algorithms.
Now you can add to that.
00:25:08,066 --> 00:25:10,233
You give one good customer.
Maybe you'll get six.
00:25:10,233 --> 00:25:14,500
You give one bad customer
and you get 3 million because information
00:25:14,500 --> 00:25:16,966
can really just iterate on itself
much faster.
00:25:16,966 --> 00:25:20,166
Well, I think the key Rob, is it
00:25:21,166 --> 00:25:21,633
you need to
00:25:21,633 --> 00:25:24,633
understand how to balance automation
and intimacy.
00:25:25,033 --> 00:25:25,700
oh. That's good.
00:25:25,700 --> 00:25:27,600
Okay.
What the important. So I'll say more like.
00:25:27,600 --> 00:25:31,566
So I lead,
when I was running digital at Nestlé,
00:25:31,633 --> 00:25:33,333
you know, one of my number one
00:25:33,333 --> 00:25:37,433
initiatives was implementing Service
Cloud across all the markets.
00:25:37,666 --> 00:25:42,600
You know, and the market
I most admired was in China,
00:25:43,200 --> 00:25:48,566
and it was the,
I think it was the Wyeth nutrition brand.
00:25:48,566 --> 00:25:51,333
And I see when I go to China, I would just
spend time in the call center.
00:25:51,333 --> 00:25:51,566
Yeah.
00:25:51,566 --> 00:25:55,300
And it was interesting
because they were really good at
00:25:57,000 --> 00:25:59,366
balancing Artemis,
automation and intimacy.
00:25:59,366 --> 00:25:59,666
In fact,
00:25:59,666 --> 00:26:03,766
they were very good at converting,
like, moms to e-commerce opportunities.
00:26:03,766 --> 00:26:06,833
And I use WeChat,
which is like their version of.
00:26:07,166 --> 00:26:07,966
Facebook.
00:26:07,966 --> 00:26:11,233
Instagram, and, and they would kind of
00:26:11,233 --> 00:26:14,900
do it at scale and they money
and they wanted to automate a lot of it.
00:26:14,900 --> 00:26:17,666
But other times, if they knew
there was an upsell opportunity,
00:26:17,666 --> 00:26:20,666
they would, you know,
they would have folks, you know,
00:26:20,666 --> 00:26:24,433
doing the chat and kind of work it
and then or if a mom had a complaint
00:26:24,466 --> 00:26:28,666
about their child, then
you automatically defaulted to a human.
00:26:29,100 --> 00:26:29,866
And so I think there's
00:26:29,866 --> 00:26:33,366
just a balancing act on
how do you get the benefits of automation.
00:26:33,666 --> 00:26:35,400
And I think some of
that could get really good.
00:26:35,400 --> 00:26:38,766
I mean, if you can automate
a friendly bot or a boy
00:26:38,766 --> 00:26:41,766
and they're just answering questions,
but with the
00:26:41,833 --> 00:26:44,333
with the happy face,
I think that can be a big benefit.
00:26:44,333 --> 00:26:46,133
But there's anything that's on,
00:26:46,133 --> 00:26:51,400
the negative side or the sensitive side
or the product has poison or whatever,
00:26:51,400 --> 00:26:53,033
then you want to immediately
go to a human.
00:26:53,033 --> 00:26:55,066
And I think it's going to be
really important or.
00:26:55,066 --> 00:26:57,000
Helpful for lots of other stuff. Yep.
Yeah.
00:26:57,000 --> 00:27:01,166
Like so and I so I, I don't think people
should over romanticize 100% automation.
00:27:01,166 --> 00:27:05,066
I think you just need
to find the right balance to do it right.
00:27:05,066 --> 00:27:06,533
And it's the same thing with the
00:27:06,533 --> 00:27:10,266
certain things we like on the airline,
where we can kind of get automatic alerts
00:27:10,266 --> 00:27:13,500
and other cases just get on
the damn phone, for crying out loud.
00:27:13,633 --> 00:27:14,300
And they know.
00:27:14,300 --> 00:27:15,533
And there's a calculus
00:27:15,533 --> 00:27:19,166
that says if they completely ignore Pete
when he really needs us,
00:27:19,366 --> 00:27:21,600
he's going to go to he's
going to go from Delta to American.
00:27:21,600 --> 00:27:23,366
Exactly. You know, so.
00:27:23,366 --> 00:27:27,433
Now, what would you tell young
people nowadays in this in this state,
00:27:27,433 --> 00:27:30,466
there's been more tech layoffs,
I think, than we've ever seen since the
00:27:31,133 --> 00:27:33,266
you and I went through the,
the ourselves.
00:27:33,266 --> 00:27:38,433
But the.com, burst and bubble,
there's, there's a lot of tech layoffs.
00:27:38,433 --> 00:27:40,233
It's a, it's a it's
a very shaky time for people
00:27:40,233 --> 00:27:42,766
that are in tech,
that have been in careers
00:27:42,766 --> 00:27:45,133
that have always been
thought to be untouchable.
00:27:45,133 --> 00:27:47,200
Right. Yeah. Oh big time.
00:27:47,200 --> 00:27:51,000
What do you say to, to, to those
emerging in their career at this time?
00:27:51,000 --> 00:27:53,333
Like, what advice
would you give them at this moment?
00:27:53,333 --> 00:27:55,733
I mean, number one,
just eyes wide open. Yeah.
00:27:55,733 --> 00:27:58,766
And it's funny,
my daughters, an incredible
00:27:59,166 --> 00:28:00,866
one of my daughters
is this incredible artist.
00:28:00,866 --> 00:28:02,333
And when you're talking about
she's only a sophomore
00:28:02,333 --> 00:28:06,666
and we were talking about real college
and, and I was actually rattling
00:28:06,666 --> 00:28:09,666
off a number of design schools, including,
you see, staff.
00:28:10,033 --> 00:28:12,866
And it was funny, like she was like,
00:28:12,866 --> 00:28:15,266
you know,
hey, is all that going to be real?
00:28:15,266 --> 00:28:16,300
It was very perceptive.
00:28:16,300 --> 00:28:18,766
It's sort of like,
you know, isn't that going to get kind of
00:28:18,766 --> 00:28:22,600
automated by and you know, some of it
well, but a lot of it won't.
00:28:22,600 --> 00:28:24,400
And I think we're going to have to
00:28:25,466 --> 00:28:26,433
remind our young
00:28:26,433 --> 00:28:29,433
people that we're also in a renaissance
of creative confidence.
00:28:29,733 --> 00:28:32,000
Yeah. Like my I'm
sure you feel the same thing.
00:28:32,000 --> 00:28:35,066
Like my creative confidence
quotient has gone up dramatically.
00:28:35,066 --> 00:28:37,466
Absolutely. Because I've got all these
help. Absolutely.
00:28:37,466 --> 00:28:41,466
I am you know I'm writing better
with a little bit of help.
00:28:41,766 --> 00:28:46,166
I'm taking sometimes
scatterbrained concepts and turning them
00:28:46,166 --> 00:28:49,566
into like beautiful concepts
that are kind of like and.
00:28:49,566 --> 00:28:51,100
I definitely, I think concur with that.
00:28:51,100 --> 00:28:54,633
And I think everyone's got artistic
qualities are probably severely
00:28:54,633 --> 00:28:55,666
under leveraged. Yeah.
00:28:55,666 --> 00:28:58,600
So we need to figure out like how do we
00:28:58,600 --> 00:29:01,833
almost like bring more creativity
in the world using those tools.
00:29:01,833 --> 00:29:04,133
But we're going to have to lead
that thinking.
00:29:04,133 --> 00:29:06,633
But at the same time,
we may have to tell the programmer
00:29:06,633 --> 00:29:10,500
that programing as we knew it
before is not going to be as critical.
00:29:10,633 --> 00:29:10,833
Yeah.
00:29:10,833 --> 00:29:16,133
I mean, you've got the CEO of Nvidia
basically saying,
00:29:16,133 --> 00:29:19,900
all this obsession with teaching
every kid to be a coder is nuts.
00:29:20,366 --> 00:29:22,033
And he's the head of Nvidia. Yeah.
00:29:22,033 --> 00:29:26,266
And so now prompts experimentation.
00:29:26,666 --> 00:29:28,866
How do you turn rocks over?
00:29:28,866 --> 00:29:30,000
How do you test and learn?
00:29:30,000 --> 00:29:31,866
I mean,
I think we're in a renaissance, right?
00:29:31,866 --> 00:29:35,933
I mean, I have written, I kid you not,
maybe I'm a bit of a freak,
00:29:36,333 --> 00:29:39,866
but since I discovered generative,
I probably written 100 business concepts
00:29:40,066 --> 00:29:42,366
all the way down to, like, images. Yeah.
00:29:42,366 --> 00:29:46,333
I think my last team thought I was crazy,
but I am like a real entrepreneur.
00:29:46,333 --> 00:29:47,600
And it's like.
00:29:47,600 --> 00:29:49,433
And like the things
that when I was at P&G,
00:29:49,433 --> 00:29:51,133
that would take like months to kind of
00:29:51,133 --> 00:29:53,766
get I'm like literally
like doing like overnight.
00:29:53,766 --> 00:29:54,266
Yeah.
00:29:54,266 --> 00:29:57,333
You know, I wrote a concept this morning
before I came here for a new product.
00:29:58,166 --> 00:30:03,000
and so that it's to some extent
like the ideas
649
00:30:03,000 --> 00:30:07,166
and the concepts that we have in
our mind can be formalized and laid out.
00:30:07,166 --> 00:30:09,600
And I think we just got to teach
the young people to do this.
00:30:09,600 --> 00:30:12,600
Like all the great businesses
I know, I actually.
00:30:12,866 --> 00:30:14,566
I agree, I have a of a bit of a slight.
00:30:14,566 --> 00:30:16,900
Yeah. take on that a different take.
00:30:16,900 --> 00:30:20,233
I would say
that you can do those things well
00:30:20,766 --> 00:30:24,466
because you also have some
training experience like.
00:30:25,566 --> 00:30:25,966
Right.
00:30:25,966 --> 00:30:29,066
Like say like you, you have these so like,
so I want to
00:30:29,633 --> 00:30:32,633
and I still think there are skill sets
to learning how to program,
00:30:32,933 --> 00:30:35,933
but I do
I can I completely agree with this.
00:30:35,966 --> 00:30:37,833
Creativity
has to be a part of what you do.
00:30:37,833 --> 00:30:40,900
You have to can't be one dimensional in
anything you do anymore.
00:30:41,066 --> 00:30:43,266
Like it is not. It is not going to be.
00:30:43,266 --> 00:30:46,133
So my advice would be
you don't be one dimensional, right?
00:30:46,133 --> 00:30:50,033
Can't just say people want to go in
sometimes say, I just want to do this.
00:30:50,466 --> 00:30:52,766
That world is done. Like you work.
00:30:52,766 --> 00:30:53,333
To your point.
00:30:53,333 --> 00:30:54,566
I mean, think about some of the things
00:30:54,566 --> 00:30:57,533
that you typically learning university
like critical thinking, right.
00:30:57,533 --> 00:30:58,733
How to interrogate.
00:30:58,733 --> 00:31:00,100
Now that's as relevant.
00:31:00,100 --> 00:31:04,166
It's always you're going to be constantly
interrogating the bots. Yep.
00:31:04,166 --> 00:31:08,100
You know and it's going to be a long time
before the deep fakes go away or
00:31:08,133 --> 00:31:09,400
the hallucinations.
00:31:09,400 --> 00:31:14,533
And what is real, what is not,
what is a logical sequence of dialog.
00:31:14,533 --> 00:31:18,500
And I think these, you know, we may find
that the things we always wanted
00:31:18,500 --> 00:31:23,033
kids to learn at universities
are going to be amplified in this world.
00:31:23,033 --> 00:31:23,866
Zach. Highly.
00:31:23,866 --> 00:31:28,000
I wait when I don't say
like I've got Plato in my pocket loosely.
00:31:28,000 --> 00:31:31,266
I mean, this is a highly dialectical
process, right?
00:31:31,566 --> 00:31:34,566
I mean, you're going back and forth
like 20 times more than you would.
00:31:34,566 --> 00:31:36,433
But you have to know,
but you got to know how to do that.
00:31:36,433 --> 00:31:38,766
Well, but that's a skill.
And I think that's that's a skill.
00:31:38,766 --> 00:31:40,666
You might find that
some of the universities
00:31:40,666 --> 00:31:42,566
go back to some of the basics, right.
00:31:42,566 --> 00:31:44,966
Just to kind of get. Something, but like,
oh, communication.
00:31:44,966 --> 00:31:46,500
Everybody doesn't know how to communicate
00:31:46,500 --> 00:31:48,233
more people are going to have
to really know how to communicate.
00:31:48,233 --> 00:31:49,333
It's very important.
00:31:49,333 --> 00:31:52,600
I also believe that creativity
is going to be highly important still.
00:31:52,966 --> 00:31:55,966
you know, I think we're both at a
00:31:56,366 --> 00:31:57,966
more creative moment,
but at the same time,
00:31:57,966 --> 00:32:00,966
I think we're, as a society
less creative than we've ever been.
00:32:01,100 --> 00:32:01,666
Right?
00:32:01,666 --> 00:32:03,800
I think about like musicians, for example.
00:32:03,800 --> 00:32:05,300
Like, I just like it.
00:32:05,300 --> 00:32:09,233
And this doesn't make me feel old,
but I look at the music now
00:32:09,866 --> 00:32:12,200
and there's nowhere near the,
musical input.
00:32:12,200 --> 00:32:13,433
And I think this started a while ago.
00:32:13,433 --> 00:32:13,700
Right.
00:32:13,700 --> 00:32:17,700
But, like, think about what
Michael Jackson even original rappers did.
00:32:17,800 --> 00:32:19,500
They told stories,
they put things into him.
00:32:19,500 --> 00:32:23,466
Now you can automate things,
but it's not as good quality.
00:32:23,833 --> 00:32:25,600
People still like originality.
00:32:25,600 --> 00:32:26,433
So I think
00:32:26,433 --> 00:32:30,733
the age of artificial intelligence, like,
one of my friends, Lisa Fancy
00:32:30,733 --> 00:32:33,533
Flowers, he's actually in the crypto world
and she's in the AI world.
00:32:33,533 --> 00:32:35,633
She said, in
a world of artificial intelligence,
00:32:35,633 --> 00:32:37,433
you need authentic intelligence.
00:32:37,433 --> 00:32:38,300
We are one of one.
00:32:38,300 --> 00:32:41,266
You can only fake talent
so far, right? Right.
00:32:41,266 --> 00:32:45,700
And I sometimes I felt artificially
more musical than I really am.
00:32:45,700 --> 00:32:48,900
But, you know, you still have to have
the foundation of talent.
00:32:48,933 --> 00:32:51,300
You do seem like.
They help you amplify. It's not.
00:32:51,300 --> 00:32:52,000
That's my point.
00:32:52,000 --> 00:32:55,000
Like the tool should be seen that way.
00:32:55,233 --> 00:32:57,066
And I think people are still.
00:32:57,066 --> 00:32:59,366
To your point, you mentioned earlier
when you talked about
00:32:59,366 --> 00:33:01,066
I never heard it's phrased this way,
00:33:01,066 --> 00:33:05,033
you know, making sure you had a mixture
between automation and intimacy.
00:33:05,366 --> 00:33:08,933
I look at that same way, like authenticity
and artificial Intel.
00:33:08,933 --> 00:33:11,033
You need to have authenticity. Absolutely.
00:33:11,033 --> 00:33:12,266
And that's what's going to differentiate.
00:33:12,266 --> 00:33:15,000
What I love about what
Kendra and, and Helen are doing.
00:33:15,000 --> 00:33:15,333
Absolutely.
00:33:15,333 --> 00:33:17,900
I mean, that whole human in the loop,
it's a big part of our model.
00:33:17,900 --> 00:33:19,500
Like it's interesting, like I am.
00:33:19,500 --> 00:33:21,166
I'll give you a good example. So we're,
00:33:22,500 --> 00:33:25,433
putting
out reports that are very data based,
00:33:25,433 --> 00:33:29,666
but throughout the report, you can click
a button and there's Pete in the studio
00:33:29,666 --> 00:33:33,466
explaining how to use it
with all of Pete's passion and authority.
00:33:33,466 --> 00:33:34,966
And it's really called data theater.
00:33:34,966 --> 00:33:36,866
Yeah, I've got to get myself
an avatar too.
00:33:36,866 --> 00:33:37,866
I got to do that. But, you know.
00:33:37,866 --> 00:33:38,600
But it's real.
00:33:38,600 --> 00:33:41,833
It's real me because I,
you know, data alone is not good enough.
00:33:41,833 --> 00:33:42,366
Sometimes.
00:33:42,366 --> 00:33:45,366
I'm always trying to channel my,
you know, guilt tripping
00:33:45,366 --> 00:33:49,000
Italian mother to kind of get people
to, like, act on the data.
00:33:49,000 --> 00:33:50,500
Data alone. Won't get.
00:33:50,500 --> 00:33:50,866
Hacked.
00:33:50,866 --> 00:33:53,733
And again, it's, it's that balancing act
that we need to perfect.
00:33:53,733 --> 00:33:54,600
And it's the human in.
00:33:54,600 --> 00:33:56,566
Yeah. Data is
guidance is not determinative.
00:33:56,566 --> 00:33:58,766
You have to it's something
you can use as a map.
00:33:58,766 --> 00:33:59,700
But it won't. It won't.
00:33:59,700 --> 00:34:01,433
It won't mean anything
if you don't use it.
00:34:01,433 --> 00:34:01,833
Yeah.
00:34:01,833 --> 00:34:04,833
It's not the things where we've been told
to do because rationally it's right.
00:34:04,866 --> 00:34:05,800
Yeah. Eat your peas.
00:34:05,800 --> 00:34:06,300
Yeah.
00:34:06,300 --> 00:34:08,700
It's like, you know,
it's like every study will tell you
00:34:08,700 --> 00:34:09,933
it's right that you need, like.
00:34:09,933 --> 00:34:14,100
Yeah, mom to say goddamn Jesus
or you're not, you know.
00:34:14,566 --> 00:34:16,766
Yeah. I mean, we're not
we're not rational people, so it's.
00:34:16,766 --> 00:34:17,433
We're good to go.
00:34:19,300 --> 00:34:22,900
so, I want to get to a few kind of,
like, questions that I ask people.
00:34:22,900 --> 00:34:26,533
So you have a committee of three
to advise you on
00:34:26,866 --> 00:34:28,833
life, business, whatever you want.
00:34:28,833 --> 00:34:31,833
Tell me who these three people are
and why.
00:34:34,500 --> 00:34:37,766
I really admire
John Pepper, the former CEO of P&G.
00:34:38,066 --> 00:34:41,066
I think he is just, he continues
00:34:41,166 --> 00:34:44,500
emailing him this morning on me,
and he's the type of person.
00:34:44,500 --> 00:34:47,566
If he asked me to do something,
I'd say how high or jump.
00:34:47,566 --> 00:34:50,666
And he's consistently there
and just always
00:34:50,666 --> 00:34:53,666
kind of wise, ethical,
00:34:54,500 --> 00:34:56,900
Kind of,
00:34:56,900 --> 00:34:57,700
stem.
00:34:57,700 --> 00:35:00,600
You know, there's some,
00:35:00,600 --> 00:35:01,466
you know,
00:35:01,466 --> 00:35:04,033
I want isolated to one individual,
but there's some folks
00:35:04,033 --> 00:35:05,233
from my Nestlé experience.
00:35:05,233 --> 00:35:08,233
So I just really, like to bring
to the table.
00:35:08,533 --> 00:35:09,033
And I think
00:35:10,100 --> 00:35:10,400
a lot
00:35:10,400 --> 00:35:13,400
of, you
know, family members are always there.
00:35:13,500 --> 00:35:14,933
So many respects.
00:35:14,933 --> 00:35:16,233
so you got one. You play. You know what?
00:35:16,233 --> 00:35:17,600
You know what? It used.
00:35:17,600 --> 00:35:20,200
Oh, I need to think a little bit harder
about,
00:35:20,200 --> 00:35:23,200
you know,
just because you might say I'm leaning on.
00:35:23,200 --> 00:35:26,466
I'm for certain things.
I'm fine on my kids.
00:35:26,900 --> 00:35:27,300
That's fine.
00:35:27,300 --> 00:35:30,333
And that's a good that my daughter Layla
is like a source of incredible insight.
00:35:30,333 --> 00:35:33,600
So my daughter Sophia, my son Liam, and,
you know,
00:35:33,600 --> 00:35:37,466
you have to, like,
be humble to know what you don't know.
00:35:37,466 --> 00:35:39,400
And almost
there's a certain innocence and.
00:35:39,400 --> 00:35:40,866
Yeah, world that they see.
00:35:40,866 --> 00:35:46,000
And so, but I, you know, I generally try
to, you know, listen to a lot of signals.
00:35:46,533 --> 00:35:48,500
About what you don't know. That's great.
00:35:48,500 --> 00:35:50,966
May I ask you the opposite
of that question? Yeah.
00:35:50,966 --> 00:35:52,966
What do you know
for sure? The Oprah question.
00:35:56,400 --> 00:35:57,400
And I know that I don't
00:35:57,400 --> 00:36:00,866
know, I kind of still Plato
or Aristotle, but,
00:36:01,266 --> 00:36:03,900
you know, I'm pretty humble about
00:36:03,900 --> 00:36:06,900
the fact that there's still a lot,
00:36:07,266 --> 00:36:09,900
and,
00:36:09,900 --> 00:36:13,566
and that, you know, knowledge
is, real quest.
00:36:13,566 --> 00:36:18,233
It never entirely
just kind of an end point
00:36:19,633 --> 00:36:23,666
is, I'm really fanatical about
00:36:25,166 --> 00:36:28,166
standing.
00:36:29,233 --> 00:36:30,666
Kind of guided everything I've done.
00:36:30,666 --> 00:36:33,666
And to some extent,
a lot of that is in the
00:36:33,733 --> 00:36:36,733
and, say, trust your inner.
00:36:37,300 --> 00:36:38,733
Yeah,
00:36:38,733 --> 00:36:39,566
that you're probably going
806
00:36:39,566 --> 00:36:42,566
to find all sorts
of great business ideas and the like.
00:36:42,566 --> 00:36:43,433
All right.
00:36:43,433 --> 00:36:46,533
What's an important truth you have that
very few people agree with you on?
00:36:47,066 --> 00:36:50,066
It's always a fun one for me.
00:36:50,133 --> 00:36:52,900
And I would say goes back to what I said
before, I think, you know, the
00:36:52,900 --> 00:36:55,766
trust your inner consumers.
I gave a Ted talk on it.
00:36:55,766 --> 00:36:59,900
I think, I think we does that sometimes.
00:36:59,900 --> 00:37:02,900
And I think what specifically is
that is the belief that people
00:37:03,166 --> 00:37:04,666
agree with you on any I don't understand.
00:37:04,666 --> 00:37:06,266
Tell me that something in
what do you mean?
00:37:06,266 --> 00:37:09,400
Like what's an important truth you have
that most people may disagree with.
00:37:10,200 --> 00:37:11,700
Some people may disagree with that.
00:37:11,700 --> 00:37:13,866
It's almost like
because I'm I've always believed that
00:37:13,866 --> 00:37:17,033
if you if you trust your inner consumer,
you're almost bringing more of an outside
00:37:17,033 --> 00:37:18,466
in activist perspective.
00:37:18,466 --> 00:37:20,100
Okay. I think in business sometimes.
00:37:20,100 --> 00:37:21,866
What does that mean
to trust your inner consumer?
00:37:21,866 --> 00:37:25,133
So basically means, so I've always been,
00:37:25,966 --> 00:37:28,133
I give a Ted talk
called the Work Life Advantage,
00:37:28,133 --> 00:37:31,766
and it was all about how the things we do
in our personal lives
00:37:32,233 --> 00:37:36,533
and digital really inform
how we become business leaders.
00:37:36,866 --> 00:37:39,300
So you've got to do social media yourself.
00:37:39,300 --> 00:37:41,866
It'll kind of teach you
about the power of iteration.
00:37:41,866 --> 00:37:46,333
you need to walk the talk, need to,
Yeah.
00:37:46,333 --> 00:37:50,766
You just, you know, you need to become
a good e-commerce shopper to really speak
00:37:50,766 --> 00:37:54,200
with authority when you're managing a team
about winning with Amazon.
00:37:54,200 --> 00:37:55,200
And so it's like
00:37:55,200 --> 00:37:58,500
try and you also, you know, as
consumers are pretty damn critical.
00:37:58,500 --> 00:37:58,833
Right.
00:37:58,833 --> 00:38:00,333
And sometimes we let it out.
00:38:00,333 --> 00:38:02,933
We're kind of, you know, we get
we get really angry.
00:38:02,933 --> 00:38:09,200
We tell 3000 and so but that journey is
what makes us really sharp.
00:38:09,200 --> 00:38:13,166
Now why do I feel like
I'm qualified to go on a startup
00:38:13,466 --> 00:38:15,500
if I a million startups do an I?
00:38:15,500 --> 00:38:18,866
Honestly, I've been I've been trusting
my inner consumer.
00:38:18,866 --> 00:38:21,733
I'm using this stuff like 100 times a day.
00:38:21,733 --> 00:38:25,700
I see a future that others don't
because I'm channeling my consumer side.
00:38:25,700 --> 00:38:31,166
And I think that's always going to be my
source of truth, my source of motivation.
00:38:31,600 --> 00:38:35,566
And, and I would I don't think that
that truth will ever go away.
00:38:35,900 --> 00:38:36,566
All right.
00:38:36,566 --> 00:38:37,166
Final question.
00:38:37,166 --> 00:38:40,866
What does success look like for you or
your business or yourself in five years?
00:38:41,733 --> 00:38:45,300
And I want to make a big
it's funny, I'm not I want
00:38:46,766 --> 00:38:50,100
monetary
wealth, but I'm not motivated by it.
00:38:50,100 --> 00:38:53,100
I'm very motivated by
00:38:53,366 --> 00:38:56,366
feeling that I did something
00:38:57,100 --> 00:38:59,333
in me
00:38:59,333 --> 00:39:00,533
out of positive
00:39:00,533 --> 00:39:05,166
sounds like lofty and idealistic,
but it's just where I am in my day.
00:39:05,166 --> 00:39:09,066
It's like I
if people read my epitaph at their
00:39:09,066 --> 00:39:12,366
give speech at the funeral
and they don't say that I really moved
00:39:14,100 --> 00:39:16,566
in it very thoughtfully.
00:39:16,566 --> 00:39:19,966
I'll be yeah, I'll be really disappointed
wherever I am.
00:39:20,366 --> 00:39:22,166
And, yeah.
00:39:22,166 --> 00:39:25,400
So I just, I want
I want to drive real impact.
00:39:25,900 --> 00:39:26,966
I've always been that.
00:39:26,966 --> 00:39:30,133
I mean, I, I, I grew up,
I wanted to be governor of California,
00:39:31,000 --> 00:39:33,166
and I've, I started
00:39:33,166 --> 00:39:34,600
and to some extent I've almost.
00:39:34,600 --> 00:39:36,166
You and I have that in common.
00:39:36,166 --> 00:39:36,966
Yeah. Yeah.
00:39:36,966 --> 00:39:37,766
Planted feedback.
00:39:37,766 --> 00:39:40,733
My first business was a bit of a,
stick it to the man.
00:39:40,733 --> 00:39:42,600
Yeah. It also had a business side to it.
00:39:42,600 --> 00:39:46,100
So I've always like blended
that, you know, drive impact,
00:39:46,400 --> 00:39:47,366
build a great business.
00:39:47,366 --> 00:39:48,000
And to some extent
00:39:48,000 --> 00:39:51,666
I feel like I'm doing that now
because I like the idea of accountable.
00:39:52,333 --> 00:39:55,333
Like, don't talk about sustainability
and not do it right.
00:39:55,433 --> 00:39:55,766
You know.
00:39:55,766 --> 00:39:56,900
So I'm kind of blame it on the
00:39:56,900 --> 00:40:00,600
I bought say you're lying
or you're greenwashing, but I like that.
00:40:00,600 --> 00:40:04,266
It's like, you know, because I think
it also drives drives to better outcomes.
00:40:04,266 --> 00:40:07,333
It might speed up companies
to actually meeting
00:40:07,333 --> 00:40:10,300
their green commitments or realizing
they can't go slow.
00:40:10,300 --> 00:40:12,566
Right. You know,
so those are the things I really like.
00:40:12,566 --> 00:40:14,266
Yeah. People stop being a pleasure,
man. Yeah.
00:40:14,266 --> 00:40:16,566
Thanks for all your leadership.
That's fine. Thank you. And.
HOSTED BY
ROB RICHARDSON
Share This!
In this episode, Rob Richardson interviews entrepreneur Pete Blackshaw, who shares his pioneering journey of using AI to protect brands. Blackshaw discusses how artificial intelligence is revolutionizing the industry, providing a glimpse into the future and practical insights on how businesses can harness this technology for brand protection. He highlights the necessity for companies to adapt and the transformative potential of AI in managing brand integrity. This episode is essential listening for business leaders and marketers keen to navigate the intersection of AI and brand management effectively.
DISRUPTION NOW LINKS:
Watch the episode: https://www.youtube.com/@Disruptnow
Listen to the podcast episode: https://share.transistor.fm/s/4389eb03/
CONNECT WITH THE HOST
ROB RICHARDSON
Entrepreneur & Keynote Speaker
Rob Richardson is the host of disruption Now Podcast and the owner of DN Media Agency, a full-service digital marketing and research company. He has appeared on MSNBC, America this Week, and is a weekly contributor to Roland Martin Unfiltered.
MORE WAYS TO WATCH
DISRUPTION NOW
Serious about change? Subscribe to our podcasts.