00:00:00,000 --> 00:00:03,433
the digital world, it scales differently.
00:00:04,100 --> 00:00:08,633
So you don't need to call one latchkey
service at a time
00:00:08,900 --> 00:00:12,500
with AI, with being online,
you can replicate things very,
00:00:12,533 --> 00:00:15,900
very quickly
and it can be a one man shop or operation.
00:00:15,900 --> 00:00:20,033
So there's a huge difference in scale
when we're looking at this online.
00:00:20,566 --> 00:00:22,500
If you believe
we can change the narrative.
00:00:22,500 --> 00:00:24,700
If you believe,
we can change our communities.
00:00:24,700 --> 00:00:28,066
If you believe we can change the outcomes,
then we can change the world.
00:00:28,800 --> 00:00:30,366
I'm Rob Richardson.
00:00:30,366 --> 00:00:32,933
Welcome to disruption. Now.
00:00:32,933 --> 00:00:34,300
Welcome to Disruption Now.
00:00:34,300 --> 00:00:36,866
I'm your host and moderator,
Rob Richardson.
00:00:36,866 --> 00:00:39,700
Have you ever thought
about how your data is used?
00:00:39,700 --> 00:00:44,100
Or better yet, have you thought about
how data of your kids is being used?
00:00:44,833 --> 00:00:49,033
Do you have any idea about
how much is happening behind the scenes,
00:00:49,400 --> 00:00:52,400
how that can be manipulated,
how it's being manipulated?
00:00:53,000 --> 00:00:54,633
You know,
I don't think we thought much about it.
00:00:54,633 --> 00:00:56,366
We entered. I'm going to tell my age here.
00:00:56,366 --> 00:00:59,333
We enter the social media era
and just everything was free.
00:00:59,333 --> 00:01:00,166
We were able to connect.
00:01:00,166 --> 00:01:02,566
It was all free. It was all great.
00:01:02,566 --> 00:01:04,200
But it's not. Free.
00:01:04,200 --> 00:01:05,633
Nothing's free in this world.
00:01:05,633 --> 00:01:07,733
That data is not free.
00:01:07,733 --> 00:01:09,466
You are the product.
00:01:09,466 --> 00:01:11,200
You and your kids are the product.
00:01:11,200 --> 00:01:13,133
The question is,
are you comfortable with that?
00:01:13,133 --> 00:01:14,366
Does that matter to you?
00:01:14,366 --> 00:01:19,066
Well, we're here to actually to talk about
why data privacy matters is just not,
00:01:19,400 --> 00:01:20,133
you know,
00:01:20,133 --> 00:01:20,833
it's just not people
00:01:20,833 --> 00:01:23,833
that are just liberals out here
saying we need to protect our data.
00:01:24,133 --> 00:01:27,133
It's actually affecting all of us because
00:01:27,266 --> 00:01:30,833
whether you like it or not,
you are a part of the digital economy.
00:01:30,966 --> 00:01:33,966
And that's just going to amplify
now with artificial intelligence.
00:01:34,100 --> 00:01:37,866
So with me to talk about
how we can have a future that's actually,
00:01:38,100 --> 00:01:41,100
more about transparency and freedom
00:01:41,100 --> 00:01:44,400
and understanding how you can protect
yourself is John Kavanagh.
00:01:44,400 --> 00:01:46,733
He is the founder of the Plug Foundation.
00:01:46,733 --> 00:01:48,066
And, he's gonna
00:01:48,066 --> 00:01:51,366
tell you, a story in his journey
because it's a very interesting one.
00:01:51,533 --> 00:01:54,766
But I want you to understand
that you have the ability
00:01:54,766 --> 00:01:59,833
to protect yourself, to protect your kids,
but you have to know where to start.
00:01:59,833 --> 00:02:02,833
And we hope that by the end of this
episode, you'll actually have that.
00:02:02,866 --> 00:02:06,700
But before we start, make sure you like
make make sure you subscribe.
00:02:06,733 --> 00:02:08,433
That's how we're going
to keep the disruption growing.
00:02:08,433 --> 00:02:10,000
We appreciate you listening.
00:02:10,000 --> 00:02:13,333
And now I have the pleasure of introducing
John Kavanagh.
00:02:13,366 --> 00:02:14,366
John, how you doing, brother?
00:02:14,366 --> 00:02:15,166
Good. Rob.
00:02:15,166 --> 00:02:17,033
Thank you so much for the introduction.
Appreciate it.
00:02:17,033 --> 00:02:18,566
Hey, thank you so much.
00:02:18,566 --> 00:02:23,600
So you had an interesting journey
to get to data to get to data privacy.
00:02:23,600 --> 00:02:26,900
I like to start like this is such a
00:02:26,900 --> 00:02:29,900
we met not not not so long ago.
00:02:29,966 --> 00:02:33,066
And to me, it was clear
that you had a clear mission and focus.
00:02:33,433 --> 00:02:35,966
And I have to say,
I consider myself pretty informed.
00:02:35,966 --> 00:02:39,666
But after I left our conversation,
I became even more concerned
00:02:39,700 --> 00:02:41,200
about data privacy.
00:02:41,200 --> 00:02:43,000
I was like,
oh, this is worse than I thought.
00:02:43,000 --> 00:02:48,000
And, I'm just curious, how did you get
into the world of data privacy?
00:02:48,000 --> 00:02:51,000
Like, how did this become
your kind of central? Why?
00:02:51,066 --> 00:02:51,433
Yeah.
00:02:51,433 --> 00:02:56,166
So everybody in privacy,
if you ask, they all have a wild story.
00:02:56,500 --> 00:02:59,300
Very few people just started out
passionate about this.
00:02:59,300 --> 00:03:01,300
And like you mentioned, the introduction.
00:03:01,300 --> 00:03:04,966
We all thought the products we were using
online were free and great.
00:03:05,200 --> 00:03:07,200
And yeah,
there were some advertising behind it.
00:03:07,200 --> 00:03:07,900
No problem.
00:03:07,900 --> 00:03:10,366
Where you see that
with television and stuff,
00:03:10,366 --> 00:03:14,400
but there's a dark side to it
and people learn that at their own pace.
00:03:14,700 --> 00:03:16,633
So a few people on my board, one was,
00:03:16,633 --> 00:03:20,366
a lawyer,
and he was just doing contract law,
00:03:20,533 --> 00:03:24,366
and he saw a drone go over his house,
and he was with his kids and family.
00:03:24,366 --> 00:03:27,733
And it started that question like,
you know, who's drone is that?
00:03:27,733 --> 00:03:28,033
Right?
00:03:28,033 --> 00:03:31,100
Why are they do they have now video of me
and my family?
00:03:31,233 --> 00:03:31,733
Right.
00:03:31,733 --> 00:03:34,633
So we all have those interesting ways.
00:03:34,633 --> 00:03:38,933
for me, it started back in undergrad
where I made a website.
00:03:38,933 --> 00:03:40,833
It was called Slate Up,
00:03:40,833 --> 00:03:43,633
and it was just a place
as a pre-med student
00:03:43,633 --> 00:03:46,700
where I wanted to meet other students
that were taking
00:03:46,700 --> 00:03:49,866
the mCAT already in med school,
because it's a huge decision.
00:03:49,966 --> 00:03:50,866
It's like a half
00:03:50,866 --> 00:03:54,066
$1 million in debt to figure out
if you like being a doctor or not.
00:03:54,100 --> 00:03:54,533
Yeah.
00:03:54,533 --> 00:03:57,166
So it was a lot of cost
to figure that out.
00:03:57,166 --> 00:03:59,600
So I built a website
to connect people similar
00:03:59,600 --> 00:04:02,066
to like the old school version
of Facebook. Yeah.
00:04:02,066 --> 00:04:03,833
And that was are missing
connect the world.
00:04:03,833 --> 00:04:04,333
Yeah.
00:04:04,333 --> 00:04:08,300
So I wanted to meet and connect
with people, especially professionally
00:04:08,300 --> 00:04:10,000
and within school.
00:04:10,000 --> 00:04:11,366
And it grew very fast.
00:04:11,366 --> 00:04:15,166
within the first three months,
we had one fourth of you see, on it.
00:04:15,166 --> 00:04:15,866
Yeah.
00:04:15,866 --> 00:04:20,533
And it grew in the Midwest,
Ohio State, Middle Tennessee,
00:04:20,533 --> 00:04:25,100
Clayton State University, lots of schools
in Ohio, Kentucky, Tennessee, Indiana.
00:04:25,566 --> 00:04:29,766
And as we were growing,
we got, research grants
00:04:29,766 --> 00:04:33,433
to keep it running, but eventually
we needed a series A funding.
00:04:33,433 --> 00:04:34,300
Sure.
00:04:34,300 --> 00:04:38,166
And we weren't
we weren't keeping any data of our users.
00:04:38,400 --> 00:04:40,233
We were like, oh, let's just,
00:04:41,500 --> 00:04:42,066
make sure
00:04:42,066 --> 00:04:45,066
that we're taking care of them
and that everything's protected.
00:04:45,133 --> 00:04:48,833
But we just want to make sure that
the people that are using our products,
00:04:49,200 --> 00:04:52,333
feel good about
you know, the environment that they're in.
00:04:52,366 --> 00:04:55,266
Right? And our privacy policy
is, hey, we're not selling anything.
00:04:55,266 --> 00:04:56,866
We're not pushing any ads.
00:04:56,866 --> 00:05:01,033
And we developed, a way to make money
by selling to universities.
00:05:01,200 --> 00:05:04,933
So we figured that would be our pipeline
and our revenue to make money,
00:05:05,400 --> 00:05:07,133
which everybody agreed upon.
00:05:07,133 --> 00:05:10,133
But when we were looking at that series
A funding
00:05:10,633 --> 00:05:14,166
every single investor 50
plus that we talked to,
00:05:14,166 --> 00:05:17,200
every single one was saying,
hey, what are you doing to sell the data?
00:05:17,200 --> 00:05:19,333
Because that's another revenue
stream, right?
00:05:19,333 --> 00:05:22,933
And you can't ignore it to be competitive
in this day and age with tech.
00:05:23,433 --> 00:05:25,866
And I started learning
about what that really meant.
00:05:25,866 --> 00:05:30,533
So when looking through the details
of what data mining was, how it's used,
00:05:30,666 --> 00:05:34,966
who it's sold to, you enter this huge dark
00:05:34,966 --> 00:05:38,833
side of the internet
that people are vaguely familiar with.
00:05:39,300 --> 00:05:42,300
But when you really dive
deep down about it,
00:05:42,366 --> 00:05:45,133
you understand how they're using various
00:05:46,266 --> 00:05:47,166
parameters about
00:05:47,166 --> 00:05:50,933
your life to nudge you to a decision.
00:05:50,933 --> 00:05:52,600
It can be a political decision, right?
00:05:52,600 --> 00:05:55,600
It can be in buying products
at a certain time.
00:05:55,666 --> 00:05:58,033
It can be understanding
your mental health state.
00:05:58,033 --> 00:05:58,900
Are you depressed?
00:05:58,900 --> 00:05:59,733
Are you sad?
00:05:59,733 --> 00:06:02,533
Well,
you tend to buy more in this direction.
00:06:02,533 --> 00:06:05,200
and we were primarily
focused on college students
00:06:05,200 --> 00:06:08,300
who have, by default, pretty bad literacy.
00:06:08,366 --> 00:06:10,433
financial literacy skills. Yes.
00:06:10,433 --> 00:06:14,466
So that was one thing was like, hey,
a lot of these students are getting a lump
00:06:14,466 --> 00:06:18,666
sum that is debt, essentially,
and spending it on beer and pizza
00:06:18,700 --> 00:06:19,766
right on the weekends.
00:06:19,766 --> 00:06:23,033
But if we have an opportunity
to slip our products in there as well,
00:06:23,533 --> 00:06:26,366
where they're paying 12% interest
on whatever
00:06:26,366 --> 00:06:30,200
T-shirt company they are,
then that's good for business.
00:06:30,200 --> 00:06:33,433
But in the long run, it's
not sustainable for our society.
00:06:33,733 --> 00:06:35,533
And I found a lot of objections.
00:06:35,533 --> 00:06:39,733
So, me and the core group,
we decided to close down the organization
00:06:39,733 --> 00:06:41,233
and build a nonprofit.
00:06:41,233 --> 00:06:43,466
So you you mentioned me as the founder,
but there are.
00:06:43,466 --> 00:06:46,166
And I want to say that back, back, back,
back up for a minute, okay?
00:06:46,166 --> 00:06:49,600
People just have to I want to make sure
they absorb what you just said.
00:06:50,133 --> 00:06:52,700
A series funding is a big funding round.
00:06:52,700 --> 00:06:53,300
Yes, right.
00:06:53,300 --> 00:06:57,066
There is seed pre-seed series
A, you're talking
00:06:57,066 --> 00:06:59,300
minimum 5 million, probably dollars.
00:06:59,300 --> 00:07:02,733
I don't know what it was then,
but I'm guessing it's around that like so
00:07:03,733 --> 00:07:06,200
very clear. Like
00:07:06,200 --> 00:07:08,800
millions of dollars was offered. Yes.
00:07:08,800 --> 00:07:11,566
And because you weren't comfortable
with the direction
00:07:11,566 --> 00:07:14,566
of where the investors wanted
you to go with the data.
00:07:14,666 --> 00:07:17,433
You turned it down
and you started a nonprofit.
00:07:17,433 --> 00:07:18,500
Just make sure people understand that.
00:07:18,500 --> 00:07:21,200
So like it's one thing to say
that you have these principles.
00:07:21,200 --> 00:07:23,200
It's another thing
to actually have done it.
00:07:23,200 --> 00:07:25,700
What's that massive respect for. Right.
Thank you.
00:07:25,700 --> 00:07:29,566
Could have been easy by the way, but,
like, it's,
00:07:30,200 --> 00:07:33,200
So I want to talk about
I want you to go down the nonprofit.
00:07:34,166 --> 00:07:37,166
but I like to just deep dive into,
like, what?
00:07:37,200 --> 00:07:40,466
Really kind
of, sparked you to say no to that?
00:07:40,466 --> 00:07:44,000
Because it sounds like you weren't against
the idea of ads in general.
00:07:44,233 --> 00:07:46,000
It sounds like there was something there.
00:07:46,000 --> 00:07:49,533
Was there other things that that deeply
disturbed you to make you go from?
00:07:50,000 --> 00:07:53,533
It's one extreme to say, no, no,
I do this to starting a nonprofit.
00:07:54,100 --> 00:07:57,800
Like like what were like the top things
that really just, like, stuck out.
00:07:57,800 --> 00:07:59,800
There
has to be some things
that like that.
00:07:59,800 --> 00:08:01,300
The investors,
the people wanted to do that.
00:08:01,300 --> 00:08:03,066
You just didn't sit right with you.
00:08:03,066 --> 00:08:05,200
Like, what? What were those things?
00:08:05,200 --> 00:08:05,533
Yeah.
00:08:05,533 --> 00:08:09,733
So our primary focus was understanding
how students could feel at home
00:08:09,733 --> 00:08:10,933
at a university. Yes.
00:08:10,933 --> 00:08:15,133
And we were helping with,
reducing melt rate where people said,
00:08:15,133 --> 00:08:17,500
hey, I want to go to University
of Cincinnati, for example.
00:08:17,500 --> 00:08:21,600
But then they wouldn't at the end of the
like beginning of the school year,
00:08:21,800 --> 00:08:25,533
they wouldn't come because they went
to Ohio State or something as an example.
00:08:26,233 --> 00:08:28,633
a lot of that is because they just didn't
feel a sense of community.
00:08:28,633 --> 00:08:32,533
So we were helping facilitate
that sense where the resident advisors,
00:08:32,566 --> 00:08:35,733
the people that live on the dorms,
and the orientation
00:08:35,733 --> 00:08:38,966
leaders
would connect in a better facilitated way.
00:08:39,000 --> 00:08:41,433
Right. so people felt more at home.
00:08:41,433 --> 00:08:44,100
and then, you know,
00:08:44,100 --> 00:08:47,933
to be honest, I was against ads
because the business model from.
00:08:48,066 --> 00:08:49,800
Yeah, the business model that we had.
00:08:49,800 --> 00:08:51,433
Right. Didn't need it. Right.
00:08:51,433 --> 00:08:55,933
So everything else was like,
hey, these are slightly it was a steep,
00:08:56,400 --> 00:08:59,433
like a curve of more
and more invasive maneuvers.
00:08:59,466 --> 00:09:01,133
Right. Which was against the philosophy.
00:09:01,133 --> 00:09:03,333
It's like, hey,
we have a good revenue stream.
00:09:03,333 --> 00:09:05,066
universities are willing to pay for this.
00:09:05,066 --> 00:09:09,733
We had letters of intent and to buy
and everything, so we were good to go.
00:09:09,966 --> 00:09:12,033
And, they wanted you to be.
00:09:12,033 --> 00:09:15,200
I feel where you go. Is that, yes.
00:09:15,200 --> 00:09:18,266
You didn't like the ads, but I also think
it was the level of what you would have
00:09:18,266 --> 00:09:22,866
had to compromise
feels like for becoming for for selling.
00:09:22,866 --> 00:09:24,966
That would have meant that you would
have gone away from your mission.
00:09:24,966 --> 00:09:26,500
It feels like. Exactly.
00:09:26,500 --> 00:09:29,833
And here's another thing
is, I'm convinced in the tech world,
00:09:29,833 --> 00:09:32,833
if you are a for profit tech company,
unless you're a billionaire
00:09:32,833 --> 00:09:35,966
and you can find everything itself, at one
point you're going to be cash strapped.
00:09:35,966 --> 00:09:39,133
So if it's not the series A
that we compromise on,
00:09:39,133 --> 00:09:42,600
it's going to be a series B, 100,
100 million or something like that.
00:09:43,000 --> 00:09:47,600
And what's going to happen is companies
they want or investment firms
00:09:47,600 --> 00:09:50,600
they want somebody on the board
that makes decisions.
00:09:50,833 --> 00:09:53,566
And unless you are a super galactic
00:09:53,566 --> 00:09:56,866
unicorn like Zuckerberg,
who has all of the voting rights.
00:09:56,866 --> 00:09:59,700
But even but,
but but he had to still do that.
00:09:59,700 --> 00:10:02,433
And I don't think he has any problem
with it, actually.
00:10:02,433 --> 00:10:06,066
You know, like, I don't even say he
he doesn't share your your moral dilemma.
00:10:06,300 --> 00:10:08,166
It's not
I just think we got to be honest. Right.
00:10:08,166 --> 00:10:10,800
But but he's like he did share that.
00:10:10,800 --> 00:10:14,400
But you're right if it but that but also
he's following the model that they want.
00:10:14,933 --> 00:10:15,733
Yeah.
00:10:15,733 --> 00:10:20,833
Any you know he does it to like builders
have a tough job but I don't.
00:10:20,833 --> 00:10:24,366
But I do think that they don't do enough
for policy to protect people.
00:10:24,800 --> 00:10:26,233
And what you're saying is you
00:10:26,233 --> 00:10:27,966
if you would have done
that, you'd have had no ability
00:10:27,966 --> 00:10:30,000
to protect the people
that you're fighting for. Yes.
00:10:30,000 --> 00:10:31,466
And it can only get worse.
00:10:31,466 --> 00:10:33,500
Yeah. And, I. Totally respect that.
00:10:33,500 --> 00:10:36,133
Like to been able to do that. Amazing. For
this is why this is amazing.
00:10:36,133 --> 00:10:38,000
This is why I was moved to have you on.
00:10:38,000 --> 00:10:40,000
So sorry to interrupt you,
but go ahead. Good to go.
00:10:40,000 --> 00:10:44,200
Thank you, I appreciate it, but to,
if I died tomorrow, let's say I'm running
00:10:44,200 --> 00:10:47,900
it, and I have decision, like, even
best case scenario, I have full decision.
00:10:48,300 --> 00:10:49,266
I die tomorrow.
00:10:49,266 --> 00:10:52,400
Whoever's taking my place,
there's nothing, there's
00:10:52,400 --> 00:10:55,733
no law protecting the original mission
that we have.
00:10:55,733 --> 00:10:58,733
Somebody can come in and say,
this is our business strategy.
00:10:58,900 --> 00:11:00,200
where a nonprofit has
00:11:00,200 --> 00:11:03,600
an articles of incorporation
that the government enforces.
00:11:03,600 --> 00:11:03,966
Right.
00:11:03,966 --> 00:11:08,133
So if our founding team, by the way,
our founding team is amazing.
00:11:08,133 --> 00:11:08,866
There's four of us.
00:11:08,866 --> 00:11:12,033
If our founding team all dies tomorrow,
00:11:12,333 --> 00:11:16,466
whoever upholds it is legally bound
to uphold our Constitution.
00:11:16,700 --> 00:11:19,700
And to change that is extremely difficult.
00:11:19,966 --> 00:11:23,533
and the people that have to change
it have to look at our, our mission
00:11:23,533 --> 00:11:24,833
and see if that follows our mission.
00:11:24,833 --> 00:11:29,133
So, that's why I made the shift to
if we're doing something
00:11:29,133 --> 00:11:31,600
in the tech space,
it needs to be a nonprofit.
00:11:31,600 --> 00:11:33,233
Okay. That's amazing.
00:11:33,233 --> 00:11:36,233
So, you started the Clock Foundation.
00:11:36,866 --> 00:11:39,066
Now, tell us that you moved from,
00:11:39,066 --> 00:11:42,066
you know, working to help the community
of college students.
00:11:42,533 --> 00:11:44,800
What is the foundation doing now?
00:11:44,800 --> 00:11:46,033
What is its mission?
00:11:46,033 --> 00:11:46,333
Yeah.
00:11:46,333 --> 00:11:49,666
So, the mission is to wake up
as many people in the same way
00:11:49,666 --> 00:11:53,933
that I was woken up when,
when I did this deep dive, it was like
00:11:54,166 --> 00:11:57,266
pulling the matrix out and like,
oh, this is really what's going on?
00:11:57,800 --> 00:12:00,233
and it's to do it
in a way to create awareness.
00:12:00,233 --> 00:12:03,366
So the four main principles
that we're doing is creating awareness.
00:12:03,666 --> 00:12:05,900
We want to build education off of that.
00:12:05,900 --> 00:12:09,233
we want to create technology
that solves for it.
00:12:09,233 --> 00:12:11,366
So it's great to say, hey, guess what?
00:12:11,366 --> 00:12:14,000
You're screwed. Rob,
I don't have any solutions for you. Right?
00:12:14,000 --> 00:12:15,866
Okay. That doesn't help anybody. Yeah.
00:12:15,866 --> 00:12:18,133
If we have tools and technology.
00:12:18,133 --> 00:12:19,166
So technology and tools.
00:12:19,166 --> 00:12:21,300
So the second piece second half.
00:12:21,300 --> 00:12:24,300
So awareness
education technology and tools okay.
00:12:24,633 --> 00:12:29,133
To give people the steps
that they need to protect their privacy
00:12:30,533 --> 00:12:30,933
okay.
00:12:30,933 --> 00:12:34,900
The second part of that is that we are
also not only focused on privacy,
00:
12:35,166 --> 00:12:37,400
but we're focused on digital safety, okay.
00:12:37,400 --> 00:12:38,566
And that's more encompassing.
00:12:38,566 --> 00:12:41,833
So similar to when we were kids,
our parents would say,
00:12:41,833 --> 00:12:44,833
don't talk to strangers or look both ways
before you cross the street.
00:12:45,100 --> 00:12:48,100
What does that mean in an ever
changing online landscape?
00:12:48,333 --> 00:12:50,200
What does it mean
when it comes to virtual reality?
00:12:50,200 --> 00:12:53,833
When it comes to AI, when it comes
to just using our phones or our laptops?
00:12:54,133 --> 00:12:59,233
And how do we inform our children,
how to inform families and, keep that
00:12:59,233 --> 00:13:04,300
as a thread that we are on the pulse
for whatever advancing technologies come.
00:13:04,533 --> 00:13:04,933
Yeah.
00:13:04,933 --> 00:13:07,733
Give some examples.
Like so people think digital safety.
00:13:07,733 --> 00:13:10,066
and you gave me
some really great examples.
00:13:10,066 --> 00:13:13,366
But like people don't people
don't appreciate what that means.
00:13:13,366 --> 00:13:16,066
Tell me, why should people care
about digital safety?
00:13:16,066 --> 00:13:18,766
Yeah,
I think people have some idea of privacy
00:13:18,766 --> 00:13:22,333
and that feels easier because it's like,
all right, you can, you can.
00:13:22,333 --> 00:13:24,533
You get to choose
what level you want to do.
00:13:24,533 --> 00:13:26,966
Do you want to have your information
there? Do you not.
00:13:26,966 --> 00:13:29,266
And that needs to be in an informed
consent way
00:13:29,266 --> 00:13:31,033
where we actually understand
what you're telling us. Yeah.
00:13:31,033 --> 00:13:33,233
Not a 10,000 page terms condition.
00:13:33,233 --> 00:13:35,666
This where you can sell your information,
it can be sold to anybody.
00:13:35,666 --> 00:13:37,666
Are you comfortable with that
or not? Yeah.
00:13:37,666 --> 00:13:40,433
And then people need to really have
that versus whatever we have now.
00:13:40,433 --> 00:13:43,833
It's like not informed consent
I believe that's pretty dumb.
00:13:44,233 --> 00:13:47,866
I think overall people understand what
privacy means and why that's important.
00:13:48,366 --> 00:13:49,800
What do you mean by digital safety.
00:13:49,800 --> 00:13:50,133
Yeah.
00:13:50,133 --> 00:13:53,633
So digital safety has a lot of,
general threads.
00:13:53,833 --> 00:13:59,166
So it can be a mix of cybersecurity,
such as when you are creating a password,
00:13:59,166 --> 00:14:01,733
are you creating a long password
a short password.
00:14:01,733 --> 00:14:05,466
do you have authentication, multi-factor
authentication
00:14:05,633 --> 00:14:10,066
so that if somebody finds your password,
you get sent something on your phone
00:14:10,066 --> 00:14:12,466
saying, did you approve of this password
being sent?
00:14:12,466 --> 00:14:14,966
There are little things,
like very basic things.
00:14:14,966 --> 00:14:17,566
Are you updating your devices regularly?
00:14:17,566 --> 00:14:18,600
there are things
00:14:18,600 --> 00:14:22,500
such as what are the next scams
that are happening on TikTok or Facebook?
00:14:22,733 --> 00:14:26,233
we now know with AI there have been
phone calls that are replicating
00:14:26,233 --> 00:14:29,233
somebody's voice because all you need
is a six second sample.
00:14:29,266 --> 00:14:33,033
So those types of things
that I need to understand,
00:14:33,033 --> 00:14:36,766
that if I'm a mom
and my kid calls me from a phone
00:14:36,766 --> 00:14:39,333
and it sounds like them,
but it's not registered, I don't know,
00:14:39,333 --> 00:14:40,533
it's their actual phone number.
00:14:40,533 --> 00:14:42,666
And they're saying, hey, mom,
I need money, right?
00:14:42,666 --> 00:14:44,733
For you to have some street smarts.
00:14:44,733 --> 00:14:46,500
Yeah, some digital street smarts.
00:14:46,500 --> 00:14:48,633
Yeah, that's a good way to think about it.
Yeah, exactly.
00:14:48,633 --> 00:14:49,633
That's a good that's a really good.
00:14:49,633 --> 00:14:50,700
That's a really good example.
00:14:50,700 --> 00:14:54,466
Like, I remember growing up,
my parents and I had a, had
00:14:54,466 --> 00:14:58,200
I had a password between the family
that only 3 or 4 people knew.
00:14:58,433 --> 00:15:02,100
If that was so, you would know, actually,
if somebody said
00:15:02,300 --> 00:15:05,633
that they were trying to pick you up
because this is going to age me again.
00:15:05,633 --> 00:15:06,800
But we were latchkey kids, right?
00:15:06,800 --> 00:15:10,166
So, like, sometimes we were,
which is never happens now.
00:15:10,166 --> 00:15:14,166
Like, you get picked up by people
and, and uncles,
00:15:14,166 --> 00:15:17,700
and we would only know
that if people would know a password.
00:15:18,066 --> 00:15:19,433
Hardly
anybody ever picked me up, by the way.
00:15:19,433 --> 00:15:21,700
But it's usually I just went home myself.
00:15:21,700 --> 00:15:27,133
But they they would tell us that
because kids were getting kidnaped, right?
00:15:27,133 --> 00:15:29,600
When people were, saying,
oh, you're your father.
00:15:29,600 --> 00:15:32,466
I'm a friend with your father. Right?
And he told me to take you home.
00:15:32,466 --> 00:15:32,700
Yeah.
00:15:32,700 --> 00:15:35,766
And like, so kids
sometimes wouldn't know the difference.
00:15:35,766 --> 00:15:36,700
You know the father's name.
00:15:36,700 --> 00:15:38,900
They say. You know this,
I know this about your family.
00:15:38,900 --> 00:15:41,000
So you assume that kids are more trusting
00:15:41,000 --> 00:15:45,133
that haven't been through the level of,
exposure and experience that adults have.
00:15:45,233 --> 00:15:48,133
So we had we had, like,
passwords that we all have to know.
00:15:48,133 --> 00:15:51,100
And it's funny, like,
I feel like that needs to come back now.
00:15:51,100 --> 00:15:52,333
Like people. People didn't.
00:15:52,333 --> 00:15:54,833
People don't think about that as much now,
but it's it's
00:15:54,833 --> 00:15:56,133
because people don't get strangers.
00:15:56,133 --> 00:15:58,633
Don't pick up the kids. Most,
we have helicopter people now.
00:15:58,633 --> 00:16:01,666
Our moms and dads, like,
everything is like, encapsulated,
00:16:02,400 --> 00:16:05,033
but it's not really
what did with the digital world, is it?
00:16:05,033 --> 00:16:05,800
It's not.
00:16:05,800 --> 00:16:09,966
And the digital world,
it scales differently.
00:16:10,633 --> 00:16:15,166
So you don't need to call one latchkey
service at a time
00:16:15,433 --> 00:16:19,033
with AI, with being online,
you can replicate things very,
00:16:19,066 --> 00:16:22,433
very quickly
and it can be a one man shop or operation.
00:16:22,433 --> 00:16:26,566
So there's a huge difference in scale
when we're looking at this online.
00:16:26,666 --> 00:16:27,033
Yeah.
00:16:27,033 --> 00:16:30,133
I also remember an example
that we, that we talked when we first met.
00:16:30,133 --> 00:16:34,900
Yeah, about how digital safety affects
the most vulnerable populations.
00:16:35,100 --> 00:16:37,166
Like I love for you to
to to talk through that example.
00:16:37,166 --> 00:16:41,733
We talked about like women
that are for example, like women
00:16:41,733 --> 00:16:46,266
that are recovering from,
violent acts and domestic abuse.
00:16:46,266 --> 00:16:46,633
Yeah.
00:16:46,633 --> 00:16:49,400
Like that go to organizations like Women
Helping Women.
00:16:49,400 --> 00:16:52,400
You know, you really talked about
how digital safety can affect,
00:16:52,700 --> 00:16:55,833
people like that, that I never thought
about, talk about, like,
00:16:56,033 --> 00:16:59,200
why even even the
most vulnerable populations
00:17:00,266 --> 00:17:02,800
could actually be, more at risk as well.
00:17:02,800 --> 00:17:03,166
Yeah.
00:17:03,166 --> 00:17:07,333
So what we're seeing a lot
now in the landscape through our research
00:17:07,333 --> 00:17:11,266
is that there's various
areas, there's human trafficking,
00:17:11,266 --> 00:17:14,000
there's intimate partner
violence and domestic abuse,
00:17:14,000 -->
00:17:15,966
that we're seeing
a lot of these things happening.
00:17:15,966 --> 00:17:19,033
And then there's school
bullying and sextortion.
00:17:19,033 --> 00:17:25,033
So, let's go to to the first example
is that, in women's shelters,
00:17:25,266 --> 00:17:31,033
there's research that's been done
in Europe that found that 79% of women
00:17:31,233 --> 00:17:35,933
who went to a shelter for domestic abuse
were tracked in some way or form online.
00:17:36,133 --> 00:17:37,300
Wow. There.
00:17:37,300 --> 00:17:40,366
But the national Network to End
Domestic Violence, which is here in,
00:17:40,366 --> 00:17:45,800
in the US, you know,
they found that it was 100% of people,
00:17:46,266 --> 00:17:49,333
and now they surveyed,
probably like a couple thousand less.
00:17:49,333 --> 00:17:52,733
but either way, the survey was over
3000 people.
00:17:52,933 --> 00:17:56,600
So it is a significant amount of people
who are saying that online,
00:17:56,600 --> 00:17:57,433
I'm being tracked.
00:17:57,433 --> 00:18:00,400
And this could be
if my Snapchat location is on
00:18:00,400 --> 00:18:02,666
for those that you Snapchat, it could be
my car.
00:18:02,666 --> 00:18:05,500
is being tracked. It could be my phone.
00:18:05,500 --> 00:18:07,066
It could be more,
00:18:07,066 --> 00:18:11,100
like the Apple AirTags,
which are very common to slip into a car.
00:18:11,100 --> 00:18:13,266
I just talked to a young woman
who had that happened to her.
00:18:13,266 --> 00:18:14,933
She has no idea who is from.
00:18:14,933 --> 00:18:17,466
So there are various ways
of tracking and stalking.
00:18:17,466 --> 00:18:20,366
That's that's done.
So how does Snapchat tracking chat?
00:18:20,366 --> 00:18:21,700
You can turn your location on.
00:18:21,700 --> 00:18:25,533
They have an they have a piece
that's called Snapchat maps.
00:18:25,566 --> 00:18:27,066
Right. And you can turn your location on.
00:18:27,066 --> 00:18:29,233
You can see where people are at at all
times. Yeah.
00:18:29,233 --> 00:18:30,866
And it's pretty common for her.
00:18:30,866 --> 00:18:33,966
Used to have that on
some are very selective about
00:18:33,966 --> 00:18:37,000
which friends have it,
but some partners may demand
00:18:37,000 --> 00:18:40,933
that you have these types of things on,
and if you don't, you will get in trouble.
00:18:40,933 --> 00:18:41,566
As an example.
00:18:40,933 --> 00:18:41,566
As an example.
00:18:41,566 --> 00:18:44,433
And what type of tools do they have
in place, like you said, like this is
00:18:45,433 --> 00:18:46,866
like Snapchat should probably do
00:18:46,866 --> 00:18:51,166
a lot more to warn to warn kids, or
maybe turn that on and off to say, do you?
00:18:51,466 --> 00:18:53,800
I know, like, Apple
at least started doing that.
00:18:53,800 --> 00:18:55,900
Like, do you want to stay
or do you want to stay on?
00:18:55,900 --> 00:18:57,966
But I think with kids
they probably needs to be
00:18:57,966 --> 00:19:00,566
another level of, protection and safety.
Absolutely.
00:19:00,566 --> 00:19:02,100
And kids are using it
all the time to figure out,
00:19:02,100 --> 00:19:04,933
like who's at what party
or where they're going with things to.
00:19:04,933 --> 00:19:07,100
That's that's tough. Like,
00:19:07,100 --> 00:19:10,100
so how do you balance this
when you think about like,
00:19:11,533 --> 00:19:16,700
how do we balance the obviously
the innovation and having the right
00:19:16,700 --> 00:19:20,700
balance between what, what I guess
policy or regulation needs to look like?
00:19:20,933 --> 00:19:25,366
obviously you and I are both believers
in policy and regulation somewhere here
00:19:25,366 --> 00:19:29,233
that and say like you create regulation,
it's going to kill innovation.
00:19:29,333 --> 00:19:30,133
Yeah, right.
00:19:30,133 --> 00:19:32,466
That's the line.
What's your response to that.
00:19:32,466 --> 00:19:34,700
Yeah, that's a that's a good question.
00:19:34,700 --> 00:19:39,933
I the way that I think about it
is that most organizations
00:19:39,933 --> 00:19:43,933
that are, that are doing
this, have good intentions,
00:19:43,933 --> 00:19:47,900
they're trying to make good products,
and I don't blame them at all.
00:19:48,200 --> 00:19:51,200
The areas that I'm really looking toward,
00:19:51,466 --> 00:19:54,233
is when it comes to bad actors.
00:19:54,233 --> 00:19:56,500
So when it comes to corporate
policy and privacy,
00:19:56,500 --> 00:19:59,333
there is this culture of surveillance
capitalism. Yes.
00:19:59,333 --> 00:20:01,333
The The Punk Foundation doesn't
focus on that,
00:20:01,333 --> 00:20:04,833
although where where that happens,
we're more focused on how we can help
00:20:04,833 --> 00:20:07,933
with bad actors, but, help
fight against bad actors.
00:20:07,933 --> 00:20:08,366
Sorry.
00:20:08,366 --> 00:20:11,633
and but when we look
at the, the corporate area,
00:20:13,066 --> 00:20:14,366
the way that I see it
00:20:14,366 --> 00:20:17,466
and this is a, the more personal take,
but the way that I see it
00:20:17,466 --> 00:20:22,700
is that they have been using a technology
before regulation.
00:20:22,700 --> 00:20:26,633
So everything before regulation
is sort of a gravy train for them.
00:20:26,633 --> 00:20:27,600
But there are.
00:20:27,600 --> 00:20:30,433
But then the consequences that they have
00:20:30,433 --> 00:20:33,433
will not allow for what they're doing
to be sustainable in the long run.
00:20:33,433 --> 00:20:33,800
Right?
00:20:33,800 --> 00:20:37,866
If they're if they're taking everything
and bastardize it and sucking it in
00:20:38,066 --> 00:20:38,900
and turning it out is
00:20:38,900 --> 00:20:42,766
I you're going to have a lot of problems
of quality data that exists.
00:20:43,033 --> 00:20:45,966
You're going to have a lot of problems of,
00:20:45,966 --> 00:20:50,533
taking and, and,
just having massive amounts of siloed data
00:20:50,533 --> 00:20:53,266
that when cyber attacks happen,
now you have data
00:20:53,266 --> 00:20:56,033
you shouldn't have collected to begin with
or you should have purged.
00:20:56,033 --> 00:20:59,033
And now it's affecting
as a national security risk.
00:20:59,200 --> 00:21:03,100
So, the Biden administration
actually released,
00:21:03,100 --> 00:21:07,233
about around October,
a bill for cyber security
00:21:07,233 --> 00:21:11,200
where they're funding,
for people to get jobs in cybersecurity.
00:21:11,200 --> 00:21:14,133
And this does tie into privacy
and digital safety,
00:21:14,133 --> 00:21:18,166
but they want to because all of the top
businesses are not equipped,
00:21:18,400 --> 00:21:21,400
for example, what China has
00:21:21,566 --> 00:21:25,400
when it comes to their, ability to hack.
00:21:25,400 --> 00:21:27,666
As an example,
when it comes to missile launching,
00:21:27,666 --> 00:21:30,100
when it comes
to, let's say, if P&G shuts down,
00:21:30,100 --> 00:21:34,233
you know, because of cyber, cyber
attacks or something along those lines,
00:21:34,366 --> 00:21:37,933
or a coordinated effort
that shuts down multitudes of hospitals.
00:21:38,066 --> 00:21:42,166
So we're looking at this from a national
security and defense, type of thing.
00:21:42,166 --> 00:21:45,666
And businesses in the United States
have a lot of freedom compared
00:21:45,666 --> 00:21:51,866
to a lot of other countries to navigate,
but we need some type of unity of
00:21:51,866 --> 00:21:55,800
how are we protecting our citizens,
which in turns protects the country.
00:21:55,933 --> 00:21:59,566
So it is a national security risk,
by the Biden administration.
00:22:00,000 --> 00:22:01,200
that's very, very interesting.
00:22:01,200 --> 00:22:03,100
It's a national security risk.
00:22:03,100 --> 00:22:07,033
And so you kind of answer
one of my questions like, how do you see
00:22:07,033 --> 00:22:07,900
navigating this?
00:22:07,900 --> 00:22:12,133
If you were,
I guess, president and ruler for a day
00:22:12,133 --> 00:22:16,166
where you had the control of Congress
and your president and Senate,
00:22:17,500 --> 00:22:19,733
what laws would you pass to, to,
00:22:19,733 --> 00:22:23,433
to to help us with both privacy
and digital safety?
00:22:23,733 --> 00:22:24,066
Yeah.
00:22:24,066 --> 00:22:29,166
So I think mimicking the GDPR is probably
the best place to start in America.
00:22:29,166 --> 00:22:30,433
Tell me what the GDPR is. Yeah.
00:22:30,433 --> 00:22:34,500
So that's the
the European Union's, privacy law.
00:22:34,700 --> 00:22:37,766
And it's, basically it's global.
00:22:37,766 --> 00:22:40,200
So it's like imagine
be a federal law here.
00:22:40,200 --> 00:22:43,200
And, it controls,
00:22:43,300 --> 00:22:46,266
what amount of data
a company can collect the limits.
00:22:46,266 --> 00:22:49,866
And,
it's very complicated and very thorough.
00:22:50,533 --> 00:22:53,733
but the main thing is
that your privacy is a fundamental right.
00:22:54,166 --> 00:22:57,766
And to have access as a company
or as an organization to the privacy,
00:22:57,766 --> 00:23:00,766
the person needs very clear
consent saying, yes.
00:23:00,833 --> 00:23:03,266
Exactly.
And they can revoke that at any time.
00:23:03,266 --> 00:23:06,200
And they say, hey, never mind.
I don't want you to have my data.
00:23:06,200 --> 00:23:08,333
The only exceptions,
there's a few exceptions
00:23:08,333 --> 00:23:11,333
and I'm not an expert on GDPR,
but there are a few exceptions
00:23:11,700 --> 00:23:14,033
that when it comes to like a court case
as an example.
00:23:14,033 --> 00:23:16,933
Yeah. So fairly reasonable exceptions.
00:23:16,933 --> 00:23:17,300
Right.
00:23:17,300 --> 00:23:22,966
and but the most important philosophical
thing in the difference is that in
00:23:22,966 --> 00:23:26,733
the United States, privacy is an asset
that you trade for services.
00:23:27,766 --> 00:23:28,433
That's good.
00:23:28,433 --> 00:23:28,633
Yeah.
00:23:28,633 --> 00:23:31,600
And you're that's bad
but it's a good line.
00:23:31,600 --> 00:23:33,433
so yeah it's a bar.
00:23:33,433 --> 00:23:34,233
Yeah. It's a bar.
00:23:34,233 --> 00:23:37,233
It's a bar. But like is that is bad. Yes.
00:23:37,500 --> 00:23:40,066
Say that again one more time
because I think that's important is
00:23:40,066 --> 00:23:41,400
to drop the bar again okay.
00:23:41,400 --> 00:23:45,933
So in the United States privacy
is an asset that you trade for services.
00:23:45,933 --> 00:23:47,866
Convenience, whatever that is.
00:23:47,866 --> 00:23:50,466
in Europe it's a fundamental human right.
00:23:50,466 --> 00:23:54,333
And the United Nations that actually
they have a, bill of fundamental
00:23:54,333 --> 00:23:54,933
human rights.
00:23:54,933 --> 00:23:58,666
Article 12 is about privacy
is of being a fundamental human right.
00:23:59,100 --> 00:24:04,333
And to be frank, in the developing world,
the US is very behind on this. Yes.
00:24:04,400 --> 00:24:06,133
and we have patchwork laws.
00:24:06,133 --> 00:24:10,200
So like California has CcpA,
which is the California Privacy Act
00:24:11,033 --> 00:24:14,400
and it's emulated from European Union's
GDPR.
00:24:14,566 --> 00:24:16,633
Right. So they're they're using that.
00:24:16,633 --> 00:24:18,166
But that's only in California.
00:24:18,166 --> 00:24:18,766
Right.
00:24:18,766 --> 00:24:21,766
And Colorado has a few things
Virginia has a few things.
00:24:21,800 --> 00:24:23,933
Kentucky is passing a bill.
00:24:23,933 --> 00:24:27,233
And it's it's like a slap on the wrist
for organizations that violate it.
00:24:27,900 --> 00:24:29,000
But it's a start. Right?
00:24:29,000 --> 00:24:32,800
So there are various
we call it a patchwork framework.
00:24:32,966 --> 00:24:33,200
Right.
00:24:33,200 --> 00:24:36,433
but there's nothing federal
that's, that's happening.
00:24:36,433 --> 00:24:41,000
And that's something that would be
ideally the best place to start.
00:24:41,466 --> 00:24:44,900
Oh, no, that's that's,
very well said. So.
00:24:45,233 --> 00:24:47,900
All right. They we don't have policy now.
00:24:47,900 --> 00:24:50,800
It's obviously in the US is the wild,
wild West.
00:24:50,800 --> 00:24:52,133
Privacy is an asset.
00:24:52,133 --> 00:24:55,133
It's not a right in in the US
unfortunately.
00:24:55,466 --> 00:24:57,833
what are some steps people can take.
00:24:57,833 --> 00:24:59,800
Like what are some practical steps
you talked about for you
00:24:59,800 --> 00:25:04,200
you talked about having multiple,
forms of authentication.
00:25:05,066 --> 00:25:08,433
what are some things that you think were
some basic things people can do
00:25:08,666 --> 00:25:12,200
or places they can go to learn about steps
they can take to protect themselves?
00:25:12,500 --> 00:25:15,600
Yeah. So I think that when it comes
00:25:15,600 --> 00:25:18,733
to individuals,
just being aware is half the battle.
00:25:18,933 --> 00:25:19,266
Yeah.
00:25:19,266 --> 00:25:22,300
Because when you're aware of
what's actually happening, there are few
00:25:22,300 --> 00:25:22,933
resources.
00:25:22,933 --> 00:25:25,400
The Plug Foundation's
website, it's Plunk Foundation.
00:25:25,400 --> 00:25:26,933
Org plunk.
00:25:26,933 --> 00:25:27,800
Yes. By the way, it's
00:25:27,800 --> 00:25:31,900
an acronym for peaceful, loving,
uplifting, nurturing and kind.
00:25:31,900 --> 00:25:33,700
So that's why plunk. There you go.
00:25:33,700 --> 00:25:36,566
so plunk
foundation.org has resources on this.
00:25:36,566 --> 00:25:39,566
There are other things
such as the center for Humane Technology,
00:25:39,733 --> 00:25:43,266
the Mozilla Foundation,
DuckDuckGo has a blog.
00:25:43,266 --> 00:25:46,300
It's like a Google search engine
that's that's privacy centric.
00:25:46,633 --> 00:25:50,100
there are
there's the International Association
00:25:50,100 --> 00:25:53,100
for Privacy Professionals, AIA.
00:25:53,333 --> 00:25:55,300
Those are all really good resources.
00:25:55,300 --> 00:26:00,333
The EFF has a really, really good
write ups on such things, such as like
00:26:00,333 --> 00:26:04,866
police surveillance and how that's
being used and, things you can do about.
00:26:05,166 --> 00:26:08,166
And they also have a ton of lawyers
on their team
00:26:08,333 --> 00:26:13,100
that actually go to battle in court and
try to win cases when it comes to privacy,
00:26:13,500 --> 00:26:17,566
those are all really good things
in terms of practical, actionable steps.
00:26:18,033 --> 00:26:21,100
we are building a curriculum
at plunk to do this.
00:26:22,100 --> 00:26:22,633
and to put it
00:26:22,633 --> 00:26:25,900
online and make it available,
but extremely basic.
00:26:25,933 --> 00:26:27,333
I'll give it a few,
like right off the bat.
00:26:27,333 --> 00:26:29,000
Yeah. extremely basic.
00:26:29,000 --> 00:26:32,133
one is
make sure your devices are updated.
00:26:32,633 --> 00:26:36,233
if you even if you have good privacy
in place, if the device isn't updated,
00:26:36,233 --> 00:26:37,633
it's easy to get into.
00:26:37,633 --> 00:26:40,200
So make sure you update your device
hands down.
00:26:40,200 --> 00:26:44,033
That's that's the best thing to
is that you want to make sure
00:26:44,033 --> 00:26:48,866
that you have at least a junk
email address where you can if you
00:26:49,433 --> 00:26:51,500
if you go to a store
and they ask for your email address
00:26:51,500 --> 00:26:55,933
for 20% off, 30% off, whatever that is,
typically I would say don't do it.
00:26:55,933 --> 00:26:57,700
But if you want to make it easy,
just have a junk
00:26:57,700 --> 00:27:02,066
email address, with information
that's not relevant to you.
00:27:02,066 --> 00:27:04,800
it doesn't have your address, stuff
like that.
00:27:04,800 --> 00:27:06,233
That's a good way to go about it.
00:27:06,233 --> 00:27:10,266
And here's a really interesting one that,
if you go to the doctor
00:27:10,500 --> 00:27:14,566
and they or if you go pretty much anywhere
and they want to scan your ID,
00:27:15,933 --> 00:27:16,300
they don't
00:27:16,300 --> 00:27:20,133
need to scan your ID, they can verify that
it's you and you can say, hey,
00:27:20,133 --> 00:27:23,466
I would like for you, you can verify
this is me, but you don't need to scan it.
00:27:23,666 --> 00:27:23,966
Right?
00:27:23,966 --> 00:27:26,133
Because a lot of companies take that,
they scan it,
00:27:26,133 --> 00:27:29,466
and now they have a ton of information
about you that they don't need,
00:27:29,633 --> 00:27:33,933
and they store it, or they resell it or
share it with other third party members.
00:27:34,266 --> 00:27:35,733
And you don't need to do that.
00:27:35,733 --> 00:27:36,466
So if you have a kid
00:27:36,466 --> 00:27:39,900
or anything along those lines too,
don't need to do any of that scanning.
00:27:40,600 --> 00:27:42,833
you don't need to give your social
for a lot of places.
00:27:42,833 --> 00:27:45,733
So it's always good when somebody says,
May I have your social?
00:27:45,733 --> 00:27:49,200
You can ask, is that required for us
to continue this transaction?
00:27:49,200 --> 00:27:51,366
Right.
even at a doctor's office? It's not.
00:27:51,366 --> 00:27:54,466
It's an often not so
they don't need any of that information.
00:27:54,466 --> 00:27:56,833
So doctors, we sell your stuff too.
00:27:56,833 --> 00:27:59,233
So yeah.
00:27:59,233 --> 00:28:01,366
I know
this isn't going to be a clickbait moment,
00:28:01,366 --> 00:28:05,166
but there's, there's a law called
or there's an act called HIPAA,
00:28:05,200 --> 00:28:07,133
which is the health information
privacy. Yes.
00:28:07,133 --> 00:28:08,133
So they can do that.
00:28:08,133 --> 00:28:11,233
So what happens is
there's a barrier that you have to meet.
00:28:11,533 --> 00:28:15,733
You have to be, some type of person,
00:28:15,733 --> 00:28:20,166
that or entity that crosses this,
this barrier of health information.
00:28:20,533 --> 00:28:22,200
Once you do,
00:28:22,200 --> 00:28:25,100
which means you meet
a lot of the federal regulations,
00:28:25,100 --> 00:28:27,200
you have to meet
all these federal regulations.
00:28:27,200 --> 00:28:29,633
Once you do,
you have the ability to transact
00:28:29,633 --> 00:28:32,666
with people
who who have reached that barrier.
00:28:32,666 --> 00:28:33,166
Right? Yeah.
00:28:33,166 --> 00:28:37,066
So so you can share
or they can share information.
00:28:37,066 --> 00:28:40,066
And some of that is like the part of me
that is conflicted on that.
00:28:41,933 --> 00:28:44,433
What does not conflict is I understand
even when things are used for good
00:28:44,433 --> 00:28:47,600
purposes, people find ways to use them
for nefarious purposes, I get it.
00:28:48,366 --> 00:28:51,866
The other part of that, though,
is of course, the sharing of information
00:28:51,866 --> 00:28:56,233
helps to prevent, diseases, helps
you learn about causes.
00:28:56,533 --> 00:28:58,666
So, I mean,
00:28:58,666 --> 00:29:00,633
figuring out that balance of what that is,
00:29:00,633 --> 00:29:04,166
because I do think that's important
and I don't have the answers is important.
00:29:04,166 --> 00:29:05,766
I mean, I
00:29:05,766 --> 00:29:08,900
so I do think it has to be like,
because the sharing of that data,
00:29:08,900 --> 00:29:11,066
the problem is people don't trust
that you'll do something
00:29:11,066 --> 00:29:14,033
like what shouldn't be shared is
if somebody has a,
00:29:14,033 --> 00:29:17,000
a condition that shouldn't be shared
with the insurance company
00:29:17,000 --> 00:29:18,366
to figure out ways, right.
00:29:18,366 --> 00:29:20,266
so they can maybe not,
00:29:20,266 --> 00:29:22,033
or they're not supposed to be able
to not give you an assurance.
00:29:22,033 --> 00:29:25,133
But we all know they can find ways
to make it difficult for things
00:29:25,133 --> 00:29:26,733
not to be covered. Still.
00:29:26,733 --> 00:29:29,066
yes, that's the worry I have.
00:29:29,066 --> 00:29:31,033
But I'm like,
I also know share the sharing
00:29:31,033 --> 00:29:34,900
of the information for things
like understanding health patterns.
00:29:34,900 --> 00:29:37,166
You introduce things like doppelgangers
that have people
00:29:37,166 --> 00:29:39,133
have similar profiles to you.
00:29:39,133 --> 00:29:40,900
Like what do you think is the balance
there
00:29:40,900 --> 00:29:43,300
times of figuring out how we share data?
00:29:43,300 --> 00:29:44,000
But it's not.
00:29:44,000 --> 00:29:46,733
It's probably regulation to find out
the answer to my question on that. But.
00:29:46,733 --> 00:29:49,333
Well, it's it's
actually a pretty simple solution.
00:29:49,333 --> 00:29:51,566
Okay. And and Facebook can do this
with advertising.
00:29:51,566 --> 00:29:53,133
Lots of organizations can do this.
00:29:53,133 --> 00:29:57,033
It's just that don't attach
personally identifiable information.
00:29:57,133 --> 00:29:59,666
Oh yeah. To it. Anonymize the data.
00:29:59,666 --> 00:29:59,900
Right.
00:29:59,900 --> 00:30:05,433
Like, hey, we you know, maybe there's 7000
cases in Cincinnati of this thing.
00:30:05,633 --> 00:30:08,600
We don't need to be able to trace it back
to who those people are.
00:30:08,600 --> 00:30:10,033
It's anonymized data.
00:30:10,033 --> 00:30:13,933
And, I know for the nerds out there,
they're going to say, well,
00:30:13,933 --> 00:30:17,533
I has the ability to anonymize data
right through that.
00:30:17,833 --> 00:30:21,366
There are techniques that brilliant people
even in the university.
00:30:22,033 --> 00:30:23,033
Yes. Yes.
00:30:23,033 --> 00:30:28,333
And, well, they add noise, so you can't
identify exactly who has what condition.
00:30:28,333 --> 00:30:31,266
And so there are brilliant people
coming up with these solutions.
00:30:31,266 --> 00:30:34,466
And to go to your question about,
how do we have,
00:30:34,466 --> 00:30:37,700
innovation, you know, and privacy.
00:30:37,700 --> 00:30:41,266
There's so much innovation happening
in this realm that protects us, too,
00:30:41,433 --> 00:30:43,566
and there's so much innovation
that's going on.
00:30:43,566 --> 00:30:47,133
I think, to answer my own question out
loud, I think we do have policy.
00:30:47,700 --> 00:30:50,800
And we started synthesizing investment
the same way we started doing,
00:30:50,966 --> 00:30:53,100
sustainability. Right, right.
00:30:53,100 --> 00:30:54,766
So now you're seeing all of this happen
00:30:54,766 --> 00:30:57,766
and all this happened all across Ohio,
but really all across the country,
00:30:57,900 --> 00:31:00,900
that there's just solar power farms
being built everywhere.
00:31:00,933 --> 00:31:02,400
Of course, there are people
that are trying to be against it
00:31:02,400 --> 00:31:05,233
because they want to keep the money train
the same way it's always been.
00:31:05,233 --> 00:31:07,966
I'm sure there will be people
that want to keep data the way.
00:31:07,966 --> 00:31:09,700
It's the way it's always been,
because that's how
00:31:09,700 --> 00:31:10,700
they've always made their money.
00:31:10,700 --> 00:31:14,100
Yeah, but to the point is, there's lots of
there's lots of jobs being created
00:31:14,100 --> 00:31:16,466
that way, lots of opportunities
being created that way.
00:31:16,466 --> 00:31:17,300
Yeah.
00:31:17,300 --> 00:31:20,666
If we start having a renaissance towards
okay, we can protect people's privacy
00:31:21,133 --> 00:31:21,900
and still innovate.
00:31:21,900 --> 00:31:24,433
And we want to incentivize
that type of activity.
00:31:24,433 --> 00:31:27,033
That to me
is one of the main purposes of policy.
00:31:27,033 --> 00:31:27,200
Yeah.
00:31:27,200 --> 00:31:30,933
Policy is actually
it is a guiding principle to say
00:31:30,933 --> 00:31:33,933
this is the type of society we want to be,
and we never get it perfect.
00:31:33,933 --> 00:31:35,266
We can't overdo it.
00:31:35,266 --> 00:31:39,700
But I like the concept of
how do we incentivize people to innovate.
00:31:39,733 --> 00:31:40,666
It protects privacy.
00:31:40,666 --> 00:31:43,800
We have the
you talked about what you do in terms of,
00:31:43,800 --> 00:31:46,800
the solution to scrub out the noise
using AI.
00:31:46,933 --> 00:31:49,966
I mean, there's also blockchains
involved in the zero proof knowledge.
00:31:49,966 --> 00:31:53,433
You're able to, hold data like collect
data.
00:31:53,866 --> 00:31:56,300
That's, that's that's totally private.
00:31:56,300 --> 00:32:00,600
But then you can still know about the data
without knowing the data.
00:32:00,600 --> 00:32:03,866
Like, I know it sounds weird, but it's the
same type of thing you're talking about.
00:32:03,866 --> 00:32:07,033
Like it's a way of proving things
without knowing the details of a person,
00:32:07,533 --> 00:32:10,000
which I think though
we have the technology to do.
00:32:10,000 --> 00:32:12,800
This is just what problem
are we trying to solve a society?
00:32:12,800 --> 00:32:16,733
And right now we've just said nothing
more important than just how fast we can
00:32:16,733 --> 00:32:20,666
just, make profits without any type of,
thought about,
00:32:20,966 --> 00:32:23,866
should we do all of these things
and how should we do them?
00:32:23,866 --> 00:32:24,500
Yeah, totally.
00:32:24,500 --> 00:32:27,600
And one thing,
you know, that this is an issue
00:32:27,966 --> 00:32:32,633
that is sort of bipartisan,
you know, because you will talk to,
00:32:32,633 --> 00:32:36,100
a ton of different people
from both sides of the aisle.
00:32:36,433 --> 00:32:39,366
And, you know, it's
something that needs regulation
00:32:39,366 --> 00:32:43,100
when people on the right are saying, hey,
I don't even believe in regulation.
00:32:43,100 --> 00:32:46,533
But this I know what I'm into deep.
00:32:46,533 --> 00:32:47,600
I know that there's.
00:32:47,600 --> 00:32:49,633
Something that they know
their kids are vulnerable.
00:32:49,633 --> 00:32:51,133
Yes. And another thing too,
00:32:51,133 --> 00:32:55,166
is that a lot of things that we're seeing
is that you could be an affluent family,
00:32:55,733 --> 00:32:59,933
but this could still happen to your kid
all in all, it takes is just one moment
00:32:59,933 --> 00:33:01,400
where, hey, mom. Hey, dad, I hate you.
00:33:01,400 --> 00:33:03,900
You don't understand me.
And then they put this online.
00:33:03,900 --> 00:33:08,400
I used to talk crap about my parents
under a tree house, right?
00:33:08,400 --> 00:33:09,166
Right.
00:33:09,166 --> 00:33:11,133
But the digital. Tree house exists.
00:33:11,133 --> 00:33:11,966
Yeah, where you're.
00:33:11,966 --> 00:33:15,233
Spewing that, and it just takes somebody
who can identify
00:33:15,233 --> 00:33:19,233
through sentiment analysis,
through using AI, and pinpointing
00:33:19,233 --> 00:33:22,700
exact locations of who may be having fits
with their parents at this moment.
00:33:23,033 --> 00:33:24,233
to understand.
00:33:25,333 --> 00:33:25,633
Okay.
00:33:25,633 --> 00:33:27,333
These are vulnerable children. Yeah.
00:33:27,333 --> 00:33:29,100
And it just takes one connection.
00:33:29,100 --> 00:33:32,100
You can buy their data for five bucks
to understand what their interests are.
00:33:32,466 --> 00:33:35,100
You can say, okay,
it just takes this one connection
00:33:35,100 --> 00:33:38,366
with this child who's having a bad moment
with their parents online.
00:33:39,000 --> 00:33:42,000
And that's all it takes to start
something like that.
00:33:42,166 --> 00:33:45,400
So people are understanding
like both sides of the aisle,
00:33:45,666 --> 00:33:49,933
all sides of the income economic ladder
are people that are affected from this.
00:33:50,333 --> 00:33:53,100
What happens if we don't get this right?
00:33:53,100 --> 00:33:57,333
Well, I think it's going to get sort of
like climate change.
00:33:57,333 --> 00:33:59,966
It's going to get a lot worse
before it gets better.
00:33:59,966 --> 00:34:03,666
and but I do see hope.
00:34:03,900 --> 00:34:09,366
so it's hard to predict,
how I, will make this worse.
00:34:09,600 --> 00:34:12,666
It's hard to predict, how bad actors,
because they're very smart.
00:34:12,666 --> 00:34:13,666
I'm asking you to predict.
00:34:13,666 --> 00:34:16,366
What do you think that looks like?
If we don't get it right.
00:34:16,366 --> 00:34:17,366
I think
00:34:18,966 --> 00:34:21,833
it's going
to be a lot more like Brave New World.
00:34:21,833 --> 00:34:22,966
Have you seen Brave New World? Yes.
00:34:22,966 --> 00:34:25,100
Or. I mean, read the book.
00:34:25,100 --> 00:34:28,900
It's essentially
where people kind of understand
00:34:28,900 --> 00:34:31,900
that this is happening,
but they don't care that much.
00:34:32,033 --> 00:34:36,300
And there will be large
corporations or large governments
00:34:36,300 --> 00:34:40,333
that are using that to understand
everything about you and nudging
00:34:40,333 --> 00:34:41,800
how you think and what you think.
00:34:41,800 --> 00:34:44,533
Over time, there will be bad actors
00:34:44,533 --> 00:34:48,400
who are exploiting this to get their way
with whatever they want.
00:34:48,400 --> 00:34:51,733
and this could be not even
just from a kids and,
00:34:51,833 --> 00:34:54,766
you know, vulnerable population
perspective, but from policy.
00:34:54,766 --> 00:34:55,166
Absolutely.
00:34:55,166 --> 00:34:58,433
so it looks very bleak,
to be honest with you.
00:34:58,433 --> 00:34:59,000
So, yeah,
00:34:59,000 --> 00:35:03,366
that it's a world where the algorithms
know us better than we know ourselves,
00:35:03,666 --> 00:35:05,266
and we don't even know
that that's happening.
00:35:05,266 --> 00:35:06,933
Like you said, and I.
00:35:06,933 --> 00:35:08,500
And to take it further,
00:35:08,500 --> 00:35:12,100
like we can get to the point where it
it sounds really freaky, right,
00:35:13,033 --> 00:35:14,200
that algorithms
00:35:14,200 --> 00:35:18,066
own most of people and corporations
and nations.
00:35:18,066 --> 00:35:21,066
Now, that sounds weird until you realize
00:35:21,733 --> 00:35:24,733
that most holders of land aren't.
00:35:25,133 --> 00:35:27,933
People are organizations.
00:35:27,933 --> 00:35:32,100
So imagine if we
if we're okay with algorithms,
00:35:32,200 --> 00:35:36,400
because this is very possible
that algorithms can eventually determine,
00:35:36,966 --> 00:35:40,633
who owns land, how policy is written,
all those things that we don't even know.
00:35:40,766 --> 00:35:42,833
Like,
I know it sounds like science fiction,
00:35:42,833 --> 00:35:46,333
but it's not science fiction
when you understand that if there's not,
00:35:46,333 --> 00:35:49,833
we don't have a guiding map
for what we want in society,
00:35:50,200 --> 00:35:54,466
how we're using artificial intelligence
to augment us, right?
00:35:54,733 --> 00:35:57,566
Augmented intelligence,
not an artificial intelligence.
00:35:57,566 --> 00:35:59,833
And AI, everybody is not new.
00:35:59,833 --> 00:36:02,700
This is this is this is algorithms
and machine learning.
00:36:02,700 --> 00:36:04,700
All that has been going on
for a long time now.
00:36:04,700 --> 00:36:07,600
We have another level of generative AI.
00:36:07,600 --> 00:36:10,866
And so like I'm with you, I think there's
a lot of hope and potential.
00:36:11,266 --> 00:36:14,266
But there's there's huge reasons
to be concerned.
00:36:14,600 --> 00:36:14,833
Yeah.
00:36:14,833 --> 00:36:18,433
It's it's hard to understate
how important it is,
00:36:18,433 --> 00:36:21,400
but it always sounds like tinfoil
hat in a way.
00:36:21,400 --> 00:36:24,166
So it's and it's impossible to predict
00:36:24,166 --> 00:36:27,133
because this is the thing I struggle with.
00:36:27,133 --> 00:36:30,800
We are generating so many billions of data
points,
00:36:31,000 --> 00:36:34,633
just like per second with AI,
and how that transforms
00:36:34,633 --> 00:36:38,000
without much regulation
could go any direction.
00:36:38,000 --> 00:36:41,266
And it's hard to predict
just how bad that could be.
00:36:41,266 --> 00:36:41,700
Yeah.
00:36:41,700 --> 00:36:46,100
and, you know,
if we don't look out for this
00:36:46,366 --> 00:36:49,600
at all,
we don't know how fast it can affect us.
00:36:49,600 --> 00:36:51,200
It'll go faster than we can imagine.
00:36:51,200 --> 00:36:52,900
Yeah, it's going to be exponential.
00:36:52,900 --> 00:36:55,333
It's already is. It already is. Right.
Like we're growing.
00:36:55,333 --> 00:36:55,800
That's horror.
00:36:55,800 --> 00:36:56,833
People understand? Yeah.
00:36:56,833 --> 00:36:57,266
As you know,
00:36:57,266 --> 00:37:01,733
exponential growth is not something
the human brain can actually, understand.
00:37:02,033 --> 00:37:04,400
So people say, well, you look like
what does that mean?
00:37:04,400 --> 00:37:06,633
It literally means it's growing so fast
you don't understand.
00:37:06,633 --> 00:37:07,300
Yeah. Right.
00:37:07,300 --> 00:37:10,833
From the human like we understand as nerds
what exponential growth looks like.
00:37:10,833 --> 00:37:11,700
We can see it on a chart.
00:37:11,700 --> 00:37:14,833
Yeah, but we really can't visualize
exponential growth.
00:37:15,133 --> 00:37:18,733
So it's moving so fast that like,
we need to this is why, you know,
00:37:19,333 --> 00:37:21,033
you're going to be a midwest con.
00:37:21,033 --> 00:37:25,166
And this is why like we're really focused
on what does policy innovation look like.
00:37:25,166 --> 00:37:28,100
Like because we think
I think you agree to that.
00:37:28,100 --> 00:37:31,066
We don't policy. Does
it mean that we don't have innovation.
00:37:31,066 --> 00:37:35,366
It means that we are building trust
so we can, build better innovation,
00:37:35,600 --> 00:37:39,133
have a flexible enough policy,
but a policy that sets clear rules
00:37:39,133 --> 00:37:42,400
so we understand
how we're operating with one another.
00:37:42,400 --> 00:37:43,833
Like I just think it's very important.
Agreed.
00:37:43,833 --> 00:37:46,833
Like we need even
we need to get to deep fakes
00:37:47,266 --> 00:37:51,233
and all those things that are really like,
like that are really concerning people.
00:37:51,233 --> 00:37:52,566
So like before we leave,
00:37:52,566 --> 00:37:54,566
I got a couple of lightning
round questions before I do this. Like
00:37:56,300 --> 00:37:58,333
if you had
to say something that will surprise people
00:37:58,333 --> 00:38:01,433
the most about the lack of privacy
or digital safety,
00:38:01,766 --> 00:38:04,466
can you think of a story or an example
that exemplifies
00:38:04,466 --> 00:38:07,466
that?
00:38:07,533 --> 00:38:09,133
What surprised you the most?
00:38:09,133 --> 00:38:15,233
Yes. on average
you have 3192 data points about you.
00:38:15,466 --> 00:38:18,233
And this is much better than people
that are close to your friends
00:38:18,233 --> 00:38:19,233
may know about you.
00:38:19,233 --> 00:38:24,133
So this is being sold
and, traded every single day
00:38:24,133 --> 00:38:27,400
and being aggregated for every time
you accept the term and condition,
00:38:28,000 --> 00:38:30,100
right, that invade your privacy.
00:38:30,100 --> 00:38:32,466
And I remember
the one thing that you told me about,
00:38:32,466 --> 00:38:35,466
when you set out a picture, metadata,
please tell people that the.
00:38:35,466 --> 00:38:36,033
White part.
00:38:36,033 --> 00:38:39,033
When you send out a picture
and the metadata behind it, what that
00:38:39,033 --> 00:38:41,866
what you can just by saying, oh, yeah,
one picture could. Do. Yeah. True.
00:38:41,866 --> 00:38:46,033
If you're uploading to Instagram
or Facebook or, various sites,
00:38:46,033 --> 00:38:49,166
you can right click on a computer
and inspect the element
00:38:49,433 --> 00:38:51,300
and it shows
what time of day this was taken.
00:38:51,300 --> 00:38:54,133
What camera was taken from,
GPS coordinates.
00:38:54,133 --> 00:38:57,300
All of that stuff is not obfuscated,
for a lot of websites.
00:38:57,300 --> 00:39:00,900
So you're able to track people
based on where they took a picture.
00:39:01,200 --> 00:39:03,833
Yeah. So very scary stuff.
So just think about this.
00:39:03,833 --> 00:39:04,066
All right?
00:39:04,066 --> 00:39:05,500
So I want to get to a couple of lightning
00:39:05,500 --> 00:39:08,500
round,
lightning round questions about you.
00:39:08,566 --> 00:39:08,966
All right.
00:39:08,966 --> 00:39:11,966
So you have a committee of three,
living or dead,
00:39:12,066 --> 00:39:16,466
to advise you on business life, digital
privacy, safety, whatever you want.
00:39:16,500 --> 00:39:19,033
Tell me who these three people are
and why.
00:39:19,033 --> 00:39:22,066
Marly Marlinspike
would be the first one who created
00:39:22,066 --> 00:39:25,400
the end to end encryption
that's used on WhatsApp and signal.
00:39:25,733 --> 00:39:31,333
So that would, be one my grandpa,
who was a judge here in Cincinnati.
00:39:31,866 --> 00:39:32,733
Fantastic.
00:39:32,733 --> 00:39:35,733
and really levelheaded person.
00:39:35,733 --> 00:39:36,600
What's his name?
00:39:36,600 --> 00:39:37,700
Marley. Marlinspike.
00:39:37,700 --> 00:39:40,733
No, no. You're my grandpa.
Norman Murdock.
00:39:40,733 --> 00:39:42,333
Okay. Yeah, he is county commissioner.
00:39:42,333 --> 00:39:44,066
Judge for a while. Wow. Yeah.
00:39:44,066 --> 00:39:46,900
And then, you know, the third one,
00:39:49,466 --> 00:39:52,466
I would probably.
00:39:53,066 --> 00:39:55,800
And so many good people,
00:39:55,800 --> 00:39:58,800
I would probably look toward,
00:39:59,500 --> 00:40:02,200
somebody named Steve Shahan,
00:40:02,200 --> 00:40:05,700
who is a very interesting musician.
00:40:05,966 --> 00:40:12,000
Yeah, but he went to pretty much every
single country and learned about cultures,
00:40:12,233 --> 00:40:15,033
and it would be just so good
to get that perspective.
00:40:15,033 --> 00:40:19,200
because he's still alive now
and has seen how cultures have evolved.
00:40:19,400 --> 00:40:21,533
And he's extremely intellectual
in that way.
00:40:21,533 --> 00:40:23,633
And I think you're going
to need that perspective.
00:40:23,633 --> 00:40:26,233
But I think those are the three
that I would go for.
00:40:26,233 --> 00:40:28,766
All right.
00:40:28,766 --> 00:40:32,900
what's an important truth you have that
very few people agree with you?
00:40:32,900 --> 00:40:35,300
I like a hot take.
00:40:35,300 --> 00:40:37,266
Yeah.
00:40:37,266 --> 00:40:39,600
I don't like the order.
00:40:39,600 --> 00:40:42,133
that is in every meeting.
00:40:42,133 --> 00:40:44,500
And I'm gonna write something about this.
00:40:44,500 --> 00:40:46,533
But you're guilty. You.
00:40:46,533 --> 00:40:47,800
That's what I. But.
00:40:48,766 --> 00:40:52,033
Yeah, but, I get emails about it
all the time.
00:40:52,033 --> 00:40:53,166
Every media guy show up to.
00:40:53,166 --> 00:40:55,100
And then there are people
that don't show up to the meeting
00:40:55,100 --> 00:40:57,200
that have it,
and then I'm not in charge of the meeting.
00:40:57,200 --> 00:40:59,633
And now I see, like,
all these things over.
00:40:59,633 --> 00:41:03,400
I've always order I not going to be a plug
for validation meetings.
00:41:03,800 --> 00:41:08,200
No, no, I, I was, talking to John
Salisbury about, yeah.
00:41:08,433 --> 00:41:12,033
Even just like when we have a meeting
or a presentation.
00:41:13,033 --> 00:41:16,266
making people turn off their phones, like,
if it's in person.
00:41:16,266 --> 00:41:16,966
Yeah, like.
00:41:16,966 --> 00:41:20,633
Demanding people turn off their phones
and just, like, creating attention.
00:41:20,633 --> 00:41:25,300
I also think order, on one hand, excuses
people from really paying attention
00:41:25,300 --> 00:41:29,166
because they think they can go back to
something where it's like, I want to like,
00:41:29,166 --> 00:41:31,200
if we're having a meeting,
I want full focus,
00:41:31,200 --> 00:41:33,266
I want good participation, and I don't.
00:41:33,266 --> 00:41:36,400
And, you know, this is a fleeting moment,
so you got to pay attention.
00:41:36,400 --> 00:41:37,600
So I like that a lot.
00:41:37,600 --> 00:41:39,533
No, I think that's really important.
00:41:39,533 --> 00:41:41,200
And I'm guilty of what you said.
00:41:41,200 --> 00:41:41,533
Right.
00:41:41,533 --> 00:41:44,566
And but,
but I respect it and I know it's true.
00:41:44,933 --> 00:41:45,166
Right.
00:41:45,166 --> 00:41:47,833
Because, in the last interview
I had right before you,
00:41:47,833 --> 00:41:50,833
we were talking about the need
to make sure people don't think that
00:41:50,966 --> 00:41:53,866
artificial intelligence
is not going to replace intelligence.
00:41:53,866 --> 00:41:55,633
You need to still be intelligent.
00:41:55,633 --> 00:41:58,100
You need to understand how to do this
and have focus.
00:41:58,100 --> 00:41:58,333
Right?
00:41:58,333 --> 00:42:01,333
Because those who who are able to,
00:42:01,466 --> 00:42:05,033
communicate their authentic intelligence,
they will be the winners.
00:42:05,033 --> 00:42:07,933
And in the age of artificial Intel.
Totally. Right. Yeah.
00:42:07,933 --> 00:42:11,666
That that takes though
focus, that takes presence.
00:42:11,666 --> 00:42:11,966
Yeah.
00:42:11,966 --> 00:42:15,133
That takes still grinding it out
and learning those hard parts.
00:42:16,033 --> 00:42:19,800
and no amount of artificial intelligence
will replace the need for authenticity.
00:42:19,833 --> 00:42:20,533
Totally.
00:42:20,533 --> 00:42:22,900
Like so I completely agree
with you on that.
00:42:22,900 --> 00:42:23,233
All right.
00:42:23,233 --> 00:42:26,566
So, a time
00:42:26,566 --> 00:42:29,566
you failed in your life
and how that made you better.
00:42:29,800 --> 00:42:31,633
I fail literally every single day.
00:42:31,633 --> 00:42:33,266
Every single day.
Welcome to the club. Yeah.
00:42:35,633 --> 00:42:36,900
it's always made me better.
00:42:36,900 --> 00:42:42,900
I think, there was one point where, back
when I was first starting the slate
00:42:42,900 --> 00:42:46,466
up, you know, related stuff,
where I wanted to quit.
00:42:46,800 --> 00:42:52,166
And I talked to my team about it,
and they gave me a a week, and, then
00:42:52,166 --> 00:42:55,900
I saw how much work my team was doing,
and I wasn't there to lead them.
00:42:55,900 --> 00:42:59,666
And I felt really bad about,
you know, wavering.
00:42:59,666 --> 00:43:00,033
Yeah.
00:43:00,033 --> 00:43:03,433
At that moment,
and I realized that leaders need
00:43:04,100 --> 00:43:06,633
extreme control in their emotions,
00:43:06,633 --> 00:43:10,300
and they need to step up
and not be that person that waver.
00:43:10,300 --> 00:43:11,133
So it was either
00:43:11,133 --> 00:43:14,566
that I should not be a leader
or I need to step up in that context.
00:43:14,566 --> 00:43:17,666
And that was a good path
for me to like, really understand that.
00:43:17,733 --> 00:43:18,500
That's a great point.
00:43:18,500 --> 00:43:21,400
I'll say is that from one leader
to another, because I've felt it too.
00:43:21,400 --> 00:43:24,100
You also need a safe,
00:43:24,100 --> 00:43:26,833
a safe place to be vulnerable
00:43:26,833 --> 00:43:30,000
and tell it
because you're going to have doubts.
00:43:30,000 --> 00:43:32,833
You're going to have, difficult times.
00:43:32,833 --> 00:43:35,100
Now, that mentor may not be your team.
00:43:35,100 --> 00:43:38,200
Maybe, maybe, at some point,
maybe it can be.
00:43:38,200 --> 00:43:40,633
But having people
that have been through it,
00:43:40,633 --> 00:43:43,566
because it's always hard
like it always is always hard.
00:43:43,566 --> 00:43:48,333
Like even even when you get to where you
where you think you want to be,
00:43:48,566 --> 00:43:50,333
the new problems come about.
00:43:50,333 --> 00:43:50,933
Yeah. Right.
00:43:50,933 --> 00:43:52,933
And there's it's very hard
for people to relate.
00:43:52,933 --> 00:43:55,033
It is what you're doing
like your family can't.
00:43:55,033 --> 00:43:55,400
They can't.
00:43:55,400 --> 00:43:56,866
That's why
you need to talk to other leaders as well.
00:43:56,866 --> 00:43:59,133
So I tell you totally,
you need to get support
00:43:59,133 --> 00:44:02,566
and advice from people important
and have them pour into you.
00:44:02,966 --> 00:44:03,833
You'll help them too,
00:44:03,833 --> 00:44:06,033
because you're going to be
you need to be vulnerable
00:44:06,033 --> 00:44:08,433
because if you feel like
you always have to take it on,
00:44:08,433 --> 00:44:11,033
yeah, you might break too,
which also hurt your team.
00:44:11,033 --> 00:44:11,433
True.
00:44:11,433 --> 00:44:14,100
But I agree with you
in terms of regulating your emotion.
00:44:14,100 --> 00:44:16,033
You know, I I've had to learn that too.
00:44:16,033 --> 00:44:16,900
And I'm still learning
00:44:16,900 --> 00:44:20,033
that it's a constant process
of when regulating their emotions.
00:44:20,033 --> 00:44:23,533
But you have to be in a place
where you can also be vulnerable.
00:44:23,833 --> 00:44:26,766
and sometimes you have to show your team
vulnerability, too.
00:44:26,766 --> 00:44:30,533
I think all of this stuff is balance,
and I think your team showed you
00:44:30,766 --> 00:44:33,866
that they had your back,
and that sounds like that inspired you.
00:44:33,866 --> 00:44:35,366
Yeah, definitely. Yeah.
00:44:35,366 --> 00:44:36,100
All right.
00:44:36,100 --> 00:44:38,600
Final, final final lightning
round question.
00:44:38,600 --> 00:44:39,733
What's your slogan?
00:44:39,733 --> 00:44:43,400
Your your it'll be on your
that'll be on your grave.
00:44:43,400 --> 00:44:43,833
What is it?
00:44:47,000 --> 00:44:49,466
probably some like we're all going to die.
00:44:49,466 --> 00:44:52,366
Enjoy it.
00:44:52,366 --> 00:44:53,066
That's true.
00:44:53,066 --> 00:44:55,166
You know, don't take it too seriously.
00:44:55,166 --> 00:44:57,733
All right, brother, good to see you. Good
to see you, too. Pleasure having you on.
HOSTED BY
ROB RICHARDSON
Share This!
Step into the digital arena with John Cavanaugh, a renowned digital privacy and safety expert, on this episode of "Encrypting Freedom." Uncover the hidden pitfalls in your digital practices and learn how to fortify your data against the encroaching eyes of AI technologies. John doesn't just outline the problems; he arms you with robust, actionable strategies to take control of your digital identity.
This episode isn't just about listening—it's about engaging. Discover resources, communities, and tools that will empower you to join the vanguard of digital defenders. "Encrypting Freedom: AI & The Data Battle" isn't just a conversation—it's a call to action. Tune in to transform your understanding and approach to digital privacy in the age of AI.
DISRUPTION NOW LINKS:
Watch the episode: https://www.youtube.com/@Disruptnow
Listen to the podcast episode: https://share.transistor.fm/s/419487c0/
CONNECT WITH THE HOST
ROB RICHARDSON
Entrepreneur & Keynote Speaker
Rob Richardson is the host of disruption Now Podcast and the owner of DN Media Agency, a full-service digital marketing and research company. He has appeared on MSNBC, America this Week, and is a weekly contributor to Roland Martin Unfiltered.
MORE WAYS TO WATCH
DISRUPTION NOW
Serious about change? Subscribe to our podcasts.