fbpx
00:00:00,208 --> 00:...

00:00:00,208 --> 00:00:03,000
I can now produce
something that's pretty damn accurate,

00:00:03,000 --> 00:00:07,875
and there's no way of officially verifying
what's true and what's not true.

00:00:08,000 --> 00:00:10,791
And we really need policy around that

00:00:10,791 --> 00:00:12,708
If you believe
we can change the narrative,

00:00:12,708 --> 00:00:14,875
if you believe
we can change our communities,

00:00:14,875 --> 00:00:18,916
if you believe we can change the outcomes,
then we can change the world.

00:00:19,000 --> 00:00:20,541
I'm Rob Richardson.

00:00:20,541 --> 00:00:23,083
Welcome to Disruption Now.

00:00:23,083 --> 00:00:25,291
Hey, fellow disruptors Happy 2024.

00:00:25,291 --> 00:00:27,250
Welcome to Disruption. Now I'm your host.

00:00:27,250 --> 00:00:29,041
The moderator Rob Richardson.

00:00:29,041 --> 00:00:32,500
With me is the original crew back
the the man,

00:00:32,583 --> 00:00:37,375
the legend, James Keys, the professor,
the connoisseur Tunde Ogunlana.

00:00:37,375 --> 00:00:42,000
How are you guys doing?

00:00:42,083 --> 00:00:43,041
Good to see you both.

00:00:43,041 --> 00:00:46,291
And of course, for all of our subscribers,
please follow.

00:00:46,291 --> 00:00:51,333
Like you, it helps us reach more people
when you like and subscribers to Please do

00:00:51,541 --> 00:00:54,666
and also sign up for disruption now
so you can learn about the latest events

00:00:54,875 --> 00:00:58,333
and some of the cool disruptive shit
that we're going to do in the future.

00:00:58,416 --> 00:01:00,708
Talking about disruptive shit. Tick tock.

00:01:00,708 --> 00:01:01,875
Tick tock is on.

00:01:01,875 --> 00:01:05,458
Everybody's on everybody's brain,
everybody's mouth

00:01:05,625 --> 00:01:06,875
saying this should be banned.

00:01:06,875 --> 00:01:08,916
Obviously,
there's a lot of conversation going on

00:01:08,916 --> 00:01:11,666
given
the climate of what's happening right now.

00:01:11,666 --> 00:01:14,125
China has a very different position
than the United States

00:01:14,125 --> 00:01:16,083
on the Israel-Hamas war.

00:01:16,083 --> 00:01:18,541
China has different interest
from most of the Western nations.

00:01:18,541 --> 00:01:22,375
So of course, there's concern that this
the most popular social media app

00:01:22,375 --> 00:01:24,916
probably in the world,
TikTok, is owned by China

00:01:24,916 --> 00:01:28,208
and all that data is going to be used,
I don't know, 200 cents on my list.

00:01:28,291 --> 00:01:29,625
Chinese drones, I don't know.

00:01:29,625 --> 00:01:33,250
But anyhow, that's the concern of many
that this

00:01:33,250 --> 00:01:37,500
that this tool social media is being
used is being weaponized against us.

00:01:37,500 --> 00:01:41,416
So I want to start this conversation
really talk about is it

00:01:41,416 --> 00:01:43,750
a national security threat?
Is it threatening our kids?

00:01:43,750 --> 00:01:45,625
These are all the arguments we're hearing.

00:01:45,625 --> 00:01:48,708
And I want to make sure we're having a
different take than the rest of the world.

00:01:48,708 --> 00:01:49,875
We can hear what they say.

00:01:49,875 --> 00:01:51,666
But for us, you know,
we got the best minds

00:01:51,666 --> 00:01:53,708
as far as I'm concerned,
and the most disruptive.

00:01:53,708 --> 00:01:55,000
So let's get right to it.

00:01:55,000 --> 00:01:58,458
What do you think starting off,
James, does

00:01:58,458 --> 00:02:42,333
TikTok present a national security threat
or is this some overhyping it?

00:02:42,416 --> 00:03:12,041
Clearly,

00:03:12,125 --> 00:03:12,541
yeah.

00:03:12,541 --> 00:03:13,291
What are your thoughts, too?

00:03:13,291 --> 00:03:15,708
And do you do you agree with that,
that this is just

00:03:15,708 --> 00:03:18,708
this is pretty much just
they're just replicating what's already of

00:03:18,708 --> 00:03:21,625
what's already happening
and and people are overhyping it?

00:03:21,625 --> 00:03:24,875
Or is there any point,
particularly within government,

00:03:25,000 --> 00:03:27,500
that people maybe shouldn't use it
Because, you know, there's the

00:03:27,500 --> 00:03:30,208
you know, there's a lot of government
right now, governments across the world.

00:03:30,208 --> 00:03:31,750
And really, I think federal employees

00:03:31,750 --> 00:03:36,458
cannot have Tik Tok
within their federal of federal devices.

00:03:36,666 --> 00:03:40,500
So do you think there's any
any type of legitimate concern there

00:03:40,500 --> 00:03:54,083
or is it just something that is across
the board about social media?

00:03:54,166 --> 00:05:03,333
I it's a good point that

00:05:03,416 --> 00:05:06,250
I don't find enough

00:05:06,250 --> 00:05:10,125
information any.

00:05:10,208 --> 00:05:21,416
So I just have

00:05:21,500 --> 00:05:56,083
James

00:05:56,166 --> 00:05:56,958
no, I never happened.

00:05:56,958 --> 00:06:36,708
Yeah there's that

00:06:36,750 --> 00:06:53,875
right

00:06:53,958 --> 00:07:10,666
right.

00:07:10,750 --> 00:07:16,041
You have all of that

00:07:16,125 --> 00:07:25,000
and all that.

00:07:25,083 --> 00:07:28,458
And at that time when

00:07:28,500 --> 00:07:34,708
I went to the public,
I want to do something, but

00:07:34,791 --> 00:07:37,000
every single aspect of that.

00:07:37,000 --> 00:07:41,291
So every single,
pretty much every single device.

00:07:41,375 --> 00:07:43,166
And there was an article,

00:07:43,166 --> 00:07:47,875
you had a government

00:07:47,958 --> 00:07:52,166
guy, right?

00:07:52,250 --> 00:08:02,416
And I

00:08:02,500 --> 00:08:03,416
so that's why

00:08:03,416 --> 00:08:10,708
I think that we have this now,

00:08:10,750 --> 00:08:13,625
that that's
what all nations do, by the way.

00:08:13,625 --> 00:08:15,125
But go ahead.

00:08:15,125 --> 00:08:23,458
Yeah. Yes.

00:08:23,500 --> 00:08:33,750
So all

00:08:33,833 --> 00:08:37,125
one thing.

00:08:37,208 --> 00:08:38,250
Yeah.

00:08:38,250 --> 00:08:41,541
Yes, absolutely.

00:08:41,625 --> 00:08:43,250
Yeah.

00:08:43,250 --> 00:08:57,166
You can add let me say that.

00:08:57,250 --> 00:08:59,875
So look,
look there's definitely the big five

00:08:59,875 --> 00:09:04,708
see this as a threat and is not only now
a threat to social media as a whole.

00:09:04,750 --> 00:09:08,583
Tik Tok is now really a
they're getting into Amazon's market

00:09:08,583 --> 00:09:11,916
like people are now ordering stuff
like I noticed like, like Mario, my son,

00:09:11,916 --> 00:09:12,750
he went on there

00:09:12,750 --> 00:09:15,458
in like getting some of a text out, like,
why are you getting some of it?

00:09:15,458 --> 00:09:16,625
Who gets it off a TikTok?

00:09:16,625 --> 00:09:19,000
But apparently people get stuff
over Tik Tok.

00:09:19,000 --> 00:09:22,041
So like now Amazon is like because they
they figured out something

00:09:22,250 --> 00:09:23,666
they could capture people's attention.

00:09:23,666 --> 00:09:26,750
A lot of people now is
the second most used

00:09:26,791 --> 00:09:30,333
search engine is right behind YouTube.

00:09:30,500 --> 00:09:34,500
So now is a search engine it's a it's
now is now looking into manufacturers

00:09:34,500 --> 00:09:35,625
sitting out products.

00:09:35,625 --> 00:09:38,500
And I think what they're worried about
is that it

00:09:38,500 --> 00:09:42,333
creates a unique advantage
because China backs it,

00:09:42,416 --> 00:09:45,041
because China backs its businesses and,
of course, the U.S.

00:09:45,041 --> 00:09:46,291
government business,
we don't work that way.

00:09:46,291 --> 00:09:49,291
So they're screaming and hollering,
but they don't have any problem with that.

00:09:49,416 --> 00:09:53,083
With that, when they when they shipped
all jobs overseas in order to have

00:09:53,083 --> 00:09:55,625
people get paid lower wages
so they could be more competitive.

00:09:55,625 --> 00:09:55,958
Right.

00:09:55,958 --> 00:09:58,916
That's what So we've
kind of this ship has sailed. Right.

00:09:58,916 --> 00:10:01,916
And we need so if the
if the policy and I want to move on,

00:10:02,000 --> 00:10:06,291
if the issue is about data privacy,
let's have a real honest,

00:10:06,291 --> 00:10:11,125
transparent conversation, because I know
everybody on this panel believes the U.S.

00:10:11,250 --> 00:10:15,041
is way behind and needs to have
a transparent data policy,

00:10:15,125 --> 00:10:17,791
particularly for
how we do data across the board.

00:10:17,791 --> 00:10:18,583
Right.

00:10:18,583 --> 00:10:22,875
It needs to be done because right
now, social media companies like they have

00:10:22,875 --> 00:10:27,750
algorithms that they can't even understand
and they just know it works, right?

00:10:27,750 --> 00:10:33,166
So like we need to have transparency in
how these things are being are being used.

00:10:33,250 --> 00:10:36,375
And with a
I going to amplify that even more.

00:10:36,625 --> 00:10:38,958
By the way, artificial intelligence,
if you guys didn't know

00:10:38,958 --> 00:10:41,750
just for the audience,
artificial intelligence, not new.

00:10:41,750 --> 00:10:44,708
Facebook and others have been using it
for a very long time now.

00:10:44,708 --> 00:10:46,666
It's just amplifying its ability.

00:10:46,666 --> 00:10:51,000
And so now we really need to know
how this how the algorithms are working.

00:10:51,000 --> 00:10:53,333
And it's really presented
in a transparent way.

00:10:53,333 --> 00:10:55,208
So if this what this is about that
we want to see

00:10:55,208 --> 00:10:58,500
was transparent with ticktock,
I want that applied to Facebook,

00:10:58,500 --> 00:11:02,583
I want that applied to Amazon,
I want that applied to everybody.

00:11:02,583 --> 00:11:07,458
So we understand what's happening
and what our basic data rights are.

00:11:07,500 --> 00:11:36,583
That's where I am.
I don't know. What do you guys think?

00:11:36,666 --> 00:11:37,250
That's right.

00:11:37,250 --> 00:11:39,333
That's right.

00:11:39,333 --> 00:11:47,083
Yeah, exactly.

00:11:47,166 --> 00:12:06,250
No, no.

00:12:06,250 --> 00:12:07,583
I mean, that never happens, right?

00:12:07,583 --> 00:12:09,333
They just make up narratives
and propaganda,

00:12:09,333 --> 00:12:10,916
which I think is a part of it, too. Right.

00:12:10,916 --> 00:12:13,666
So it's definitely
is definitely bigger corporations

00:12:13,666 --> 00:12:17,500
and they see a threat to what's happening,
really a few corporations

00:12:17,500 --> 00:12:18,250
because not that many.

00:12:18,250 --> 00:12:21,250
We're talking about the big five,
but beyond that,

00:12:21,458 --> 00:12:25,250
it's also I believe there's also we talk
about influence on young voters.

00:12:25,250 --> 00:12:25,458
Right.

00:12:25,458 --> 00:12:30,416
Moving to that point, like the argument
I hear from from some folks

00:12:30,416 --> 00:12:33,875
is that you have
you have different things being presented

00:12:33,875 --> 00:12:37,500
to the Chinese young people than you do
to young people in America.

00:12:37,500 --> 00:12:40,875
So they're presenting, you know, I think

00:12:40,875 --> 00:12:44,166
what Americans usually like to consume,
which are things that aren't

00:12:44,166 --> 00:12:48,333
about necessarily education,
they're about entertainment, sex, drugs.

00:12:48,333 --> 00:12:51,375
That's what
that's what people gravitate towards.

00:12:51,375 --> 00:12:54,250
Right. And so, like, that's just a truth.
Otherwise it wouldn't be out there.

00:12:54,250 --> 00:12:57,125
And some say that the Chinese government
are doing more to empower their youth.

00:12:57,125 --> 00:12:58,083
And that's that's worrying.

00:12:58,083 --> 00:13:01,250
It's worrying people
that this is programing us, I guess,

00:13:01,250 --> 00:13:05,416
into something
that that that China wants us to be.

00:13:05,500 --> 00:13:07,833
And people don't understand
what's happening.

00:13:07,833 --> 00:13:08,500
I don't know.

00:13:08,500 --> 00:13:16,500
I don't I don't I don't know
if we've see that credit or not.

00:13:16,583 --> 00:13:19,208
That's the that's what that's what I think
the argument is.

00:13:19,208 --> 00:13:21,875
They're not going to say it.

00:13:21,875 --> 00:13:36,250
I think that's what it is, though.

00:13:36,333 --> 00:14:11,375
That's how I feel, too.

00:14:11,458 --> 00:14:14,333
Yeah,

00:14:14,333 --> 00:14:24,833
absolutely.

00:14:24,916 --> 00:14:25,208
Yeah.

00:14:25,208 --> 00:14:29,166
When we think about like you and I, James,
and have conversations about

00:14:29,250 --> 00:14:33,500
how it's just so different for our kids
in this and because of social media,

00:14:33,541 --> 00:14:37,125
it's really changed
how they think, how they absorb the world.

00:14:37,166 --> 00:14:40,916
And I think we as parents have to be more

00:14:41,000 --> 00:14:44,250
we have to be more intentional
about how they use these devices

00:14:44,250 --> 00:14:47,500
and making sure that they're having time
to to to understand that

00:14:47,708 --> 00:14:48,958
they have to put that down.

00:14:48,958 --> 00:14:51,541
They have to be in thought just to observe
what's around them.

00:14:51,541 --> 00:14:54,541
They have to work to not be influenced
like we like.

00:14:54,541 --> 00:14:55,625
It's really interesting.

00:14:55,625 --> 00:14:58,500
You never see like there used to be dance
parties or stuff like that.

00:14:58,500 --> 00:15:00,125
You never see things like that.

00:15:00,125 --> 00:15:02,791
And the reason why is because people
don't want to get embarrassed.

00:15:02,791 --> 00:15:03,583
They have something to hide.

00:15:03,583 --> 00:15:04,916
Tik Tok about them.

00:15:04,916 --> 00:15:08,791
24 seven You actually see,
and I've heard kids say this like

00:15:08,791 --> 00:15:09,125
they will

00:15:09,125 --> 00:15:10,458
actually they'll be in the middle

00:15:10,458 --> 00:15:13,041
of a basketball game
rather than try to play defense.

00:15:13,041 --> 00:15:15,708
They will be scared to get dunked on
because they don't want that to be used

00:15:15,708 --> 00:15:18,625
against them over and over and over again.
So it's very interesting.

00:15:18,625 --> 00:15:19,750
I think we got to have a

00:15:19,750 --> 00:15:23,875
we both need a policy conversation,
but we also need a perspective as people

00:15:23,875 --> 00:15:25,958
how we how we
how are we watching our kids?

00:15:25,958 --> 00:15:27,333
This is not a tick tock thing.

00:15:27,333 --> 00:15:29,708
Tick tock, just the latest trends.
Nothing's going to replace that.

00:15:29,708 --> 00:15:34,083
We have to figure out how we think
about social media and interacting

00:15:34,083 --> 00:15:35,958
with us, particularly with A.I. coming.

00:15:35,958 --> 00:15:36,625
What are your thoughts

00:15:36,625 --> 00:15:40,708
in terms of what we need to do as parents
and the perspective we should have there?

00:15:40,708 --> 00:16:48,875
We think as we think about social media
and how it influences our kids.

00:16:48,958 --> 00:16:50,000
the whole world's going to end.

00:16:50,000 --> 00:16:56,916
Yes, you guys are horrible.

00:16:57,000 --> 00:17:01,958
Yeah, it was two black.

00:17:02,041 --> 00:17:18,916
Yeah.

00:17:19,000 --> 00:17:21,041
Yep. Yeah.

00:17:21,041 --> 00:17:22,500
People try to do bad dancing.

00:17:22,500 --> 00:18:21,166
All that TV go through it.

00:18:21,250 --> 00:18:28,875
Yes, Yes. Yep.

00:18:28,958 --> 00:18:30,583
And that's a good point on that.

00:18:30,583 --> 00:18:31,875
And we're not going to go
into a deep dive.

00:18:31,875 --> 00:18:33,000
We have a whole nother show on this.

00:18:33,000 --> 00:18:55,250
But go ahead.

00:18:55,333 --> 00:19:41,833
Yeah.

00:19:41,916 --> 00:19:54,083
Yeah,

00:19:54,166 --> 00:19:55,666
yeah,

00:19:55,666 --> 00:19:57,875
yeah, yeah. Go ahead.

00:19:57,875 --> 00:20:17,083
Go ahead.

00:20:17,166 --> 00:20:24,208
Yes, exactly.

00:20:24,291 --> 00:20:43,708
Yep, exactly.

00:20:43,791 --> 00:21:07,125
Yep. It's real.

00:21:07,125 --> 00:21:10,291
And another good point
is that yes, beyond

00:21:10,291 --> 00:21:14,291
letting the whole world
into into your living room,

00:21:14,375 --> 00:21:18,083
it also presents a challenge in that
when you and I play like I don't know

00:21:18,083 --> 00:21:20,708
if you remember this, we play contraband.
I would say how really old we are.

00:21:20,708 --> 00:21:23,458
So this was like we were like,
This is a long time ago.

00:21:23,458 --> 00:21:27,500
But we used to play this type of Nintendo
contraband. Yes.

00:21:27,500 --> 00:21:28,750
super old.

00:21:28,750 --> 00:21:30,250
You've got to go get that in a retro game.

00:21:30,250 --> 00:21:32,583
So I'm really dating this,
but I'm making a point here.

00:21:32,583 --> 00:21:37,208
Used to be able to there was a beginning
and an end to the game, right?

00:21:37,291 --> 00:21:37,833
Yeah, it was.

00:21:37,833 --> 00:21:40,416
There was a
there was a beginning, middle and an end.

00:21:40,416 --> 00:21:41,166
You beat the game.

00:21:41,166 --> 00:21:43,625
It might take it 20 hours,
but you beat it.

00:21:43,625 --> 00:21:46,041
Right now the games are different.

00:21:46,041 --> 00:21:47,875
The games are more like a casino, Right?

00:21:47,875 --> 00:21:49,666
So you also have something else
you got to worry about.

00:21:49,666 --> 00:21:51,958
The level of addiction
that it could create

00:21:51,958 --> 00:21:54,125
is that now it's not just about,

00:21:54,125 --> 00:21:57,125
you know, just playing the game
and just learning the basics of the game.

00:21:57,208 --> 00:22:01,083
Now it's like a never ending loop
and a dopamine hit

00:22:01,291 --> 00:22:04,666
that they're getting and kids at,
you know, can't handle that.

00:22:04,708 --> 00:22:05,791
Adults can't handle that.

00:22:05,791 --> 00:22:08,083
Barely little kids can't handle it at all.

00:22:08,083 --> 00:22:10,375
So this is really,
I think, presented a challenge.

00:22:10,375 --> 00:22:13,250
And so ticktock has just found the
they've been the latest

00:22:13,250 --> 00:22:16,416
trend in figuring out how to capture
how to capture the attention.

00:22:16,416 --> 00:22:17,833
Somebody else is coming. Right.

00:22:17,833 --> 00:22:20,458
So it's not like
this is not a unique China type of thing.

00:22:20,458 --> 00:22:22,500
This is not China threatening us. This is

00:22:22,500 --> 00:22:26,041
we have to understand this technology
and with artificial intelligence.

00:22:26,208 --> 00:22:29,166
And by the way, I'm
a fan of of of of innovation.

00:22:29,166 --> 00:22:31,291
I'm not anti artificial intelligence.

00:22:31,291 --> 00:22:35,916
I'm pro transparency, I'm pro regulation,
and then I'm pro innovation.

00:22:35,916 --> 00:22:38,708
Go crazy with it
because we need to understand

00:22:38,708 --> 00:22:43,208
how these things are being how they're
making decisions and what they're doing.

00:22:43,208 --> 00:22:45,125
Because if we don't
have any understanding,

00:22:45,125 --> 00:22:47,250
it makes it that much harder to parent.

00:22:47,250 --> 00:22:49,458
So that's my perspective on parenting,
on that

00:22:49,458 --> 00:23:05,625
less than anything
else on that part. Go ahead.

00:23:05,708 --> 00:23:08,375
Yeah,

00:23:08,375 --> 00:23:10,208
I think a lot of percent of parents
can't pay all the time.

00:23:10,208 --> 00:23:46,708
And that's also part of the problem.

00:23:46,791 --> 00:24:18,000
Yes. Yep,

00:24:18,083 --> 00:25:06,208
yep, yep.

00:25:06,291 --> 00:25:31,000
No, no. Yep.

00:25:31,083 --> 00:25:38,708
That's right.

00:25:38,791 --> 00:25:47,916
Yeah.

00:25:48,000 --> 00:26:19,375
Yep. No.

00:26:19,458 --> 00:26:29,958
Yep, yep.

00:26:30,000 --> 00:26:32,791
That actually makes me go to the other way
to do this real quick today.

00:26:32,791 --> 00:26:34,708
Don't let me go this way.

00:26:34,708 --> 00:28:10,000
But I get every yep.

00:28:10,083 --> 00:28:14,166
There are companies.

00:28:14,250 --> 00:28:32,958
Yeah,

00:28:33,000 --> 00:28:56,708
right.

00:28:56,791 --> 00:29:02,375
Well no, no they, they, they are,
they just.

00:29:02,458 --> 00:29:05,458
Well it's,
well I actually want to transition.

00:29:05,541 --> 00:29:07,791
This is a good transition to talk about.

00:29:07,791 --> 00:29:10,083
But as James says,
they're hacking your brain.

00:29:10,083 --> 00:29:12,708
But this is a this is a

00:29:12,708 --> 00:29:14,250
this is an old science. Right?

00:29:14,250 --> 00:29:16,416
So I also recently went to Europe as well.

00:29:16,416 --> 00:29:19,416
So we're on a zero, I guess
we're on this Europe disruption trip

00:29:19,541 --> 00:29:23,125
and one thing that was very interesting

00:29:23,208 --> 00:29:27,333
was when I went to the Colosseum, right,
and I got this tour and the woman

00:29:27,333 --> 00:29:31,333
explained to me it was actually never
called the Colosseum in Rome.

00:29:31,333 --> 00:29:31,958
Right.

00:29:31,958 --> 00:29:36,375
In Rome it was called theater Y
because it was about propaganda.

00:29:36,375 --> 00:29:40,291
It was about promoting Rome
and then putting fear in those

00:29:40,291 --> 00:29:41,291
who would be against Rome.

00:29:41,291 --> 00:29:43,458
So everything they did was a performance.

00:29:43,458 --> 00:29:46,083
Every time to ask you to
somebody was a message to be sent.

00:29:46,083 --> 00:29:47,833
They would send everybody

00:29:47,833 --> 00:29:50,833
that was basically poor people all around
that didn't have anything else to do.

00:29:51,041 --> 00:29:52,458
It was their original football stadium.

00:29:52,458 --> 00:29:56,166
They entertained people
and got them excited about Rome

00:29:56,375 --> 00:29:57,416
and what Rome was doing.

00:29:57,416 --> 00:29:59,416
Even if what Rome was doing was horrible.

00:29:59,416 --> 00:30:01,375
They made people feel as if this is great.

00:30:01,375 --> 00:30:03,375
I'm glad to be a part of Rome.

00:30:03,375 --> 00:30:06,375
I believe that misinformation

00:30:06,458 --> 00:30:09,750
is an issue that's been a problem,
of course, in America for a long time.

00:30:09,875 --> 00:30:12,250
We've talked about propaganda
on the show before

00:30:12,250 --> 00:30:15,625
and how it's been
the most effective thing that's been used

00:30:15,625 --> 00:30:19,708
to really prop up parts of our economy
that don't help people.

00:30:19,708 --> 00:30:23,791
But propaganda is so strong
in this country that it works.

00:30:23,875 --> 00:30:27,708
And I believe that people see Tik-Tok
as a as a propaganda machine.

00:30:27,875 --> 00:30:31,000
The other way in terms of it's
able to get out more information

00:30:31,125 --> 00:30:34,041
that may not always be aligned
with our interests.

00:30:34,041 --> 00:30:35,166
That may not always be.

00:30:35,166 --> 00:30:36,666
Sometimes
they may be right too, by the way,

00:30:36,666 --> 00:30:41,000
but it may not be something
that we want to hear in terms of the US

00:30:41,041 --> 00:30:43,625
government, in terms of many
that are in power,

00:30:43,625 --> 00:30:46,333
because we've always had one
dominant perspective

00:30:46,333 --> 00:30:49,166
and now we get to hear other voices
and that threatens people.

00:30:49,166 --> 00:30:51,583
And Tik Tok is one of those places.

00:30:51,583 --> 00:30:54,916
But as every place where Tik Tok
I think has been more effective

00:30:54,916 --> 00:30:58,375
at getting out this short form content
and they don't have any control over that.

00:30:58,375 --> 00:31:00,250
So they are hacking our brains

00:31:00,250 --> 00:31:02,041
in their own ways
and now everybody's getting this

00:31:02,041 --> 00:31:04,541
moral righteousness about them,
but all of it is wrong.

00:31:04,541 --> 00:31:05,916
So let's have a transparent approach

00:31:05,916 --> 00:31:09,958
to how we talk about misinformation,
particularly with the use of technology.

00:31:09,958 --> 00:31:11,208
I'm all for it.

00:31:11,208 --> 00:31:13,208
But as James says, I'm not just for this.

00:31:13,208 --> 00:31:40,041
We're just going to select Tik Tok in
Brazil like it's not a problem elsewhere.

00:31:40,125 --> 00:31:43,625
It's definitely true.

00:31:43,708 --> 00:32:04,541
Exactly.

00:32:04,625 --> 00:32:20,500
Yeah,

00:32:20,583 --> 00:32:50,333
that's right.

00:32:50,416 --> 00:32:53,416
Yeah.

00:32:53,458 --> 00:32:55,458
For example, Trump, Trump
lost the election,

00:32:55,458 --> 00:32:59,333
but people do still about 40% of people
that believe this,

00:32:59,416 --> 00:32:59,708
right?

00:32:59,708 --> 00:33:05,416
I be like,

00:33:05,500 --> 00:33:08,958
yeah, I'd be like, Yeah,
but nobody talks about that.

00:33:08,958 --> 00:33:17,291
Like, it's not a it's

00:33:17,375 --> 00:33:25,666
to prove their point.

00:33:25,750 --> 00:33:29,583
Yeah.

00:33:29,666 --> 00:34:03,416
Yes, exactly. Yes.

00:34:03,500 --> 00:34:06,708
And they are
I mean that's a, that's true concern.

00:34:06,750 --> 00:34:16,208
Yeah that's

00:34:16,208 --> 00:34:33,041
right.

00:34:33,125 --> 00:34:44,875
Of course they did.

00:34:44,958 --> 00:34:46,208
That's right.

00:34:46,208 --> 00:34:49,083
And the algorithms have made
that problem worse,

00:34:49,083 --> 00:34:51,000
which is why we don't need to ban TikTok.

00:34:51,000 --> 00:34:54,000
We need to have regulation
that's transparent,

00:34:54,125 --> 00:34:55,583
that's going to help everybody.

00:34:55,583 --> 00:35:00,916
And so I hope for an honest conversation.

00:35:01,000 --> 00:35:01,750
Well, how it like

00:35:01,750 --> 00:35:05,375
I mean, as I said,
we should just make sure we are.

00:35:05,458 --> 00:35:13,750
yeah, yeah, yeah. I forgot. Yeah,

00:35:13,833 --> 00:35:14,291
yeah.

00:35:14,291 --> 00:35:16,625
But like,
I actually go to really think about this.

00:35:16,625 --> 00:35:21,250
So when the president of China
came here to visit and he

00:35:21,250 --> 00:35:25,208
of course, met with the president,
but you know who else was in that meeting?

00:35:25,291 --> 00:35:26,666
Apple.

00:35:26,666 --> 00:35:29,458
The head of Apple, Elon
Musk, and Blackstone,

00:35:29,458 --> 00:35:31,708
one of the largest investment
firms in the world.

00:35:31,708 --> 00:35:32,791
They were there.

00:35:32,791 --> 00:35:37,000
So I say this like I just call

00:35:37,083 --> 00:35:38,625
there you go and ask why.

00:35:38,625 --> 00:35:40,708
And that's why I also call bullshit.

00:35:40,708 --> 00:35:43,708
When people say like tech, we need to ban
Tik Tok because it's a national

00:35:43,875 --> 00:35:45,875
like we're tied in with China.

00:35:45,875 --> 00:35:49,208
That ship sale 20 or 30 years ago
in a lot of different ways.

00:35:49,416 --> 00:35:53,458
And so when people say this, this I think
is to really this is my perspective

00:35:53,458 --> 00:35:57,875
is to get the masses riled up about China
because it's something easy to see.

00:35:57,875 --> 00:35:59,750
They see their kids on their dancing
on Tik tok,

00:35:59,750 --> 00:36:02,458
and they say that the Chinese companies
influence them.

00:36:02,458 --> 00:36:06,333
It's not the Chinese company is not the
Chinese company is the fucking technology.

00:36:06,333 --> 00:36:07,416
And how it's being used.

00:36:07,416 --> 00:36:07,833
All right.

00:36:07,833 --> 00:36:09,541
And so what you got to do is figure out

00:36:09,541 --> 00:36:11,875
how we have transparent
use of the technology.

00:36:11,875 --> 00:36:17,708
Sorry. Go ahead. Go ahead.

00:36:17,791 --> 00:36:19,583
That's even better.

00:36:19,583 --> 00:36:37,750
That's a mic drop moment. Yes.

00:36:37,833 --> 00:36:43,875
Yeah,

00:36:43,958 --> 00:36:44,166
that's

00:36:44,166 --> 00:39:11,958
right. No.

00:39:12,041 --> 00:39:12,625
So, you know,

00:39:12,625 --> 00:39:16,250
when I think about this
in terms of where we're at, like

00:39:16,333 --> 00:39:19,541
the problem with the hacking of the brain,
too, is this.

00:39:19,625 --> 00:39:22,666
Now with artificial intelligence,
we have to also understand

00:39:22,666 --> 00:39:26,041
it's going to be harder and harder
to tell what's real and what's not real.

00:39:26,041 --> 00:39:28,666
It's already pretty difficult right now.

00:39:28,666 --> 00:39:30,291
You add with artificial intelligence.

00:39:30,291 --> 00:39:33,375
Now you look at some stuff right now
I can produce something right now

00:39:33,625 --> 00:39:36,041
with James out there
saying that he loves Trump

00:39:36,041 --> 00:39:40,166
and he said you should vote for Trump
every single time I can. I

00:39:40,250 --> 00:39:42,166
Yeah. All right. You do that.

00:39:42,166 --> 00:39:44,083
You like that, too?

00:39:44,083 --> 00:39:46,000
Hyundai hates communism.

00:39:46,000 --> 00:39:47,875
He's you know, he loves communism.

00:39:47,875 --> 00:39:49,583
He's against capitalism, right?

00:39:49,583 --> 00:39:51,583
He's a financial planner.
That wouldn't be good for his career.

00:39:51,583 --> 00:39:54,625
But I can now produce
something that's pretty damn accurate,

00:39:54,875 --> 00:39:59,750
and there's no way of officially verifying
what's true and what's not true.

00:39:59,875 --> 00:40:03,000
And we really need policy around that
that's transparent in that,

00:40:03,208 --> 00:40:04,666
because now we're going to know

00:40:04,666 --> 00:40:06,500
we're not we're not only talking
about entertainment.

00:40:06,500 --> 00:40:10,000
It's one thing to not be able to tell
what's true or what's not true

00:40:10,083 --> 00:40:13,000
when I'm in the middle of entertainment,
cause I know I'm being entertained

00:40:13,000 --> 00:40:16,750
is a whole nother thing
when you're talking about, you know,

00:40:16,750 --> 00:40:17,958
political discourse,

00:40:17,958 --> 00:40:21,500
when you're talking about taking someone's
image and voice and likeness.

00:40:21,750 --> 00:40:24,708
And none of these things
are actually regulated in any type of way.

00:40:24,708 --> 00:40:28,125
And there seems to be no rest
for for the U.S.

00:40:28,125 --> 00:40:28,875
to do this.

00:40:28,875 --> 00:40:29,625
And so that's why

00:40:29,625 --> 00:40:33,000
I really get kind of pissed off with this
Tik Tok argument, because it's all fake,

00:40:33,083 --> 00:40:34,208
it's all phony.

00:40:34,208 --> 00:40:38,333
It's not about the real issue, which is
actually having transparent policy.

00:40:38,416 --> 00:40:39,375
That's going to actually help

00:40:39,375 --> 00:40:42,791
protect us move innovation forward
and most of all help people.

00:40:43,041 --> 00:40:45,166
And so that's the conversation
I want to have.

00:40:45,166 --> 00:40:46,333
I don't want to have the conversation

00:40:46,333 --> 00:40:48,500
about banning Tik Tok
because it's not a real conversation.

00:40:48,500 --> 00:40:50,208
It's a straw man conversation.

00:40:50,208 --> 00:40:53,750
I want the conversation to be
how are we actually making sure

00:40:53,750 --> 00:40:56,833
that we have good policy
to help us out in the long run?

00:40:57,000 --> 00:40:57,916
That's what I want to have.

00:40:57,916 --> 00:41:00,958
And so as we get to this,
this feels like a political agenda.

00:41:00,958 --> 00:41:03,583
This is the final point
I want to talk about here.

00:41:03,583 --> 00:41:07,083
When we know
when the president of China came here,

00:41:07,083 --> 00:41:09,875
what he did, he met with the president
and he met with the real presidents.

00:41:09,875 --> 00:41:10,208
Right.

00:41:10,208 --> 00:41:14,875
He met with the leaders of the biggest
companies, tech companies.

00:41:14,958 --> 00:41:18,500
And they rely on China
quite a bit to do quite a bit of business.

00:41:18,666 --> 00:41:21,958
And so we're are
we're as I said earlier, we're tied in.

00:41:22,000 --> 00:41:25,416
And what really concerns me
is that there's a there's also a control

00:41:25,416 --> 00:41:29,208
by companies here about what we can see,
even about even about China.

00:41:29,208 --> 00:41:31,666
That's why
I also think this is not real, right?

00:41:31,666 --> 00:41:33,500
So I said so.

00:41:33,500 --> 00:41:38,333
So so what I also think
is what I think about this John Stuart

00:41:38,416 --> 00:41:41,208
had a great show on Apple TV, right?

00:41:41,208 --> 00:41:45,500
But they actually got canceled because
one of the topics he wanted to talk about

00:41:45,583 --> 00:41:50,250
is that he didn't he wanted
he wanted to talk about AI in China.

00:41:50,416 --> 00:41:51,750
He wanted to have a conversation.

00:41:51,750 --> 00:42:04,208
Essentially, Apple was like narborough.

00:42:04,291 --> 00:43:09,583
Yes. Yeah.

00:43:09,750 --> 00:43:12,750
I actually think I actually think part
of the reason why I sorry, interrupt you,

00:43:12,750 --> 00:43:15,250
but I think it's an important point,
what you just said.

00:43:15,250 --> 00:43:19,333
I think part of the reason
why corporations and others are also

00:43:19,333 --> 00:43:25,333
worried about social media with Tok
is that it actually is not it by itself.

00:43:25,333 --> 00:43:27,000
This generation tends to be a lot

00:43:27,000 --> 00:43:30,500
less like, okay,
we just believe a corporation's tell us.

00:43:30,500 --> 00:43:33,583
In fact, they're much more skeptical
because they've seen

00:43:33,583 --> 00:43:36,750
what's happened, frankly, to us
and to other generations.

00:43:36,833 --> 00:43:38,333
And I think people
are trying to figure out a way

00:43:38,333 --> 00:43:41,333
how do we control the message
to get them back in our corner?

00:43:41,583 --> 00:43:44,291
And I think that's they
they fear that because it's not working

00:43:44,291 --> 00:43:46,916
and Tik Tok allows them
to share their perspective.

00:43:46,916 --> 00:43:49,750
And this is outside of the
Chinese government.

00:43:49,750 --> 00:43:50,875
People are sharing their perspectives

00:43:50,875 --> 00:43:53,166
about how they feel, like
people talk about it

00:43:53,166 --> 00:43:54,916
and people can never talk about things
like quiet.

00:43:54,916 --> 00:43:57,541
Quitting
is one of the most trending things ever

00:43:57,541 --> 00:43:59,375
and people are like,
What the hell is going on?

00:43:59,375 --> 00:43:59,833
It's that,

00:43:59,833 --> 00:44:02,125
you know, young people are seeing the gig,
they're seeing it,

00:44:02,125 --> 00:44:03,583
they're seeing that
things aren't working for them.

00:44:03,583 --> 00:44:06,916
So I actually think there's some hope
with them.

00:44:06,916 --> 00:44:10,541
But their issues, they've got to keep,
stay and evolve and not get discouraged.

00:44:10,541 --> 00:44:12,166
But they they're really like

00:44:12,166 --> 00:44:14,875
more than any generation
I've seen really taking a stance.

00:44:14,875 --> 00:44:18,750
You've seen it with the with
the intense unionization like you've seen

00:44:18,833 --> 00:44:21,791
Microsoft
actually just just formed a union.

00:44:21,791 --> 00:44:23,500
In some places,
people are actually doing things.

00:44:23,500 --> 00:44:26,625
The unions
have a very high approval rating.

00:44:26,791 --> 00:44:29,375
They're much higher at that generation
than any other generation.

00:44:29,375 --> 00:44:32,750
So I do believe that's part of the fear,
is that now this information

00:44:32,750 --> 00:44:35,833
is being shared in a way where people
are like using it for activation.

00:44:35,833 --> 00:44:38,833
They're like, no, no, we want you to
we want to use social media

00:44:39,041 --> 00:44:40,250
to entertain and be distracted.

00:44:40,250 --> 00:44:42,666
But we don't want you talking about things
like quiet quitting.

00:44:42,666 --> 00:45:37,041
We let you talk about union unionization.

00:45:37,125 --> 00:45:55,458
Yep. Yeah, we were.

00:45:55,458 --> 00:46:19,166
I agree.

00:46:19,250 --> 00:46:20,500
Yeah.

00:46:20,500 --> 00:46:22,375
Smoke on airplanes, smoke in schools.

00:46:22,375 --> 00:46:23,250
Be nothing.

00:46:23,250 --> 00:46:24,500
Smoke and bars, that stuff.

00:46:24,500 --> 00:46:29,291
Nobody does that anywhere.

00:46:29,375 --> 00:47:17,625
That's a great point.

00:47:17,708 --> 00:47:45,458
Yeah. Yes,

00:47:45,541 --> 00:48:10,458
they would have thought.

00:48:10,541 --> 00:48:11,541
Yep. It's

00:48:11,541 --> 00:48:58,500
a misdirection. Yep.

00:48:58,541 --> 00:48:59,083
Yeah, I agree.

00:48:59,083 --> 00:49:01,541
I think as about as my final point,
I would tell people to don't

00:49:01,541 --> 00:49:02,791
get distracted.

00:49:02,791 --> 00:49:05,458
Understand the goal is to distract you.

00:49:05,458 --> 00:49:07,125
We all get distracted.

00:49:07,125 --> 00:49:10,958
The goal is to make sure that
you understand that we all are irrational.

00:49:10,958 --> 00:49:13,333
As Robert
Green, who's been on the show, said.

00:49:13,333 --> 00:49:16,375
And the first point to being rational
is to understand that we're irrational.

00:49:16,375 --> 00:49:17,708
We can come from that point.

00:49:17,708 --> 00:49:21,416
Then you can step off of social media and
realize like, What am I taking in here?

00:49:21,416 --> 00:49:23,083
And we got to tell our kids that too.

00:49:23,083 --> 00:49:24,458
And critical thinking

00:49:24,458 --> 00:49:28,041
and being really just sober minded about
this is going to be really our only hope.

00:49:28,125 --> 00:49:29,625
And I have some faith today.

00:49:29,625 --> 00:49:32,833
I know you're kind of a pessimist here,
but I actually think beyond smoking,

00:49:33,041 --> 00:49:35,916
we've actually changed some with with with
with how people eat.

00:49:35,916 --> 00:49:39,833
People know a lot more about what they're
putting in their body, what's happening.

00:49:39,916 --> 00:49:42,875
There's still more work to be done,
but a lot more people are a lot

00:49:42,875 --> 00:49:47,583
more informed than they used to be
and is the reason why I like it.

00:49:47,666 --> 00:49:48,041
Right?

00:49:48,041 --> 00:49:50,416
Yeah. Yes, there's that does that. Yeah.

00:49:50,416 --> 00:49:53,625
Like it's the reason.

00:49:53,708 --> 00:49:56,541
Yeah,

00:49:56,541 --> 00:49:57,125
yeah, yeah.

00:49:57,125 --> 00:50:08,125
It that too.

00:50:08,208 --> 00:50:10,333
Yeah.

00:50:10,333 --> 00:50:11,625
I got a good drug for them too.

00:50:11,625 --> 00:50:13,875
It's called exercising and eating less.

00:50:13,875 --> 00:50:36,833
That works, but

00:50:36,916 --> 00:50:42,541
yeah,

00:50:42,625 --> 00:50:45,500
I agree.

00:50:45,500 --> 00:50:45,958
I agree.

00:50:45,958 --> 00:51:35,458
I mean, I agree.

00:51:35,541 --> 00:51:36,458
No, no, no.

00:51:36,458 --> 00:51:39,541
Because it becomes, it becomes,
it becomes a habit into you

00:51:39,583 --> 00:51:41,541
What you,
what you have to work out to a new habit.

00:51:41,541 --> 00:51:44,000
But that's a great conversation that we.

00:51:44,000 --> 00:52:10,625
Yeah. Yes.

00:52:10,708 --> 00:52:13,583
Yeah.

00:52:13,583 --> 00:52:15,708
So and that's it's a wrap

00:52:15,708 --> 00:52:18,708
this up as a full point because I know
we've been talking about this for a while.

00:52:18,916 --> 00:52:20,041
This is the point.

00:52:20,041 --> 00:52:22,250
This is a whole other one.
This is a whole other conversation. Right.

00:52:22,250 --> 00:52:26,250
But like there's the major point
that I want the listeners to understand.

00:52:26,291 --> 00:52:30,875
The European Union is not perfect, but
they're ahead of us on actually regulating

00:52:31,125 --> 00:52:32,291
not only our food and our body,

00:52:32,291 --> 00:52:33,041
but now

00:52:33,041 --> 00:52:36,041
I think what's even more consequential,
because I think affects all of that is

00:52:36,166 --> 00:52:41,208
is really technology, A.I., blockchain,
all that stuff they have policies for.

00:52:41,208 --> 00:52:42,583
They have a whole A.I.

00:52:42,583 --> 00:52:44,833
policy
for how it can be used with surveillance,

00:52:44,833 --> 00:52:47,833
how it could be used with social media,
all those things, because it matters.

00:52:47,958 --> 00:52:49,000
And so we have to wake up.

00:52:49,000 --> 00:52:52,416
The real thing is not China,
the real thing is having clear policy

00:52:52,416 --> 00:52:56,583
to protect us and protect others
from hacking our brain.

00:52:56,583 --> 00:52:57,916
Because you can be hacked.

00:52:57,916 --> 00:52:59,791
We can all be hacked because we're human.

00:52:59,791 --> 00:53:02,250
I hope that's
what the takeaway is from this.

00:53:02,250 --> 00:53:04,500
But until next time.

00:53:04,583 --> 00:53:20,458
Yes. Yes.

00:53:20,541 --> 00:53:30,750
No. Yeah, just the thought.

00:53:30,750 --> 00:53:33,750
But it's good.

00:53:33,916 --> 00:53:34,875
You're never that.

00:53:34,875 --> 00:53:52,166
Go ahead.

00:53:52,250 --> 00:53:55,250
Well,

00:53:55,416 --> 00:53:57,250
yeah, well, that's the easy way to go.

00:53:57,250 --> 00:53:58,875
And then, you know.

00:53:58,875 --> 00:54:01,958
Yeah, but meanwhile,
all the money is being made inside itself.

00:54:02,125 --> 00:54:02,916
Don't believe the hype.

00:54:02,916 --> 00:54:05,541
So until the next time, we'll see,
we'll see it.

00:54:05,541 --> 00:54:05,958
Disruptors.

00:54:05,958 --> 00:54:09,208
Appreciate you

00:54:09,291 --> 00:54:10,083
know. Yes, please.

00:54:10,083 --> 00:54:12,541
Blood flow. Call it like I see it podcast.

00:54:12,541 --> 00:54:17,791
Make sure you go join Subscribe as always
they always ask good topics.

00:54:17,791 --> 00:54:26,000
I'm I have to come back on
the show as well.

00:54:26,083 --> 00:54:26,583
Thank you.

HOSTED BY

ROB RICHARDSON

Share This!



CONNECT WITH THE HOST

ROB RICHARDSON

Entrepreneur & Keynote Speaker

Rob Richardson is the host of disruption Now Podcast and the owner of DN Media Agency, a full-service digital marketing and research company. He has appeared on MSNBC, America this Week, and is a weekly contributor to Roland Martin Unfiltered.

MORE WAYS TO WATCH

Serious about change? Subscribe to our podcasts.