fbpx
00:00:00,542 --> 00:...

00:00:00,542 --> 00:00:02,544
welcome to Midwest Con 2023.

00:00:02,544 --> 00:00:05,088
I'm Rob Richardson, the CEO of

00:00:05,088 --> 00:00:08,299
Disrupt Art and also, of course,
the host of Disruption now podcast.

00:00:08,341 --> 00:00:10,760
Honored to be here with you.
Thank you for joining us.

00:00:10,760 --> 00:00:14,180
We are here at the Digital Futures
Building at the University

00:00:14,180 --> 00:00:17,267
of Cincinnati
taping live for Midwest Con 2023.

00:00:17,475 --> 00:00:21,938
And with me is Kalista Zacharias,
who is the CEO of Spark Seeker.

00:00:22,147 --> 00:00:25,275
And really the goal, the spark seeker
and I'm going to summarize it to the best

00:00:25,275 --> 00:00:26,067
I can.

00:00:26,067 --> 00:00:29,904
The goal is to really make sure you have
that we can have intentional engagement

00:00:29,988 --> 00:00:34,075
using technology, and it's through an app
that she does that, but she also cares

00:00:34,075 --> 00:00:37,662
very deeply about the future of technology
and how we're intentional about

00:00:37,912 --> 00:00:41,666
using that for humanity
and for good and for transparency.

00:00:41,666 --> 00:00:56,431
And that's why it's an honor
to have Kalista on the show.

00:00:56,514 --> 00:00:57,932
Kalista, welcome.

00:00:57,932 --> 00:00:59,684
Our. You had me. You're hired.

00:00:59,684 --> 00:01:00,894
Yes. Hi.

00:01:00,894 --> 00:01:02,145
Hello.

00:01:02,145 --> 00:01:02,854
That's good.

00:01:02,854 --> 00:01:04,522
You know, we tried. We tried.

00:01:04,522 --> 00:01:06,107
So you for having me know.

00:01:06,107 --> 00:01:07,317
Appreciate having you on.

00:01:07,317 --> 00:01:10,028
It's when we met,
there was instant connection.

00:01:10,028 --> 00:01:13,406
I think we were very similar
in our pursuits of

00:01:13,406 --> 00:01:16,367
why we're in technology,
why we're doing this.

00:01:16,367 --> 00:01:18,953
But, you know,
the audience needs to learn it, too.

00:01:18,953 --> 00:01:22,207
So tell me if you can,
tell me what motivates

00:01:22,248 --> 00:01:25,251
Calista day in and day out to get up.

00:01:25,460 --> 00:01:27,253
It's not just about entrepreneurship,
but generally.

00:01:27,253 --> 00:01:30,256
I think I told you a little bit
about my my background in history,

00:01:30,256 --> 00:01:31,007
which I don't know

00:01:31,007 --> 00:01:34,010
how much of it can go into here,
but I'm just going to go with it.

00:01:34,094 --> 00:01:35,303
I think,

00:01:35,303 --> 00:01:39,474
you know I grew up homeless on
and off the streets, and I think there was

00:01:39,474 --> 00:01:42,393
a part of me that always wanted to do good
when I could,

00:01:42,393 --> 00:01:45,271
and I never knew
what that opportunity was going to be.

00:01:45,355 --> 00:01:48,441
And I was
looking to really have a shift in my life.

00:01:48,441 --> 00:01:51,277
I wanted to get out of that rat race,
that hamster wheel,

00:01:51,277 --> 00:01:54,531
and create something
where I could be more intentional

00:01:54,531 --> 00:01:56,866
with my time and resources
and find a way to give back.

00:01:56,866 --> 00:02:01,371
And I kind of stumbled upon this issue
because I had actually gone to these

00:02:01,371 --> 00:02:05,041
social platforms looking for connection,
looking for community,

00:02:05,125 --> 00:02:08,044
looking for ways
that I can join existing organizations

00:02:08,044 --> 00:02:10,880
and start lending my time
and my energy to give back. Right?

00:02:10,880 --> 00:02:14,467
And I saw the disinformation
and the hate speech

00:02:14,467 --> 00:02:18,012
and the systemic divide
and the disinformation,

00:02:18,096 --> 00:02:20,181
the behavior, manipulation
and all that stuff.

00:02:20,181 --> 00:02:22,725
And it just kind of
took it on its own life.

00:02:22,725 --> 00:02:26,396
And what started out as a frustration
eventually became something

00:02:26,396 --> 00:02:31,693
where it was like, Hey, my my whole
passion is about uniting people

00:02:31,693 --> 00:02:35,864
and bringing the power of people together
to do social good.

00:02:35,947 --> 00:02:39,909
And how about we did that
while solving a problem, right?

00:02:39,993 --> 00:02:42,203
And so we were able to marry the two.

00:02:42,203 --> 00:02:45,290
So before I die, we're going to talk a lot
about Spark Seeker in your vision

00:02:45,290 --> 00:02:48,418
with technology,
but want to know about what motivates you.

00:02:48,418 --> 00:02:51,337
Because I think the most important

00:02:51,379 --> 00:02:52,589
ingredient in any

00:02:52,589 --> 00:02:56,467
organization and startup is is the founder
at least initially.

00:02:56,634 --> 00:03:00,013
And it's tied to their story,
their experience and

00:03:00,096 --> 00:03:01,431
their why.

00:03:01,431 --> 00:03:05,977
So I would what is your why?

00:03:06,019 --> 00:03:07,604
Freedom. Freedom.

00:03:07,604 --> 00:03:09,439
What does that mean

00:03:09,439 --> 00:03:10,273
to you?

00:03:10,273 --> 00:03:12,108
To me,

00:03:12,108 --> 00:03:15,695
freedom means that as human beings,
we get to maintain a sense of autonomy

00:03:15,737 --> 00:03:21,409
over the way
we think, over what we get to do.

00:03:21,492 --> 00:03:26,164
And I mean, obviously
within legal parameters, but,

00:03:26,247 --> 00:03:29,209
you know, you want to have that.

00:03:29,250 --> 00:03:31,377
I think, you know, our ancestors have

00:03:31,377 --> 00:03:34,172
these are hard won freedoms we have

00:03:34,172 --> 00:03:36,049
and I think technology

00:03:36,049 --> 00:03:39,052
has the ability
to really enhance our lives.

00:03:39,260 --> 00:03:41,930
I don't believe that technology
technology's good or bad,

00:03:41,930 --> 00:03:44,182
believe it to be very neutral.

00:03:44,182 --> 00:03:45,016
Interesting.

00:03:45,016 --> 00:03:48,686
But I think its current trajectory

00:03:48,770 --> 00:03:51,064
can do more harm than good.

00:03:51,064 --> 00:03:53,566
And so I think
that we are a pivotal moment.

00:03:53,566 --> 00:03:55,401
And I think

00:03:55,401 --> 00:03:58,529
we need to have more
and more of these conversations.

00:03:58,529 --> 00:04:00,490
We need to really
look at what we're building,

00:04:00,490 --> 00:04:03,159
where we're going
and why we're going there.

00:04:03,159 --> 00:04:06,579
And if it is going to

00:04:06,663 --> 00:04:11,042
make us lose that sense of autonomy.

00:04:11,125 --> 00:04:14,629
I think it's an injustice
to all those who came

00:04:14,629 --> 00:04:19,759
before us of God who have helped us
get to this place of hard work freedoms.

00:04:19,884 --> 00:04:22,303
Absolutely.

00:04:22,303 --> 00:04:23,513
So much to dive into there.

00:04:23,513 --> 00:04:26,516
I'm going to back up a little bit.

00:04:26,516 --> 00:04:28,268
Why are you so motivated by this?

00:04:28,268 --> 00:04:33,356
I want to know, you know, the
what makes you freedom is important,

00:04:33,439 --> 00:04:37,193
but it comes from a personal reason
or your experience

00:04:37,193 --> 00:04:42,490
because freedom is relative
to the person, experience and time.

00:04:42,490 --> 00:04:45,535
And so that must it must be something.

00:04:45,535 --> 00:04:49,455
You talked
a little bit about you being homeless.

00:04:49,539 --> 00:04:50,748
If you could just speak to

00:04:50,748 --> 00:04:56,546
personally why this mission
became your passion to help other people.

00:04:56,629 --> 00:04:57,380
Great question.

00:04:57,380 --> 00:04:58,631
Thanks for asking.

00:04:58,631 --> 00:05:01,843
I think very early on in life

00:05:01,843 --> 00:05:06,723
I understood the pain of disconnection.

00:05:06,806 --> 00:05:12,186
I understood the pain of abandonment, and

00:05:12,270 --> 00:05:14,522
I understood how hard it was

00:05:14,522 --> 00:05:17,442
to navigate this world.

00:05:17,442 --> 00:05:19,736
And then as I grew

00:05:19,736 --> 00:05:23,906
older, I realized how fortunate
I had been interesting

00:05:23,990 --> 00:05:27,618
and that I was one of

00:05:27,702 --> 00:05:30,038
the luckier ones

00:05:30,038 --> 00:05:33,041
and that I was not an anomaly,

00:05:33,166 --> 00:05:36,169
that I was an everyday story.

00:05:36,210 --> 00:05:41,090
And it was it became very personal to me.

00:05:41,174 --> 00:05:44,510
And I remember being a little girl
and thinking like, How does one just walk

00:05:44,510 --> 00:05:46,929
by a child who's shivering in the cold?

00:05:46,929 --> 00:05:50,266
How desensitized have we become?

00:05:50,350 --> 00:05:54,562
How does one not worry
that somebody else hasn't had food?

00:05:54,645 --> 00:05:55,813
I mean

00:05:55,813 --> 00:06:00,318
one might say I'm naive or that I have a
my mind lives in a utopian world.

00:06:00,485 --> 00:06:00,902
Right.

00:06:00,902 --> 00:06:04,364
But it's just it's the basics
of our human existence.

00:06:04,364 --> 00:06:06,741
And I couldn't understand that.

00:06:06,741 --> 00:06:11,579
And and as I realized
that I was not this anomaly

00:06:11,579 --> 00:06:16,584
and it was a million
more like me, billions more like me,

00:06:16,667 --> 00:06:18,169
you know, I just want to do my part.

00:06:18,169 --> 00:06:20,254
I was like,
you know what? I've I've been very lucky.

00:06:20,254 --> 00:06:23,007
And I mean, the same circumstances.

00:06:23,007 --> 00:06:26,511
Had they been imposed on my life
and I had been

00:06:26,511 --> 00:06:29,972
in a third world country or in a country
where women don't have rights,

00:06:30,056 --> 00:06:34,102
I can't even fathom what the outcome of
my life would have been.

00:06:34,185 --> 00:06:37,980
And so with all my hardships,
I am very lucky, right?

00:06:38,064 --> 00:06:43,403
And with all of my all the things I've had
to overcome, I've been very privileged.

00:06:43,486 --> 00:06:45,571
So just because I was

00:06:45,571 --> 00:06:49,242
raised here in in in Canada, in the U.S.

00:06:49,325 --> 00:06:52,203
And so that's a very different story
for someone in India.

00:06:52,203 --> 00:06:53,162
And as well.

00:06:53,162 --> 00:06:55,832
It is. Absolutely.
It is. It definitely is.

00:06:55,832 --> 00:07:00,002
And that's a great perspective to not

00:07:00,086 --> 00:07:02,630
a mother says to not live with the scars
which you've met as.

00:07:02,630 --> 00:07:04,006
You call her Mama Richardson.

00:07:04,006 --> 00:07:07,260
But that's what we're saying, is to,
you know,

00:07:07,427 --> 00:07:11,764
to not to not live with your scars,
because it's

00:07:11,806 --> 00:07:15,810
when you go through trauma
and you go through hardship,

00:07:15,893 --> 00:07:19,856
the temptation is to let that to keep
that perspective and never come out of it.

00:07:19,856 --> 00:07:23,860
And it's very difficult
for for folks to do that. So

00:07:23,943 --> 00:07:26,487
how were you able to do that

00:07:26,487 --> 00:07:28,948
and what message do you give to others
when they're going

00:07:28,948 --> 00:07:31,576
through their own hardships?

00:07:31,659 --> 00:07:32,660
I'm able to do that

00:07:32,660 --> 00:07:36,831
because God grace me
with two beautiful sons.

00:07:36,914 --> 00:07:39,083
I just dropped my youngest off
to college yesterday.

00:07:39,083 --> 00:07:40,585
Congratulations. I am.

00:07:40,585 --> 00:07:41,711
I don't like the term empty nester.

00:07:41,711 --> 00:07:45,006
I'm officially an open nester,

00:07:45,089 --> 00:07:47,300
but I think that when you

00:07:47,300 --> 00:07:50,845
when you create life
and I think it's this wonderful journey

00:07:50,928 --> 00:07:54,182
that is there for both moms and dads,

00:07:54,182 --> 00:07:57,185
but really very unique to motherhood,

00:07:57,226 --> 00:08:01,147
is that you want to leave the world

00:08:01,230 --> 00:08:03,858
in better shape than you found it.

00:08:03,858 --> 00:08:07,653
You want to make sure that you get
I mean, I think I've always taught my kids

00:08:07,737 --> 00:08:10,990
until all of us are okay,
none of us are okay.

00:08:11,073 --> 00:08:14,577
And I think, you know,

00:08:14,660 --> 00:08:20,374
I imagine, like, what do I want the world
to be like if somebody that was my child?

00:08:20,416 --> 00:08:22,919
And I
think that's really what motivates me.

00:08:22,919 --> 00:08:25,755
I really want to know.

00:08:25,755 --> 00:08:29,592
You can't it's necessary for one person
to go and try to save the world.

00:08:29,592 --> 00:08:30,426
But it is important

00:08:30,426 --> 00:08:33,429
that every one of us takes responsibility
to some degree and tries.

00:08:33,596 --> 00:08:34,472
Absolutely.

00:08:34,472 --> 00:08:36,933
Absolutely. It's
you talked about privilege.

00:08:36,933 --> 00:08:39,894
And when when folks discuss
privilege is often

00:08:39,894 --> 00:08:42,897
discussed from a negative point of view.

00:08:42,897 --> 00:08:47,527
But I would ask us to look at it
from another perspective.

00:08:47,568 --> 00:08:49,654
Privileges
is something that you have. Right.

00:08:49,654 --> 00:08:53,449
And it's a it can be male privilege,
it can be white privilege,

00:08:53,449 --> 00:08:56,452
and people
automatically like it, defensive about it.

00:08:56,661 --> 00:09:00,915
And the first time I understood
that was in law school, right.

00:09:01,082 --> 00:09:05,002
In terms of getting defensive
about it, because I went to a

00:09:05,086 --> 00:09:05,795
women in the law

00:09:05,795 --> 00:09:09,590
class and I think I was the only man

00:09:09,674 --> 00:09:13,469
that like, Oh, what's that?

00:09:13,511 --> 00:09:14,303
What's up with you?

00:09:14,303 --> 00:09:18,057
Oh, yes,
but you're allowed to to order sisters.

00:09:18,057 --> 00:09:20,393
I grew up strong
and a strong mother. Right.

00:09:20,393 --> 00:09:23,980
But that's clearly.

00:09:24,063 --> 00:09:26,065
But I,

00:09:26,065 --> 00:09:30,236
but I remember being in the class and,
and there was this conversation

00:09:30,236 --> 00:09:35,157
about essentially misogyny,
culture, rape culture, men.

00:09:35,157 --> 00:09:37,535
And then I felt a little attacked.

00:09:37,535 --> 00:09:39,036
I never had that feeling before.

00:09:39,036 --> 00:09:39,370
Like, well,

00:09:39,370 --> 00:09:43,249
I feel like all men are responsible
for this or have some privilege in this

00:09:43,249 --> 00:09:45,793
and that was basically
what the conversation was about.

00:09:45,793 --> 00:09:48,296
And I was kind of like,
I don't want to really understand that.

00:09:48,296 --> 00:09:48,713
Right?

00:09:48,713 --> 00:09:52,341
And and then somebody just flipped it over
very quickly and said,

00:09:52,466 --> 00:09:55,469
you know, if people talk about racism,

00:09:55,553 --> 00:09:56,304
it's the same thing.

00:09:56,304 --> 00:09:58,139
And the light bulb went off for me.

00:09:58,139 --> 00:09:59,807
Right. And I and I got that.

00:09:59,807 --> 00:10:01,809
And so when people think about privilege,

00:10:01,809 --> 00:10:05,271
privilege is not necessarily
inherently a bad thing.

00:10:05,354 --> 00:10:09,400
It's only bad if you abuse it

00:10:09,483 --> 00:10:12,528
and you ignore the fact that you have it
right.

00:10:12,778 --> 00:10:15,197
It's not bad in itself.
It's not bad itself.

00:10:15,197 --> 00:10:19,285
And even if, you know, somebody could say,
and I've actually been asked

00:10:19,285 --> 00:10:22,747
this question, you've been homeless,
you've gone through abuse, you've got

00:10:22,830 --> 00:10:25,708
and there's a list,
there's a long list out to that things

00:10:25,708 --> 00:10:27,835
we won't get into all of those things
now, too.

00:10:27,835 --> 00:10:31,547
How can you say you were privileged simply
because even with

00:10:31,547 --> 00:10:35,885
all those horrific experiences,
all those challenges and adversities,

00:10:35,926 --> 00:10:39,430
I got to experience those adversities

00:10:39,430 --> 00:10:43,851
in a place like the United States
or in a place like Canada, Right.

00:10:43,934 --> 00:10:46,312
Where a woman can go to a shelter.

00:10:46,312 --> 00:10:49,315
Yes, absolutely. It's not.

00:10:49,315 --> 00:10:51,734
I mean, that's
a different level of privilege

00:10:51,734 --> 00:10:53,444
that most people don't associate. No.

00:10:53,444 --> 00:10:54,695
Because you could never know

00:10:54,695 --> 00:10:57,990
because life is the reason why they don't
is the mind is relative.

00:10:58,199 --> 00:10:58,532
Right.

00:10:58,532 --> 00:11:01,535
You can only compare to
what you've seen and understood,

00:11:01,702 --> 00:11:04,497
which is why on another podcast
we talked about, the most

00:11:04,497 --> 00:11:08,376
beneficial thing one can do is to travel

00:11:08,417 --> 00:11:12,129
and understand other cultures
because that really helps you understand.

00:11:12,129 --> 00:11:13,255
It'll help you understand a lot.

00:11:13,255 --> 00:11:15,049
It'll also help you understand privilege.

00:11:15,049 --> 00:11:16,550
Absolutely, quite a bit.

00:11:16,550 --> 00:11:19,387
If you go to the wizard
level, is life challenging?

00:11:19,387 --> 00:11:20,179
Absolutely.

00:11:20,179 --> 00:11:23,933
But would I have liked to have done that
life in a third world country?

00:11:24,016 --> 00:11:25,685
Hell no.

00:11:25,726 --> 00:11:28,729
Yeah, there's levels of challenge
and people are still able to find a way.

00:11:28,938 --> 00:11:30,856
But let's go to some of the challenges

00:11:30,856 --> 00:11:34,235
that you're dealing with
with your business in

00:11:34,318 --> 00:11:36,320
Spark Seeker.

00:11:36,320 --> 00:11:39,115
What problem
are you trying to solve of spark figure?

00:11:39,115 --> 00:11:41,992
Well, where do we begin now?

00:11:41,992 --> 00:11:44,412
Well, we're solving quite a few things
you don't like.

00:11:44,412 --> 00:11:47,039
Online engagement hasn't changed much.

00:11:47,039 --> 00:11:50,543
You know,
there might be a new fad or a new filter

00:11:50,543 --> 00:11:54,547
or a new this or new that,
but the whole like follow

00:11:54,588 --> 00:11:59,427
how engagement is metric sized, how
these platforms are monetized,

00:11:59,510 --> 00:12:02,513
not much has changed, right,
in a long, long time.

00:12:02,722 --> 00:12:07,143
And really that's where, you know,
whether you are

00:12:07,226 --> 00:12:10,312
a nonprofit organization
who has a global community

00:12:10,312 --> 00:12:13,399
or a nationwide community and you're
looking to mobilize that community,

00:12:13,399 --> 00:12:16,402
get that community engaged,
or you're an individual, whether you are

00:12:16,402 --> 00:12:19,697
an influencer,
an individual user of this platforms

00:12:19,697 --> 00:12:23,451
like how do you measure engagement,
What is engagement mean to you?

00:12:23,659 --> 00:12:27,538
Like,
so do the likes and follows work anymore.

00:12:27,621 --> 00:12:31,917
The question with deepfakes, fakes, bots,
all the explosion of all these things.

00:12:31,917 --> 00:12:34,879
What does a lot of the metrics
that we get from these platforms?

00:12:35,087 --> 00:12:36,964
What do they really mean to a marketer?

00:12:36,964 --> 00:12:39,008
Yeah,
What do they mean to an organization?

255
00:12:39,008 --> 00:12:42,011
What does it mean for actual
If we're looking at like a business

00:12:42,011 --> 00:12:47,475
to business interaction or,
you know, you're catering to a community

00:12:47,475 --> 00:12:51,103
that has let's take a hypothetical,
say the YMCA, right?

00:12:51,145 --> 00:12:54,982
If they've got a huge global community,
how are they really engaging

00:12:54,982 --> 00:12:57,526
their audience,
how they really staying connected?

00:12:57,526 --> 00:13:01,489
How are we creating community
and how are we mobilizing

00:13:01,489 --> 00:13:02,615
that community into action?

00:13:02,615 --> 00:13:04,158
So that's really what we focus on.

00:13:04,158 --> 00:13:07,036
Okay, So give that to me
because I like to get it in

00:13:07,036 --> 00:13:10,039
and where we're as as

00:13:10,080 --> 00:13:13,083
Joe Madison is, where the goats can get it
right, make it very easy.

00:13:13,209 --> 00:13:17,087
What would you say if you had to explain
that in less than two sentences?

00:13:17,087 --> 00:13:20,382
What problem are you solving?

00:13:20,466 --> 00:13:23,594
We're see engagement and action.

00:13:23,677 --> 00:13:24,887
Okay.

00:13:24,887 --> 00:13:25,137
All right.

00:13:25,137 --> 00:13:28,974
And what would that look like
when the problem solve was spark seeker?

00:13:29,058 --> 00:13:33,020
It would look like
we had a more human connection

00:13:33,020 --> 00:13:35,898
that was more meaningful and personal.

00:13:35,898 --> 00:13:38,776
We felt a true sense of community

00:13:38,776 --> 00:13:41,570
and we felt empowered to take action

00:13:41,570 --> 00:13:44,073
within our communities
or on topics that matter to us.

00:13:44,073 --> 00:13:44,907
Right?

00:13:44,907 --> 00:13:48,786
How would we measure that?

00:13:48,869 --> 00:13:50,412
You could actually measure that through.

00:13:50,412 --> 00:13:55,459
We have a lot of we don't do a invasive
tracking, but we have

00:13:55,543 --> 00:13:59,421
aggregated and atomized data in terms of
what does that mean?

00:13:59,505 --> 00:14:02,299
Means that we don't associate things
to behavior to.

00:14:02,299 --> 00:14:03,551
Rob Richardson.

00:14:03,551 --> 00:14:05,928
Okay, we might get some. Okay.

00:14:05,928 --> 00:14:08,514
With this many actions
taken on this platform,

00:14:08,514 --> 00:14:10,432
this is the amount of good
we were able to do.

00:14:10,432 --> 00:14:13,269
There are certain basic metrics
that we can pull,

00:14:13,269 --> 00:14:16,355
but it's just not associated to you
because the second it becomes

00:14:16,355 --> 00:14:18,190
associated to you,
we're building a profile

00:14:18,190 --> 00:14:21,026
and now we're looking to see how we can
manipulate.

00:14:21,026 --> 00:14:21,902
Yes, exactly.

00:14:21,902 --> 00:14:23,863
So we're not looking to do that.

00:14:23,863 --> 00:14:25,072
So we're very mindful about that.

00:14:25,072 --> 00:14:27,157
So we we do have a way of maximizing that.

00:14:27,157 --> 00:14:30,953
And then, of course, like with
we're going to like the how the platform

00:14:30,953 --> 00:14:35,499
is positioned for each organization is a
little bit different with the B2B side.

00:14:35,583 --> 00:14:38,168
Okay. So we actually go in
and do a pre assessment.

00:14:38,168 --> 00:14:41,547
We look at what the overarching goals are
and then we look to see, first of all,

00:14:41,547 --> 00:14:43,173
if we're right
fit in the way we matricide,

00:14:43,173 --> 00:14:45,551
the way we can give them the results
that they're looking for.

00:14:45,551 --> 00:14:46,552
Is it possible?

00:14:46,552 --> 00:14:50,222
So if we do it, then it's a good fit
and we move forward.

00:14:50,306 --> 00:14:53,017
So you work with organizations
essentially that are looking for ways

00:14:53,017 --> 00:14:57,021
to connect with their communities.

00:14:57,104 --> 00:14:59,273
How do you
so you work mostly on the B2B side?

00:14:59,273 --> 00:15:03,360
It sounds like at this mostly
the the B2B can always add

00:15:03,360 --> 00:15:06,697
like a member of the B2B
side can always add a B to C account

00:15:06,697 --> 00:15:10,117
if they want to have a private social
experience or private online experience.

00:15:10,200 --> 00:15:14,204
But really, like one of the use cases
might be a university alumni group.

00:15:14,288 --> 00:15:17,207
Okay, They've got, you know, stuff
where they want

00:15:17,207 --> 00:15:20,669
to get their community
involved in donations events.

00:15:20,753 --> 00:15:23,756
They want them to come back
and be part of their communities.

00:15:23,839 --> 00:15:27,259
So one of the big, bigger
things that people are trying to get done,

00:15:27,259 --> 00:15:30,387
like one big area of interest for us
is there because it's a huge market.

00:15:30,387 --> 00:15:30,888
Oh, okay.

00:15:30,888 --> 00:15:34,725
Well, you had a university,
so we had connections.

00:15:34,808 --> 00:15:36,101
Walk me through what that would look like.

00:15:36,101 --> 00:15:39,313
Just and then I want to then
I want to really get into what do you see?

00:15:39,355 --> 00:15:41,899
Walk me through what that looks like,
the visual experience.

00:15:41,899 --> 00:15:42,691
So online.

00:15:42,691 --> 00:15:45,569
So let's just say you were
let's take the university alumni group.

00:15:45,569 --> 00:15:47,237
Sure. You're working with them.

00:15:47,237 --> 00:15:50,449
How would that look if you were just
to work with them on a very high level?

00:15:50,532 --> 00:15:52,868
How would that work with you
working with any university universities?

00:15:52,868 --> 00:15:57,581
Nati Well,
what we do is we don't actually have

00:15:57,665 --> 00:16:00,376
a regular plot
like the feature of connection.

00:16:00,376 --> 00:16:02,044
Our platform is all video based.

00:16:02,044 --> 00:16:04,088
Okay, So it's a video centric app.

00:16:04,088 --> 00:16:04,463
Okay?

00:16:04,463 --> 00:16:05,756
So you go in and it's like,

00:16:05,756 --> 00:16:09,635
let's the best analogy would be
if Clubhouse and Zoom had a baby.

00:16:09,885 --> 00:16:12,972
Okay,
That's what the video engagement is like.

00:16:13,138 --> 00:16:16,517
Okay, They could be one off for recordings
and people can actually search

00:16:16,517 --> 00:16:17,434
based on their interests,

00:16:17,434 --> 00:16:21,647
actually find a community that was good
unless the clubhouse in Zoom had a baby.

00:16:21,647 --> 00:16:23,774
Okay.
So that's sort of our video engagement.

00:16:23,774 --> 00:16:27,361
And whether you are starting
a small business at home

00:16:27,361 --> 00:16:30,364
and you want to create awareness
around that or you're a big organization

00:16:30,447 --> 00:16:33,367
who wants to like, for example,
we've got we know organizations

00:16:33,367 --> 00:16:38,998
that have one organization
that has 176 on Facebook, Right.

00:16:39,081 --> 00:16:41,959
And all of this is for their community
engagement, right?

00:16:41,959 --> 00:16:43,377
And not one department

00:16:43,377 --> 00:16:46,046
knows what the other departments are doing
that Very, very disconnected.

00:16:46,046 --> 00:16:46,547
Got it.

00:16:46,547 --> 00:16:49,675
So like to create a lot of these online
virtual events,

00:16:49,675 --> 00:16:52,636
get there to get that community
actually talking.

00:16:52,636 --> 00:16:54,346
And then we've got something
called an action feature

00:16:54,346 --> 00:16:56,807
where you can actually get them
to take action. Now.

00:16:56,807 --> 00:16:57,766
Oh, that's awesome.

00:16:57,766 --> 00:16:58,684
So I got it.

00:16:58,684 --> 00:17:02,938
So I'm envisioning
kind of how the clubhouse spaces

00:17:02,938 --> 00:17:06,900
which turned into Twitter spaces now
and you're in which there's a

00:17:06,984 --> 00:17:09,570
huge opportunity
now given where that's at,

00:17:09,653 --> 00:17:12,698
but engaging
people but using video to do so.

00:17:12,865 --> 00:17:14,116
Yeah, and it's in video.

00:17:14,116 --> 00:17:16,744
It's video rooms and video communities.

00:17:16,744 --> 00:17:20,039
And so you're building
you're not building your community

00:17:20,039 --> 00:17:22,207
because someone liked your software,
someone followed you.

00:17:22,207 --> 00:17:23,125
You're building your community

00:17:23,125 --> 00:17:25,169
because you sat out to make time
to talk to your community.

00:17:25,169 --> 00:17:27,004
Got it. You're Engaging
with your community.

00:17:27,004 --> 00:17:28,338
They get to ask you questions.

00:17:28,338 --> 00:17:30,591
They get to be part of the conversation.

00:17:30,591 --> 00:17:33,635
They want to feel engaged
and they want to know that they matter,

00:17:33,635 --> 00:17:36,680
that their opinion matters,
and that they want to feel seen and heard.

00:17:36,805 --> 00:17:40,476
Yes. And really, how do we build
relationships in the world?

00:17:40,517 --> 00:17:42,561
By connecting. By connecting.

00:17:42,561 --> 00:17:44,646
So we got a dollar back
and we got to go back

00:17:44,646 --> 00:17:47,274
to our roots of connection
and what connection means and looks like.

00:17:47,274 --> 00:17:50,277
So we're doing things
very differently on Spark Seeker

00:17:50,277 --> 00:17:53,530
and also like we know
data support science supports

00:17:53,530 --> 00:17:55,532
that if you want someone to take action,
they've got to do it.

00:17:55,532 --> 00:17:59,995
Now, if you tell them let's take action
a day from now, a week from now,

00:17:59,995 --> 00:18:02,039
they won't do it. The drop off
rate is very high.

00:18:02,039 --> 00:18:05,542
So if you are looking to mobilize
a community, whether that be towards

00:18:05,542 --> 00:18:09,546
a good cause or whether that be towards
a company excursion,

00:18:09,630 --> 00:18:11,048
how do you want them to take action?

00:18:11,048 --> 00:18:12,299
How do you want them to get involved?

00:18:12,299 --> 00:18:14,510
And it's let's do it now.

00:18:14,510 --> 00:18:15,219
Okay.

00:18:15,219 --> 00:18:20,599
So thinking of that and just walking
through that a little bit,

00:18:20,682 --> 00:18:21,141
you know, a lot

00:18:21,141 --> 00:18:24,937
of social media platforms
use complex algorithms.

00:18:24,937 --> 00:18:26,105
A lot of them are using

00:18:26,105 --> 00:18:31,026
I don't want to get into that
because obviously you're going to use a I.

00:18:31,110 --> 00:18:32,861
What is your biggest concern

00:18:32,861 --> 00:18:36,782
with the current use
of AI for social interactions?

00:18:36,865 --> 00:18:40,577
And if you do have concerns
and I assume you do,

00:18:40,661 --> 00:18:44,748
how is Spark Seeker or your organization
going to go about changing that?

00:18:44,832 --> 00:18:47,918
Well, there's a lot of concerns,
everything from,

00:18:47,960 --> 00:18:50,963
you know, disinformation,

00:18:51,130 --> 00:18:55,008
the deep fakes with generative AI to

00:18:55,092 --> 00:18:57,469
existential threats and elite ism.

00:18:57,469 --> 00:19:02,975
And this, you know, it's there's there's
a whole gamut of things to discuss there.

00:19:03,016 --> 00:19:05,811
I think on a lot of these social
platforms, like we don't call ourselves

00:19:05,811 --> 00:19:07,437
social media,
we're actually social community.

00:19:07,437 --> 00:19:09,231
You get to be social,
but we're all about community.

00:19:09,231 --> 00:19:12,234
We're not about anything to do
with typical social media.

00:19:12,484 --> 00:19:13,861
So we really try to do it.

00:19:13,861 --> 00:19:15,654
So it's social community
that's social media.

00:19:15,654 --> 00:19:18,282
I think that's such a key point for Go.

00:19:18,282 --> 00:19:19,908
I have more later.

00:19:19,908 --> 00:19:21,076
Thank you.

00:19:21,076 --> 00:19:25,330
But we don't
we haven't deployed any as of yet.

00:19:25,330 --> 00:19:27,875
We're building a frame work for it.
So that's what I would say.

00:19:27,875 --> 00:19:30,878
The number one thing
that would want everybody to do,

00:19:30,878 --> 00:19:36,425
really sort of looking at things
from a ethical intelligence place first,

00:19:36,508 --> 00:19:40,762
you know, you need ethical
intelligence models first, right?

00:19:40,971 --> 00:19:44,141
You need to understand really
what are you building?

00:19:44,141 --> 00:19:48,312
And I think, you know, in your
panel tomorrow, wonderful

00:19:48,395 --> 00:19:50,022
gentleman that's going to be on the panel

00:19:50,022 --> 00:19:54,109
that speaks about explainable that
that's another great topic to get into.

00:19:54,109 --> 00:19:55,736
But there's
I think a framework needs to be done.

00:19:55,736 --> 00:19:58,322
You need to understand what you're going
to be doing with this A.I..

00:19:58,322 --> 00:20:01,074
A.I., I think, can be used
for some amazing things,

00:20:01,074 --> 00:20:03,952
specifically like disease prevention. Yes.

00:20:03,952 --> 00:20:07,789
You know, early detection, early
like early evaluation,

00:20:07,789 --> 00:20:11,293
early findings actually prevent
that from ever actually manifest

00:20:11,376 --> 00:20:16,423
or like predicting things
like drought and famine.

00:20:16,506 --> 00:20:19,968
So there's a lot of great uses for
I think, that we can see in the future.

00:20:19,968 --> 00:20:22,554
And I don't think that it's going to be
an AI itself.

00:20:22,554 --> 00:20:25,557
It's going to be when AI converges
with an existing industry

00:20:25,557 --> 00:20:27,851
and finds a solution to a problem
within that industry.

00:20:27,851 --> 00:20:28,310
Got it.

00:20:28,310 --> 00:20:31,021
And I think it's something which is true
because technology is just a tool.

00:20:31,021 --> 00:20:32,272
It's doesn't solve.

00:20:32,272 --> 00:20:34,900
There's no you got to figure out
what problem yourself in an airplane.

00:20:34,900 --> 00:20:36,985
It's all do. Exactly, exactly.

00:20:36,985 --> 00:20:39,863
So I think I think there's
some great opportunities to explore.

00:20:39,863 --> 00:20:43,575
I think everything is going
to still require that human interaction.

00:20:43,659 --> 00:20:48,080
I think it needs to I think the second
we remove the human piece out of it,

00:20:48,163 --> 00:20:51,166
I think that's where we start
running into a lot of risks.

00:20:51,291 --> 00:20:53,752
How concerned are you that
that's happening?

00:20:53,752 --> 00:20:54,670
So what is happening?

00:20:54,670 --> 00:20:57,673
Yeah, and that's why
these conversations are so necessary.

00:20:57,798 --> 00:21:00,175
I do believe
we're at a very pivotal point in time.

00:21:00,175 --> 00:21:03,762
What's the worst case scenario
if we if we don't take the human centric

00:21:03,762 --> 00:21:06,765
approach, militarization and,

00:21:06,807 --> 00:21:11,311
you know, really jobs, job security,

00:21:11,395 --> 00:21:12,312
I think

00:21:12,312 --> 00:21:15,565
disparity between countries
like right now with AI,

00:21:15,649 --> 00:21:18,235
with the technology,
the tools that we have,

00:21:18,235 --> 00:21:20,320
you know, there are third world countries
with a lack of training.

00:21:20,320 --> 00:21:21,863
There's going to be an elite decision

00:21:21,863 --> 00:21:25,158
that comes out
and there's going to be a huge divide.

00:21:25,242 --> 00:21:27,202
Yeah, I mean, I've

00:21:27,202 --> 00:21:29,871
the way up to the way I've described it is

00:21:29,871 --> 00:21:32,332
we can upgrade inequality, right?

00:21:32,332 --> 00:21:37,212
So those who have access to AI and

00:21:37,296 --> 00:21:41,758
will have a totally different experience
and opportunities and those that don't,

00:21:41,842 --> 00:21:46,972
what else is well,
what is how we defining what

00:21:47,055 --> 00:21:50,225
algorithms can do overall,
which are algorithms

00:21:50,225 --> 00:21:53,228
that combine the two together
because that's basically what it is.

00:21:53,353 --> 00:21:55,814
Are we going to have algorithms only?

00:21:55,814 --> 00:22:00,402
And because it's all of this,
all this is possible, right?

00:22:00,485 --> 00:22:02,529
Because most

00:22:02,612 --> 00:22:04,197
humans don't own land, right?

00:22:04,197 --> 00:22:06,950
Shoes,
you know, like the entities own land.

00:22:06,950 --> 00:22:07,868
And you could argue

00:22:07,868 --> 00:22:11,580
an algorithm is another entity
that can be controlled by a few people.

00:22:11,663 --> 00:22:14,458
How important is that
for us to think about?

00:22:14,458 --> 00:22:16,376
Like, so how do we get people
to think about that, though?

00:22:16,376 --> 00:22:18,086
Because it's

00:22:18,086 --> 00:22:20,297
we talk about this a lot on our podcast.

00:22:20,297 --> 00:22:24,676
It's very hard
to get people to prevent fires,

00:22:24,760 --> 00:22:27,512
even though it's
it'll save you a lot more money.

00:22:27,512 --> 00:22:30,057
It's it's what it will save lives.

00:22:30,057 --> 00:22:32,768
But people often only see the fire.

00:22:32,768 --> 00:22:33,393
Right.

00:22:33,393 --> 00:22:36,229
And so like they it's
people can easily put out fires for for

00:22:36,229 --> 00:22:39,691
right now people are seeing AI
and they're saying, you know, Calista,

00:22:39,691 --> 00:22:41,902
I hear you,
but let's just figure out a way to

00:22:41,902 --> 00:22:45,238
move as fast as we can,
make as much money as we can.

00:22:45,322 --> 00:22:46,698
How do we come at that?

00:22:46,698 --> 00:22:51,286
Especially we know America is great
for a lot of things when it comes to that.

00:22:51,286 --> 00:22:55,415
Generally, we we make as much money first
and figure out

00:22:55,499 --> 00:22:57,459
what happens on the back end later.

00:22:57,459 --> 00:23:00,045
How do we how do we
how do we change the narrative there?

00:23:00,045 --> 00:23:02,589
I think that think

00:23:02,672 --> 00:23:03,090
putting the

00:23:03,090 --> 00:23:07,094
cart before the horse right now, that is
that is my genuine opinion on that.

00:23:07,094 --> 00:23:10,514
I think there is going to always be people
that say,

00:23:10,514 --> 00:23:13,642
we'll worry about this, we'll worry
about dealing with the problem later.

00:23:13,850 --> 00:23:15,310
Let's just make that money.

00:23:15,310 --> 00:23:19,022
But I think that it's going to be,
in this particular case, a runaway train

00:23:19,106 --> 00:23:20,774
and it's going to be a lot more undoing.

00:23:20,774 --> 00:23:22,692
That may not even be possible
at that point.

00:23:22,692 --> 00:23:23,276
Yeah, you know,

00:23:23,276 --> 00:23:26,947
I think right now we are at a point
where we need to stop and we need to

00:23:27,030 --> 00:23:32,494
we have to understand as human beings
that we live in a supply and demand world.

00:23:32,577 --> 00:23:36,081
And if we all want to jump onto a fad,
onto the next greatest thing,

00:23:36,081 --> 00:23:38,583
and we just want to use
all the new air tools out there,

00:23:38,583 --> 00:23:41,420
it is going to be a runaway train,
but we need to

00:23:41,420 --> 00:23:44,673
and then we always sit back and say,
But the government this but

00:23:44,756 --> 00:23:48,135
this company did this
and that person will know.

00:23:48,218 --> 00:23:52,889
Ultimately, the responsibility is with
each of us as individuals.

00:23:52,973 --> 00:23:57,894
What do we accept now, do we have the time
to wait for terms of service?

00:23:57,978 --> 00:23:58,645
No, no.

00:23:58,645 --> 00:24:03,775
We will complain that our data
is being pillaged, but we agree to it now,

00:24:03,859 --> 00:24:04,317
you know,

00:24:04,317 --> 00:24:08,822
how much responsibility
do we take as individuals show up?

00:24:08,905 --> 00:24:11,825
And by not showing up, you're showing up.

00:24:11,825 --> 00:24:13,618
So if something doesn't work,
don't participate.

00:24:13,618 --> 00:24:15,120
Oh, man, you're speaking my language.

00:24:15,120 --> 00:24:16,955
I tell this, when people are in politics,

00:24:16,955 --> 00:24:18,999
I tell people,
you get what you don't vote for, too.

00:24:18,999 --> 00:24:21,460
Yeah, people complain every minute.

00:24:21,460 --> 00:24:22,878
And I used to go to that.

00:24:22,878 --> 00:24:24,796
My former life. I ran for office.

00:24:24,796 --> 00:24:26,173
And it's not the people that complain
the most.

00:24:26,173 --> 00:24:29,426
You go back and look like you haven't
voted in ten years.

00:24:29,509 --> 00:24:32,471
You got a lot of complaints
about what's happening.

00:24:32,471 --> 00:24:33,513
It's the same thing, right?

00:24:33,513 --> 00:24:35,390
We have to be active citizens.

00:24:35,390 --> 00:24:39,561
And I guess and what's happening
when I'm building, nobody's out

00:24:39,561 --> 00:24:42,230
there are spending billions,
millions of dollars

00:24:42,230 --> 00:24:46,026
building models and no one wants to use
They're in it to make money.

00:24:46,026 --> 00:24:47,319
Yeah, but that's the trick, though, right?

00:24:47,319 --> 00:24:51,364
So I want to challenge that side
of what you're saying. I don't disagree,

00:24:51,406 --> 00:24:56,620
but when all the social media apps

00:24:56,703 --> 00:24:59,956
started right,
I think people believed that

00:24:59,956 --> 00:25:01,666
they were getting on
to connect with their family.

00:25:01,666 --> 00:25:02,751
I think that was a genuine,

00:25:02,751 --> 00:25:04,753
genuine interest
that you could connect with people

00:25:04,753 --> 00:25:08,089
that you didn't know, what you didn't see,
and it didn't start that way, Right?

00:25:08,298 --> 00:25:12,344
Yeah, But to your point, because I love
you said that

00:25:12,511 --> 00:25:17,224
social community versus social media,
because my definition of media

00:25:17,307 --> 00:25:21,478
is that especially with news
generally, I'd say, you know, media

00:25:21,561 --> 00:25:24,940
to the to the brain,
what sugar is to the body.

00:25:25,023 --> 00:25:28,527
You get the instant hit,
but then it really kind of takes you down

00:25:28,610 --> 00:25:30,529
because the goal is
just to get you emotional reaction.

00:25:30,529 --> 00:25:32,030
If you think about it, most of the time,

00:25:32,030 --> 00:25:34,741
the amount of information
we get at news doesn't help us.

00:25:34,741 --> 00:25:38,745
Yeah, it just really makes us either
depressed or angry or emotional

00:25:38,745 --> 00:25:40,664
in some way.
That's really what it does, right?

00:25:40,664 --> 00:25:43,250
That's why we keep hearing about

00:25:43,250 --> 00:25:46,795
when there's a there could be a crime
to happen in certain area.

00:25:46,795 --> 00:25:49,548
There could not have been a crime
for ten years. Right.

00:25:49,548 --> 00:25:52,342
But then they'll talk about a crime
that happened because people

00:25:52,342 --> 00:25:55,011
over and over and over again cause
people are interested, I'll draw them in.

00:25:55,011 --> 00:25:56,846
But it doesn't necessarily really solve
the problem.

00:25:56,846 --> 00:25:58,974
I'm making the point
that what social media does

00:25:58,974 --> 00:26:01,977
the goal is to click,
but it's not to connect.

00:26:02,143 --> 00:26:07,023
I was telling you a little bit
about this piece of that because

00:26:07,107 --> 00:26:08,024
I don't agree that,

00:26:08,024 --> 00:26:12,696
you know, I think people were misinformed
when they first went on social

00:26:12,779 --> 00:26:13,738
because it ever

00:26:13,738 --> 00:26:17,492
any of these social platforms
that they ever charge you money for it?

00:26:17,576 --> 00:26:21,162
No. So how do we think

00:26:21,246 --> 00:26:23,873
how did we ever believe that

00:26:23,873 --> 00:26:27,961
a company that gave us something for free
was worth billions of dollars?

00:26:28,003 --> 00:26:28,878
Well, I agree with you there.

00:26:28,878 --> 00:26:30,880
And I tell people, if you're not the

00:26:30,880 --> 00:26:32,966
if it's free, you're not the consumer,
you're the product.

00:26:32,966 --> 00:26:34,301
You are the product.

00:26:34,301 --> 00:26:35,927
So I think.

00:26:35,927 --> 00:26:38,555
But do you think people knew
that genuinely, though?

00:26:38,555 --> 00:26:41,433
I think that people it
I know that that's true

00:26:41,433 --> 00:26:44,936
because at the end of the day,
they were getting something for free. Yes.

00:26:45,020 --> 00:26:47,063
And you know what?
I think we will face that

00:26:47,063 --> 00:26:49,482
every single time
you've given somebody something for.

00:26:49,482 --> 00:26:52,569
And then now you turn
around and charge you for it.

00:26:52,652 --> 00:26:55,071
Why? I got it for free, right?

00:26:55,071 --> 00:26:57,782
And then they don't connect.

00:26:57,782 --> 00:26:59,868
The issue is such a great point.

00:26:59,868 --> 00:27:02,162
I mean, it's have a lot of points on this.

00:27:02,162 --> 00:27:04,164
I mean, I agree with a lot of it.

00:27:04,164 --> 00:27:06,833
This is my nuance mine,

00:27:06,833 --> 00:27:08,918
the challenges people definitely.

00:27:08,918 --> 00:27:09,919
I think you're right. Right.

00:27:09,919 --> 00:27:13,673
They they got a free product
and they were fine with it.

00:27:13,673 --> 00:27:17,636
And then and and I think it got
people used to in a bad way

00:27:17,677 --> 00:27:20,680
getting things for free and expecting
things are free

00:27:20,847 --> 00:27:24,517
content is free
creators is free, music is free.

00:27:24,559 --> 00:27:26,269
We don't like none of this matters.

00:27:26,269 --> 00:27:27,687
And then the only people that end up

00:27:27,687 --> 00:27:31,983
getting some money are the platforms
that have advertising on there.

00:27:31,983 --> 00:27:33,693
And that's it, right?

00:27:33,693 --> 00:27:37,280
So it did create other problems
in terms of how we thought about things

00:27:37,280 --> 00:27:41,159
because we shouldn't think about things
that there is value to what we're getting,

00:27:41,242 --> 00:27:44,245
but we're giving it,
but we're literally not paying for it.

00:27:44,329 --> 00:27:48,208
But I would say the other side of it
is I don't know if it was for informed

00:27:48,208 --> 00:27:51,211
consent in that agreed in that

00:27:51,461 --> 00:27:56,341
I don't think people predicted
and I actually don't think even Facebook

00:27:56,341 --> 00:28:00,762
and others predicted how the algorithms
would affect people in terms

00:28:00,762 --> 00:28:05,725
of how our politics are going, in terms
of how people are dividing in a way.

00:28:05,809 --> 00:28:10,689
And well, I think many those platforms
and I didn't mean to interrupt you,

00:28:10,772 --> 00:28:15,068
but I think, you know,
when you when you start tracking

00:28:15,151 --> 00:28:17,862
through third party services minute
by minute behavior

00:28:17,862 --> 00:28:20,865
of every single person on your platforms,
whether you're a company

00:28:20,907 --> 00:28:23,660
or an organization using it social. Sure.

00:28:23,660 --> 00:28:27,288
Or you're an individual, it doesn't matter
because companies are just as much risk

00:28:27,372 --> 00:28:28,206
or worse.

00:28:28,206 --> 00:28:32,669
And so if when you start tracking
everybody and then in order to achieve

00:28:32,669 --> 00:28:37,048
a certain result, you're going to deploy
behavior manipulation, tactics

00:28:37,132 --> 00:28:41,594
to create deeper engagement, longer
engagement to result in a certain outcome,

00:28:41,678 --> 00:28:44,472
whether that be in disinformation
or whether that be a purchase

00:28:44,472 --> 00:28:48,268
or whether that you know ads through
ad sales and things like that,

00:28:48,351 --> 00:28:51,229
whatever it might be,
there's a surveillance economy

00:28:51,229 --> 00:28:54,274
and then there's behavior manipulation
to get certain things to happen.

00:28:54,274 --> 00:28:56,025
So it's not just the ad revenue.

00:28:56,025 --> 00:28:58,486
This is like the smallest of problems.
Sure.

00:28:58,486 --> 00:29:01,364
I mean, I think
as we go into this conversation of AI,

00:29:01,364 --> 00:29:04,659
all this data
that's been collected on who we are,

00:29:04,659 --> 00:29:07,829
once we switch into a digital currency
and we switch into

00:29:07,912 --> 00:29:11,916
AI and we switch into all of these things
that are going to be connected,

00:29:11,958 --> 00:29:16,129
how is the data that
that all these organizations have on us?

00:29:16,129 --> 00:29:18,590
How is that going to be leveraged
against us?

00:29:18,590 --> 00:29:20,800
Where do we qualify for what? Wow.

00:29:20,800 --> 00:29:23,720
What kind of jobs are going to be linked
to our data? Wow.

00:29:23,720 --> 00:29:25,346
What kind of credit scoring
are we going to be?

00:29:25,346 --> 00:29:27,557
Have for medical resources?

00:29:27,557 --> 00:29:29,017
Do we have access to it?

00:29:29,017 --> 00:29:30,769
What type of criminal justice system
will we have?

00:29:30,769 --> 00:29:33,396
What kind of travel permissions
will we have? Oh, wow.

00:29:33,396 --> 00:29:40,069
So how I is going to with a
I with the evolution explosion of AI

00:29:40,153 --> 00:29:42,655
and with all this data
that's being collected about us,

00:29:42,655 --> 00:29:47,494
especially companies and individuals
really need to start thinking about

00:29:47,577 --> 00:29:50,330
what kind of platforms am I going to use?

00:29:50,330 --> 00:29:51,372
Is it worth it for me

00:29:51,372 --> 00:29:55,210
to get something for free
when my data is being pillaged, basically?

00:29:55,210 --> 00:29:59,964
And how will that data be
leveraged against me with AI?

00:30:00,048 --> 00:30:01,633
That's great at its current trajectory.

00:30:01,633 --> 00:30:04,177
So that's where I want everybody
to start thinking and why.

00:30:04,177 --> 00:30:06,805
I think Spike because there are necessary.

00:30:06,805 --> 00:30:10,183
There you go. Podcast. You do that.

00:30:10,266 --> 00:30:10,934
That was great.

00:30:10,934 --> 00:30:13,853
No, it's and getting people
to make it to understand

00:30:13,853 --> 00:30:18,566
that issue is everything right And that's
why that's why we started this disruption

00:30:18,566 --> 00:30:21,778
now That's why we were this conference
and Midwest con

00:30:21,861 --> 00:30:26,783
people have to understand
and it's up to us to advocate the problem.

00:30:26,783 --> 00:30:30,245
Like I think it's people did sign away
a lot of that so people signed away

00:30:30,245 --> 00:30:30,912
a lot of their rights.

00:30:30,912 --> 00:30:33,289
They didn't realize it,

00:30:33,289 --> 00:30:35,124
but now they can go back and reclaim them.

00:30:35,124 --> 00:30:35,375
Right.

00:30:35,375 --> 00:30:38,628
And we have to figure out
what are we going to do with all this,

00:30:38,670 --> 00:30:41,506
with all this information
that's been collected like we have.

00:30:41,506 --> 00:30:44,509
I forget what all the stats are in terms
of the amount of data collected.

00:30:44,551 --> 00:30:45,593
It changes every year.

00:30:45,593 --> 00:30:48,596
But I think it's we've already repeated
more collected

00:30:48,596 --> 00:30:51,808
in times in our history
the last few years.

00:30:51,891 --> 00:30:54,727
We have a lot of this information,

00:30:54,769 --> 00:30:56,771
but a lot of it is being collected
in a way

00:30:56,771 --> 00:31:02,443
that has reinforced the same biases
that we've had forever.

00:31:02,443 --> 00:31:02,777
Right?

00:31:02,777 --> 00:31:06,656
And so, like, if we
if we're not intentional about how we're

00:31:06,739 --> 00:31:10,118
using artificial intelligence, it's
going to really

649
00:31:10,201 --> 00:31:11,661
we don't understand the consequences.

00:31:11,661 --> 00:31:16,082
And we ought to at least come from
a mindset of let's make sure

00:31:16,332 --> 00:31:19,878
we understand what's happening,
not like turn a blind eye,

00:31:19,961 --> 00:31:23,715
make as much money as we can,
and figure it out after the fact.

00:31:23,798 --> 00:31:26,676
I think those I think, are really like
ethically speaking,

00:31:26,676 --> 00:31:29,679
like I think you really need to sit back
and look at

00:31:29,888 --> 00:31:31,931
why are we building these models
that we're building?

00:31:31,931 --> 00:31:34,434
What's going into building these models,
how we training was lying,

00:31:34,434 --> 00:31:36,561
how do we get people to hear us
like we are?

00:31:36,561 --> 00:31:38,229
You, you and I are.

00:31:38,229 --> 00:31:41,232
We're obviously preaching
to the converted, right?

00:31:41,441 --> 00:31:45,486
How do we get people who don't who just

00:31:45,570 --> 00:31:47,739
want to go and watch the algorithm

00:31:47,739 --> 00:31:51,534
on that Netflix takes them down
or whatever or social media?

00:31:51,534 --> 00:31:56,122
How do we get the attention of
of those of degraded communities

00:31:56,122 --> 00:31:58,708
or how do we get the attention
of corporations and others

00:31:58,708 --> 00:32:01,711
to make them understand
that this is in their interests

00:32:01,878 --> 00:32:05,214
right now versus them
just saying, don't worry about it?

00:32:05,214 --> 00:32:06,549
Because that's what concerns me.

00:32:06,549 --> 00:32:09,177
Like, I agree with you
I think you you know that.

00:32:09,177 --> 00:32:12,805
But what concerns me is that

00:32:12,889 --> 00:32:15,433
how how likely are we to get

00:32:15,433 --> 00:32:20,563
the attention of enough people
to actually get them to be intentional?

00:32:20,647 --> 00:32:23,191
I think that whenever New technology

00:32:23,191 --> 00:32:26,402
or new things
come out, it's all nice and new and shiny.

00:32:26,402 --> 00:32:27,695
It's like a new car.

00:32:27,695 --> 00:32:29,614
And, you know,
we can sit there and talk about,

00:32:29,614 --> 00:32:31,783
Oh, it's got this feature
and it's got these bells

00:32:31,783 --> 00:32:35,787
and those whistles that we can glamorize
something very quickly.

00:32:35,787 --> 00:32:38,164
And I think that's really
what happened with a I agree.

00:32:38,164 --> 00:32:39,624
You came up to the house.

00:32:39,624 --> 00:32:41,292
Yeah, there's your Bentley

00:32:41,292 --> 00:32:44,003
and you have blockchain really to be
that we're we're we're

00:32:44,003 --> 00:32:45,296
we're like two side children now.

00:32:45,296 --> 00:32:47,507
So no one likes teamwork.

00:32:47,507 --> 00:32:49,884
And I think, you know, I think

00:32:49,968 --> 00:32:51,886
you you can play the short game.

00:32:51,886 --> 00:32:54,764
We can play the long game.
Some of it comes down to ethics.

00:32:54,764 --> 00:32:58,893
Yeah, you ethics as a human being
and you ethics as an organization.

00:32:58,977 --> 00:33:01,270
Yes, We all want to make money
and you don't.

00:33:01,270 --> 00:33:03,356
In an ideal world,
we all want to make lots of money.

00:33:03,356 --> 00:33:05,942
Yeah,
but there is more than one way to do that.

00:33:05,942 --> 00:33:06,192
Yeah.

00:33:06,192 --> 00:33:09,654
You can make money and still have ethics
you can think about.

00:33:09,654 --> 00:33:09,988
Okay.

00:33:09,988 --> 00:33:15,535
Like I get why companies one won alums
and I get why why is so appealing.

00:33:15,535 --> 00:33:17,996
I see the virtues of AI in many ways.

00:33:17,996 --> 00:33:20,957
There are many aspects of
AI that I am very excited about.

00:33:20,957 --> 00:33:22,458
Sure,

00:33:22,458 --> 00:33:25,294
But we can take them and say,
Why are we building what we're building?

00:33:25,294 --> 00:33:26,212
What's vital?

00:33:26,212 --> 00:33:27,213
Like what is the outcome?

00:33:27,213 --> 00:33:29,716
And start
really looking at it from an ethics.

00:33:29,716 --> 00:33:31,676
I think ethical intelligence
is very important.

00:33:31,676 --> 00:33:33,344
It would be great
if every company could deploy

00:33:33,344 --> 00:33:36,264
an ethical intelligence community
or ethical intelligence.

00:33:36,264 --> 00:33:37,640
What do you mean by that,

00:33:37,640 --> 00:33:40,977
really looking at the core principles
of what ethics is and saying,

00:33:40,977 --> 00:33:45,189
hey, listen, this is our business,
this is the problem our business solves.

00:33:45,273 --> 00:33:47,942
And as we are coming up,
what is the next feature?

00:33:47,942 --> 00:33:49,444
What is the next advancement?

00:33:49,444 --> 00:33:52,905
What is the next thing we want to focus on
and how is what we're building

00:33:52,905 --> 00:33:55,908
going to support
that outcome, that solution?

00:33:55,950 --> 00:33:59,162
And what
what is it that we need to collect

00:33:59,245 --> 00:34:03,041
in order to be able to How is it
that we need to train these models?

00:34:03,124 --> 00:34:04,542
Where are those biases?

00:34:04,542 --> 00:34:09,881
How can those biases, those conscious,
unconscious biases impact people?

00:34:09,964 --> 00:34:11,340
Like what are the long term?

00:34:11,340 --> 00:34:12,800
I think that that's ethical.

00:34:12,800 --> 00:34:15,219
Intelligence needs to be

00:34:15,303 --> 00:34:16,179
something that is.

00:34:16,179 --> 00:34:18,765
There is a preface to A.I.. Yes.

00:34:18,765 --> 00:34:22,143
You know, before artificial intelligence,
we need ethical intelligence.

00:34:22,143 --> 00:34:22,810
That's great.

00:34:22,810 --> 00:34:23,728
And I think

00:34:23,728 --> 00:34:25,438
that's one of the things
I really speak about,

00:34:25,438 --> 00:34:28,232
because I think we're entering the
AI renaissance. Yes.

00:34:28,232 --> 00:34:32,570
You know, we had many different periods
as the era of the AI renaissance.

00:34:32,570 --> 00:34:34,614
So we are all artists
and we're all creators.

00:34:34,614 --> 00:34:36,949
And what are we really creating
in this period?

00:34:36,949 --> 00:34:40,244
And then, you know, and I think these are
the conversations need to have.

00:34:40,244 --> 00:34:44,916
So really, so much of like emotional
intelligence is going to be every company

00:34:44,916 --> 00:34:48,127
go out there and put together
an emotional intelligence work.

00:34:48,169 --> 00:34:51,005
It's you said we're all you're right.

00:34:51,005 --> 00:34:53,007
Something that I said a few years ago.

00:34:53,007 --> 00:34:55,551
So I feel like I was predicting the future
a little bit.

00:34:55,551 --> 00:35:01,182
So we're all of us are creators and
all of us are media companies, too, right?

00:35:01,182 --> 00:35:02,809
And so

00:35:02,892 --> 00:35:05,520
we have the ability to

00:35:05,520 --> 00:35:10,024
change the narrative that can change
the conversation, that can change

00:35:10,108 --> 00:35:14,570
the perspective, that can then solve
problems that people didn't think of.

00:35:14,695 --> 00:35:17,281
And I think that's that's
why is so wonderful.

00:35:17,281 --> 00:35:19,492
The work
that you're doing with Spark Seeker.

00:35:19,492 --> 00:35:21,702
And I think we think
about the opportunity.

00:35:21,702 --> 00:35:23,663
It's changing those perspectives, right?

00:35:23,663 --> 00:35:29,252
So I'm going to give you
a few rapid fire questions at the end of

00:35:29,335 --> 00:35:30,711
tell me you got Spark Seeker.

00:35:30,711 --> 00:35:32,672
What does success look like?

00:35:32,672 --> 00:35:35,216
I don't define success anymore.

00:35:35,216 --> 00:35:36,467
You don't define success anymore.

00:35:36,467 --> 00:35:38,386
Okay, so I learned something.

00:35:38,386 --> 00:35:39,887
Okay, Tell you this.

00:35:39,887 --> 00:35:40,930
I think there's a duality.

00:35:40,930 --> 00:35:42,181
One thing I've learned to life

00:35:42,181 --> 00:35:44,767
and this is my true sense of wisdom,
for whatever it's worth.

00:35:44,767 --> 00:35:45,935
Okay?

00:35:45,935 --> 00:35:47,687
There's a duality to everything in life.

00:35:47,687 --> 00:35:48,855
Okay?

00:35:48,855 --> 00:35:51,440
Nothing is black or white or like,
all good or bad,

00:35:51,440 --> 00:35:52,817
there's a duality to everything.

00:35:52,817 --> 00:35:53,192
Okay?

00:35:53,192 --> 00:36:01,617
And I think the there was a time that I
had to prove over my head and I had to I'm

00:36:01,701 --> 00:36:05,246
thankfully in a place where I'm okay, like
I get to create Spark Seeker.

00:36:05,246 --> 00:36:08,332
And so for me if I chase

00:36:08,332 --> 00:36:11,711
success, the duality of that is failure
that's waiting for me on the corner.

00:36:11,711 --> 00:36:13,045
So I'm in the space of creation.

00:36:13,045 --> 00:36:17,008
Okay, You're in a space of creation.
I love that. I love that answer.

00:36:17,091 --> 00:36:22,680
But I'm going to challenges

00:36:22,763 --> 00:36:24,849
you while you're on disruption.

00:36:24,849 --> 00:36:26,058
This is what we do.

00:36:26,058 --> 00:36:29,270
What does the legacy look like
for Calista?

00:36:29,353 --> 00:36:32,773
What does it look like for Spark Seeker
Oof Legacy?

00:36:33,024 --> 00:36:33,649
Now there's a word

00:36:33,649 --> 00:36:37,695
really closely attached to success,
but still enough of a difference.

00:36:37,737 --> 00:36:38,988
I can go down that.

00:36:38,988 --> 00:36:41,991
Oh yeah, I like it

00:36:41,991 --> 00:36:42,575
when it goes.

00:36:42,575 --> 00:36:45,411
Legacy
look like legacy is how you define it.

00:36:45,411 --> 00:36:49,457
I was like, No,
I don't want to do that for you. Go

00:36:49,540 --> 00:36:51,417
We probably think very much like,
so it's okay.

00:36:51,417 --> 00:36:58,716
You probably re right legacy to me
looks like

00:36:58,758 --> 00:37:00,927
someone through our work

00:37:00,927 --> 00:37:06,515
feels empowered to build a family,
to think differently

00:37:06,599 --> 00:37:10,436
and to stand up for humanity's autonomy.

00:37:10,519 --> 00:37:14,190
And and

00:37:14,273 --> 00:37:16,025
I think we're all just here
to walk each other home.

00:37:16,025 --> 00:37:19,028
So, as you know, I'm going to be I'm
going to play a part in that.

00:37:19,111 --> 00:37:24,116
And so I hope the work we do
and the way I've lived inspires that.

00:37:24,200 --> 00:37:26,077
All right.

00:37:26,160 --> 00:37:29,497
What is it uncomfortable truth

00:37:29,580 --> 00:37:32,792
that you have that
most people would disagree with you on,

00:37:32,875 --> 00:37:37,546
an uncomfortable truth or let's say this,
what's a truth you have

00:37:37,630 --> 00:37:41,175
that most people would disagree
with you on and about most people.

00:37:41,175 --> 00:37:44,637
But I know that there's a huge just
yeah yeah it doesn't

00:37:44,637 --> 00:37:48,057
that it could just be like most people
like it's something that I think

00:37:48,266 --> 00:37:51,936
yeah in order for us to ever change
a woman's journey in this world,

00:37:51,978 --> 00:37:57,233
we need to start at home with our boys
and our boys are some of the more

00:37:57,316 --> 00:38:01,237
lost people that really need more support
and more guidance, more love.

00:38:01,279 --> 00:38:02,405
Yeah, I would agree with you.

00:38:02,405 --> 00:38:05,366
And I think there will be people that like
people that would disagree with you too,

00:38:05,366 --> 00:38:07,451
because there's a lot of

00:38:07,535 --> 00:38:08,661
people fight back on that.

00:38:08,661 --> 00:38:09,954
But it's very difficult.

00:38:09,954 --> 00:38:14,208
I can tell you that as a as a man
and as a boy, to feel like there's a place

00:38:14,208 --> 00:38:16,085
you could be
that you can't be vulnerable. Right.

00:38:16,085 --> 00:38:19,672
That's a very hard thing for a man
and boy.

00:38:19,672 --> 00:38:21,924
And I haven't come close to mastering it
either.

00:38:21,924 --> 00:38:24,260
But I know that's difficult because,
I mean, I raised

00:38:24,260 --> 00:38:28,222
boys and I can say that, you know,
I raised them to be kind to women.

00:38:28,222 --> 00:38:30,599
I raised them to be fair.

00:38:30,599 --> 00:38:33,811
I taught them what showing up
looks like for equally and,

00:38:34,061 --> 00:38:35,604
you know, I did all those things.

00:38:35,604 --> 00:38:37,440
And as a man, as a woman,
it mattered to me.

00:38:37,440 --> 00:38:39,984
And I didn't want it
to change that in the world.

00:38:39,984 --> 00:38:43,321
But I think with the world, the way it's
changing as rapidly as it's changing

00:38:43,321 --> 00:38:47,116
even though maybe it's going in the right
direction, that it's our perspective,

00:38:47,116 --> 00:38:49,285
it's who you speak to.

00:38:49,368 --> 00:38:50,995
But I do know that there's

00:38:50,995 --> 00:38:53,998
a lot more support right now

00:38:54,040 --> 00:38:56,959
in this moment in time for women

00:38:56,959 --> 00:38:58,753
to navigate these waters than they are

00:38:58,753 --> 00:39:01,756
for young boys
who are feeling extremely lost

00:39:01,797 --> 00:39:04,800
and they do not know how to speak

00:39:04,800 --> 00:39:07,887
and they certainly don't know
what manhood looks like.

00:39:07,887 --> 00:39:10,514
That's so true in this world.

00:39:10,514 --> 00:39:13,517
And until if we want our journey to change

00:39:13,517 --> 00:39:16,771
and I'm ready for our journey to change,

00:39:16,854 --> 00:39:18,856
I think, to our boys.

00:39:18,856 --> 00:39:22,276
Oh, my gosh, you've you've
you've walked down a line.

00:39:22,318 --> 00:39:24,862
There's so many things
I'd like to address on that is

00:39:24,862 --> 00:39:28,866
because I do think that's
one of the threads that get ignored.

00:39:28,908 --> 00:39:32,453
And you see, there's a reason
why a lot young

00:39:32,453 --> 00:39:36,123
boys are attracted to the people like
the Andrew Andrew Tate to those folks.

00:39:36,207 --> 00:39:36,415
Right.

00:39:36,415 --> 00:39:39,418
They speak to them about mental
health is right.

00:39:39,418 --> 00:39:41,629
They see on the rise a suicide, something,

00:39:41,629 --> 00:39:44,548
but they're speaking to something
that they're feeling on.

00:39:44,548 --> 00:39:47,551
And rather than just saying
all these men are jerks

00:39:47,551 --> 00:39:50,554
like you asked another question,
why are they feeling that way?

00:39:50,596 --> 00:39:52,973
And how can we figure out
how to address that?

00:39:52,973 --> 00:39:55,393
I think that's we just we just agree.

00:39:55,393 --> 00:39:56,727
All right.

00:39:56,811 --> 00:39:57,978
Final two questions.

00:39:57,978 --> 00:40:00,356
One, we'll just do it quickly.

00:40:00,356 --> 00:40:02,191
What's your what's theme in life?

00:40:02,191 --> 00:40:03,692
What's your theme at the end of the day,

00:40:03,692 --> 00:40:06,695
if you had to say the theme for you,
what would that be?

00:40:06,862 --> 00:40:08,697
What would that say and why?

00:40:08,697 --> 00:40:14,578
A theme, A theme, a saying might is
define yourself for yourself by yourself.

00:40:14,662 --> 00:40:19,792
The theme I would say the series of
your choices becomes the sum of your life.

00:40:19,792 --> 00:40:23,254
So choose well, I'm just. That's good.

00:40:23,337 --> 00:40:24,797
All right. Final question.

00:40:24,797 --> 00:40:26,424
You got a committee of three

00:40:26,424 --> 00:40:30,177
your advisers for life, business,
spirituality, whatever you want.

00:40:30,261 --> 00:40:33,597
Tell me who these three people are and why

00:40:33,681 --> 00:40:35,141
who your three people are.

00:40:35,141 --> 00:40:39,895
Yeah,
you're three advisors and three advisors.

00:40:39,979 --> 00:40:40,729
My mom.

00:40:40,729 --> 00:40:48,237
Okay.

00:40:48,320 --> 00:40:50,531
And I want to give my sister a sister.

00:40:50,531 --> 00:40:54,743
Okay, Well,
could be a worldly sister and a brother.

00:40:54,827 --> 00:40:56,036
All right.

00:40:56,036 --> 00:40:57,496
But you're not going to name anybody.

00:40:57,496 --> 00:41:00,499
It can be whatever capacity could be
personal, could be professional.

00:41:00,541 --> 00:41:02,918
But I do believe

00:41:02,918 --> 00:41:05,087
the people who are close to my life,
they may not be born with me,

00:41:05,087 --> 00:41:07,339
but they're my brothers
and sisters will live that way.

00:41:07,339 --> 00:41:09,467
And some of them are the top CEOs.

00:41:09,467 --> 00:41:16,807
Some of them are surgeons, some of them
in politics, and some of them are

00:41:16,849 --> 00:41:19,935
moms, and they're
all powerhouses in their own right.

00:41:19,935 --> 00:41:21,645
And I think depends on what it is.

00:41:21,645 --> 00:41:23,647
But if you have one of each of those,

00:41:23,647 --> 00:41:27,151
you've got an army by your side
and there's not much you can't take on.

00:41:27,234 --> 00:41:29,695
Callisto,
such a pleasure having you on this.

00:41:29,695 --> 00:41:32,573
So again,
it was great having you, all of you on.

00:41:32,573 --> 00:41:33,949
Thank you for listening.

00:41:33,949 --> 00:41:36,076
We're at Midwest Con 2023.

00:41:36,076 --> 00:41:38,204
I'm Rob Richardson, CEO of Disrupt Art.

00:41:38,204 --> 00:41:41,665
We are taping here
live at the Digital Futures building.

00:41:41,749 --> 00:41:45,753
Be sure to check out more of our episodes
at Disruption, our podcast.

00:41:45,753 --> 00:41:49,757
We have plenty of great conversations
just like this one with Calista Zacharias.

00:41:49,757 --> 00:41:51,842
And this will be
a great will be a great content.

00:41:51,842 --> 00:41:53,302
So save it.

00:41:53,302 --> 00:41:55,095
Also, you're going to learn
more about our company.

00:41:55,095 --> 00:41:57,056
You can look
you can look more in the comments

00:41:57,056 --> 00:42:00,184
and the descriptions
and learn more about her.

00:42:00,267 --> 00:42:00,893
Also, you can learn

00:42:00,893 --> 00:42:03,229
more about disrupt art
and what we're doing to

00:42:03,229 --> 00:42:06,774
really change the future of events, change
the future of engagement,

00:42:06,857 --> 00:42:10,152
and make sure that we're empowering
creators all across the world.

00:42:10,236 --> 00:42:14,615
But always, we thank you for everything
you do and keep disrupting It

00:42:14,657 --> 00:42:15,699
is awesome. Thank you

HOSTED BY

ROB RICHARDSON

Share This!

Disruption Now Episode 161: Igniting Human Connection


Kalista Zackhariyas isn’t just envisioning a social media future centered on genuine human interactions – she’s crafting it. As the dynamic CEO of SparkSeeker, she’s pioneering the way.


Learn more at https://sparkseeker.com/


CONNECT WITH THE HOST

ROB RICHARDSON

Entrepreneur & Keynote Speaker

Rob Richardson is the host of disruption Now Podcast and the owner of DN Media Agency, a full-service digital marketing and research company. He has appeared on MSNBC, America this Week, and is a weekly contributor to Roland Martin Unfiltered.

MORE WAYS TO WATCH

thin

Serious about change? Subscribe to our podcasts.