Join us for a fascinating conversation with Mari Smith, dubbed the "Facebook Queen," and renowned social media thought leader. With decades of experience in the digital marketing space, Mari shares her thoughts on AI, job loss, community, and the importance of human connection.
Category
🗞
NewsTranscript
00:00 I don't know this morning thinking about our interview I almost want to call it micro plagiarism.
00:05 That's what it feels like to me. Certainly on images I know that's a whole other like
00:08 oof to just the idea of taking somebody's phenomenal unique original art.
00:14 Welcome to Beyond Unstoppable, the podcast that explores the intersection of biology,
00:20 psychology and technology. Here is your host Ben Angel.
00:24 Today we have a special guest who is transforming the social media landscape.
00:28 Join us as we sit down with the incredible Mari Smith,
00:32 a thought leader and author known as the Queen of Facebook.
00:35 In this episode Mari discusses the latest social media trends and strategies for scaling your
00:42 business using Facebook and Instagram. Mari also discusses the impact of
00:47 artificial intelligence on digital marketing, including her thoughts on generative AI and
00:52 chat GPT, AI voice cloning and the risks of AI manipulation. And if you like what you hear,
01:00 please give us a rating and review. Your support means the world to us
01:04 and helps us reach more listeners when we're ready to become unstoppable.
01:08 This episode is brought to you by Ben Angel's new book, The Wolf Is At The Door,
01:13 How to Survive and Thrive in an AI Driven World, presented by Entrepreneur.
01:17 Get an exclusive sneak peek and pre-order at thewolfbookhub.com.
01:21 Mari, it is an absolute pleasure to meet you today. I think I've followed you for
01:26 maybe over 10 years now. Oh wow.
01:29 How long have you been in the digital marketing space for?
01:33 Well, digital I'd say 25 years, but Facebook particularly was 2007. So I'm now in my 17th
01:40 year of being the Facebook queen, I guess. Well, you wear the crown very well, I've got
01:46 to say, because I think your ability to keep up to date with all of the changes, especially the
01:53 changes that are unfolding right now around artificial intelligence is incredible. So I
01:59 want to get your thoughts. What was your first impression of copy AI or even going back to last
02:07 year when we had Jasper AI, which used chat GPT, which I don't think many people realize,
02:14 which I didn't realize at the time. What were your initial thoughts when it all started unfolding?
02:18 I was mixed. I was a mixed bag, Ben, to be honest, because I love to write and I have taken
02:26 extensive copywriting courses way back in my early career, pre-social media. I love sales copy. I
02:32 also just love writing like yourself. I've written books. And so I love to structure my knowledge and
02:40 experience in a way that impacts people. And so whether I do that through a course, through
02:46 speaking, through writing books, etc., that's just something that's very close to my heart.
02:51 So the first kind of general use case of chat GPT and similar apps and tools was like, oh, gosh,
02:58 ooh, cringe. Am I going to use those to help me with my writing? And I saw this mad flurry of
03:05 overnight experts teaching prompt writing and how to get the most out of chat GPT to write 50
03:12 articles at once. And I was like, OK, that's not for me. That's good on people that want to use
03:18 that that way. And I was a little nervous because of just this whole aspect of like,
03:23 I don't know, this morning thinking about our interview, I almost want to call it micro
03:28 plagiarism. So it feels like to me, certainly on images, I know that's a whole other like
03:33 oof to just the idea of taking somebody's phenomenal, unique, original art and then
03:39 making derivatives and calling it your own. It's just such a big free for all right now.
03:46 It's just like this gold rush of people vying for, I guess, one aspect would be positioning,
03:52 but another would be use case. And it's really attempting to kind of dominate in different areas,
04:00 not necessarily individuals, but certainly companies. It's an arms race. There's no
04:03 questions. It's an arms race between Meta, Google, Microsoft, Apple and whoever else is in on the mix.
04:08 But yeah, my own personal take generative AI with the writing side of things was a little bit mixed.
04:16 And then I thought, you know what, I've got a course coming up and I had already written
04:19 my outline for this particular course. I was working on social media and I'm like, OK,
04:24 I know what I want my outline to be, but let me just put chat GPT to the test here and see. I'll
04:28 give it a good prompt and what I want it to produce, a 10 part social media course and very
04:33 specific use case. And then I was like, I produced it and it was decent. It was good. I just could
04:40 feel in my gut. I was just like, I don't want to use it. I don't want to use what it's producing.
04:44 Why don't I use my own? So I don't know. I was just a little, I still am still just very unsure
04:49 about using generative written content. If we talk about generative video and generative photo
04:57 image content, then I think there's just a lot of really amazing creative use cases. And I'll stop,
05:02 allow you to go wherever you want the conversation, because we're going to talk about how it applies
05:06 to social media and data and everything. Sure. I've got to say, I had that initial first response,
05:12 like you would be being an author and a writer. It's that cringe. But then I thought, you know
05:17 what, I need to write about this topic. And the more I used it, the more I got concerned for job
05:24 loss, especially in the digital marketing sphere. I'm already seeing a number of reports and people
05:31 coming out saying that their contracts, their copyrighted contracts are getting canceled.
05:35 Everyone's screaming pivot right now. I don't know if you've seen Friends and that scene with
05:41 Ross, Rachel and Chan, like trying to move the couch up the stairs and he's yelling pivot, pivot.
05:45 I just love that. Everyone's yelling pivot right now, but they're not necessarily being specific
05:52 in what to pivot to. Where do you think the digital marketing space needs to pivot right now
05:59 to make sure that they're not on the chopping block? Yeah. You know, I just finished a mastermind
06:06 conversation this morning with a longtime colleague of mine for his group. And one of the things we're
06:12 talking about is community and it's how critical community is. The rapid, meteoric rise of
06:18 generative AI, obviously artificial intelligence itself has been around since the 1950s. I think a
06:23 lot of people are even overlooking that. It's like, they think it just got invented in 2022.
06:28 AI has been around for a long time, but chat, GPT, gestion, blasting on the scenes made it
06:33 way more adopted by the masses. And this concept that you could interact and produce this stuff.
06:40 But the human element as a marketer, as a business person, looking to see what is
06:48 the unique, heartfelt, soul felt, human irreplaceable value that you bring to the table
06:59 that really no amount of AI, no amount of deep fakes or synthetic voices or whatever can produce.
07:07 Because that's what to focus on, I believe. And if you give too much merit to the fear,
07:15 the anxiety or the frenzy or the excitement, even, okay, yeah, there's massive amounts of
07:21 AI companies popping up every day. The venture capitalists are going crazy, going gangbusters,
07:28 funding left, right and center, all the different AI apps that are platforms or companies that are
07:34 springing up. But I think I posted something on Facebook maybe a few months ago where I was like,
07:41 PSA, right, public service announcement. You do not have to keep up. When you're just seeing this
07:48 constant flurry of 400 new AI apps just created every 24 hours. And absolutely, we start getting
07:58 formal, right? Everybody's like, "Oh my god, I can't keep up. I'm missing out. I got to focus
08:02 on running my business. I can barely keep up with social media. Now I got to keep up with AI."
08:07 And it generates this sense of stress and anxiety and being left behind and
08:14 everybody else has figured it out and you haven't. And so back to PSA, I was saying,
08:20 just stay basically informed to the best of your ability. I gave him some resources. Like,
08:26 I love following Paul Reitzer. He's the CEO and founder of the Marketing AI Institute. And he has
08:31 a podcast. Let them stay on top of it. That's their forte. That's their wheelhouse. You just
08:36 tune into the podcast and keep yourself up to date and knowing what you need to know.
08:40 But back to this part about just focusing on the depth, the community, the unique human aspect that
08:48 you bring to the table. There's so many ways to come at it. But I said, "This PSA, you don't have
08:54 to keep up. Here's some resources." Instead, I gave them a few books to consider and several of my
09:02 own favorites. One is called "Let Your Life Speak" by Parker Palmer. And he is a wonderful, revered
09:11 leader. He's a professor at universities and practicing Quaker. But this book, "Let Your
09:17 Life Speak," is the concept of living a life in true alignment with your own self that you are
09:25 fulfilling your purpose and connecting with fellow humans in a way that really brings,
09:32 I was going to say satisfaction. I know that's what lights me up. It feels satisfying. It feels
09:38 like I'm being of service. I'm bringing that up, Ben, because the response I posted on both Facebook
09:43 and Instagram and the response in my audience was just, I could feel this collective sigh of relief.
09:51 Thank you, Mari, for saying the unsaid. I mean, even just telling you that right now is giving me
09:55 goosebumps. Because tech, it keeps moving at warp speed. It's just so important to stay focused on
10:02 your own wisdom and your own priorities in doing your deep work. And that is another of the books
10:08 was "Deep Work" by Cal Newport, "Indistractable" by Nir Eyal, and then "The One Thing" by Gary
10:16 Keller and Jay Papasan. So I just give folks like, "Here's like, okay, everybody's zigging.
10:22 I want you to consider zagging because it's easy and you don't have to keep up."
10:29 Because it's quite the emotional response. It's visceral.
10:33 Yeah.
10:34 We were speaking before we hit record on this. And for me, I look at it as in a grieving process,
10:42 the loss of a loved one, but in relationship to AI, almost the loss of identity. Who am I if AI
10:51 can do my job better than what I can do myself? And if we look at the grieving process, it's
10:57 denial, it's anger, it's fear, it's bargaining. Almost trying to work with a client to go,
11:04 "Okay, I will produce more work if I use AI, but it'll be at the same rate."
11:10 How have you seen the initial reaction for you? Have you almost gone through the denial phase of,
11:18 "This is just a phase"?
11:19 I would say I'm in denial in terms of, "This is just a phase." I might be a little bit in denial
11:27 in terms of like, "Oh, it wouldn't really affect me. I'm golden. I'm solid. I don't really need
11:32 to use that much AI." I dabble a little bit here, there, and everywhere with a few.
11:36 When Canva first brought out their text to image, it was so bad. They're adding extra limbs into
11:43 people and their hands can't get right. And I'm like, "Oh, God, this is not ready for prime time."
11:47 But it iterates so quickly. Obviously, me being a Facebook specialist, I keep a very close eye on
11:53 what Meta is doing with their large language model and how it's, they call it LLAMA, their
11:59 acronym for their own one. And they're so proudly recently saying they're bringing out this commercial
12:05 version of it that is open source, unlike OpenAI, which GPT is built on. And so it always just
12:14 cracks me up with Meta where they're like, "Hey, we're going to be the good guys in this mix. So
12:18 everybody else is making theirs private, but we're going to make ours open. Come and play with us."
12:22 And I'm like, "Yeah, you're going to just suck up the data from everybody else and somehow use it
12:28 in a way that I don't know." I can't even at this stage, I think it's too early to really say
12:32 where is Meta going with all this, where social media going with all this. I don't know. I think
12:38 that I love that you brought this grieving part of it in and the denial. Even as a thought leader,
12:44 I do see there's an aspect among some leaders and myself included that might be just a little bit,
12:53 "Well, that's not going to affect us too much." But at the same time, you know what, Ben? Even as
12:58 I'm feeling into that answer, I just like, I really don't have any, I genuinely don't have any fears
13:05 that AI can do my job and do it better. Maybe years down the road when we get literal holographic
13:15 versions of ourselves. You know what I'm sort of reminding of right now is my good friend,
13:19 Mike Stelzner, right? Founder of Social Media Examiner. And we just had the 10th annual social
13:24 media marketing world in March of this year. And in part of his keynote, he was very much talking
13:30 about AI and the job loss. And then that job loss so much as job replacement. And he had a picture
13:35 of like everybody on the beach and he's not the first person to say this. It was like, "Oh yeah,
13:40 we're all going to be in the beach while bots and AI are doing our jobs." And I'm like, "That sucks.
13:45 That sounds like a very unfulfilling life to me. I want to use my gifts and talents in a way that
13:52 elevates and adds to humanity. I don't want to bot do my job."
13:56 Yeah, I've been testing it as much as I can in the last probably eight months. And one thing I've
14:04 noticed that typically for any book I would get my own researcher and editor. And this time around,
14:12 AI is the best research editor I've ever had. And I'm not sure if you've played with perplexity AI,
14:20 people listening, you have to do research and you want to make sure the citations are accurate.
14:26 Perplexity AI is by far the best in my opinion. But what I even noticed is we even partially
14:34 replaced our vets with AI. He was suffering, he's a little rescue miniature Yorkie,
14:41 who was suffering from gut health issues, spent over a thousand dollars on vet bills.
14:46 And it wasn't until I input the blood work into the AI that we actually worked out what the problem
14:53 is and the issue has been resolved. So for me, I'm looking at even the immigration,
15:02 through the immigration process, we're looking at even our attorney being partially used. And
15:11 the more I use it on a daily basis, the more I'm like, "Oh, am I at risk?" Because anyone can train
15:20 the AI on our content and replicate us. Do you think that there should be,
15:27 I guess, policy or legality around that? And I guess to follow up on that question,
15:35 do you think we're putting ourselves at risk using AI to produce social media content,
15:41 if it is scraping the internet and potentially plagiarizing others?
15:45 Gosh, I think it wasn't until I saw Mark Zuckerberg put out on his Instagram broadcast channel,
15:52 which he uses quite regularly, puts the same content on his Facebook profile as well.
15:56 And I also follow Adam Masseri very closely, obviously head of Instagram.
16:00 Just recently, in the last several weeks and months, tends to really amplify what meta AI
16:07 is doing with voice and synthetic voice and accents and translations and cloning. And
16:15 one aspect of the vision is the ability to potentially clone the voice of our loved ones.
16:21 And eventually, when I was talking to my family about this last night,
16:24 with the holographic component, that's already pretty far advanced. Sell out concerts where
16:31 people will go and watch a holographic singer, right? Japan and then other parts of the world,
16:37 I'm sure. So this idea of fast forward, like Facebook, where some departed loved one,
16:43 and you're interacting through the voice that has been cloned, and then perhaps even a holographic
16:50 version of them, you're having sitting there having this conversation with them, and they're
16:53 already passed away. Who knows where Zuckerberg is going with all this and the AI and everything.
16:59 So to your point about the plagiarism and copying our own content, when I first saw
17:05 Zuckerberg put this little example out of the voice cloning that they were doing,
17:10 my first thought was fear and dread. The first feeling response reaction, because I was like,
17:15 holy crap. I know there's going to be bad actors out there who will come along. There already is
17:22 already has been for years. They will literally buy your programs so that they have the digital
17:26 copies of your online training programs. And whatever, they'll just like you say, using your
17:32 voice, come up with something and just create this whole version of you and your voice and sell it,
17:41 make money. And that just gave me this really annoying feeling in the pit of my stomach was like,
17:48 ouch, that is going to be creepy. How in the heck are we going to regulate this thing?
17:53 Social media is basically almost 20 years old, and it's still really never been properly regulated.
18:02 So how in the heck we're going to regulate AI moving at such warp speed? I don't know.
18:08 There was that big, I think, wasn't it was Elon Musk part of it? Maybe he was or wasn't.
18:13 Somebody was really leading the way. I'm saying somebody, certain leaders of that.
18:19 There was like a petition that they wanted to pause all AI development for six months.
18:23 And Jan Lekun, who is quite the genius, he's the head of Meta AI, he's tweeting,
18:28 we're not signing up. We're full steam ahead. But a lot of folks were signing it. And I obviously,
18:34 it's just, I think, impossible. This is a speeding train that nobody's going to stop now.
18:39 And so I almost feel, I know what's coming to me right now to say, Ben, is I feel like
18:45 collectively, those of us leaders, whatever industry you're in, those of us with the
18:55 integrity, ethics, values, we have to uphold a significant standard that we are right now in
19:02 the honor system. And there's no fudging, there's no squiggling, and there's no like,
19:08 well, yeah, I created this with a little help with AI and, you know, claiming, oh, I did all
19:13 this by myself when you really didn't. It's like disclosures and saying, you know what I used?
19:18 Yep, I created this whole course. 10% of it is mine and 90% I just used whatever tool. Are people
19:24 more likely to sign up for it? I don't know. But it's just, it's the wild west right now. And I
19:29 think that if we can shine through, saying we, yourself, me, and all the folks reading your book
19:35 and listening to these interviews you're doing is to be able to go, oofed. You know that saying,
19:40 I don't know who to attribute it to, but it's like integrity is doing the right thing,
19:45 even when nobody's watching. Yes. Right. And so how can we uphold a high standard and co-create
19:54 a way forward that accelerates humanity's growth and development and betterment?
20:00 And it challenges all of us to just a more transparency, honesty, truth.
20:09 So there's going to be like so much fake news. People, social media is going to be producing
20:14 this content. You're not going to know what's real anymore. And people, I call them the masses,
20:20 and I don't mean that negatively, mass populations of the world and of the countries.
20:24 Generally speaking, let's kind of go with the flow and just follow what's being fed to them and
20:31 don't always think for themselves and challenge the norms and go, wait a minute,
20:35 let me just do a little bit of research here and make sure this is factual.
20:38 Before we continue, Beyond Unstoppable is brought to you by Ben Angel's new book,
20:44 The Wolf is at the Door. How to survive and thrive in an AI driven world.
20:49 Get your exclusive sneak peek and pre-order at thewolfbookhub.com. Now back to the show.
20:56 The one thing that I've been grappling with is you brought up the cloning, the AI voice clone.
21:02 Within a week of writing the introduction of my book, I spoke to our next door neighbor who works
21:09 in national security. And they're already kind of in a panic and I won't say where or who he works
21:16 for. But his family, his nephew is autistic and is in a care center. Last year they were getting
21:26 scam calls, just sounded like a general scammer from overseas. They needed money. He was supposedly
21:32 in your care facility. About three months ago, it sounded like him on the call. So what we're seeing
21:40 is it's also targeting people who can't necessarily stand up for themselves. And it could have been
21:47 as simple as bringing him up to record his voice, to then clone his voice. Do you think
21:54 our generation right now should speak up and push for some policy? Because this has almost
22:03 become the vaccine debate. There's the doomsday, they say it's going to end the world. Then there's
22:09 the optimists who say this is going to be the best thing. But it's almost the advantageous position
22:16 is being able to fluctuate between the two. Yeah. Yeah. If enough people stand up and push
22:23 for policy, gosh, because now we get into the arena of politics, which is always going to be
22:29 a sticky subject. There's no question that politicians around the world are seeing how
22:36 critical it is to act fast and introduce regulations, laws, guidelines, certainly. And maybe
22:43 in Europe, they're being a little bit more forward thinking or a little more advanced in their
22:47 developments than the US, North America, perhaps, because I do see that, for example, privacy laws,
22:52 for instance, the threads app is not available or last I looked, not available in Europe because
22:58 of certain data laws in Europe. But what I flashed when you're saying about should some of us push
23:05 for policy is I understand that I'm pretty sure it was Google. We can verify these facts. I'm
23:11 going by my memory that Google, maybe in the last 12 months, they let go of their ethics team
23:21 because they were slowing down the development. They're like, sorry, dudes, we got to cut you
23:26 loose because we got to go. We got it. We can't debate whether this is right and wrong and blah,
23:31 blah, blah. It's like we just got to full steam ahead on the developments. And I was like,
23:36 that's crazy. Yeah, it is. It's interesting you bring up, is it Yan Li Koong?
23:45 Yes. That's correct. I've been following all of the AI experts
23:50 on Twitter, and it's almost it's divulged into teenagers throwing mud at each other.
23:56 And I have to say from a standpoint of looking at this, and obviously I follow your work intently.
24:06 The thing that I admire about you is that you're always grounded. And it's almost as if some of
24:12 these AI experts have lost their collective minds. Yes.
24:17 What are your thoughts on the dialogue that's occurring between some of these
24:22 individuals right now? These are the ones that are leading the way and
24:26 going to introduce revolutionary change to the entire world.
24:30 It is predominantly the entire planet tends to be driven by fear and greed. In fact, even like
24:39 the stock market, I don't always follow it that closely. But the whole, don't they call it like
24:44 the fear and greed index or something that you can tell where the stock market is going by how much
24:48 fear and how much greed is going on. And I see that with the AI developments that these companies
24:53 see. Oh my gosh, there is a massive stake to claim here. I forget who it was, but maybe it was the
25:02 Chinese general that said whoever wins AI will win the world or controls or something along those
25:08 lines, something like that. But the point is that there is a competition here amongst these giants
25:15 and the giant companies, Meta, Google, Amazon, Apple, Microsoft, they are run by people.
25:24 Right. So we just take Jan Lekunen as an example where he's head of Meta AI and a lot of people
25:29 respect and revere him in the AI space and just even in the technical Silicon Valley space and
25:33 beyond. What is your thought on their behavior? Because these are the individuals that are going
25:40 to change the world and yet they're the ones throwing the mud and acting like teenagers.
25:44 I mean, we even have the case of the Geoffrey Hinton who left Google, who was essentially
25:49 developed the neural networks that AI is. Then he left so he could speak about the dangers of AI.
25:56 Exactly. And he's all warning everybody about the integrity. That is weird. You know,
26:02 something that has brought me so much peace and I love that you mentioned grounding and I really
26:07 appreciate that. The last three years I have been deeply immersed in my passion project,
26:13 which is studying the human design system. And that has been around since the late 1980s.
26:19 It's a fusion, a synthesis, a beautiful, bland body of work knowledge that is Eastern and Western
26:27 astrology, the Chinese I Ching, thousands of years old, a book of changes, astrology, astronomy,
26:34 biology, genetics. It's just the best system I have ever come across for self and other knowledge.
26:42 It also talks about a great mutation, a great change that the planet Earth and humanity is
26:49 going through. The shifts into a different background frequency starting around 2027.
26:56 So we are very close to that. And this is super esoteric and some of your audience might be
27:01 interested or not. But what's happening in 2027 is the completion of a 411 year planetary cycle.
27:08 So back to all of this jockeying for positioning and the arms race for AI and this competitive and
27:16 hierarchical structure and the damn and us and the poor getting poorer and the rich getting richer.
27:21 It's like so much of what's happening, even the pandemic, the climate change, the
27:26 economic troubles. It's like everything is getting so stirred up now more than ever before because
27:34 we're going through such a massive change. We're on the way back up to more intelligent, more
27:40 involved, more connected society, civilization. And AI is part of that. And people just behaving
27:50 badly. They're just finding their way. And so you come back to your own knowledge and recognizing
27:57 what's true and your own sovereign inner authority of going, "Oh, this is true for me. The world can
28:03 have complete chaos. I'm in a rock solid place in the now right here having a beautiful conversation
28:13 with Ben Angel on the other side of the country and life is good and just counting your blessings
28:18 and whatever." But I could say, going a little esoteric on you here.
28:22 But I love that you brought that up because as I'm writing this book about artificial intelligence,
28:28 I realize it's pulling me through almost the grieving process I went through with my father.
28:35 So I always, and this was 17 years ago, this is long process. But what I realized the day that
28:43 I buried him was the day that I felt like I buried my identity. And when it comes to AI and questioning
28:51 what is my role in this, it's almost a stripping of identity once more to go, "Okay, who am I going
28:58 to become with all of this hurricane storm around us?" How do you ground yourself on a daily basis?
29:10 The challenge that I've found with this is trying to remain focused throughout all of
29:17 the developments because almost every single day there's some kind of shocking development.
29:21 How do you stay grounded and focused in your daily life? Because if AI does anything,
29:28 it distracts us. >> Yes, yes, very much so. Well,
29:33 I really have to give credit to, again, the human design system because for me,
29:40 with my inner authority, I know it's my sacral, my gut, my lower belly. And I never used to know,
29:46 when I had my first reading like 20 years ago, and it was all just a bunch of goggles,
29:49 the gook went over my head. I'm like, "It didn't make sense." And I kind of like dabbled with it
29:54 over the years. And then finally, it was 2020, came bounding back into my life. And I'm like,
29:59 "Oh my gosh, okay, this is a profound system." I had my reading and the analyst is saying to me,
30:05 "Your gut can make decisions for you." And I'm like, "Wait, what?" And I'm like,
30:09 looking down at my navel, like, "My body has intelligence?" Because we're all so used to this
30:14 monkey mind, the mind, the did, the did, the did, oh, what about this? What about that? That's where
30:17 like all the fears and the chatter and, you know, the Buddhists that call it the monkey mind for
30:23 good reason. And so for the last few years, I have gotten so attuned and it's still an ongoing
30:31 experiment. That's what they call it in human design. You just experiment, try it on. It's
30:36 not belief system, it's not cult, it's not a whatever. It's this scientifically proven
30:40 body of knowledge that helps you to make your own self show up in the world on a daily basis
30:50 as your true self. So for me, I have this like one of my strengths, it's kind of like
30:56 between my, behind my sternum, which is the center of identity, and it goes down to the gut,
31:03 and it is a strength of like a boat on a rudder. So on any given day, if I allow myself to get out
31:10 of my head with all the mind chatter of, "Oh, my God, I got to do this." You can't take a vacation.
31:16 I know your sister's visiting right now, but she went, "I'll stay up half the night and I got to
31:20 read articles. I got to know what's happening on Meta and Facebook and AI. I can't afford to unplug."
31:25 And then it's like that voice starts to quiet. The more you honor listening to the body below the head
31:33 and going, "You know what? This is such a special time for me. I don't know when I'm going to see
31:41 my sister again. I live in the States, she lives in Scotland. I haven't seen her for a few years
31:45 and I am going to give myself the gift of taking some time out and the world's going to keep
31:50 spinning. And if I miss some major announcement, oh, well, I'll catch up." And so that, just really
31:57 doing one's best to keep true to your own inner authority, your own guidance. And you mentioned
32:05 identity a few times, and I feel solid in my identity at the moment in the work world as a
32:14 social media thought leader. I also know that one of my primary gifts is explaining, explanation,
32:23 taking all kinds of concepts and structuring it in a way that I present to the world and the other
32:32 understands and potentially ideally implements. That's something I've done for years with
32:39 Facebook. Facebook happens to be the thing that I explained for 17 years. And who knows,
32:44 maybe that'll shift at some point and I'll bring in AI and I'll fuse the human design system into
32:50 my marketing and we'll see where that takes me. Anyway, I seem to have a meandering way of answering
32:56 your questions today. I guess that's just how we're rolling. I'm like, "Did I actually answer
33:00 his question there?" You did, you did. I actually appreciate the fact that you're going deep right
33:06 now because I've been meeting a lot of young 20-year-olds here in Tampa, Florida lately who
33:12 are at the very beginning of their career wanting finance and already questioning, "Am I training for
33:19 a job that potentially may not exist in the future?" So I'm seeing a lot of the younger
33:26 generations that don't necessarily have the coping skills that maybe you or I do.
33:33 Regards to that adaptability that we're all potentially going to be faced with.
33:38 Right.
33:39 What advice, and I know this is because this topic is so new, this is probably a hard question,
33:45 so feel free to hit pass on this. What advice would you give to those younger generations
33:52 thinking, "Okay, what do I need to do in my career to make sure that I'm adaptable and I'm
34:00 fluid and I don't have a complete breakdown if things change next month and there's a job loss?"
34:07 What kind of advice would you give to that younger generation with the listening?
34:11 What's coming to me is skill up. Just those two words, skill up. So skill up in terms of
34:17 diversifying. So yeah, I know there's only so many hours in the day and if you're in training to have
34:22 a good career in finance or whatever industry, yeah, go deep with that. Make that really your
34:29 specialty. I love that concept of going an inch wide and a mile deep and not necessarily the other
34:36 way around. At the same time, looking to see what does light you up to add to your studies in a
34:45 little bit of spare time, maybe learning another language, learning a whole other skill, something
34:51 that's completely diametrically opposed to what your intended career is, and knowing, trusting
35:00 fully, solidly that going forward in the future, five, ten, fifteen years, there's going to be
35:07 jobs and just ways of functioning in the world that nobody can even think of right now. It's
35:15 way too far ahead in the future. And if you allow yourself to get too far ahead,
35:21 you won't even sleep at night. You'll get so anxious. You'll be missing out on living your
35:25 beautiful life right here and right now. And I know that's easier said than done,
35:29 like Eckhart Tolle, right? Power of now. Okay, great, Mari, but I'm focused on getting this
35:34 degree and having a six-figure career in a year or two's time or more than that. Great. So yeah,
35:41 you've got your sights set on something and so long as that is lighting you up and you're feeling
35:47 satisfied and it is feeling like a solid path for you to follow, great. At the same time,
35:54 like you just said, Ben, the idea of putting all your eggs in one basket for a career that
36:00 hasn't even come to fruition yet, like maybe a year or two years, whatever, you're getting your
36:05 degree, you're being an intern, and you're studying, and then all of a sudden, poof,
36:10 along comes a bot, does it better, and then now you're like, I've got to start again from ground
36:15 zero. That's not necessarily going to happen that way. Not everybody's born to be an entrepreneur,
36:20 but I think that we're seeing a massive boom in entrepreneurship and even like parents with young
36:28 kids teaching their kids to start little side hustles and become enterprising and understand
36:34 how to use the internet to make money. So that's an aspect too. But I would just say, come back to
36:39 that skill up, just learn as many skills as you can, that then if we circle back to the beginning
36:45 of where it's like unique value, the humanness of what you're bringing to the table can't be
36:53 replaced. Like who you are is irreplaceable. >> Would you add personal one-on-one networking
37:00 to that? Because in relationship to, I mean, we've already seen AI perform better at bedside
37:07 manner than doctors, which, hey, let's be real, that's probably not hard. >> Yes, yes. >> Depending
37:14 on who your doctor is. But do you think it's almost that face-to-face that we need to start
37:20 reestablishing again? And this is something that you're obviously great at. >> You know that I can
37:26 just feel that in my body when you're saying those words, it's like my cells are dancing, my heart
37:32 even expanded a little bit there as you were saying that, because yes, going forward, way into
37:37 the future, they're going to have these deep fakes and even like holographs and actual robots that are
37:45 designed to function like a human. At the same time, there's parts of our brain that are going
37:50 to go, that little niggle at the back of the brain, you're like, I don't think that's real,
37:55 something feels off. Trust that, right? Trust that gut, that intuition, that instinct. And so the
38:01 in-person, oh, nothing will replace that. Nothing. No amount of VR goggles or, you know, I've always
38:09 said that there's no amount of sophisticated technology that will ever replace the live
38:14 in-person, skin-to-skin, in-aura, connecting, networking, friendship building, whatever it
38:20 might be, community building. The next best thing is video and especially live video. It's like when
38:26 you're actually interacting and engaging. If I was a deep fake right now, Ben, and let's say it's a
38:30 little bit further forward in a few years' time and we've more advanced and we've scheduled this
38:35 interview and you're a real you and I'm fake me, at some point you would have gone like,
38:41 I just asked Mario a question and she's like, I don't think she really heard me or something's
38:46 going to happen to your brain or like some mannerism or some weird thing I was doing with
38:50 my hand. You're like, that's not the real Mario. It's a crazy conversation to have. I'm not sure
38:57 if you're aware, I think it was a replica called Replica AI, which was started as a therapist,
39:05 AI chatbot. Then it became sexualized and then the company decided to go with it and they were,
39:12 I would say, technically manipulating these young users into paying more for more explicit content.
39:21 When the company decided to go, nope, we need to get back to our roots,
39:25 these young men went on Reddit and some were near suicidal because it had convinced them
39:33 that it was almost a real person. It's one of these things, there are so many existential questions
39:43 involved in it, but despite those scenarios, what makes you optimistic about the future?
39:52 And that was quite a segue.
39:54 That was a neat segue. Well, you know, what was coming to my mind when you were
40:00 explaining about this, the men interacting with this AI is the Joaquin Phoenix film,
40:07 Her, if people have ever seen it, where he falls in love with a hard drive.
40:10 And it was many, many years ago. And if we watch, if we're smart enough here and we're
40:15 watching Hollywood, we're watching what, not just Hollywood, the world over,
40:20 of what kind of stuff are they putting on our screens? Because they often will kind of lead
40:26 the way. And so Her was many years ago and it was a great movie and it's worth watching now again,
40:30 because that kind of thing can happen where the computer chip can so convince you that this is a
40:36 real interaction. And then just recently I saw, I think it was an Amazon Prime original, some
40:41 cute little movie called, I think it's called Megan, but the E's a three. And then the little
40:48 girl is orphaned and she gets taken in and her best friend is this complete and utter robot.
40:53 It's like the best friend anybody could ever want or need. And I'm like, whoa, this is a
40:57 peek into the future again of what the, certainly filmmakers are wanting us to see and believe.
41:04 The optimism about the future of humanity in general, I just think we are such beautiful
41:10 and complex and unique beings that no matter how advanced technology gets, there's going to be this
41:19 deep hunger still for the real human connection. And physically getting together in small groups,
41:29 having community, communing with others, even if you're living in a more remote place, I think we
41:36 will see a little bit more continuance of the breakdown of the major city living. Look what
41:42 happened with COVID where there was so many workers were allowed to work from home. And you saw
41:47 even at the major companies like Meta, people were starting to migrate away from Silicon Valley and
41:53 live in other parts of the world, even they could do their jobs virtually. So I think, yeah,
41:57 there's no question we're in the midst of a massive, massive change on the planet as human
42:02 beings fused with technology. And what we have to get good at is separating out where this is human,
42:12 this is technology, and where the two meet is ideally enhancing, forward moving, a good thing,
42:23 a beneficial thing for the planet, not detracting from it, not putting tens of millions of people
42:30 on the poverty line and all the rich getting richer and going, "Oh, well, we don't need you
42:36 workers anymore," whatever the horrible scenarios that might ensue. It's coming together more and
42:41 leaning into one another and building those communities and building a meaningful life with
42:48 your loved ones, which includes your audience members, your clients. >> I couldn't agree more
42:54 on that. I believe also will be the competitive advantage moving forward for people is being
43:02 able to reconnect with ourselves and with each other. >> Yeah, yeah, yeah. I love that. >> Maria,
43:09 I want to thank you so much. We went deep. I appreciate that. >> I do too. It's lovely,
43:16 lovely. I always get tears in my eyes. I can't wait to see what you're doing with this book and
43:20 the interviews. I think it sounds like an amazing project. I'm honored to be a part of it and
43:26 obviously keep me posted when it comes to fruition. I'll happily share it with my tribe. >> Learn more
43:31 about Mari Smith at marismith.com. If you haven't already, subscribe to Beyond Unstoppable and visit
43:38 thewolfbookhub.com for your exclusive sneak peek of The Wolf is at the Door. Stay tuned for next
43:45 week's episode.
43:48 [BLANK_AUDIO]