Presenter: Wasim Khaled, Co-founder and CEO, Blackbird.AI
Category
🤖
TechTranscript
00:00Hello, everyone.
00:01Great to be here today.
00:01My name is Waseem Khaled.
00:03I am CEO and co-founder of Blackbird AI.
00:06Blackbird is a technology company
00:08that helps some of the world's largest organizations
00:11understand and respond to disinformation
00:14and misinformation.
00:17This is something that we're all dealing
00:18with on a daily basis, and this is a very important week
00:21to be here discussing this topic.
00:23Now, I am here today to talk about perception, lies,
00:28and AI.
00:30But before we do that, we're going
00:31to do a little experiment, since I've got such a great crowd
00:35here today.
00:36I'm only going to tell you two things.
00:39I'm about to play an audio phrase for you,
00:41and it's going to repeat.
00:43You're going to have to take my word for it
00:46that it never changes.
00:48I'm also going to flash a couple of phrases on the screen.
00:53So ask yourself, what do you hear?
00:57What do you hear?
00:58What do you hear?
00:59What do you hear?
01:00What do you hear?
01:01What do you hear?
01:02What do you hear?
01:03What do you hear?
01:04What do you hear?
01:05What do you hear?
01:06What do you hear?
01:07What do you hear?
01:08What do you hear?
01:09What do you hear?
01:10What do you hear?
01:11What do you hear?
01:12What do you hear?
01:13What do you hear?
01:14What do you hear?
01:15What do you hear?
01:16What do you hear?
01:17What do you hear?
01:18What do you hear?
01:19What do you hear?
01:20What do you hear?
01:21What do you hear?
01:22What do you hear?
01:23What do you hear?
01:24What do you hear?
01:25What do you hear?
01:26What do you hear?
01:27What do you hear?
01:28What do you hear?
01:29What do you hear?
01:30What do you hear?
01:31What do you hear?
01:32What do you hear?
01:33What do you hear?
01:34What do you hear?
01:35What do you hear?
01:36What do you hear?
01:37What do you hear?
01:38What do you hear?
01:39What do you hear?
01:40What do you hear?
01:41What do you hear?
01:42What do you hear?
01:43What do you hear?
01:44What do you hear?
01:45What do you hear?
01:46What do you hear?
01:47What do you hear?
01:48What do you hear?
01:49What do you hear?
01:50What do you hear?
01:51What do you hear?
01:52What do you hear?
01:53What do you hear?
01:54So the thing is, no one likes to be manipulated.
01:58And so when we talk about disinformation and misinformation, it's really not about what
02:03is true and what is false.
02:04It is about manipulation of the narrative.
02:09And what we are experiencing in today's information ecosystem is nothing less than a battle of
02:14the narratives.
02:15What I've always thought about since 2017 when we founded the company, a cyber attack
02:21on human perception.
02:24Now, every organization has a narrative.
02:30And every narrative has a counter-narrative,
02:33sometimes dozens of counter-narratives.
02:36What everyone has to understand here,
02:38it is the conflict between the narrative
02:41and the counter-narrative that creates an opening for threat
02:46actors to polarize communities and society against one
02:50another, but also to drive some sort of sometimes
02:53ideological or financial gain or motive.
02:57Now, people get sideswiped by what
03:01we call narrative attacks on an almost weekly basis now.
03:06So what is a narrative attack?
03:09So the way we define this is any kind of assertion
03:12in the information ecosystem that
03:14can drive harm against a person, place, or thing.
03:19Now, a thing could be something as innocuous as an HR policy
03:23or an ingredient in a product.
03:25But what you have to understand about a narrative attack
03:27is that a single post or comment can turn into something
03:30that creates immense damage, both financially
03:33and reputationally.
03:34Threat actors are just waiting for this opportunity.
03:38They use every tool available in the toolkit
03:41to make sure that the narrative they want seen
03:45are seen by as many people as possible.
03:48Now, they use bot networks.
03:50They use techniques to make sure that the online groups that maybe
03:55you want the least attention from are the ones you get the most
03:57attention from.
03:58They use generative AI to fabricate entirely warped realities.
04:02And they can do it fast.
04:06And these are not black swan events any longer.
04:09Since we started the company, you'd
04:10see one of these maybe once a year.
04:12Now we see them almost every week.
04:14I'm sure you recognize some of these.
04:16Billions and billions of dollars of damage
04:19have been caused by narrative attacks.
04:22There's just a name for it now.
04:26We also have generative AI.
04:28So when it comes to generative AI,
04:31we see fabricated images being used by threat actors
04:36to create entirely curated narratives that they cannot
04:41tell the difference between false and.
04:44What I think is most important about this
04:46is it's low cost.
04:47It's scalable.
04:49And when you're on LinkedIn seeing all the cool things
04:52that generative AI can do for your company,
04:53know that these threat actors are also there watching
04:57and waiting and scaling their information operations
05:00campaigns.
05:01So it is no surprise that the World Economic Forum
05:06has declared misinformation and disinformation
05:08one of the largest global threats, the number one
05:10global threat in 2024.
05:13And this is across economic, technological,
05:16and societal categories.
05:18Now I spend a little time thinking
05:21about what is an accessible case study that I could show
05:26this audience to really help them understand
05:28how a narrative attack works.
05:31We're, as you can imagine, in the midst
05:33of a lot of analysis for a lot of hairy topics and situations.
05:38Geopolitical risk, wars, vaccines, anti-vax.
05:45But there was one particular report
05:48that we put out that pulled me into more newsrooms
05:50and got me in front of more cameras
05:52than almost anything else.
05:54It's probably a hard one to guess.
05:55It's not wars.
05:56It's not vaccines.
05:58It was Taylor Swift.
06:02So stick with me for a moment here.
06:03If you're not following some of the weirdness,
06:08a Guardian poll recently showed that one in five Americans
06:12believe that Taylor Swift is a government asset and a PSYOP.
06:20Well, how did that come to pass?
06:24Well, it started with the usual suspects.
06:26It was a conspiracy theory in the dark web,
06:28moved to public telegram channels,
06:30landed on social media.
06:31It then moved into the mainstream media.
06:33And everyone started picking it up.
06:35I mean, this was a week before the Super Bowl.
06:37They needed to talk about Travis Kelsey and Taylor Swift.
06:41Before anyone knew it, millions of people
06:45had been exposed to these narratives
06:48and to these conspiracies.
06:49And, you know, Marketing 101, you see something seven times.
06:53It sinks in.
06:54That's kind of how these work.
06:56But what was really happening in the information ecosystem?
07:01That's what's critical.
07:02And that's what we really focus on at Blackbird.
07:04So what you're seeing here is the information ecosystem
07:08as it was behaving during the lead up to that narrative.
07:11These jellyfish-like structures that you're seeing here,
07:14it's what a narrative actually looks
07:16like if you have the ability to visualize it.
07:18This is our constellation platform.
07:21The tentacles are actually connections
07:23between multiple narratives and sub-narratives
07:26and counter-narratives.
07:28It's a pretty big mess unless you know how to read it.
07:30Now, the red that you see here are actually bot-like actors.
07:34Very important quality to understand.
07:36I'll talk a little bit more about why in a moment.
07:40This also enables you to pull out executive summaries
07:43to help you make strategic decisions
07:45across comms and threat intelligence
07:47and a host of other areas.
07:48Everything from insider threats to M&A,
07:51our platform has been utilized to make
07:53stronger, faster decisions.
07:55Now, here you can see the narrative was almost 90%,
07:58the narrative risk.
08:00Bot-like activity, 25-ish percent.
08:04Important to know.
08:06If there are bots propping up narratives,
08:08you might not want to take the bait and react to it.
08:12Today you can make that decision.
08:14And finally, cohorts.
08:15Cohorts are like-minded tribes of online actors,
08:20the ones you might want to avoid.
08:21Now, here we saw NFL fans and Swifties
08:25coming to Taylor Swift's defense in the narrative.
08:28However, there were Russian state actors
08:30amplifying these narratives.
08:32Why is that?
08:32Well, as it turns out, almost anything
08:35that gets enough eyeballs, you will have threat actors
08:39amplify those narratives, especially
08:42when you can polarize broader societal divides
08:45to create friction and ultimately erosion of trust.
08:51Now, not every brand is Taylor Swift.
08:55To be resilient like a narrative attack like this
08:57when it hits you, you can't always bounce back.
09:00And we've seen companies lose $20, $25 billion
09:03in a matter of weeks, and they never really bounce back.
09:08Give you another example, Silicon Valley Bank.
09:11A lot of people call this the first social media
09:13driven bank run.
09:14Now, I want to be really clear.
09:16If you had narrative intelligence
09:18to understand what was happening that week,
09:20it wouldn't have stopped the bank from failing.
09:22That's a whole different topic.
09:25However, let's say that you banked with Silicon Valley,
09:28or you were partners, or you were just
09:30looking to understand what was happening.
09:31You would have had a big, big early warning
09:34if you were able to understand what was really happening.
09:37Now, in this case, part of the narrative,
09:39it's just an interesting point.
09:41What you see coded in blue is a cohort
09:44that we label as anti-capitalists.
09:47They were driving narratives that
09:49created a lot of fear, uncertainty, and doubt.
09:52And that cohort will often appear
09:54in almost anything related to an M&A, banking narrative,
09:57et cetera.
09:58And then when you dig in, you also
10:00see all of the cohorts, the bot-like activity,
10:03and everything else that was happening within that ecosystem.
10:07Now, I want to be clear that Constellation is a tool
10:12that teams like Threat Intel Analysts and comms teams
10:15can use to better make decisions.
10:17We're also going to talk about one more tool
10:19that we have in just a few minutes that anyone can use.
10:24I also want to talk a little bit about deep fakes
10:26in generative AI.
10:28Now, I hear from our customers on an almost regular basis
10:31that they want to understand if there's
10:33deep fakes out there manipulating their narratives.
10:35That's an important thing to understand.
10:37But in a world where almost everything
10:39is going to have some amount of analysis or manipulation
10:43by generative AI, using legitimate tools, by the way.
10:46It could be Office 360 or all of the tools
10:48that integrate gen AI.
10:49Just knowing it's gen AI is not enough.
10:51Let's assume that all the content
10:53is going to have a percentage of gen AI analysis around it.
10:57It's important to know more so how that deep fake impacts
11:01the public and how far it's spread.
11:03Because if a deep fake falls in the forest
11:05and no one is there to hear it, it
11:07may not be worth responding to.
11:11Tools make this very easy.
11:13Things like this.
11:14I'm going to show you some magic.
11:18It's the real thing.
11:22I mean, it's all real.
11:28It takes 10, 20 seconds of video, a social media
11:31post with audio, and you can generate something
11:33like this in a matter of minutes.
11:35Anyone can do it, but you have to also see
11:37how it moves through networks.
11:41Now, you may be wondering, is my organization
11:45susceptible to these types of attacks?
11:47Can we see these things coming?
11:50I mean, I'm just going to tell you
11:51after talking to hundreds of organizations
11:53for the past six, seven years, more than likely you're
11:56not going to see it coming.
11:57And our customers are always asking us, are we next?
12:02There's three questions you can ask.
12:04Maybe if you're a CEO or a board member or someone
12:07who's in charge of communications,
12:09you ask these three questions to your customers.
12:11The first, if we are under attack,
12:14how will I understand the contagion-like effect
12:18of narratives?
12:19How fast they're spreading, particularly when they're
12:21mutating and evolving.
12:25Then, how am I going to understand
12:27which of those narratives are bot-driven, synthetically
12:29amplified?
12:30You all are probably familiar with share of voice.
12:33Well, you have to be worried about synthetic share
12:35of voice.
12:36And finally, who and what are the hyper-agenda-driven groups
12:42that are propping up this narrative?
12:44And are they collaborating, which
12:47can cause even more harm to my organization?
12:49If you can't understand these kinds of risk signals today,
12:52fast, at the moment of crisis, you are fighting blind.
12:58So high-fidelity, high-resolution technology
13:01must be deployed.
13:06I know I've thrown a lot out today.
13:09And it can sometimes seem insurmountable or daunting
13:12when you hear this all at once, particularly
13:13if you're not really looking at this space.
13:16In fact, it's been said that we, as a species,
13:20have primitive brains, medieval institutions,
13:24and godlike technology.
13:28There has never been a time where that is more true.
13:31We've seen it this week almost more
13:34than we have in the past six, seven years.
13:39In kinetic warfare, in cyber warfare,
13:42and now in information warfare, defense
13:45has always trailed offense.
13:48It might even feel like whack-a-mole sometimes.
13:50But if you have nothing in place to defend your organizations,
13:54and frankly, yourself, against manipulation,
13:58you will not be safe.
14:03Thank you very much.