How AI May Meddle With The Election Year

  • 8 months ago
The rise of generative AI tools like ChatGPT has increased the potential for a wide range of attackers to target elections around the world in 2024, according to a new report by cybersecurity giant CrowdStrike.

Experts on AI explain how it may meddle with the 2024 election.
Transcript
00:00 2024 is going to be the first AI election that we've had in the United States and we should expect
00:08 that there will be attempts to use AI to attack our election infrastructure, to further poison
00:16 the information environment.
00:26 The 2024 election will be more riskier than before because of artificial intelligence.
00:34 It's unprecedented. One way that folks are very focused on are deepfakes and deepfakes could be
00:41 used to make it look like candidates or other officials are saying something that they never
00:47 said. We've already had that impact happen in other countries' elections. In Slovakia, we saw
00:53 a deepfake audio used against one of the candidates in the weekend before the election.
00:59 And now we have a lot of powerful generative AI tools. You can use this model to generate
01:05 ads, images, and videos. And also, you can actually ask the model to produce more engaging
01:14 content. The internet and the social media are filled with AI-generated content. And then,
01:21 because of the video, those content is generated by AI models. If the AI models have certain biases,
01:28 the biases will manifest in those content. In previous elections, we do see social bots were
01:37 used to spread misinformation. There are concerns that AI may make it easier for
01:44 bad actors to fool election workers by imitating people who work in elections in a phone call or
01:51 in an email and getting election workers to provide information that they shouldn't be providing.
01:58 Voter suppression is another area where I think people are worried that AI may be helpful. We have
02:04 a history in the U.S. of robocalls being used to confuse voters about when to vote, social media
02:13 being used to provide false information so maybe that voters don't show up or are scared to vote.
02:18 I am worried that AI may become yet another block where people don't really understand what AI is
02:23 and what security measures there are in place to combat the dangers of AI and then start to believe
02:30 that AI has taken over our elections or maybe rely on AI to give them information. I used all of the
02:37 main commercial chatbots and asked them questions about the elections in different states to test
02:42 them and all of them provided some wrong information. A little bit of a flip side to
02:47 this that I think is really important which is something called the liar's dividend. If you think
02:52 back to the 2012 election which was Romney versus Obama or the 2016 election which was Trump versus
03:01 Hillary Clinton, in both of those cases there were audios that leaked of one of the major candidates.
03:09 There are 47 percent of the people who have voted for the president no matter what.
03:12 I moved on her actually you know she was down in Palm Beach. If something like that comes out in
03:17 the future that is legitimate it's going to be easier for candidates to deny that a video or
03:23 audio of them is actually real. It's critical to stop, double check yourself and do research to
03:34 find out whether or not what you're seeing on social media is accurate. A lot of election
03:38 officials and other politicians on social media have been verified. The watermarking and requiring
03:46 some kind of authentication of video audio images that appear on social media long term is going to
03:54 be really critical to help. This is a reality we have elections happening around the world over the
03:59 next year. All of the security measures that we've talked about that we need to put in place to
04:05 defend our elections we have to double down on them and we have to invest further over the course
04:09 of the next few months to make sure that the system is as resilient as possible.
04:14 [Music]
04:16 [Music]
04:18 [BLANK_AUDIO]

Recommended