PM warns AI safety should be world priority

  • last year
Rishi Sunak says mitigating the risk of extinction because of artificial intelligence should be a global priority alongside pandemics and nuclear war. Addressing The Royal Society in London, the prime minister also says he wants to be "honest" with the public about the risks of AI and emerging technology but that it does not present “a risk that people need to be losing sleep over right now". Report by Covellm. Like us on Facebook at http://www.facebook.com/itn and follow us on Twitter at http://twitter.com/itn

Category

🗞
News
Transcript
00:00 I genuinely believe that technologies like AI will bring a transformation as far-reaching
00:06 as the Industrial Revolution, the coming of electricity, or the birth of the Internet.
00:11 It also brings new dangers and new fears.
00:15 So the responsible thing for me to do, the right speech for me to make,
00:20 is to address those fears head-on.
00:22 Get this wrong, and AI could make it easier to build chemical or biological weapons.
00:28 Terrorist groups could use AI to spread fear and destruction on an even greater scale.
00:34 Criminals could exploit AI for cyber attacks, disinformation, fraud, or even child sexual abuse.
00:41 And in the most unlikely but extreme cases, there is even the risk that humanity could lose control of AI completely,
00:49 through the kind of AI sometimes referred to as superintelligence.
00:53 Indeed, to quote the statement made earlier this year by hundreds of the world's leading AI experts,
00:59 mitigating the risk of extinction from AI should be a global priority,
01:04 alongside other societal scale risks such as pandemics and nuclear war.
01:10 Now I want to be completely clear.
01:12 This is not a risk that people need to be losing sleep over right now.
01:17 And I don't want to be alarmist.
01:19 And there is real debate about this.
01:21 Some experts think it will never happen at all.
01:25 But however uncertain and unlikely these risks are,
01:29 if they did manifest themselves, the consequences would be incredibly serious.

Recommended