Meta CEO Mark Zuckerberg on Tuesday announced changes to content moderation on Facebook and Instagram that have been long sought by conservatives. FRANCE 24's Sharon Gaffney speaks to Henry Peck, Senior Campaigner on Digital Threats at Global Witness. He says that Meta, like X, is in a "race to the bottom" and is washing its hands of the responsibility to preserve a level of trust and safety and to provide robust content moderation.
Visit our website:
http://www.france24.com
Like us on Facebook:
https://www.facebook.com/FRANCE24.English
Follow us on Twitter:
https://twitter.com/France24_en
Visit our website:
http://www.france24.com
Like us on Facebook:
https://www.facebook.com/FRANCE24.English
Follow us on Twitter:
https://twitter.com/France24_en
Category
🗞
NewsTranscript
00:00This is Apropos. The European Commission is rejecting an assertion by Meta chief Mark
00:08Zuckerberg that EU data laws censor social media, saying the legislation only requires
00:14large platforms to remove illegal content. The company announced this week that it's
00:19scrapping its fact-checking service in the US in favour of user-generated community notes
00:25similar to those found on Elon Musk's ex. Zuckerberg added that he would work with Donald
00:30Trump to push back on censorship around the world. Morgan Air has more.
00:37The EU absolutely refutes claims of censorship. That's according to a spokesperson for the
00:42European Commission. Mark Zuckerberg, the CEO of social media giant Meta, accused the
00:48bloc of passing laws which institutionalise a suppression of free speech. He made the
00:52comments during a video message announcing the company would get rid of independent fact-checkers
00:57in the United States. Meta said the decision was to make freedom of expression a priority.
01:02The EU says it does that.
01:05We don't ask any platform to remove any lawful content. We just need to make the difference
01:10between illegal content and then content that is potentially harmful. This is an in-between
01:15category, harmful for minors, harmful for our democracies. There we ask just platforms
01:21to take appropriate risk mitigation measures.
01:24Meta has sent a risk assessment report to the bloc on how the changes to its fact-checking
01:29system would work if they were to be introduced in Europe. Such research is required under
01:33EU law.
01:35Working with such independent fact-checkers can be considered as an effective way to mitigate
01:41systemic risks stemming from very large online platform services. For example, risks relating
01:46to disinformation, risks relating to electoral processes or civic discourse.
01:53The tech giant plans to replace its fact-checkers with a community notes system. Users will
01:57moderate content themselves, but some experts warn this will remove safeguards.
02:03Meta by doing this are retreating from fact, they're retreating from truth. And by switching
02:09to a community notes model, they're effectively trying to capture tidal wave in a bucket and
02:15it's not going to work.
02:17The changes highlight the different regulatory approaches to social media companies around
02:22the world and the role these companies play when it comes to moderating and policing content.
02:28To discuss the implications of the Meta move, we're joined now by Henry Peck, he's a campaigner
02:33on digital threats at NGO Global Witness. Thanks so much for being on the programme
02:38with us this evening. Firstly, your NGO says this decision by Meta will make it more dangerous
02:44for women and for minorities to speak out online. What are your specific concerns?
02:51We're concerned that by essentially outsourcing its content moderation responsibilities to
02:57users, Meta in the US is allowing for a far greater degree of harmful content to be asserted
03:05against women, against people of colour, against scientists, climate activists, immigrants,
03:13people who already are disproportionately targeted with harassment and abuse online.
03:18And in this move, it's essentially washing its hands of a responsibility that all social
03:23media platforms have, which is to preserve a level of trust and safety and robust content
03:30moderation to make sure that there is a space in which people can participate inclusively
03:37and can have constructive, warm and positive discussions. Instead, we're worried that what
03:45this move is signalling is a real backsliding, a race to the bottom following X, in which the
03:53platform experience will degrade and decline. And this has serious implications for our spaces for
04:01civic discourse, as well as our democracies themselves. And what kind of implications
04:06are you foreseeing then? We had a Nobel Peace Prize winner earlier also warning that this
04:11decision means that extremely dangerous times lie ahead for journalism, for democracy and for
04:17social media users right across the board. What implications do you see, Henry?
04:22Well, Meta is no stranger to online harms becoming real offline dangers. We've seen
04:31it itself take some responsibility for the impact its platforms had on the genocide in
04:40Myanmar, also ethnic violence in Ethiopia, and plenty of cases of teenage self-harm and suicide
04:50that have related to its platforms. So we're worried that this could presage a new era of
04:58genocidal violence elsewhere, of serious individual harm and injury, and the targeting of
05:05users in a really vile way that leads to more deaths and violence.
05:14And given those concerns that have already existed when it comes to Facebook and Instagram,
05:18also for several years now, how effective have fact checkers actually been about monitoring
05:25what is being put onto these sites?
05:29Sure. Well, at Global Witness we've been studying the abilities of these platforms to properly
05:35moderate content for four years now and have found many shortcomings and issues with their
05:41current systems. But by abandoning systems that have taken huge amounts of labour and work and
05:49resources and instead outsourcing them to users, this isn't a signal for an improved refining
05:58process. Instead, Meta has the resources, based on its huge revenues, to properly invest in content
06:06moderation, training and funding content moderators who, by their very nature, are
06:14independent, whereas users, you know, there's the possibility for the weaponisation of bias
06:20over the long term, and providing adequate psychological support as well. Some of the
06:25material that comes up is really extreme and requires treating safely, whereas users may not
06:32be prepared for the same encounters. And then, of course, there's also the possibility for falsehoods
06:39and lies to spread, which, as Facebook and Meta's algorithms are designed to boost sensational
06:46content, this could become an even greater risk going forward.
06:51Yeah, because Henry, already back in August, your NGO had been criticising Meta for its decision to
06:56close its transparency tool, CrowdTangle, saying that that had been a very useful method to help
07:02understand how misinformation and how disinformation are being circulated. So,
07:08do you think that this latest decision by Meta, it's part of a wider pattern of restricting access?
07:16Yes, that was a huge, the closure of CrowdTangle was a huge disappointment and backstep for
07:24transparency, access to information, for journalistic investigations around what was
07:30actually taking place on Meta's platform, something that is clearly in the public interest.
07:35So we see this as yet another backstep in terms of Meta's commitment to public safety, to a positive
07:42user experience on its platforms, and also fear that by signalling this move as the incoming
07:51Trump administration prepares to take office, it also has implications for how Meta may respond
07:57to pressure from governments elsewhere and in parts of the world where there are not community
08:03notes, teams or users accustomed to doing so, but instead more authoritarian governments that may
08:11instead put more pressure on Meta to cave to their will.
08:16Yeah, because we saw Mark Zuckerberg as part of his announcement saying that he planned to work
08:20with Donald Trump to push back against what he described as foreign governments going after
08:26American companies to censor more. What do you think he was trying to get at there?
08:31I think it was a blatant attempt to cosy up to the Trump administration, having previously been
08:39threatened by Trump with being imprisoned for life and having previously banned Trump on his
08:46platform four years ago. I think he does swing whichever way the political winds are blowing,
08:52and it was a cold business decision and that there wasn't so much principle in it as political
09:01convenience.
09:03And does it suggest to you that other businesses are going to do likewise to,
09:08in effect, shift their priorities to adapt to the incoming administration?
09:14Yes, it doesn't presage well for other companies reacting to the incoming administration and
09:21cosying up to themselves. We've already obviously seen X and Elon Musk go all in with Trump's
09:29campaign and found it's been effective for them. And we're seeing this as a risk for other big
09:35tech companies as well.
09:37And Henry, some free speech groups have actually welcomed this decision by Meta,
09:42saying that users actually want social media platforms that don't suppress political content.
09:47So what would your response be to that?
09:51Absolutely. I think there's a lot to be said for allowing for robust, healthy political
09:56debates on social media. And as we've found, the processes for moderating harmful content
10:03are imperfect. But there are ways that they can improve them without throwing out this
10:10system that they've built up over years, using independent fact checkers who are professionals,
10:15trained trust and safety staff, content moderators. And instead, they're essentially
10:22opening the sluice gates to a much worse user experience. X's experience has declined
10:28substantially since its takeover and become a cesspool of hate. So I think there's a concern
10:34that Meta's platforms will come to look more like that, and actually the user experience
10:39will be diminished.
10:41Okay, Henry, we'll leave it there for now. Thank you so much for joining us on the program
10:45this evening. That is Henry Peck. He's a campaigner on digital threats at the NGO
10:50Global Witness. Well, that is it from us for now.