Online harassment, bullying … How do social networks protect us from it all?
Brut spoke to Antigone Davis, Vice President and Global Head of Safety at Meta (Facebook Instagram, WhatsApp) about what goes on behind-the-scenes at her job ...
Brut spoke to Antigone Davis, Vice President and Global Head of Safety at Meta (Facebook Instagram, WhatsApp) about what goes on behind-the-scenes at her job ...
Category
🤖
TechTranscript
00:00Take a teen, you're at school, people are making fun of your outfit all day long.
00:05You come home, someone hits you up on one of our apps, they say, nice outfit.
00:10We can't know that that is bullying, but you do and it hurts.
00:15And so we have a filter where you can, it's called hidden words, where you can put in
00:19words that are particular to what someone might be doing that might be harmful or you
00:24might feel sensitive about and remove those and those comments will be filtered out of
00:28your experience.
00:29So what we want to do is create the right rules, but give you the tools so you can personalize
00:34that experience and make it a fun and positive and safe place for you.
00:38How does it work behind the scenes?
00:40Are there beyond AI, are there people, humans who sit behind screens and kind of sift through
00:46these reports or has it all become automated?
00:49No, it's a combination of both artificial intelligence and technology and human review
00:56in order to moderate the platform and create a safe space.
01:00If you think about something that we, there's some content that's obviously going to be
01:04violating our policies, some content it may not be as obvious.
01:08The artificial intelligence helps us to find the content that clearly is violating our
01:13policies and remove it, but there are instances where it's not clear, where there's the only
01:17way to know for sure is for a human to review and take a little bit more time.
01:22Correct me if I'm wrong, but I think sometimes we can have the impression that Meta and other
01:26platforms act in response to things that are happening rather than in a proactive way.
01:34Would you agree or disagree and how is Meta trying to proactively tackle any issues that
01:41users might have on their platforms?
01:44What I would say is that we are constantly looking at our platform to see how we can
01:49create a safe and positive experience.
01:52We also are taking feedback and responding to people's experience and their experiences
01:57are evolving in the way that the user apps are evolving and so as a result, we evolve.
02:04Sometimes that can take time, which is why I think sometimes people think that we may
02:07be being reactive.
02:09When we build a tool, we do a lot to build that tool.
02:13We talk to experts, we talk to users, we may test the tool to see what makes the most sense
02:20or works the easiest.
02:21How does the testing work?
02:23I'll give you an example.
02:24When we were building our parental supervisory tools, we sat down with parents and with teens
02:30and we wanted to understand from parents what are the things that would be useful to you.
02:35We heard things like, I want to be able to manage my teen's time, I want to have a sense
02:39of who they're engaging with, who's following them.
02:45That was the guidance that we heard from them.
02:46We also talked to experts about how do we build these in a way that teens are going
02:50to want to use them and they're not going to push them away.
02:54How do we create that right balance?
02:56Taking all of that information in, we then started developing our tools and what we often
03:02do is we'll launch our tools out to a smaller group of the population, test it, see how
03:07people respond, maybe make some changes to those tools to ensure that we get those tools
03:13in a place that people find them valuable.
03:16How can people find out about all of these tools?
03:20Where concretely can they go on the platforms to find out what they can use to limit their
03:25time or block out certain kinds of content?
03:29If you can sum it up in two sentences, where can they go to find the tools they need to
03:35stay safe?
03:36First of all, first and foremost, go to your settings.
03:39You're going to find a lot there.
03:41The second place that I would say to go to is the family center.
03:45Particularly for parents, you will find tips there.
03:48You'll find explanations for all our tools there.
03:51I would say go to your settings, go to our family center, and you can always go to our
03:58safety center.
03:59Those three things should really give you a full picture of the things that are out
04:02there.
04:03How then do you explain that despite all of these tools, there's still so many people
04:07who feel like they are being harassed or can be harassed on META's platforms?
04:15The internet is a very big place.
04:19Just like in the offline world, you are going to have people who are going to try to do
04:24bad things on the platform.
04:26There's no chance that you'll get to a place where there's zero, just like in the offline
04:31world.
04:32There's no chance we're going to get to a place where there's zero opportunity for somebody
04:36to do something that would bother somebody or harm another person.
04:41That said, we are using artificial intelligence to find things at scale.
04:46We're using human review to find things that are specific.
04:50We're giving people controls to make sure that they can personalize where we may not
04:54have the information.
04:56We're fundamentally committed to creating that safe and positive experience and learning
05:01from our users where we can be doing better and evolving.
05:04Many studies have shown that spending too much time online can have a negative impact
05:09on our mental health, on self-esteem, etc.
05:13So would META be in favor of users spending less time on social networks?
05:20What is in it for META to encourage users to spend less time on their platforms?
05:25It's one of the, I think, bigger misconceptions about what we want for our users.
05:32We want people to have a positive experience, to be having a positive experience when they're
05:38on the app, but when they walk away from the app to feel like they had a positive experience,
05:41that they haven't spent too much time on the app, to be able to manage their time.
05:46We have different tools that we have.
05:48So Take a Break, for example, this is something that we'll use where if someone's been down
05:52for a while, a teen's been on, we'll say, hey, would you like to take a break?
05:56We have another tool called Quiet Mode.
05:59So to really encourage teens, if it's nighttime and they want to turn it off, there's something
06:05that you can use called Quiet Mode.
06:07It allows them to send an auto-reply to people, letting them know that they're taking a break
06:12and they are not on their device, but enabling them when they go back to see who may have
06:18been in touch so they don't have this fear of sort of missing out on what was happening
06:23online.
06:24We've done that knowing that that would reduce the time, and in many cases, it has reduced
06:29the amount of time that people are spending on that platform.
06:32The idea is that here, just as much time as is humanly possible, it's actually have
06:37a positive time, have a positive relationship with our app.
06:41Enjoy the time that you're on, build the community, discover the things that interest you.
06:46But walk away, go explore, take that out into the offline world too.
06:50We really want to give people that opportunity to kind of manage and feel good about their
06:54time online.