• last month
Mark Zuckerberg ha provocado una revolución en las comunicaciones sociales desde que en el año 2004 creara la plataforma de Facebook. Su idea de hacer un mundo más abierto y conectado le ha convertido en el fundador de una de una de las empresas más influyentes del mundo.

Category

😹
Fun
Transcript
00:0014 years have passed since its launch and Facebook is probably the most powerful company in the world.
00:10With 1.4 billion daily users, which exceeds the population of any country in the world except China, its history is the faithful reflection of the American dream.
00:22The goal of Facebook is to connect people around the world. It is a commitment that we take very seriously.
00:29However, as a result of the latest scandals, hidden problems have come to light on Facebook.
00:35I think we agree, Mr. Zuckerberg, that everything that has happened is serious. He has already admitted that it has been a case of abuse of trust and I want him to assure me that it will not happen again.
00:44Today we are going to infiltrate among the content moderators of Facebook to find out how they decide what we can see on their platform.
00:54If they are eating them, which is what it seems, because they are on a plate with sauce and others, we consider it food.
01:02Every day, billions of contents are uploaded and as a result of this, some may have a huge impact.
01:09A video of someone who is dying does not always have to be eliminated.
01:14How these decisions are made has always been a mystery, until now.
01:19You cannot talk to anyone about the issues related to work, okay? Nor spread any type of information, under any concept.
01:28From violence to incitement to hatred, we are going to find out how Facebook approaches the most radical content.
01:36He takes care of disgusting Muslim immigrants. I think there is no problem.
01:43Oh my God! These are self-injuries. The cuts seem even recent.
01:49The point is to do with the videos too explicit that are uploaded systematically.
01:54He is beating him. Here he even gives him a knee in the face. I would say that it cannot be defended.
01:59And with the special protection that the far right enjoys.
02:03It is clear that with that large number of followers, Facebook is making a lot of money.
02:07Does Facebook provide its benefits to security?
02:10If you start censoring everything, people lose interest in the platform and the final goal is to make money.
02:16The goal of Facebook is the mission of every technological company, to bring people from all over the world.
02:36Facebook has published this year a set of rules that refer to the content that allows to show its platform.
02:42The mission of the Department of Operations of the communities in which we work is to generate interactions and trust in Facebook worldwide.
02:51People who have worked as content moderators tell us the problems that exist when applying these rules.
02:57So we are going to infiltrate the largest Facebook moderators center in the UK, located in Dublin.
03:04The company has subcontracted a large part of this activity. Our reporter is working for the company CPL Resources.
03:12In a section of the rules, it is stipulated that you cannot talk to anyone about issues related to work, okay?
03:17You cannot disseminate any type of information under any concept.
03:22Both CPL and Facebook are hermetic in that regard.
03:26As I tell you, I don't care how drunk you are, you can't reveal any type of information about what you're doing here.
03:35I'm going to be with you to take a first look at what you're going to have to do as content moderators.
03:44Basically, we are going to rely on Facebook's policies to decide whether we publish or delete the content we have.
03:53Every week, users report on millions of content that they believe should be deleted.
04:01So these are some examples.
04:04We are going to have to remove these two photos because the nipples are not covered by anything.
04:10Both the nipples and the areolas are clearly visible, so we can't leave them.
04:22There is no regulation that limits what can appear on a social network,
04:26so the decision to publish the content or delete it depends solely on them.
04:35We started with universities in the United States and then we launched it in the national institutes.
04:40Then we started to introduce it in some American companies,
04:44and once we opened the website in September 2006, it began to grow exponentially internationally.
04:51I met Mark when I was 22, and since then and for the next three years,
04:57I was one of the people he went to for advice.
05:03Roger McNamee, a capital risk investor, was one of the first investors on Facebook and Mark Zuckerberg's advisor.
05:13He was completely convinced that it was possible to connect the whole world
05:19and that that had to be his goal.
05:22We are convinced that the more people publish,
05:25the world will be a much more open place where people can better understand what happens to the rest of the people,
05:30and that is the destination we want to reach.
05:32Facebook made me proud more than anything else I had done throughout my professional career,
05:38until I understood that it was all going to happen.
05:42In this presentation, certain images appear that are not very pleasant.
05:47One of the most delicate sections of Facebook's content regulation is explicit violence.
05:53I want you to know that if any of them causes you discomfort, you can go outside.
05:58Go for a glass of water or whatever you want, I don't care.
06:02We have three possibilities.
06:04Ignore it.
06:07Eliminate it.
06:09Remove it from Facebook.
06:11Mark it as unpleasant,
06:13which is to apply certain limits as to who can see that content and how it is shown.
06:18If the content is marked as unpleasant, a warning is added.
06:22To be able to see it, users have to click on the warning.
06:25Children under the age of 18 should not be able to access the content,
06:29but in reality it is hanged on the page and is available for viewing.
06:33A video of someone who is dying does not have to be eliminated in all cases.
06:37Sometimes it can be marked as unpleasant.
06:39Because it is not eliminated.
06:41Why?
06:42Exactly.
06:43Policies can change, but basically Facebook has to decide where to draw the line
06:48when it comes to the content that can be uploaded and not.
06:51And it has been agreed that the videos of people who die, but do not include any of these things,
06:56can be marked as unpleasant.
06:58So that people can continue to share it, to raise awareness, or for any other reason.
07:07So far we have seen violent deaths.
07:09The next policy refers to abuse of minors.
07:12These are videos that show child abuse, and are defined by the following contents.
07:18Adults or animals.
07:20Child abuse.
07:22Child abuse.
07:24Child abuse.
07:27Adults or animals that hit, kick, or slap a child on several occasions.
07:35Also enter the contents in which an adult is seen inflicting burns or cuts,
07:40or slapping a baby or a child too small to stand,
07:45grabbing him by the wrists, ankles, legs, arms, or neck.
07:52Child abuse is always marked as unpleasant.
07:56We never eliminate it or ignore it.
07:58Let's see an example.
08:00In this video, this man kicks the child several times, so it is marked as unpleasant.
08:10Do you recognize the images?
08:12Of course, it's a video.
08:15What video?
08:17It is a video sent to us by one of our former collaborators,
08:21to see if we could help him.
08:26They informed us about this Facebook page,
08:31and they asked us for help because they had seen it and were scared.
08:36They did not know who to turn to.
08:38At the beginning of the video, you can see a two or three-year-old boy,
08:43and a man who is talking to him, shouting at him.
08:48Then he starts hitting him and punching him,
08:53slapping him, stomping him, and kicking him, and obviously the video is cut off.
08:59You don't know anything at all.
09:01Except that you have a creepy feeling,
09:04because you have just seen a man beating a little boy.
09:18When you watch the video, you know perfectly well that this child does not get up at all,
09:22nor is he playing.
09:25You know he's been hurt.
09:31You know it, and you can't get it out of your head.
09:34You don't stop thinking about the images or the boy.
09:37Where is he? What is happening to him?
09:41Did you inform Facebook about the video?
09:44At first, yes.
09:46They answered us, but we didn't know what to do.
09:50At first, yes.
09:52They answered us that they did not violate any of their conditions.
09:58It was absurd.
10:00They left it hanging.
10:03They didn't remove it.
10:10I see little defined this.
10:12It is curious that ...
10:14Mark as unpleasant.
10:16Yes, because it is as if, in theory, it is still on ...
10:18On the platform.
10:19Our reporter asks another of the moderators
10:22why certain explicit contents are marked as unpleasant instead of being removed.
10:27Do you know why they keep it there?
10:29To improve the user experience.
10:31It is clear that people of all ages use it.
10:34Imagine that a few nine-year-old boys get in and see a beating they are giving someone.
10:39They do it to protect the user, the youngest.
10:43But in that case, don't you think they should remove it?
10:46No, if they do, they would be censoring in excess.
10:49Okay.
10:50If you start censoring in excess, people can lose interest in the platform.
10:54Sure.
10:55The goal is to make money.
10:58In the next two days after its publication on Facebook,
11:01the video of the boy who is being beaten was shared more than 44,000 times.
11:07From the perspective of Facebook, this is basically the crack,
11:11the drug of its product.
11:13It is a radical and truly dangerous content format
11:17that captures the attention of the people who interact the most with the platform.
11:22Facebook saw that by focusing its business on advertising,
11:27it needed users to see those ads,
11:31and the way to get it was to spend more and more time there.
11:36Facebook has realized that radical users are the most valuable,
11:41because one person from one extreme or another
11:44is going to cause the interaction of another 50 or 100.
11:47It is interested in having as much radical content as possible.
11:56We show the results of our investigation
11:58to the vice president of Facebook policies.
12:02The shocking content does not make us earn more money.
12:04That opinion shows that it is not well understood how the system works.
12:07But it is those contents that make people stay on Facebook,
12:11and with that it increases the probability that they see the ads,
12:14which is what gives money.
12:15That's why they are good for you.
12:18It is not the experience that we want to offer,
12:20nor the one that the users of our service are looking for.
12:22There is a minority that wants to abuse our systems
12:25to publish all kinds of offensive content.
12:28But I do not agree that most users are looking for that kind of experience,
12:33and it is not what we are trying to offer.
12:38I got together with a couple of administrators
12:41and we began to review that page.
12:43We were three people trying to find something on Google,
12:46see if the images were recent,
12:48if they had been moving around the Internet for a while.
12:51We had no idea who we could turn to.
12:54And if you report it to Facebook, they do nothing.
12:57So the three of us were looking everywhere
12:59to find some kind of information about the video
13:02and to be able to say that the child was protected.
13:06What do we do if we want to notify a superior?
13:09I mean, if the video was uploaded five minutes ago,
13:11we see that they hit the child,
13:13and we only mark it as unpleasant, and that's it.
13:15Yes.
13:16Yes, but it's a little weird, isn't it?
13:19We have different policies for live videos.
13:22We talk about videos that are not broadcast live,
13:25and probably 99% of those photos or videos end up being viral.
13:30And what happens if there is a text in the video that says,
13:33I just saw this a moment ago?
13:35Do we leave it hanging?
13:37We can't do anything.
13:41Our investigation has shown that the police are not informed
13:44unless the video that shows a child being mistreated
13:47is being broadcast live.
13:51Is there any investigation or follow-up procedure
13:54once you mark it as unpleasant?
13:56When you mark something as unpleasant?
13:58Yes.
13:59No.
14:00Not even when we talk about child abuse or things like that?
14:05Yes, and for something to be done,
14:07the content has to meet our severity criteria.
14:10Okay.
14:11Otherwise, for us, it's just crap that's floating around the Internet.
14:16I know it sounds weird.
14:18And Facebook doesn't do anything?
14:20Maybe it could.
14:22If you see something, you can say,
14:24if you see someone doing something terrible to the child,
14:28there are rules for live videos that are different.
14:33Okay.
14:34But when we talk about domestic abuse,
14:36what happened a long time ago,
14:38we're not going to inform the police.
14:41I guess someone will.
14:43I don't know.
14:45Do you know how the child is?
14:49We found out that the video had been recorded in Malaysia
14:52and that they had entered the child in the hospital.
14:55The one who hit him was the stepfather.
14:57He was arrested and sentenced to 12 months in prison.
15:01How long ago did you see the video for the first time?
15:04I think it was in December 2012.
15:09It's been almost six years.
15:14Is the video still up on Facebook?
15:16Yes, of course.
15:23It's called...
15:34In training, they use this video
15:36as an example of the type of child abuse
15:38that should be marked as unpleasant,
15:40but that must be left up on the platform.
15:44If this video is used as an example
15:47to show the moderators what is considered acceptable,
15:50what Facebook allows is...
15:53It's huge.
15:56Facebook justifies its decision
15:58to leave this type of content saying,
16:00with the aim of identifying the victim
16:02of child abuse and rescuing her,
16:04it's possible that we don't remove this content
16:07from Facebook immediately.
16:09Facebook has the possibility
16:11to remove images of child abuse from the platform
16:14and send them to competent authorities
16:16or keep a copy to help the security forces
16:19in case the police carry out an investigation.
16:23It's difficult to justify the reason
16:25why the images have to be left up on the platform.
16:29We know that as long as the content
16:31remains on social media,
16:33it will exacerbate the trauma
16:35that the children might feel.
16:38Not only has the child suffered
16:40a sustained physical abuse,
16:43but unfortunately the child will be re-abused
16:47by the person receiving the video.
16:50Because that content is there
16:52for anyone to go on Facebook and see.
17:01Why was that video on Facebook?
17:03Because someone shared it.
17:05And I want to be clear, that's intolerable.
17:07That shouldn't be there.
17:09That should have been removed.
17:11On the site where it's been,
17:13we can see a part of the global system we use.
17:15They do the first review.
17:17But behind them, there's a large team
17:19of minors protection specialists
17:21who are actually part of the full-day
17:23Facebook team.
17:25And they're the ones who evaluate
17:27if the child is at any risk,
17:29what to do with the content,
17:31and if it has to be sent to the security forces.
17:33The video of child abuse
17:35was still available a week after
17:37we had notified them of its existence.
17:39They told us they had already reviewed
17:41the material used in the training
17:43of the new moderators.
17:49We have recorded with a hidden camera
17:51everything that happens in one of the
17:53main training centres
17:55to find out how to decide
17:57what can be seen on the platform.
18:01If a user finds content
18:03that he thinks is inappropriate,
18:05he can inform Facebook
18:07and a moderator will decide
18:09whether or not to break the rules of the platform.
18:13In the video, if they're eating them,
18:15which is what it looks like
18:17because they have them on a plate
18:19with sauce and so on,
18:21we consider it food.
18:31Each image or video that is reported
18:33is called a tick,
18:35and they all go on to thicken
18:37depending on how the moderators work.
18:39After three and a half weeks in training,
18:41our reporter is already working
18:43on his own list.
18:49This rule always seems a bit...
18:51I need you to clarify it.
18:55The video we just saw is of a fight,
18:57and I'm pretty sure
18:59they were minors.
19:01It looks like two girls fighting.
19:03They're talking about being at school
19:05and that kind of thing.
19:07Our reporter has to moderate
19:09a video showing two girls
19:11fighting.
19:13Both are clearly identified,
19:15and the video has been shared
19:17more than a thousand times.
19:19One is definitely more likely
19:21to be stronger.
19:23You know, at this stage,
19:25they're kicking each other.
19:27All stuff like that is important.
19:29Including a knee in the face.
19:31Yeah, she's out of place,
19:33but she's helpless, I would say.
19:37Yeah, yeah.
19:41My friend called me
19:43and she's like,
19:45have you been on Facebook?
19:47There's a video of your daughter,
19:49and I think you need to see it.
19:51You see how they start
19:53fighting.
19:57It keeps getting worse
19:59and they fall to the floor.
20:01But then the other girl
20:03gets up,
20:05she kicks my daughter,
20:07and she starts shooting
20:09her knees and kicks her in the head.
20:11She loses control
20:13completely.
20:15There's no doubt.
20:17She just looks like a wild beast.
20:21To wake up the next day
20:23and discover that the whole world
20:25had seen it, it had to be horrible.
20:27It was humiliating for her.
20:31The video has been published
20:33with a headline that condemns
20:35violence.
20:37This is already subject
20:39to a new policy.
20:41I don't remember what changed
20:43in politics in relation to this.
20:45If it's condemned, it's marked
20:47as unpleasant, and if it's not condemned,
20:49it's eliminated, right?
20:51I think that's what I understood
20:53at the last meeting,
20:55that neither the recent change
20:57in politics nor the moderators
20:59with more experience are sure
21:01whether to eliminate it or leave it
21:03and mark it as unpleasant.
21:05If a sub-title for condemnation
21:07doesn't appear, it's an elimination
21:09like a house.
21:11You have to eliminate whenever
21:13any kind of physical bullying
21:15of minors appears.
21:17If sub-titles don't appear,
21:19condemn it.
21:21This is marking as unpleasant.
21:23The simple appearance of a headline
21:25that condemns violence
21:27is marked as unpleasant.
21:33We show the recording
21:35of the hidden camera
21:37to the girl's mother.
21:47It shouldn't be an option
21:49to leave it hanging, right?
21:53I don't think there should be doubts.
21:55He himself says
21:57that they're beating the girl.
21:59They shouldn't
22:01even think about it.
22:13Seeing those images
22:15is terrible.
22:17They're repulsive,
22:19and yet they doubt.
22:21How can they think
22:23that there's another option
22:25other than eliminating it?
22:27I don't get it.
22:29I don't know.
22:31We're talking about someone's daughter
22:33being beaten in a park,
22:35not in a Facebook show.
22:37It's interesting
22:39because I swear
22:41that school bullying
22:43between two minors
22:45should be eliminated.
22:47People should be aware
22:49that they're going to look for the girls
22:51and report them.
22:53Because my daughter ended up in hospital.
22:55It would be unfair
22:57if Facebook decided
22:59that it can't be left hanging on the platform.
23:01Yes, I get it.
23:03So, if there's a headline
23:05that says,
23:07great, they've won it.
23:09Eliminate, eliminate.
23:11In that case, they're encouraging it.
23:13No matter what the headline says,
23:15they should have eliminated the video.
23:17They shouldn't have to
23:19upload a video
23:21showing someone's daughter
23:23being beaten.
23:27I don't see what's happening
23:29to that girl.
23:31They don't see the real situation.
23:33They only see the rules.
23:35Would they leave it hanging
23:37if it was their daughter
23:39in the video?
23:43If a parent or a tutor
23:45publishes a video showing their children
23:47in unapproved circumstances,
23:49they have the right to eliminate it.
23:51And when we hear about it,
23:53we do it.
23:55But that's taking responsibility
23:57for the victim,
23:59who has to be complained about.
24:01Why don't they delete it
24:03before they can humiliate the child?
24:05If the content is shared
24:07in a way that encourages or encourages
24:09violence, it's going to be deleted.
24:11But where people are highlighting
24:13even if the topic is unpleasant,
24:15there are a lot of cases
24:17where users can say
24:19that Facebook should not interfere
24:21if they want to highlight a phenomenon
24:23that has occurred.
24:25When users see this, do you make money?
24:27We make money
24:29because users use the service
24:31and see ads in their news sections.
24:37While we were investigating,
24:39Mark Zuckerberg had to appear
24:41in front of the US Senate
24:43accusing him of the lack of protection
24:45that his users suffer.
24:47We want to know, without delay,
24:49what are they going to do,
24:51both Facebook and the rest of the companies,
24:53to take greater responsibility
24:55for what happens on their platforms?
24:57It's not enough to create the tools.
24:59You have to use them as they should.
25:01During the last year,
25:03the number of employees
25:05who are dedicated to security and content has doubled.
25:07By the end of the year,
25:09it will be just that.
25:11When you have $40 billion
25:13in sales figures
25:15and tens of billions
25:17of profits a year,
25:19I think you have the obligation
25:21to make sure
25:23that the world is not worse
25:25for those who use your product.
25:31Recently,
25:33there has been a huge spike
25:35in content complaints.
25:37Now we always have delayed work.
25:39We have to review
25:41about 15,000 content.
25:43The team can do about 3,000 a day,
25:45but it seems that the work never ends
25:47and more and more comes.
25:49The goal of Facebook
25:51is to evaluate in a 24-hour period
25:53the content that is reported,
25:55but we just heard
25:57that there was a delay
25:59of about 15,000 tickets not reviewed at that time.
26:01I would like to thank you
26:03for the great effort you are making, guys.
26:05I know that your work
26:07is not only appreciated by CPL,
26:09but it is also recognized
26:11by the Facebook management team
26:13that we have above.
26:15I know that these two weeks
26:17have been very hard
26:19because the delay we have
26:21is crazy and I know
26:23that we need more people
26:25on the team.
26:27If we pass the mouse
26:29through this bar,
26:31we see that it has exceeded
26:33five days, 18 hours
26:35and 45 minutes.
26:37It's a disaster.
26:39Our reaction time on this list
26:41must be 24 hours,
26:43which means that we have to remove
26:45all that in 24 hours.
26:47It's impossible.
26:49So in this high-risk list,
26:51can there be something like someone saying
26:53that he's going to commit suicide in 10 minutes?
26:55Yes.
26:57Seriously?
26:59Any information
27:01that comes to us
27:03about someone who can commit suicide
27:05is not directed to a list
27:07where there are delays.
27:09They go to an absolute priority
27:11in which the minimum requirements
27:13are being met.
27:15When you informed us
27:17about the problem,
27:19we checked it and we are sure
27:21that even at that time
27:23when there were delays
27:25on the normal list,
27:27Facebook's delay
27:29was set to April 6th
27:31and they are doubling the number
27:33of security and protection workers.
27:41Facebook considers it
27:43enormously important
27:45because the consequences
27:47can be real or dangerous
27:49if you prefer to call it that.
27:51So it is absolutely
27:53a priority over anything else.
27:57An infiltrator receives training
27:59on how to address the content
28:01that has to do with self-inflicted injuries.
28:03Let's look at examples
28:05about inciting suicide and self-injury.
28:07Any content that promotes suicide
28:09or self-injury is eliminated.
28:11In the first image,
28:13you can see cuts.
28:15It doesn't seem to be recent.
28:17It's as if they were healing already.
28:19But the text says,
28:21I miss that feeling.
28:23He's admitting that he likes to self-injure.
28:25It's considered inciting
28:27and it's eliminated.
28:29The content that shows self-injury
28:31but doesn't promote it
28:33is considered self-injury recognition
28:35and is left unnoticed.
28:37In everything that has to do
28:39with self-recognition,
28:41we send the user a checkpoint
28:43to be able to send them resources,
28:45assistance phones,
28:47we contact them to offer help,
28:49all these kinds of things.
28:51When self-injury recognition appears,
28:53a checkpoint is created
28:55that contains information
28:57about mental health services.
28:59The checkpoint basically
29:01sends resources to help the user,
29:03but the content is neither removed
29:05nor eliminated.
29:13I think that probably I can attribute
29:1565% of my scars
29:17to the effects that social media
29:19had on me.
29:23It definitely has to do
29:25with the rise of adrenaline.
29:27For me, seeing the blood
29:29was a relief because it reminded me
29:31that there was something inside of me
29:33that wasn't as empty
29:35as I felt.
29:37What about this one?
29:39It says, I'm fine, thank you.
29:41Checkpoint, right? I guess it's alive
29:43because it says, I'm fine, thank you.
29:47Hopefully, yes.
29:49We send a checkpoint for self-injury.
29:53I met a girl
29:55who had Facebook
29:57and who often posted photos
29:59of self-inflicted injuries.
30:01First, before they were cured,
30:03then when they were healed.
30:05I saw her pictures
30:07and I was also following
30:09a group on Facebook.
30:11I was surrounded
30:13by people who were self-injured
30:15and it encouraged me
30:17to get injured more and more.
30:19It encouraged me to become
30:21a champion.
30:23I felt that the cuts
30:25had to be deeper
30:27than the others
30:29and that the injuries
30:31had to be worse.
30:35Oh, shit.
30:37Oh, my God.
30:39They're self-inflicted injuries.
30:41While she's moderating,
30:43our reporter comes across
30:45explicit images of self-inflicted injuries.
30:47That's to send to the checkpoint.
30:49Also, the cuts
30:51seem recent.
30:53It would be self-inflicted injuries
30:55recognition, I think,
30:57because there's no incitement.
30:59I haven't seen one like this
31:01in a long time.
31:03Yeah, it's the worst
31:05you can find.
31:07It's not appropriate to see people
31:09who have been seriously injured
31:11and there are old scars there.
31:13This is just a one-off thing.
31:15That's not tolerable.
31:17OK?
31:19Disgrace shared,
31:21less felt.
31:23So if you can get out there
31:25and you can see
31:27that there are other people
31:29doing this,
31:31which is a reflection
31:33of a disability,
31:35which is actually a reflection
31:37of a disgrace,
31:39then that's something
31:41attractive,
31:43especially if your mind
31:45isn't functioning
31:47how it should be.
31:49Oh, my God.
31:51What is that?
31:53That's almost incitement
31:55because...
31:57There's a flower.
31:59She's happy and looking.
32:01Look.
32:03Yeah.
32:33Social media can be considered
32:35anything but professional intervention.
32:37I haven't had tics
32:39with self-injury in months
32:41and now there are four in a row.
32:43That's drug, I think.
32:45It's heroin.
32:47What a bunch of tics
32:49we've had.
32:51It's self-injury recognition.
32:53Even if there's a drug,
32:55the drug is the least of it.
33:03I just think that
33:05when you haven't experienced it,
33:07it's difficult for people
33:09to understand this.
33:11I think that self-injury
33:13is something very complex
33:15that it can't be
33:17clarified in a guideline
33:19or in a rule
33:21and it can't be
33:23easily understood
33:25if you haven't experienced
33:27how it feels to be that.
33:29Would you ever look at
33:31pictures of self-injury
33:33on Facebook now?
33:35No, I wouldn't do it again.
33:37I'm strong enough
33:39to not be affected
33:41but I don't have the strength
33:43to take risks.
33:45I'm convinced that I wouldn't
33:47succumb but I don't feel
33:49prepared to run that risk.
33:51I think that even though
33:53I'm better, it's something
33:55difficult to manage.
33:57There's actually a person
33:59who uses Facebook to communicate
34:01his anguish to his family and friends
34:03with the aim of asking for help.
34:05There has to be a legitimate interest
34:07on his part to express his suffering.
34:09And we see that every day.
34:11There are people who get the help
34:13they need because the content
34:15remains on the web.
34:17If we removed it, probably
34:19the family and friends
34:21wouldn't realize the risk
34:23the user is taking.
34:25If we look at the picture,
34:27what I see is a minor,
34:29a person under 13 years old,
34:31so I don't send a checkpoint.
34:33The measures we take
34:35are different in these cases.
34:37During the self-injury training
34:39we address the issue of
34:41underage users.
34:43In the case of underage accounts,
34:45we don't take measures unless
34:47they recognize that they are.
34:49Underage people under 13 are
34:51not allowed to have an account
34:53because they can be lying.
34:55They can do it.
34:57Yes, but if they are lying,
34:59we ignore it.
35:01Even if we see, as in this case,
35:03that they can have
35:05about 10?
35:07Okay, so if someone lies to us,
35:09we ignore it, because
35:11that's what it looks like.
35:13Yes, they have to recognize
35:15in some way that they are underage.
35:17If not, we blindfold ourselves
35:19and we don't know
35:21who is underage and who is not.
35:23Facebook only investigates
35:25the age of a user
35:27if someone tells them
35:29they can be underage.
35:37Most of the content
35:39reported to Facebook's moderators
35:41has to do with incitement
35:43to hate.
35:45About this, I'm convinced
35:47that everyone has their opinion
35:49about what is good and what is not.
35:51But keep in mind that we have to
35:53follow Facebook's policies.
35:55It doesn't matter what opinion
35:57we deserve. We have to follow the rules.
35:59When Mark Zuckerberg appeared
36:01in front of the Senate,
36:03he was asked about the regulation
36:05in relation to incitement to hate.
36:07Our goal is to allow people
36:09to express themselves as much as possible.
36:11I don't want the decisions
36:13that are made in our company
36:15to be based on the political ideology
36:17Facebook claims that its regulation
36:19aims to protect certain groups
36:21from attacks or insults.
36:23We can eliminate any content
36:25that promotes exclusion, death
36:27or attacks against Muslims.
36:29It's explicit hate.
36:31But we know that moderators
36:33are advised to overlook certain types of content.
36:37Let's look at this image.
36:39When your daughter's first league
36:41is a black guy.
36:43This one has been around for a long time.
36:45You have to ignore it because it insinuates
36:47but you have to be careful
36:49so that it breaks the real norm.
36:51In reality, it is not attacking
36:53the black boy in any way.
36:55My golden rule is
36:57if you have to go around
36:59to get to what you think
37:01the content is suggesting,
37:03you have probably gone too far.
37:05So we would ignore this one.
37:07Do you agree?
37:09No, we don't agree.
37:12Facebook confirmed
37:14that the image violates
37:16the regulation on incitement to hate
37:18and that they were going to check
37:20what had happened.
37:22If we want to design a service
37:24in which everyone can give their opinion,
37:26we must ensure that no one is harassed
37:28or intimidated
37:30or the environment ceases to seem safe.
37:32But on Facebook
37:34not all users are protected equally.
37:36I have an instinct
37:38to eliminate them.
37:41Our reporter is moderating
37:43a comment that says
37:45fuck you, go back to your country.
37:47It has been published under a video
37:49with a headline
37:51referring to Muslim immigrants.
37:54That tick there,
37:56the one that says fuck you,
37:58go back to your country,
38:00puts Muslims, wait, immigrants.
38:02Muslim immigrants.
38:04If it only put Muslims,
38:06we would eliminate it,
38:09Facebook allows its users
38:11to insult Muslim immigrants
38:13to a greater extent
38:15than those who are only Muslims.
38:17But they are still Muslims.
38:19What did you say?
38:21I told you that even if they are Muslim immigrants,
38:23they are still Muslims.
38:25Yes, they are still Muslims,
38:27but they are immigrants,
38:29that's why they are less protected.
38:31It deals with disgusting Muslim immigrants.
38:33I think there is no problem.
38:35Seriously, there is no problem?
38:37Yes, because disgusting
38:39can be considered physical inferiority.
38:41Ah, okay.
38:43If I put something like
38:45Muslim immigrant scum,
38:47we would eliminate it.
38:49There are many people
38:51who are dedicated to debating
38:53very sensitive issues on Facebook,
38:55such as immigration,
38:57which is absolutely current,
38:59and that political debate
39:01can be completely legitimate.
39:03Is that why it is allowed to say
39:05that Facebook promotes hatred,
39:07which goes against a collective.
39:09In other words, Muslim immigrants
39:11who have left Great Britain
39:13do not incite hatred.
39:15I repeat that we have studied it
39:17in great detail,
39:19and it is very complicated,
39:21it is right on that line.
39:23Does it incite hatred or not?
39:25We have established that it does not,
39:27as long as what is being communicated,
39:29I repeat,
39:31is a perspective
39:33of those same Muslim scum,
39:35disgusting and repulsive,
39:37who have been raping British girls.
39:39The Facebook page
39:41Britain First,
39:43had more than 2 million followers
39:45before it was eliminated
39:47last March,
39:49while we were recording the documentary.
39:51It had to be eliminated,
39:53so now it is not that they are a group
39:55or a discriminatory organization,
39:57it is simply forbidden to be.
39:59Look, we have been controlling the page
40:01that had 300,000 followers.
40:03It had a total of 2 million
40:05just before it was pulled.
40:07And what I wonder is,
40:09why did it take so long?
40:11We were talking about the history of Britain First.
40:13It was like ...
40:15We marked their pages for the content.
40:17They had like 8 or 9 infractions,
40:19and they can only have 5,
40:21but they had a lot of followers.
40:23They gave a lot of money to Facebook.
40:27Our reporter has been told
40:29that if a page violates Facebook's regulations
40:31with 5 of its content,
40:33it has to be eliminated.
40:35But Britain First was very famous.
40:37These types of pages are shielded
40:39and the moderators of CPL content
40:41cannot remove them.
40:43Didn't it reach that limit
40:45before it was removed?
40:47I think it did,
40:49but it had so many followers
40:51that it was like shielded.
40:55What do you mean?
40:57The ones that stay on the platform.
40:59The shielded ones, where do they go?
41:01They go to the list of Facebook's
41:03template workers and they decide.
41:05Ah, okay.
41:07Sometimes it depends on what it is.
41:09I mean, in the case of Britain First,
41:11there were many hot topics.
41:13When these shielded pages
41:15encompass content that violates the regulations,
41:17they are put on the list
41:19of reviews of shielded pages
41:21for Facebook to evaluate them.
41:23But what is shielded review?
41:25Shielded means you have to be careful.
41:27If a page has a lot of followers,
41:29you're not going to remove it.
41:31You have to be 100% sure
41:33that there are reasons
41:35to remove it.
41:37Okay.
41:39The page has 5 infractions
41:41but also 1 million followers.
41:43I'm not sure if I should remove it or not.
41:45Yes.
41:47Like Britain First, right?
41:49Yes, something like that.
41:51If it's an important page,
41:53you have to remove it.
42:23And if we come across
42:25one of these pages
42:27by Tommy Robinson?
42:29Yes.
42:31Do I look at the content
42:33or does it make no sense?
42:35We don't go around
42:37removing their content.
42:39They are pages that are shielded.
42:41If you delete a video
42:43or anything else,
42:45or even if you haven't removed
42:47a video by Tommy Robinson,
42:49it goes directly to the list
42:51but I can't see them.
42:53It's special.
42:55Tommy Robinson's page
42:57has 900,000 followers
42:59and is so famous
43:01that it has the same level
43:03of protection
43:05as the government
43:07and the media.
43:09What they mean
43:11when they talk
43:13about freedom of expression
43:15is that they're going
43:17to allow everyone
43:19to publish whatever they want.
43:21And once you understand
43:23that the essence of these
43:25gigantic social networks
43:27is that the most hostile
43:29and the most miserable
43:31opinions predominate,
43:33you realize that the more
43:35open your platform is,
43:37the more unpleasant,
43:39annoying and inadequate content
43:41you're going to attract.
43:43Why does Tommy Robinson's
43:45account have the same level
43:47of protection as the government's,
43:49the BBC's or the accounts
43:51of other respectable organizations?
43:53If his content violates the rules,
43:55we remove it.
43:57Because it's one of the most valuable,
43:59right? With hundreds of thousands
44:01of followers.
44:03I repeat to you that this debate
44:05has nothing to do with money
44:07but with political discourse.
44:09People prefer us to be careful
44:11and be prudent when
44:13removing their political opinions.
44:15We don't want to be a signal
44:17or use data illegally.
44:19We don't tolerate it.
44:21For that reason,
44:23Facebook has begun to change.
44:25Facebook has launched
44:27a campaign of great repercussion
44:29to help you improve your image.
44:31We didn't take a broad view
44:33of our responsibility.
44:35And that was a mistake.
44:37It was my mistake.
44:39And I'm sorry.
44:41I started Facebook.
44:43This is difficult to face.
44:45But the incentives
44:47to do so
44:49are already irrefutable.
44:51I just hope that,
44:53thanks to the documentary,
44:55the debate can be intensified
44:57and it becomes something
44:59permanent and concrete.
45:01We have to stop accepting
45:03their pretexts and not
45:05believe more in their guarantees.
45:07CPL told us,
45:09for us it is of utmost importance
45:11that all managers receive
45:13the necessary training
45:15and are aware of the changes
45:17in Facebook's policies.
45:19So we have decided to investigate
45:21this problem in a priority way
45:23and take urgent measures.
45:25The training is now
45:27much more exhaustive.
45:29We are one of those companies
45:31that is constantly subject
45:33to the strictest surveillance
45:35in the world.
45:37And we think it's good.
45:39They have identified some areas
45:41in which we have made mistakes.
45:43And I have no choice
45:45but to apologize for this
45:47and make it clear
45:49that we do not deny our weaknesses.
45:51We know that all this
45:53should not happen.
45:55When we are informed
45:57of our mistakes,
45:59we commit to study them
46:01very seriously
46:03and to take the necessary
46:05measures to ensure
46:07that we do not repeat
46:09the same mistakes.

Recommended