Detergents look who cleans the internet's toxic content: NPR



Content moderators are responsible for what determines what we see and what we do not have on social media.

With kind permission gebruder beetz filmproduktion


hide subtitles

subtitle switching

With kind permission gebruder beetz filmproduktion

Content moderators are responsible for what determines what we see and what we do not have on social media.

With kind permission gebruder beetz filmproduktion

Thousands of content moderators work continuously to ensure that Facebook, YouTube, Google and other online platforms will not have toxic content. This can include trolling, sexually explicit photos or videos, violent threats, and more.

These efforts – both man-made and algorithmic – have been fervently attacked in recent years. In April Mark Zuckerberg spoke before the Congressional Commission on how Facebook will work to reduce the incidence of propaganda, hate speech and other malicious content on the platform.

"By the end of this year, we will have more than 20,000 people involved in security and content review," said Zuckerberg.

Cleaners, the documentary film by Hans Block and Moritz Riesewieck, is trying to get to the bottom exactly how much work is going on. The film tracks five content moderators and reveals their jobs.

"I've seen hundreds of beheadings, sometimes lucky, it's just a very sharp blade that's used to them," says one moderator of the film's presenter.

Block and Riesewieck have explored more of the harsh facts that come along with the content moderator in an interview with After all.

The most important interviews

A typical Facebook content moderator day

He sees all these things we do not want to see online on social media. It could have been a fear, it could be beheading videos like the ones you talked about earlier. It can be pornography, it can be sexual abuse, on the one hand it may be necrophilia.

And, on the other hand, it could be content that could be useful for political debate or to be aware of war crimes and so on. So every day he has to mend thousands of pictures and they need to be fast enough to score points per day. … Sometimes there are so many pictures every day. And then you have to decide whether to delete it or let it stay.

On Facebook's decision to remove the Pulitzer Award-winning "Napalm Girl" photo

This content moderator, he decided he would rather erase because it depicts a young, naked baby. So, this rule applies against nudity that is strictly forbidden.

Therefore, it is always necessary to distinguish so many different cases. … There are so many gray areas that remain, where content moderators sometimes say they have to choose their feelings.

The weight of discerning malicious content from newspaper pictures or art

It's a stunning leap – it's so difficult to distinguish between all these different types of rules. … These young Filipino workers have training from three to five days, which is not enough to do such a job.

The impact of content moderators who are exposed to toxic content daily

Many young people are very traumatized due to work.

The symptoms are very different. Sometimes people told us that they are afraid to go to public places because they control terrorist attacks every day. Or they're afraid to have a close relationship with their boyfriend or girlfriend because they see sexually abused videos every day. So this is the kind of effect this work has …

Manila [capital of the Philippines] was the site where the analogous toxic waste was shipped from the Western world and was sent to a container ship for several years. Today, digital waste is brought in. Now, thousands of young content moderators in air-conditioned office towers click through the infinitely toxic sea of ​​images and the amount of intellectual unspoken concepts.

Emily Kopp and Art Silverman edited and produced this story for broadcasting. Cameron Jenkins produced this story digitally.


Source link