Culture - Books


## I choose to work about an article touching internet's moderation cause is a very interesting thing that are omnipresent on internet world. I planned to post the JOHN NAUGHTON article firstly on a device, because internet's on the screen and not in the paper. Finally there's only a mobile version of the article, where some content will interfere the lecture. ## In our daily browsering we don't see bad content very often. But they can be there or better said they were be there. Because after a post (on the most popular platforms) there's a group of moderators that verify if that content obey to the content, ethical and privacity norms. But the moderation becomes more and more difficult for be a human task. In Youtube there's 300 hours uploaded every minute, so in one single day we have 50 years of video uploaded! And each day these numbers increase, can we continue moderating like the way we did until now? Internet content is like a snow ball, each second bigger, and we have to introduce new ways of filtering, deleting and supervising the content. Platforms are using robots to clean all that content. They work with algorithms, detection colors and more recognition stuff. So we need them to keep a "clean" internet, but at what price? With the introduction of artificial intelligence to this task, the line between moderation and censure will be very close. Who can assure us that what we want to post will be see be others?

Takin Out
The Trash

Content moderators are the poorly paid, vital cogs that keep social media palatable(ish). This study will make you value them.

There's no ideology, fetish, obsession, perversion, eccentricity or fad that doesn't find expression somewhere online. And while much of what we see rehected back to us is uplifting, banal, intriguing, harmless or fascinating, some of it is truly awful, for the simple reason that human nature is not only infinitely diverse but also sometimes unspeakably cruel.

In the early days of the internet and, later, the web, this didn't matter so much. But once cyberspace was captured by a few giant platforms, particularly Google, YouTube, Twitter and Facebook, then it became problematic.The business models of these platforms depended on encouraging people to upload content to them. "Broadcast yourself", remember, was once the motto of You Tube. And people did - as they slit the throats of hostages in the deserts of Arabia, raped three-year-old girls, shot an old man in the street, firebombed the villages of ethnic minorities or hanged themselves on camera ...

All of which posed a problem for the social media brands, which liked to present themselves as facilitators of creativity, connectivity and good clean fun. So they started employing people to filter and manage it. Content moderation is now a global industry employing at least 100,000 people. Most moderators are contract workers and they work in stressful conditions, with little job security or health insurance and suffer psychological damage from having to confront - and make decisions about-the horrific stuff that people post on social media platforms.

Sarah T. Roberts is an academic at the University of California, Los Angeles, who has been exploring this netherworld of commercial content moderation for eight years. Her book Behind the Screen is the first extensive ethnographic study of those who inhabit it. Nobody who reads it will ever again view social media platforms in a tolerant light. For what it reveals is that the tech industry, like all the great industries of the past, prospers on the back of hum an exploitation.

Roberts's research confirms what scholars such as Tarleton Gillespie had suggested - that,far from being a way for social media platforms to demonstrate their sense of social responsibility, content moderation is actually the critical part of their operations, for without it their brands would be irreparably damaged by what unsuspecting users would find in their feeds. As one former moderator replied when Roberts asked him what the platform would be like without moderation: "It would be 100% porn. One hundred per cent. It would bea train wreck. The awful part is, it has to be done. It can't not be moderated. But there's no good way to do it."

A key question, which Roberts doesn't really explore, is whether the moderation task is ultimately a futile one, given the scale at which the platforms operate. Something like 400 hours of video are uploaded to YouTube every minute, for example. The technocratic response to this challenge is a cheery assurance that, one day, AI will take care of most ofit. Roberts and her interviewees don't buy that and neither do I.

One thing that could make a difference would be for the platforms to "throttle" uploads to a more manageable rate. Unsurprisingly, this is an option that is never discussed in the industry. As the computer scientist Hany Farid put it: "The content is just too valuable a commodity to the platforms; it is the bait that lures users in and keeps them coming back for updates to scroll, new pictures or videos to view, new posts to read and new advertisements to be served".

Corporations such as Google and Facebook that live by surveillance capitalism may therefore be heading for some kind of existential crisis. On the one hand, they have to keep the supply of uploads coming, for theyprovide the feedstock for their advertising machines; on the other, if through their contractors they cannot manage the moderation task, then their brands will become polluted and users will start to desert their platforms.

But that's their problem. For us, one of the most striking messages of this remarkable book is that governments and regulators should be investigating the appalling working conditions under which much content moderation is done. And if you're the kind of person who balks at buying a T-shirt that has been made in an Asian sweatshop, you might think of giving up Facebook or switching to DuckDuckGo for searching. After all, the issues are much the same.