Pages

Friday, August 24, 2012

How Child Porn And The Other Awfulest Things Ever Get Scrubbed From The Internet

Machines are long way from being able to automatically remove the most awful images mankind has to offer — child porn, beheadings, forced bestiality — from our favorite sites and social networks. An invisible workforce has emerged to look at it so we don't have to. (Warning: You may find this piece upsetting.)



“Do you know what child porn is?” she asked me. A string of god-awful words came out of her mouth. “Infant decapitation” were just two of them. Cindy* spent years as a member of the CyberTips team at the National Center for Missing and Exploited Children, processing thousands of images, videos, emails, social network profiles and more that were flagged as possibly criminal content.


There’s a lot of stuff on the internet, and every day more gets added to it. Not all of it is kosher. Child porn. Narco executions. Beheadings. So an invisible workforce has emerged to help scrub the festering filth, one that is often poorly paid, in entry-level positions with little support or job security. As an interview earlier this week with a former Google worker showed, the psychological costs can be high.


“We were the 911 for the internet. We handled every single report on an internet child porn,” Cindy said. “Man, I wished I worked at Google compared to what we were dealing with. Every week we saw about 25,000 reports and every single report had at least 200 to 500 images and videos to review.”


She worked the afternoon-to-evening shift. Going to work meant turning on a computer and sorting through a long queue of reports that came from a number of tech companies, as well as from concerned individuals. On a normal day, she said she could process 100, maybe 200 reports, although it felt “never ending.” She often saw tech companies overzealously reporting, erring toward an overabundance of caution: Pictures of Marge and Bart Simpson having sex, for instance, was classified as potential child porn. But she also saw the real stuff, every day.


“To have a naked child image — that’s not necessarily a crime, if you don't identify them and know their age,” she explained.



Looking at disturbing material for a living can make some workers feel “nothing.” Others say it will “rot you from the inside out,” says Helen Steele, the director of an organization that helps tech companies understand the impact of such material on workers. Experts say that the most pernicious effects of repeated exposures to horrific images has a cumulative effect, and in several interviews with current and former workers in this field, people reported desensitization and isolation as the most common side effects. And many complain that they feel like hired eyeballs, the digital equivalent of day labor.


“When you're not close to the development process you are expendable as a paper airplane and they let you know it,” said one current employee who analyzes this type of content for a community moderation platform in Silicon Valley.



There’s no trade group — or even a common job title — for this kind of work. There's no one advocating for them, and more significantly, there's no way of tracking exactly how many there are. But between the behemoth tech companies like Google and Facebook, and outsourced data firms like the United States-based Telecommunications On Demand and Caleris, two companied mentioned in a 2010 New York Times article as processing millions of images, it seems safe to say these workers number in the thousands.


This is not to say that every individual in this line of work has had a bad experience: the ability to handle such a mentally demanding job differs from person to person. And tech companies say that they do offer special benefits to employees who view disturbing content for a living. Facebook has a “safety team” that is tasked with reviewing the most sensitive material, and according to a spokesman, they offer “in-house training, and also, counseling for our employees.” A Google spokeswoman told me that the one-year contracts (which can frustrate those looking to stay on for longer) were designed to ensure that no one held the most brutal jobs indefinitely. Also, Google brings in independent counselors to talk to teams about secondary trauma — the kind of trauma that comes from seeing abused people and not being able to help.




View Entire List ›


No comments:

Post a Comment