Concern for Those Who Screen the Web for Barbarity
Ricky Bess spends eight hours a day in front of a computer near Orlando, Fla., viewing some of the worst depravities harbored on the Internet. He has seen photographs of graphic gang killings, animal abuse and twisted forms of pornography. One recent sighting was a photo of two teenage boys gleefully pointing guns at another boy, who is crying.
An Internet content reviewer, Mr. Bess sifts through photographs that people upload to a big social networking site and keeps the illicit material — and there is plenty of it — from being posted. His is an obscure job that is repeated thousands of times over, from office parks in suburban Florida to outsourcing hubs like the Philippines.
The surge in Internet screening services has brought a growing awareness that the jobs can have mental health consequences for the reviewers, some of whom are drawn to the low-paying work by the simple prospect of making money while looking at pornography.
. . .
Internet companies are reluctant to discuss the particulars of content moderation, since they would rather not draw attention to the unpleasantness that their sites can attract. But people in the outsourcing industry say tech giants like Microsoft, Yahoo and MySpace, a division of the News Corporation, all outsource some amount of content review.
YouTube, a division of Google, is an exception. If a user indicates a video is inappropriate, software scans the image looking for warning signs of clips that are breaking the site’s rules or the law. Flagged videos are then sent for manual review by YouTube-employed content moderators who, because of the nature of the work, are given only yearlong contracts and access to counseling services, according to Victoria Grand, a YouTube spokeswoman.
I can’t imagine the mental/spiritual damage of doing a job like this.