The Laborers Who Keep Dick Pics and Beheadings Out of Your Facebook Feed
Area 1.) I had no clue that content moderators actually did these things "manually", as it were. According to my initial thoughts, I'm not sure if I ever assumed that content moderators needed to be living people that really watched horrific things in order to block or report them. Considering the capabilities of modern software and all of the associated peripherals, it just seems a bit archaic that there would need to be rooms full of poor souls watching the worst of the worst when I'm able to dictate a text or an email to the AI on my phone and have it sent. I guess that my last statement points it out perfectly, however. The software only delivers, it doesn't know why. It wouldn't know the difference between a rape video or an old snippet of the Tonight Show. Someone must make that decision, actually review and tag content since an algorithm will fail. Despite any advances made to any system or programming, it has no appreciation for the human experience and cannot know things like cruelty, or rage, or horror, or even empathy. There is no algorithm that we could program at this time that could bring any reasonable amount of discriminating sensibility resembling human decency to any screening software.
Area 2.) The people doing this job could are potentially facing a lifetime of debilitating after-effects resulting from viewing this material. Not only are there vast inequities in pay scale internationally, there are finite resources available that could help them deal with this content in any sort of a healthy manner. I used to watch whatever videos were available when the internet was young; I saw quite a few that still haunt me to this day if I allow myself to think about them at all. There were beheadings, suicides, murders, firing squads, people drawn and quartered, people savagely beating the disabled, elderly, or children...it's astonishing to think of everything that was so regularly shared on many of the earlier incarnations of social media and even in emails.
Personally, if I watched any of these things, it left me feeling hollow and restless. Sleep was lost, malaise began to rear it's ugly head. To see these things rarely is bad enough, the feelings of hopelessness from seeing the worst sides of humanity is damaging and takes some recovery. I'm not sure how it would feel to be paid to watch this content non-stop, but I'd imagine that the cynicism and anguish, the feeling of displacement could result in some very real PTSD. It'd be fair in my opinion for these companies to be on the hook for therapies and mental health treatments for the rest of the lives of their moderators, regardless of if they are with the company or not.
Comments
Post a Comment