From Facebook (June 20, 2019):
In a similar vein, I've been trying to read the Verge article about the working conditions for Facebook moderators, and I have to take it in small bits. The situation is so horrific, and tragic, and as hard to grasp as photos of a neighborhood you used to know after a bomb attack. The sheer *damage* being done to these people, whether they realize it or not, is unbearable for an empathetic person to witness.
The issue of online content moderation is one of the most difficult the internet has ever had to grapple with. On the one side, you have the sewer of unspeakable human filth being given space and air like it never has before.... and on the other, you have a thin line of ordinary people given no special training, who are expected to wade through that corrosive, soul-crushing material unscathed. It's like the infantry in WWI, sent up over the wall to be mowed down by the machine guns. The toll in sanity and actual human lives is mounting, and very few people in charge seem to notice or care.
Most online communities can't afford more than a token staff for moderation, and rely on automated scripts, which are notoriously easy to manipulate; look at the FB Real Names policy, or the Tumblr "skin tones test". While machines can't be emotionally scarred by what they witness, they are not even close to being able to make real judgment calls. But when the company actually tries to expand its moderation staff, it's treated as unskilled tech labor, because any monkey can press a "takedown" button.
We need protections for moderation staff. We need these posts to be skilled positions requiring real training and support. The job needs to come with perks like flexible breaks and personal time, and stress-management devices (break rooms, games, counselors, massages, whatever), like other high-stress jobs have. The training needs to be developed by psychiatric professionals (like the ones who evaluate astronauts for long-term space missions like Mars) to give staff the tools they need to cope with the material they come in contact with every day. It is possible to encounter traumatic and horrible scenes regularly and still cope -- emergency rescue and ER personnel do it, as do mental health professionals who handle severe cases. But it takes a certain kind of person, and specific training, if they are to walk away able to shake it off most of the time.
I am under no illusions: the companies who need moderation will not provide these things. Congress won't pay any attention. The best leverage to get this kind of change is probably (gasp) a union. But whatever happens going forward, the current model is unsustainable, and the psychic toll we're putting on these people is unconscionable.
In a similar vein, I've been trying to read the Verge article about the working conditions for Facebook moderators, and I have to take it in small bits. The situation is so horrific, and tragic, and as hard to grasp as photos of a neighborhood you used to know after a bomb attack. The sheer *damage* being done to these people, whether they realize it or not, is unbearable for an empathetic person to witness.
The issue of online content moderation is one of the most difficult the internet has ever had to grapple with. On the one side, you have the sewer of unspeakable human filth being given space and air like it never has before.... and on the other, you have a thin line of ordinary people given no special training, who are expected to wade through that corrosive, soul-crushing material unscathed. It's like the infantry in WWI, sent up over the wall to be mowed down by the machine guns. The toll in sanity and actual human lives is mounting, and very few people in charge seem to notice or care.
Most online communities can't afford more than a token staff for moderation, and rely on automated scripts, which are notoriously easy to manipulate; look at the FB Real Names policy, or the Tumblr "skin tones test". While machines can't be emotionally scarred by what they witness, they are not even close to being able to make real judgment calls. But when the company actually tries to expand its moderation staff, it's treated as unskilled tech labor, because any monkey can press a "takedown" button.
We need protections for moderation staff. We need these posts to be skilled positions requiring real training and support. The job needs to come with perks like flexible breaks and personal time, and stress-management devices (break rooms, games, counselors, massages, whatever), like other high-stress jobs have. The training needs to be developed by psychiatric professionals (like the ones who evaluate astronauts for long-term space missions like Mars) to give staff the tools they need to cope with the material they come in contact with every day. It is possible to encounter traumatic and horrible scenes regularly and still cope -- emergency rescue and ER personnel do it, as do mental health professionals who handle severe cases. But it takes a certain kind of person, and specific training, if they are to walk away able to shake it off most of the time.
I am under no illusions: the companies who need moderation will not provide these things. Congress won't pay any attention. The best leverage to get this kind of change is probably (gasp) a union. But whatever happens going forward, the current model is unsustainable, and the psychic toll we're putting on these people is unconscionable.