Facebook under new investigation for moderate behavior in Europe


Facebook is once again facing questions from content moderators about its treatment after a moderator told one The agency does not do enough to protect workers who verify violent and annoying content on the platform.

Isabella Plunkett, who works on behalf of AISA outsourcing company Kovalen, told the committee that content moderators currently work as contract staff and that working non-moderators are not given adequate access to mental health organizations. Kovalen, for example, allows one and a half hours per week to “recover”, but the “Valence Coach” provided by the agency is not a mental health professional, and moderators are often not equipped to help process the traumatic content they deal with. Plunkett told the committee that these healthy coaches sometimes recommend activities like this .

“The content is horrible, it will affect anyone,” he told a news conference after the hearing. “No one can be right watching seven to eight hours of graphic violence.” He said moderators should carry the same benefits and protections as real Facebook employees, including the ability to be given sick time and work from home. Plunkett also increased reliance on Facebook’s undisclosed agreement, which he said contributed to an “environment of fear” that made moderators afraid to talk or seek outside help.

In a statement, a Facebook spokesperson said the company is “committed to working with our partners to provide support” to people reviewing content. “Everyone who reviews content for Facebook goes through an in-depth training program on our community standards and has access to psychological support to ensure their well-being,” the spokesperson said. “In Ireland, this includes on-site assistance with trained practitioners, an on-call service and access to private healthcare from day one of employment. We are employing technical solutions to limit their exposure to potential graphic elements as much as possible. And we are committed to that right. “

These issues are far from being raised for the first time. The workplace situation of content moderators cut through the worst content on the platform has long been a problem for Facebook, which relies on non-employee moderators around the world. The company agreed to one last year Those with US-based moderators who said their work had resulted in PTSD and other mental health problems.

As part of the settlement, Facebook agreed to make various changes to the way it handles content equipped with moderators for review. It has introduced new tools that allow them to watch videos in black and white and often more violent and less annoying graphic content to watch silent videos with audio videos to reduce the overall time spent on watching content in the relevant parts of long videos. Added features. The company has also made significant investments in it , Hopefully one day the work in between will be more automated.

However, Facebook may soon have to answer the question of whether these measures go far enough to protect content moderators. Council Ask representatives of Facebook and its contractors to appear at another hearing to face questions about the behavior of other workers.

All products offered by Engadget are selected by our editorial team, different from our parent company. Some of our stories include approved links. If you purchase something through one of these links, we can earn an approved commission.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *