Wed. Dec 1st, 2021


Facebook Whistle Blower Francis Hagen The leak gives her advice The problem of extremism Especially scary in some areas. Documents are provided to Haugen New York Times, The Wall Street Journal And other outlets suggest that Facebook is aware that it has encouraged serious misinformation and violence in India. Social networks apparently did not have sufficient resources to deal with the spread of harmful content in populated countries and did not respond with adequate action when tensions spread.

A case study in early 2021 indicated that most of the harmful content of groups like Rashtriya Swayamsevak Sangh and Bajrang Dal was not flagged on Facebook or WhatsApp due to lack of technical knowledge required to find content written in Bengali and Hindi. At the same time, Facebook has refused to identify “political sensitivities” and because of this the removal of the RSS Bajrang Dal (Associated with Prime Minister Modi’s party) was not touched even after its content was taken down via an internal Facebook call. The agency had a white list for politicians who were exempt from fact-checking.

According to the leaked information, Facebook has been fighting against hate speech in recent times five months ago. And like a previous experiment in the United States, the study showed just how quickly Facebook’s recommendation engine suggested toxic content. Following Facebook’s recommendation for three weeks, a dummy account was “near the constant barrage” of divided nationalism, misinformation and violence.

Like the previous scoop, Facebook says the leaks don’t tell the whole story. Spokesman Andy Stone argued that the data was incomplete and did not account for widely used third-party fact checkers outside the United States. He added that Facebook has invested heavily in hate speech detection technology in languages ​​like Bengali and Hindi and the company has continued to improve that technology.

The social media firm follows it Posting A long defense of his practice. It argued that it has an “industry-leading process” for reviewing and prioritizing countries with a high risk of violence every six months. It noted that teams consider long-term issues and history, as well as reliance on current events and its apps. The company added that it is involved with the local community, improving technology and constantly “refining” policies.

The response did not directly address some of the concerns. India is Facebook’s largest private market, using the services of 340 million people, but 87 percent of Facebook’s misinformation budget is concentrated in the United States. Even with third party fact checkers at work, it suggests that India is not getting proportional attention. Facebook also did not follow up on the concern that it was outside of previous statements around certain individuals and groups that it applied its policies regardless of location or association. In other words, it’s not clear that Facebook’s problems with misinformation and violence will improve in the near future.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you purchase something through one of these links, we may receive an approved commission.



Source link

By admin

Leave a Reply

Your email address will not be published. Required fields are marked *