Tue. Oct 26th, 2021


Back in February, Facebook has announced a little experimentation. This will reduce the amount of political content shown to a subset of users in several countries, including the United States, and then ask about their experience. “Our goal is to preserve the ability to find and communicate political content on Facebook, while respecting everyone’s hunger at the top of their news feeds,” Astha Gupta, a product managing director, explained in a blog post.

The company on Tuesday morning Provided an update. There are survey results, and suggests that users often appreciate seeing political things in their feeds. Now Facebook wants to repeat the test in more countries, and is teasing “further expansion in the coming months”. It is understandable for a company to politicize public feeds for its perceived impact on politics that stays in the hot water forever. The move was first announced just a month after Donald Trump supporters attacked the Capitol. Some peopleWanted to blame Facebook, including elected officials. This change could have a big impact on political parties and media organizations that have become accustomed to relying on Facebook for distribution.

The most significant part of Facebook’s announcement, however, has nothing to do with politics.

The core of any AI-powered social media feed করুন think Facebook, Instagram, Twitter, TickTock, YouTube নেই you don’t have to say what you want to see. By observing what you like, share, comment, or just stand still, the algorithm has learned what kind of material attracts your interest and puts you on the platform. Then it shows you more things like this.

In a sense, this design feature gives a convenient defense against criticism from social media companies and their apologists: if some things are big on some platforms, because it is liked by users. If you have a problem with this, then probably your problem is with the users.

And yet, at the same time, many of the social platforms optimized for engagement are at the center of criticism. An algorithm that focuses too much on engagement can push users to content that can be very attractive but has low social value. It can feed them posts that are even more interesting because they are ever more extreme. And it can encourage the viral spread of false or harmful elements, as the system is first selecting what will trigger the engagement rather than what should be seen. The list of illnesses associated with engagement-first design helps explain why not Mark Zuckerberg, Jack Dorsey or Pretty Pichai Will admit The platforms under their control were built that way during the March congressional hearing. Zuckerberg insists that “meaningful social interaction” is the real goal of Facebook. “Busy,” he said, “is just a sign that if we pay that price, it’s normal for people to use our services more.”

In another context, Zuckerberg acknowledges that things may not be so simple. In 2018 PostExplaining why Facebook suppresses “borderline” posts that try to push them to the edge without breaking the rules of the platform, he writes, They will tell us later that they do not like the content, but that observation is thought to be how to implement Facebook’s policy around prohibited content, rather than reconsidering the design of its ranking algorithm in more detail.



Source link

By admin

Leave a Reply

Your email address will not be published. Required fields are marked *