Facebook and other social media’s utility for dissemination of information accurately has been a hot topic for debate for quite some time now. A debate is also underway about the credibility of rapidly increasing news sources on Facebook and similar social media platforms. These platforms depend on algorithms to pick and show information that they see fit for their users, based on their individual behaviour. A new study is — again — bolstering the long-standing argument by critics that the algorithms of social media fuel the spread of misinformation over more trustworthy sources, resulting in misleading the public debate on an important issue.
According to a report by The Washington Post, researchers at the New York University and the Université Grenoble Alpes in France studied user behaviour on Facebook around the 2020 US presidential election and found that between August 2020 to January 2021, news publishers known for releasing misinformation got six times more “likes, shares, and interactions” on the platform than trustworthy news sources, such as the CNN or the World Health Organisation (WHO).
The researchers also found that misinformation-trafficking pages on both the far left and the far right were able to engage Facebook users much more than factual pages. The findings have validated the concern about “fake news” that first became public after the 2016 US presidential election, which was held after a divisive and acrimonious poll campaign. Social media has often been blamed for amplifying calls to violence, including the January 6 attempt by Trump supporters to violently storm the Capitol, the seat of the legislative branch of the US government.
Appearing before Congress two months later, Facebook CEO Mark Zuckerberg appeared to suggest he bore no accountability for the misinformation campaign that ran on the social media platform. Others who attended the hearing were Twitter CEO Jack Dorsey and Google-parent Alphabet CEO Sundar Pichai. Lawmakers slammed the approach of the three platforms to false content.
Rebekah Tromble, director of the Institute for Data, Democracy and Politics at George Washington University, who reviewed the study’s findings, said it adds to the growing body of evidence that “misinformation has found a comfortable home” on Facebook despite a number of mitigation efforts.
Facebook said that the study measured the number of people who engage with content, but that is not a measure of the number of people who actually view it. “When you look at the content that gets the most reach across Facebook, it is not at all like what this study suggests,” a Facebook spokesperson told The Washington Post. Facebook does not make the number of people who view content on its platform (impressions) publicly available to researchers.
For all the latest Technology News Click Here
For the latest news and updates, follow us on Google News.