Social media platforms like Facebook “have played a major role in exacerbating political polarization that can lead to such extremist violence,” according to a from researchers at New York University’s Stern Center for Business and Human Rights.
That may not seem like a surprising conclusion, but Facebook has long tried to downplay its role in fueling divisiveness. The company says that existing research shows that “social media is not a primary driver of harmful polarization.” But in their report, NYU’s researchers write that “research focused more narrowly on the years since 2016 suggests that widespread use of the major platforms has exacerbated partisan hatred.”
To make their case, the authors highlight numerous studies examining the links between polarization and social media. They also interviewed dozens of researchers, and at least one Facebook executive, Yann Le Cun, Facebook’s top AI scientist.
While the report is careful to point out that social media is not the “original cause” of polarization, the authors say that Facebook and others have “intensified” it. They also note that Facebook’s own attempts to reduce divisiveness, such as de-emphasizing in News Feed, show the company is well aware of its role. “The introspection on polarization probably would be more productive if the company’s top executives were not publicly casting doubt on whether there is any connection between social media and political divisiveness,” the report says.
“Research shows that social media is not a primary driver of harmful polarization, but we want to help find solutions to address it,” a Facebook spokesperson said in a statement. “That is why we continually and proactively detect and remove content (like hate speech) that violates our Community Standards and we work to stop the spread of misinformation. We reduce the reach of content from Pages and Groups that repeatedly violate our policies, and connect people with trusted, credible sources for information about issues such as elections, the COVID-19 pandemic and climate change.”
The report also raises the issue that these problems are difficult to address “because the companies refuse to disclose how their platforms work.” Among the researchers recommendations is that Congress force “Facebook and Google/YouTube, to share data on how algorithms rank, recommend, and remove content.” Platforms releasing the data, and independent researchers who study it, should be legally protected as part of that work, they write.
Additionally, Congress should “empower the Federal Trade Commission to draft and enforce an industry code of conduct,” and “provide research funding” for alternative business models for social media platforms. The researchers also raise several changes that Facebook and other platforms could implement directly, including adjusting their internal algorithms to further de-emphasize polarizing content, and make these changes more transparent to the public. The platforms should also “double the number of human content moderators” and make them all full employees, in order to make decisions more consistent.
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.
For all the latest Technology News Click Here
For the latest news and updates, follow us on Google News.