Facebook tests alerting users to extremist posts
A Facebook test of pop-up boxes asking people whether they think friends are becoming extremists raised concerns Friday among US conservatives who felt their voices might be stifled.
Facebook spokesman Andy Stone said in a Twitter exchange that the alerts sprang from an initiative at the social network to combat violent extremism and dangerous organizations.
“Redirect Initiative” features are intended to route people using hate- or violence-related search terms toward resources, education or outreach groups aimed at more harmonious outcomes, according to Facebook.
For example, Facebook said that searches related to white supremacy in the United States get directed to a Life After Hate group that provides crisis intervention.
Images of the alerts shared on Twitter showed messages asking whether users were worried someone they knew was becoming an extremist or if they had been exposed to extremist content.
People could opt to click on a link to “get support” or simply close the pop-up box.
Virginia state politician Nicholas Freitas, a Republican, was among those who shared an image of the Facebook alert on Twitter.
“I have a real concern that some leftist technocrats are creating an Orwellian environment where people are being arbitrarily silenced or banned for saying something the ‘thought police’ doesn’t like,” Freitas said in the post.
Facebook and other online platforms have been under pressure to stop the spread of misinformation and posts leading to real-world violence.
The social media giant recently beefed up automated tools to assist group moderators striving to keep exchanges civil in a time of clashing viewpoints.
Automated systems at Facebook check for posts in groups and news feeds that violate the platform’s rules about what content is acceptable.
Facebook in June banned former US president Donald Trump for two years, saying he deserved the maximum punishment for violating platform rules over a deadly attack by his supporters on the US Capitol.
Trump was suspended from Facebook and Instagram after posting a video during the attack by his fired-up supporters challenging his election loss, in which he told them: “We love you, you’re very special.”
The punishment was effective from January 7, when Trump was booted off the social media giant, and came after Facebook’s independent oversight board said the indefinite ban imposed initially should be reviewed.
“Given the gravity of the circumstances that led to Mr. Trump’s suspension, we believe his actions constituted a severe violation of our rules which merit the highest penalty available under the new enforcement protocols,” Facebook vice president of global affairs Nick Clegg said in a post.
Facebook also said it will no longer give politicians blanket immunity for deceptive or abusive content based on their comments being newsworthy.
Facebook imposes 2-year Trump ban, revises rules for politicians
© 2021 AFP
Citation:
Facebook tests alerting users to extremist posts (2021, July 3)
retrieved 3 July 2021
from https://techxplore.com/news/2021-07-facebook-users-extremist.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.
For all the latest Technology News Click Here
For the latest news and updates, follow us on Google News.