YouTube users will get a 24-hour timeout if their toxic comments are removed
YouTube is rolling out updates today around toxic comments that violate community guidelines, TechCrunch has reported. Previously, it has used tools like popups to encourage “respectful” interactions, but it’s now taking a (slightly) more assertive approach with warnings and timeouts.
If YouTube detects and removes abusive comments, it will notify the user that they’ve violated community guidelines. If the same person continues to post toxic comments, they’ll receive a “timeout” and be unable to leave further comments for 24 hours. If users believe their comments shouldn’t have been pulled, they can share that feedback — though YouTube didn’t say if that would help remove the timeout.
Prior to the rollout today, YouTube trialed the featured and found it to be effective. “Our testing has shown that these warnings/timeouts reduce the likelihood of users leaving violative comments again,” it wrote in the blog post.
YouTube famously has one of the more toxic comments sections in social media, and is also overrun with bots offering fake giveaways, crypto and more. To address that problem, YouTube said that it’s “improving our automated detection systems and machine learning models to identify and remove spam.” It noted that it removed over 1.1 billion spammy comments in the first half of 2022, and said its machine learning models are continuously improving as spammers change tactics.
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission. All prices are correct at the time of publishing.
For all the latest Technology News Click Here
For the latest news and updates, follow us on Google News.