Two ‘big reasons’ to keep your child away from TikTok – Times of India

TikTok can be dangerous for teenagers, so reveals a new report by Center for Countering Digital Hate (CCDH). Titled, “Deadly by Design”, the report claims TikTok bombards vulnerable teenagers with dangerous content that might encourage self-harm, suicide, disordered eating and eating disorders via its ‘For You’ feed.
CCDH researchers studied the TikTok algorithm by establishing two new accounts posing as 13 year-olds in the US, UK, Australia, and Canada. One account in each country was given a traditionally female name. The second account also had a traditionally female name but also contained the string of characters – ‘loseweight’. Research has claims that users with body dysmorphia issues will often express this through their username. “We differentiate the account with ‘loseweight’ in our analysis as a ‘vulnerable’ account,” said the report.
CCDH researchers then recorded the first 30 minutes of content automatically recommended by TikTok to these accounts in their “For You” feed, a section of TikTok that algorithmically recommends content to users. The algorithm refines the choice of videos based on the information it gathers about the user’s preferences and interests. The way this works is key to the success of the platform and is said to be key to TikTok’s success. The algorithm is proprietary to TikTok.
Startlingly dangerous numbers
CCDH claims that it found that TikTok is host to an eating disorder community using coded and open hashtags to share content on TikTok with over 13.2 billion views of their videos. “On the “For You” feed, our research team encountered numerous videos promoting potentially dangerous content about mental health, disordered eating, or self-harm. Every time videos on these topics, body image or mental health were encountered, researchers would pause and like it, simulating the behavior of a young adult who may be vulnerable to such content,” the report said.
The report shows some disturbing numbers. It says that its accounts were served videos about mental health and body image every 39 seconds on average in the study. Content referencing suicide was served to one account within 2.6 minutes. Eating disorder content was served to one account within 8 minutes.
It further adds that researchers were extremely disturbed to find that the volume of harmful content shown to vulnerable accounts (i.e. with the term ‘loseweight’ in their username) was significantly higher than that shown to standard accounts.
It goes on to add that vulnerable accounts were served 3 times more harmful content than standard accounts. They were served 12 times more self-harm and suicide videos than standard accounts.

Apple Watch Ultra can become a diving computer with new Oceanic+ app

For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! TechNewsBoy.com is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.