Sunday, October 1, 2023

Complaint against social network for provocation to suicide

Published on September 19, 2023

The social network TikTok is accused of having amplified the anguish of a teenager who committed suicide in 2021. The algorithm would have led the teenager to watch a large mass of videos harmful to her mental health. This complaint revives the debate about the responsibility of companies for the unhappiness of young people, between the cult of thinness and the promotion of destructive behavior.

Social networks, responsible for the unhappiness of young people? The parents of Marie, a 15-year-old teenager who committed suicide on September 16, 2021 in Cassis (Bouches-du-Rhône), are convinced of this. They have just filed a complaint against the social network TikTok. The charges are “provocation to suicide”, “not helping a person in danger” and “propaganda or advertising of means to commit suicide”, according to information revealed on Monday, September 18 by France Info.

A few weeks before ending her life, the young woman explained in a video published on TikTok the reasons for her devastation. This was related to her weight and the bullying she suffered because of it. Shortly after, the social network would have automatically presented her with numerous videos talking about weight loss, which would have amplified her discomfort. Family lawyer Laure Boutron-Marmion denounces an “extremely powerful algorithm” and a large number of videos presented “which can only cause even more damage.” According to the Toulon prosecutor’s office, an in-depth analysis is necessary, in addition to the investigation into bullying that is already underway.

Read Also:  Maniema: 9 cases of cholera, including 2 deaths, recorded at Lubile health center

An information bubble

If this direct accusation of “provocation to suicide” is a novelty in France, TikTok’s responsibility had already been pointed out several times in relation to the unrest. In December 2022, the Center for Combating Hate on the Internet (CCDH) published a report showing how content related to self-harm and weight loss is suggested to Internet users, a few minutes after the first uses. There are also ongoing accusations on the issue of the lack of regulation of harassment on the social network.

There is legal precedent in the UK. On September 30, 2022, the British justice system ruled on the partial responsibility of the social networks Instagram and Pinterest in the death of Molly, a 14-year-old girl, who committed suicide in 2017. The investigation revealed content that promoted acts of self-mutilation. and others enclose it in an information bubble on the topic of depression. The young woman would have watched a total of 138 videos related to these topics. According to the lawyer, “Instagram literally gave Molly ideas.”

Read Also:  Senegal hosts a summit on health technologies in Africa

The risk of depression is documented by the company itself. “32% of teenage girls said they felt bad about their bodies, and Instagram made this situation worse,” indicates an internal study conducted by Facebook (now Meta), Instagram’s parent company, and revealed in 2019 by Wall Street Newspaper. “We are aggravating the physical complexes of one in three girls,” concludes the study. For example, Instagram offers filters to appear thinner, have a fuller mouth, more almond-shaped eyes… Instagram also estimates that 13% of young British people and 6% of American women have already expressed the desire to give themselves death. on the social network.

“We should not stigmatize mental health problems”

Instagram reacted in 2019 after Molly’s father’s fight. A “sensitivity screen” – a filter that first asks the user if they want to view the content – has been added to posts related to suicide or self-harm topics. Pinterest deemed automated reports, detected by the company through algorithms, ineffective. For this reason, this social network has adopted community reports, which already exist on Instagram.

Read Also:  The benefits of flax seeds

Instagram does not want to automatically delete content. “The advice (from psychologists and associations) that we have received leads us to believe that we should not stigmatize mental health problems by eliminating images that reflect the sensitive and very difficult issues that people struggle with,” justifies Adam Mosseri, the head of Instagram. However, according to the chief, detections often arrive late.

Given this, judicial and legislative pressure is increasingly stronger. The Digital Services Law, a European regulation that aims to guarantee fundamental rights in digital spaces and which came into force in August 2023 in Europe, requires platforms to offer purely chronological news without suggestions and also promotes greater transparency with moderation. A way to regain control.

Fanny Breuneval

Times of National
Times of Nationalhttps://timesofnational.com
Times of National To give more information about the latest happenings, news related to happenings in the country and abroad a casual understanding of the latest technology products and gadgets, celebrity news and gossip, latest movie news, sports, and cricket scores all you need Always ready to fulfill whatever else is becoming a part of our life nowadays.
Latest news
Related news

LEAVE A REPLY

Please enter your comment!
Please enter your name here