Contact Form

Name

Email *

Message *

Monday, 12 December 2022

Leaked documents reveal Meta knew Instagram was pushing girls towards harmful content that harmed mental health

If there’s anything that Elon Musk’s Twitter saga and Twitter Files has shown us, its that content moderation by social media platforms is anything but straightforward. Social media platforms like Instagram and Facebook need to strike the balance between making a user’s feed as engaging as possible, and keeping users, especially impressionable users away from harmful content. This is where most social media platforms fail miserably.

Meta knew Instagram was pushing girls to harmful content that affected mental health, reveals leaked document

A previously unpublished document that has not been leaked from Meta, shows that the people heading Meta when it was still called Facebook, knew that Instagram was intentionally pushing young teenage girls to dangerous and harmful content, and did nothing to stop it.

The document reveals, how an Instagram employee ran an investigation on Instagram’s algorithm and recommendations, by pretending to be a 13-year-old girl looking for diet tips. Instead of showing the user content from medical and proper fitness experts, the algorithm chose to show content from more viral topics that got more engagement, which was adjacent to having a proper diet. These “adjacent” viral topics turned out to be content around anorexia. The user was led to graphic content and recommendations to follow accounts titled “skinny binge” and “apple core anorexic.”

It is a known fact that Instagram was aware of the fact that almost 33 per cent of all teenage users of the platform feel worse about their bodies because of the app’s recommended content, and the algorithm Insta used to curate a user’s feed. Instagram was also aware that teens who used the app felt higher rates of anxiety and depression.

This is not the first time that Instagram’s algorithms and the content that it pushes on users has been a topic of contention for mental health experts and advocates. Earlier this year Instagram was officially listed as the cause of death by a coroner in the UK in a case involving a 14-year-old girl named Molly Russell, who died by suicide in 2017.

In Molly Russell’s case, one of the key areas that the trial was focusing on was whether Molly watching thousands of posts on platforms like Instagram and Pinterest promoting self-harm had anything to do with the fact that she killed herself. In his testimony as the coroner, Andrew Walker concluded that Russell’s death couldn’t be ruled a suicide. Instead, he described her cause of death as “an act of self-harm whilst suffering from depression and the negative effects of online content.” Walker, at one point, described the content that Russell liked or saved in the days ahead of her death as so disturbing, that he found it “almost impossible to watch.”

“The platforms operated in such a way using algorithms as to result, in some circumstances, of binge periods of images, video clips and text,” which “romanticized acts of self-harm” and “sought to isolate and discourage discussion with those who may have been able to help,” Walker said.

Cases like these have opened up the debate about the content moderation policies that social media platform have, and how they play out in real life. Attorney Matt Bergman started the Social Media Victims Law Center after reading the Facebook Papers, which were disclosed by whistleblower Frances Haugen last year. He’s now working with more than 1,200 families who are pursuing lawsuits against social media companies.

“Time after time, when they have an opportunity to choose between safety of our kids and profits, they always choose profits,” said Bergman in an interview with a news agency in the US. He argues the design of social media platforms is ultimately hurting kids. 

“They have intentionally designed a product that is addictive,” Bergman said. “They understand that if children stay online, they make more money. It doesn’t matter how harmful the material is.” Bergman argues the apps were explicitly designed to evade parental authority and is calling for better age and identity verification protocols.

Meta’s global head of safety Antigone Davis has said “we want teens to be safe online” and that Instagram doesn’t “allow content promoting self-harm or eating disorders.” Davis also said Meta has improved Instagram’s “age verification technology.”

Several activists and advocacy groups are of the opinion that content moderation across platforms needs an overhaul. While the larger consensus is that social media platforms need to have independent moderation councils, and should regulate content themselves, others have expressed that there is a need for a larger and global body that sets policies for content moderation. 

Taking away content moderation from platforms and assigning an independent council that overlooks all social media platforms’ moderation policies opens up a whole new can of worms. For example, it will be much easier for regimes to suppress political dissidents and news that may be unfavourable to a regime. This is what exactly Twitter Files is trying to show. The fact remains, however, that content moderation as we know it, is broken and needs to be fixed, stat.



from Firstpost Tech Latest News https://ift.tt/PhDyUnG

No comments:

Post a Comment

please do not enter any spam link in the comment box.

Navigating the World of Crypto: Exploring the Potential of Crypto4u

 In recent years, the world of cryptocurrency has undergone a seismic shift, evolving from a niche interest among tech enthusiasts to a glob...