Health

Psychology experts are calling on social media giants to increase transparency around algorithms to protect users’ mental health

Psychology experts are calling on social media giants to increase transparency around algorithms to protect users' mental health
Written by adrina

In a new article published in the magazine body image, a team of psychology researchers outlines a mountain of evidence linking social media use to body image issues. The researchers describe how algorithms could amplify this connection and urge social media companies to act.

Appearance-based social media platforms like TikTok appear to be particularly detrimental to users’ body image. On these platforms, teens are constantly exposed to filtered and edited content that portrays unrealistic body standards. According to recent findings, this distorted environment increases the risk of physical dissatisfaction and harmful conditions such as body dysmorphia and eating disorders.

“I’m interested in risk and protective factors in body image, and some of my recent research has focused on the role of social media,” said lead author Jennifer A. Harriger, a professor of psychology at Pepperdine University. “I became interested in the use of algorithms by social media companies and the revelations by whistleblowers that showed companies were aware of the harm their platforms were causing to young users. This article was written as a call to arms for social media companies, researchers, influencers, parents, educators, and physicians. We need to protect our youth better.”

In their report, Harriger and her team explain that these effects can be amplified by social media algorithms that personalize the content shown to users. These algorithms “hide” users into content that is more extreme, less monitored, and designed to keep them on the platform.

Importantly, as evidenced by recent whistleblower statements, social media companies are not unaware of the damage caused by these algorithms. Former Facebook exec Frances Haugen leaked documents showing the social media giant was aware of research linking its products to mental health and body image issues in teens. A TikTok whistleblower later leaked evidence of an algorithm that carefully manipulates the content users see and prioritizes emotionally-triggering content to maintain engagement.

“Social media platforms can be valuable opportunities to connect with others, and users have the power to customize their own experiences (by choosing what content to follow or interact with); but social media platforms also have disadvantages. One of those downsides is the company’s use of algorithms designed to keep the user engaged for long periods of time,” Harriger told PsyPost.

“Social media companies are aware of the harm caused by their platforms and use of algorithms, but have made no effort to protect users. Until these companies are more transparent about the use of their algorithms and give users the ability to opt-out of content they don’t want to see, users are at risk. One way to minimize risk is to only follow accounts that have a positive impact on mental and physical health and block content that has triggering or negative content.”

In their article, Harriger and colleagues outline recommendations to combat these algorithms and protect the mental health of social media users. First, they emphasize that the primary responsibility lies with the social media companies themselves. Echoing the Academy for Eating Disorders (AED) suggestions, the authors state that social media companies should increase transparency about their algorithms, take steps to remove accounts that share eating disorder content, and make their research data more accessible to the public make.

The researchers add that social media platforms should let users know why the content they see in their feeds was chosen. You should also limit microtargeting, a marketing strategy that targets specific users based on their personal information. Additionally, these companies are socially responsible for the well-being of their users and should take steps to raise awareness of weight stigma. This can be accomplished by consulting body image and eating disorder experts on ways to promote body positive image among users, possibly by promoting body positive content on the platform.

Plus, influencers can also play a role in influencing the body image and well-being of their followers. Harriger and her colleagues suggest that influencers should also consult body image experts for guidelines on body positive messages. Positive actions could be educating your audience about social media algorithms and encouraging them to combat the negative effects of algorithms by following and engaging with body-positive content.

Researchers, educators, and clinicians can explore ways to prevent the negative effects of social media on body image. “It’s difficult to study the effect of algorithms empirically because each user’s experience is personally tailored to their interests (e.g., what they’ve clicked or viewed in the past),” Harriger said. “However, research can examine the use of media literacy programs that address the role of algorithms and equip young users with tools to protect their well-being on social media.”

Such research can help inform social media education programs that educate youth about social media advertising, encourage them to think critically when participating in social media, and teach them strategies to increase the positive content in their feeds .

Parents can teach their children positive social media habits by modeling healthy behaviors with their own electronic devices and setting rules and boundaries for their children’s social media use. You can also host discussions with your kids on topics like social media image manipulation and algorithms.

Overall, the researchers conclude that social media companies have the ultimate responsibility to protect the well-being of their users. “We affirm that changes must occur at the system level so that individual users can effectively do their part to maintain their own body image and well-being,” the researchers report. “Social media companies need to be transparent about how content is being delivered if algorithms continue to be used, and they need to give users clear ways to easily opt-out of content they don’t want to see.”

The Dangers of the Rabbit Hole: Reflections on Social Media as a Portal to a Distorted World of Processed Bodies and the Risk of Eating Disorders and the Role of Algorithms was authored by Jennifer A. Harriger, Joshua A. Evans, J Kevin Thompson, and Tracy L. Tylka.


#Psychology #experts #calling #social #media #giants #increase #transparency #algorithms #protect #users #mental #health

 







About the author

adrina

Leave a Comment