How image editing apps are hurting our mental health and the urgent need for action.
YouTuber Haley is trying to show girls how their Instagram heroes tweak their bodies digitally, presenting unrealistic – in fact, unreal – images of themselves to their followers.
“First, we’re going to fix my hairline. I hate my hairline. It’s really, like, weird.”
Yet as she demonstrates the image-editing app FaceTune, she becomes visibly excited, drawn into what she can do to perfect herself. In seconds her hairline is ‘fixed’.
She moves on. “We’re going to just make my waist teeny-tiny.” “Of course,” she adds.
She’s busting the myths of how the Insta-famous get their looks, but at the same time can’t hide her delight as a flick of her finger gives results one would need to pay thousands to achieve surgically, and with considerably less discomfort.
These apps are, quite clearly, fun. Fun like a house of mirrors if you push the extreme waist-shrinking slider to its maximum. Or if you have a little more restraint, they’re fun like having your own Vogue airbrushing team; to sharpen your cheekbone just slightly, to iron out those wrinkles you always get when you smile for the camera.
Like many of life’s pleasures, we ought to ask, though, whether these apps are good for us. The answer is unsurprising. Creating perfect versions of our bodies – the bodies we grow up in, live in, and know, with all their imperfections – and launching these buffed, slimmed avatars into the digital parallel universe is not, it turns out, good for our mental health.
We were so concerned about the shame and distress people can feel about their bodies that we made it the focus of Mental Health Awareness Week in 2020. As part of this, we carried out research that showed that getting on for half of teenagers have had worries about their body image that they directly attribute to social media. It affects girls more, but boys are not immune.
In addition to these worries and the shame young people can feel about themselves, the research shows that having a negative body image is associated with severe mental health problems such as body dysmorphic disorder and eating disorders.
We should also ask why people feel they need to alter their images in this way, and what can be done about it?
Governments have many policies to deal with online harm but they say nothing, though, on the more subtle harm of promoting unrealistic images of so-called ‘perfect’ bodies. Unlike, say, sending a death threat online, this is a harm that functions by accretion.
Let’s not pretend a single instance of airbrushed cellulite will hurt anyone very much at all. But when all you see is conventionally ‘perfect’ legs, tummies, faces, served to you algorithmically because that’s what keeps you scrolling, that’s a different matter. Scrolling even as your mood sinks, and you start to experience your own body as more and more inadequate, a sign of failure, or an object of disgust. That’s when young people start to face real risks.
To be clear, teenagers are not to blame for this. Image-editing apps are made by profit-making companies that know exactly what they’re doing. They function within a social media ecosystem designed to keep people scrolling, regardless of the emotional consequences.
Much of the most dangerous content is not really made by individuals at all, but by advertisers working with ‘influencers’ through paid partnerships with beauty companies, which are sometimes declared transparently, but often are not.
Good Mental Health is calling for three changes to Governments online harm policies, to protect young people from this insidious, daily bombardment.
We want apps designed to adjust users’ bodies and faces to be available only to adults.
We want a requirement for individuals to have control over the types of content presented to them algorithmically on social media, with the safest setting being the default. They can then avoid being presented with such an overwhelming volume of images of conventionally ‘perfect’ bodies.
And we want users to be given control over the type of advertising they receive, so they can avoid being presented with excessive amounts of advertising showing this sort of imagery.
These are proportionate recommendations. As it happens, they would do more than just protect young people from this type of material: we could all do with some more control over the content served to us by secret and impenetrable algorithms.
We and others have been raising the alarm loudly and clearly on these dangers for some time now. Governments do have the power to fix this part of the online world that’s harming us so greatly: they must seize upon the opportunity to do so.