Photo Credit: pexels.com
It’s becoming increasingly clear that the big social media platforms could be doing a lot more to protect their teenage users. For years, they have been dragging their feet, unwilling to take even basic steps to protect their youngest users. Case in point: Instagram is just now deciding to blur out nudity in images sent by young users. Isn’t this something that should have been done years ago?
What in the world is a “nudity protection” feature?
Of course, the big social media platforms are absolutely shameless, and they are actually taking a victory lap on this one. Instagram, for example, is congratulating itself for coming up with a so-called “nudity protection” feature that should help to cut down on cases of sexual exploitation involving minors.
As Instagram points out, the creation of this new feature is just another step in its long-standing work to protect young people. And Instagram has certainly done its homework on this one. They have lined up support from a prominent cyberbullying expert at Harvard, as well as the support of key organizations such as the National Center for Missing & Exploited Children.
The goal, it appears, is to avoid any more embarrassing cases of “sextortion” from going public and making Instagram look bad. In a classic sextortion case, a young user will be tricked, manipulated, or coerced into sending nude images via Instagram. As soon as the recipient has those images, they can be used for extortion or blackmail. All the criminal has to do is threaten to show those images to family or friends. You can understand why some sextortion cases have even led to teen suicide.
In all fairness to Instagram, the new “nudity protection” feature sounds promising. It will be auto-installed for all users under the age of 18. And it will include direct messages sent to users, reminding them about the potential risks of sending nude images (or other sensitive material) via social media.
Is more legislation on the way?
The good news, if you happen to be a parent of a young teen or tween, is that new legislation seems to be on the way. Lawmakers are tired of waiting for social media platforms to self-regulate, so they are going to force them to make changes.
The legislation that everyone is talking about right now is the Kids Online Safety Act (KOSA), which promises to make social media companies much more accountable for the type of content appearing on their platforms. It will impose a “duty of care” on them, making them responsible if they inadvertently recommend unsuitable content for teens. The legislation has been around since 2022, but it looks like this might be the year it finally gets passed.
Hopefully, the threat of lawsuits popping up all over the nation will force the hand of companies such as Meta (parent company of both Facebook and Instagram), X, and TikTok. Alas, we live in an era when “revenge porn” and “sextortion” are commonplace. That’s bad enough if it involves adults. But it’s absolutely disastrous if it involves young teens. So it’s absolutely vital that the social media companies get this right. When the new “nudity protection” feature rolls out nationally later this year, the parents of this nation will be watching.