Instagram Will Now Warn Parents If Teen Searches for Suicide or Self-Harm Content
No parent wants to think about the reality of youth suicide, the second-leading cause of death for teens and young adults ages 10 to 34. But they certainly would want to know if their child was one of the 20.4% of high school students who have seriously considered it, right?
Now, Instagram is adding a new alert for concerned parents enrolled in the platform’s parental supervision: It will send a warning if their teen repeatedly tries to search, through a teen account, for terms related to suicide or self-harm within a short period of time.
It’s the first time that parents will be proactively notified of such searches, though Instagram already blocks attempts to search for this type of content, directing people to helplines instead.
The new parent alert, which will roll out in the U.S., U.K., Australia, and Canada after an initial notification next week, offers advice on how to support your teen, including with guidance about how to validate their emotions, empathize, and listen from mental health professionals.
“We understand how sensitive these issues are, and how distressing it could be for a parent to receive an alert like this,” Instagram notes on its website today. “The vast majority of teens do not try to search for suicide and self-harm content on Instagram, and when they do, our policy is to block these searches, instead directing them to resources and helplines that can offer support. These alerts are designed to make sure parents are aware if their teen is repeatedly trying to search for this content, and to give them the resources they need to support their teen.”
How the Alerts Will Work
Attempted searches for tems like “suicide” or “self-harm” would prompt the alert to be sent to parents via email, text, or WhatsApp, depending on the contact information available, as well as through an in-app notification. Tapping on the notification will open a full-screen message explaining that their teen has repeatedly tried to search Instagram for terms associated with suicide or self-harm within a short period of time.
“Our goal is to empower parents to step in if their teen’s searches suggest they may need support,” the website explains. “We also want to avoid sending these notifications unnecessarily, which, if done too much, could make the notifications less useful overall.”
Instagram analyzed search behavior and consulted with experts from its Suicide and Self-Harm Advisory Group, landing on a threshold that requires a few searches within a short period of time, while still erring on the side of caution.
The alerts build on other policies already in place — against content that promotes or glorifies suicide or self-harm. While the platform does allow people to share content about their own struggles with these issues, such content is hidden from teens, even if it’s shared by someone they follow.
Instagram’s news comes as a landmark social media trial is underway in Los Angeles. It’s the first in a group of cases brought against Instagram, YouTube, TikTok and Snap by more than 1,600 plaintiffs, including over 350 families and over 250 school districts, accusing the tech companies of knowingly designing addictive products harmful to young users’ mental health.
Next up, Meta is building parental alerts similar to the new suicide-content warnings that will help monitor teens’ conversations with AI. Those can be expected later this year.