Meta, Snap and TikTok have formed a new program called Thrive to stop the spread of graphic content that depicts or encourages self-harm and suicide. Thrive allows participating companies to exchange “signals” to alert each other to infringing content on their platforms.
Thrive was developed in partnership with the Mental Health Coalition, a nonprofit organization dedicated to removing the stigma around discussions about mental health. Meta says it provides the technical infrastructure behind Thrive that allows “signals to be shared safely.” It uses the same cross-platform signal-sharing technology as the Lantern program, which is designed to combat child abuse online. Participating companies can share hashes that match the offending media to alert each other.
Meta says it has already made such content harder to find on its platform, but it tries to give people space to talk about their mental health, suicide and self-harm, as long as they don't promote it or provide graphic descriptions.
According to Meta's charts, the company takes action on millions of pieces of content discussing suicide and self-harm each quarter. Last quarter, an estimated 25,000 of those posts were restored, most of them after a user appealed.