Since the beginning of the pandemic, the 26-year-old student from Berkshire, England, has posted multiple videos about suicidal ideation and her stay in a psychiatric ward. Some of these clips are lighthearted, others far more serious. ![]() Yet Williams does not utter the word “suicide” to her front-facing camera, or type it in her captions, for fear the TikTok algorithm will censor or remove her content. The hashtag #unalivemeplease has 9.2 million views on TikTok #unaliving has 6.6 million #unaliveawareness has an additional 2.2 million. Though #suicideprevention is a frequently used tag on the app, the hashtags #suicide and #suicideawareness do not exist-if you search for them, TikTok pulls up the number for a local crisis helpline. It’s a well-intentioned policy, initiated in September 2021, a year after a graphic video of a suicide spread across the app. But users have also come to fear elusive content moderation filters that seemingly suppress or remove videos discussing death, suicide, or self-harm. The study-led by a communication scientist at the University of Munich-presented participants with news reports about suicide which were identical except for the word used to describe suicide itself. Some of the reports included the neutral German term “Suizid” (suicide), while others used the more problematic terms “Freitod” (free death) and “Selbstmord” (self-murder). ![]() The study found that people were more likely to subsequently use the word they had read, and that people’s attitudes to the suicides they read about did differ depending on the word in the piece. Such research is crucial because, Padmanathan notes, the words we use can determine whether or not people seek help for their issues. ![]() Without controlled studies, it’s impossible to know the effect “unalive” has on people accessing resources.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |