Why can’t TikTok take down that disturbing suicide video? (2024)

Why can’t TikTok take down that disturbing suicide video? (2)

News

A clip showing a man shooting himself is still circulating on the app, having initially been shared three days ago

TextBrit Dawson

TikTok is struggling to remove a disturbing video of a man’s suicide, three days after it was uploaded to the app.

The clip reportedly shows 33-year-old army veteran Ronnie Mcnu*tt, who, according to The New York Post, shot himself in the head on a Facebook livestream on August 31. It has been reported that Mississippi-based Mcnu*tt had recently lost his job at a Toyota plant in Blue Springs, and had broken up with his girlfriend. The video has since gone viral on multiple social media platforms, including TikTok.

TikTok’s ‘For You’ is an endless, algorithmically-powered stream of videos that curates clips from people users don’t necessarily follow. Many have reported coming across the clip at random, and voiced how traumatising being confronted by the footage has been. Given TikTok’s young audience – 69 per cent are between the ages of 13 and 24 – this has caused even more concern for mental wellbeing.

“On Sunday night, clips of a suicide that had originally been livestreamed on Facebook circulated on other platforms, including TikTok,” a TikTok spokesperson tells Dazed. “Our systems, together with our moderation teams, have been detecting and blocking these clips for violating our policies against content that displays, praises, glorifies, or promotes suicide.”

Read More

Could the new iPhone kill the video game console?

Is Atlantis resurfacing? Unpacking the internet’s latest big conspiracy

Elon Musk’s Neuralink has reportedly killed 1,500 animals in four years

Could sex for procreation soon be obsolete?

“We are banning accounts that repeatedly try to upload clips, and we appreciate our community members who‘ve reported content and warned others against watching, engaging, or sharing such videos on any platform out of respect for the person and their family.”

Despite this promise of action, the video remains on the app. Several users have posted engagement they’ve had with TikTok moderators, who have reportedly told them the video “doesn’t violate our Community Guidelines”. Dazed has asked TikTok for further clarification on why this is happening.

Now, users on the app have started taking matters into their own hands, sharing videos warning others about the presence of the suicide clip on TikTok. “Please try not to go on TikTok today or tomorrow,” one video said. “There is a very graphic and gorey (sic) suicide video going around right now!”

Another user posted a clip, which addressed rumours that the footage is fake. “In reality, it is not fake,” @alluringskull said. The user then shows a still from the video, which sees Mcnu*tt sitting down at a desk with a phone to his ear. “If you see this, pause the video, scroll away,” @alluringskull added. “Please stop treating this like a meme, please stop treating this like a joke, this is a real person who passed and his family is grieving.”

“I honestly thought it was fake until I looked into it. It horrified me to see that TikTok allowed this stuff to stay on their app for so long” – James, TikTok user

One user called James was sent the video by his friend, and opened it without knowing what it was. “I honestly thought it was fake until I looked into it,” he tells Dazed in a direct message. “It horrified me to see that TikTok allowed this stuff to stay on their app for so long.”

This isn’t the first time TikTok has been criticised for its poor moderation efforts. In February, the video sharing platform took three hours to tell police of a suicide that was livestreamed on the app. The video of the victim’s body remained live for over an hour and a half before it was taken down. TikTok reportedly took steps to prevent the post from going viral before notifing the authorities.

In July, the app’s moderation guidelines were questioned once again, after its algorithm promoted a collection of anti-semitic memes, soundtracked by the lyrics, “We’re going on a trip to a place called Auschwitz, it’s shower time”. Nearly 100 users featured the song in their videos, which remained on the app for three days (eight hours after the platform was alerted to their presence by the BBC).

@tiktok_us is disgusting. There was a video of a guy committing suicide by shooting himself in the head with a shotgun but then tells me it “doesn’t violate community guidelines” yet y’all want to block my vid because my vape is in it. ok pic.twitter.com/sABRSRdXJc

— ry 🌈 (@ryleemaarie) September 7, 2020

TikTok’s latest Transparency Report – published in July – says that the app removed over 49 million videos globally in the second half of last year, with 98.2 per cent of those being taken down before they were reported. 89.4 per cent of these were removed before they received any views. However, TikTok is known for censoring users and content that doesn’t violate any guidelines, including a teenager who criticised China (where the company is based), those deemed “ugly”, poor, or disabled, and Black creators.

The app admits that it won’t catch every instance of inappropriate content, and asserts that it will continue to invest in technology and experts to make TikTok as safe as possible. Yesterday (September 8), the app joined the European Union’s Code of Conduct on Countering Illegal Hate Speech, pledging to crack down on hateful and illegal content.

“Our ultimate goal is to eliminate hate on TikTok,“ a spokesperson said in a statement. “We recognise that this may seem an insurmountable challenge as the world is increasingly polarised, but we believe that this shouldn’t stop us from trying.”

Update (September 24): TikTok has said the suicide video was uploaded to the app as part of a “coordinated effort” by bad actors. In a statement, the video sharing platform said: “Through our investigations, we learned that groups operating on the dark web made plans to raid social media platforms including TikTok, in order to spread the video across the internet. What we saw was a group of users who were repeatedly attempting to upload the video to our platform.”

TikTok has also sent a letter to other social media platforms, including Facebook, YouTube, and Twitter, proposing that they notify each other when violent and graphic content is circulating on their sites.

If you’re struggling with mental health issues, you can contact the suicide prevention specialists Samaritans in the UK here, and the National Suicide Prevention Lifeline in the US here.

NewsTikToksocial mediamental health

Download the app📱

  • Build your network and meet other creatives
  • Be the first to hear about exclusive Dazed events and offers
  • Share your work with our community

Join Dazed Club

Why can’t TikTok take down that disturbing suicide video? (2024)
Top Articles
Latest Posts
Article information

Author: Aron Pacocha

Last Updated:

Views: 5772

Rating: 4.8 / 5 (48 voted)

Reviews: 87% of readers found this page helpful

Author information

Name: Aron Pacocha

Birthday: 1999-08-12

Address: 3808 Moen Corner, Gorczanyport, FL 67364-2074

Phone: +393457723392

Job: Retail Consultant

Hobby: Jewelry making, Cooking, Gaming, Reading, Juggling, Cabaret, Origami

Introduction: My name is Aron Pacocha, I am a happy, tasty, innocent, proud, talented, courageous, magnificent person who loves writing and wants to share my knowledge and understanding with you.