The parents of four British teenagers have sued TikTok over the deaths of their children, which they claim were the result of the viral “blackout challenge”.
The lawsuit claims Isaac Kenevan, 13, Archie Battersbee, 12, Julian “Jools” Sweeney, 14, and Maia Walsh, 13, died in 2022 while attempting the “blackout challenge”, which became popular on social media in 2021.
The US-based Social Media Victims Law Center filed the wrongful death lawsuit against the social media platform TikTok and its parent company, ByteDance, on behalf of the children’s parents on Thursday.
Matthew Bergman, the founding attorney of the Social Media Victims Law Center, said: “It’s no coincidence that three of the four children who died from self-suffocation after being exposed to the dangerous and deadly TikTok blackout challenge lived in the same city and that they all fit a similar demographic.
“TikTok’s algorithm purposely targeted these children with dangerous content to increase their engagement time on the platform and drive revenue. It was a clear and deliberate business decision by TikTok that cost these four children their lives.”
…
The lawsuit accuses TikTok of being “a dangerous and addictive product that markets itself as fun and safe for children, while lulling parents into a false sense of security”. It says TikTok “pushes dangerous prank and challenge videos to children based on their age and location in order to increase engagement time on the platform to generate higher revenues”.
The lawsuit further claims that TikTok has told lawmakers around the world that the blackout challenge had never been on its platform and “works to discount credible reports of children being exposed to and dying because of blackout and similar challenge videos on the platform”. It notes that other dangerous challenges that have been found on TikTok include those involving medications, hot water and fire.
…
Jools’s mother has campaigned for parents to be given the legal right to access their children’s social media accounts to help understand why they died, after she was left with no clues as to her son’s death in 2022.
Changes to the Online Safety Act, which come into force in the UK this year, explicitly require social media companies to protect children from encountering dangerous stunts and challenges on their platforms, as well as to proactively prevent children from seeing the highest-risk forms of content.
If you’re asking this and trying to argue this genuinely, then you’ve entirely lost the plot.
‘broadcasting’ doesn’t enter into this. Kids will do this regardless, and have done this regardless for much longer than the UK has had access to any type of video media.
The only difference is kids can now show their stupidity to the world… Which they have been able to do for the last 40 years to some extent. We don’t call a personal website broadcasting, because by no written definition could it be considered that. Neither is YouTube broadcast media, nor any internet thing (strictly speaking).
Neither is Tiktok, nor will be what comes after it, and so on.
Now we could change the definition of broadcast to mean any internet site or app and apply broadcast standards to it… but that just means the UK loses access to the Internet. No one would comply, because what power does a tiny island in perpetual economic decline have? If UK didn’t have any domestic advertisers sites would already ignore it.
Broadcasting, publishing with instant and far reach, etc. with the power of the algorithms to amplify. Whatever you want to label it. I only used “broadcasting” for the 1 to many communication. Many being more than a handful of friends. Even a website posted in the 90s was likely to only get to a few friends. Discoverability would have been a huge issue for any one kids geocities site.
It is different from word of mouth. It is another party amplifying the behavior you are saying already existed.
That’s my point. And you just stated it. The difference is they can show the world and perpetuate the “challenge” by modeling it for thousands of others instead of just their local friends/kid in the neighborhood.
you are discounting the power of the platform algorithms to connect these kids and to amplify the appearance of this behavior as normal.
Even if we agree the volume of exposure to this specific “challenge” didn’t change, a company shouldn’t be profiting off it. They shouldn’t be participating. They shouldn’t be applying their technology to optimize and enhance the delivery of the challenge to all corners of the world.
Now if you wanna discuss if this is undue burden on TikTok to monitor for these issues, that’s different.
I just strongly disagree that “whatever, this isn’t new, this has been happening since kids were invented” is a valid argument about if a platform should be policing the content. And where does it stop? Can adults challenge children? Is that fine too, because that’s surely happened before the internet too?
The following is maybe an extreme comparison to make, but kids have been showing each other themselves naked since the dawn of time, that does not mean TikTok shouldn’t monitor and address instances of kids posting themselves naked when it happens on their broadcasting/publishing/(whatever word you prefer) platform.