TikTok Challenges Gone Wrong: When Algorithms Cause Injury

TikTok trends can look silly and harmless—a dance, a prank, a quick laugh. But some challenges don’t just waste time; they send kids to the emergency room or worse. When that happens, it’s not just about one bad decision. The way social media algorithms push dangerous content in front of children can play a major role in the harm.
For Maine families, understanding how these challenges work—and how the law is starting to treat algorithm-driven harm—can make a big difference after a serious injury.
How TikTok challenges hook kids in the first place
TikTok and similar platforms don’t show videos in random order. They use recommendation algorithms that study what a user watches, likes, and replays, then serve up similar content to keep them engaged as long as possible. Lawsuits across the country now claim that these algorithms are intentionally addictive and can push kids toward extreme, risky material.
For a child:
- One silly stunt video can turn into an endless stream of “challenge” clips.
- The algorithm rewards creators who post shocking content that gets lots of views and comments.
- Kids feel social pressure to try the trend themselves so they can post their own video and gain followers.
That cycle doesn’t feel accidental to many parents—or to the attorneys who are filing lawsuits against major social media companies.
Real tragedies tied to TikTok challenges
News stories and court filings describe heartbreaking cases where children died or suffered serious injuries while trying viral challenges:
- The “blackout challenge”, which encourages kids to choke themselves or restrict airflow until they pass out, has been linked to the deaths of children as young as 8 years old around the world.
- A federal appeals court in 2024 allowed a Pennsylvania mother’s lawsuit to move forward after her 10-year-old daughter died attempting the blackout challenge, ruling that Section 230 immunity may not shield TikTok’s algorithmic recommendations.
- Medical and pediatric groups have also warned about other dangerous trends, including the “Benadryl challenge,” which encourages kids to take high doses of allergy medicine to hallucinate, leading to poisonings and emergency-room visits.
More recently, news outlets have reported deaths of children in the U.S. and abroad who apparently tried blackout-style asphyxiation challenges they saw or heard about through TikTok or classmates using the app.
These stories all share one theme: kids didn’t go searching for a way to hurt themselves. The challenge came to them—often repeatedly—through recommended videos.
When does a social media platform bear legal responsibility?
Traditionally, families struggled to sue platforms because of Section 230 of the Communications Decency Act, which often shields websites from liability for user-generated content. But newer cases focus on something different: how the algorithm pushes that content.
Recent lawsuits argue that:
- Platforms design their products to be addictive and to amplify extreme content that keeps kids glued to their screens.
- The companies know, from their own internal research, that this harms youth mental health and increases self-harm risks.
- Recommendation engines that specifically target children with dangerous challenge videos are more like a defective product feature than a neutral “publisher.”
Courts are still sorting out these theories, but parents now have more legal tools than they did just a few years ago. Some cases focus on wrongful death; others seek damages for addiction-like use, anxiety, depression, eating disorders, and self-harm linked to social media use.
What this means for Maine parents after a challenge-related injury
If a Maine child gets hurt while trying a TikTok or other social media challenge, the family may have more than one legal route:
- A potential claim against the platform
- Especially if the algorithm repeatedly pushed dangerous challenge content, or if the child was under the platform’s stated age limit.
- These cases may join larger multi-district litigation against social media companies, or proceed individually in state or federal court.
- Claims against others who encouraged or failed to protect the child
- Schools, youth programs, or adults who directly pressured kids to attempt a stunt.
- In some situations, other children or parents who staged and recorded the challenge might face civil claims.
- Traditional negligence or premises claims
- If the challenge took place at a poorly supervised party, camp, or event where adults ignored obvious risks or allowed unsafe conditions.
A Maine personal injury lawyer who follows national social media litigation can help families decide which path makes sense, and how to preserve their ability to recover compensation.
Steps to take if your child is hurt by a social media challenge
You don’t have to build a lawsuit on day one, but you can take practical steps that protect both your child and any future claim:
- Get immediate medical care.
Tell providers what happened and that a social media challenge may be involved. Ask them to document all injuries and any emotional or psychological effects. - Secure digital evidence.
- Save the child’s device if possible.
- Take screenshots of the challenge videos, comments, and direct messages.
- Note dates, usernames, and any notifications that suggested the challenge.
- Talk to your child calmly.
Ask where they first saw the challenge—TikTok, Snapchat, classmates’ phones—and whether the app kept showing similar videos. - Notify schools or youth organizations.
If the incident happened on school property or during an activity, report it in writing and request copies of any incident reports. - Consult a lawyer before deleting accounts or talking to the company.
Platforms sometimes request statements or offer “support” that can later complicate a claim. An attorney can guide you on what to share and how to preserve critical data.
How a Maine injury firm can help
A firm like Peter Thompson & Associates can:
- Work with digital forensics experts to reconstruct what your child saw and how often the algorithm pushed similar content.
- Coordinate with pediatric, psychological, and toxicology experts to understand long-term harm.
- Identify whether your case fits into existing national litigation against social media companies.
- Pursue compensation for medical bills, counseling costs, lost future opportunities, and the immense pain and suffering that comes with a serious or fatal injury to a child.
The bottom line
TikTok challenges don’t happen in a vacuum. Social media algorithms help create, spread, and normalize dangerous stunts that kids are not equipped to evaluate. When those trends lead to serious injuries or deaths, Maine families deserve answers—and may have legal options to hold powerful companies accountable.

