A Maryland school district is suing ByteDance, the owner of Meta, Google, Snap and TikTok, for allegedly contributing to a “mental health crisis” among students. A lawsuit filed by The Howard County Public School System claimed Thursday that social networks operated by these companies are “addictive and dangerous” products that have “rewired” the way children “think, feel and behave.”
The lawsuit cites a laundry list of issues with Instagram, Facebook, YouTube, Snapchat and TikTok that it alleges harm children. This includes (allegedly) addictive “dopamine-triggering rewards” on each app, such as TikTok’s For You page, which leverages data about user activity to provide an endless stream of suggested content. . It has Facebook and Instagram’s recommendation algorithms and “features that are designed to create harmful loops of repetitive and excessive product use.”
Additionally, the school district accuses each platform of “encouraging unhealthy, negative social comparisons that lead to body image issues and related mental and physical disorders” in children. Other parts of the lawsuit allege the promotion of child sexual abuse with “faulty” parental controls as well as security gaps in each app.
“Over the past decade, Defendants have pursued a strategy of sustainable development at all costs, ignoring the impact of their products on the mental and physical health of children,” the lawsuit states. “In a race to corner the ‘valuable but untapped’ market of tween and teen users, each defendant designed product features to promote repetitive, uncontrolled use by children.”
The Howard County Public School System is far from the only school district that has recently decided to take legal action against social media companies. simultaneously to two other schools Districts in Maryland, in school systems Washington State, Florida, California, Pennsylvania, new Jersey, Alabama, TennesseeAnd others have filed similar lawsuits over the negative effects of social media on children’s mental health.
“We’ve invested in technology that finds and removes content related to suicide, self-injury or eating disorders,” Antigone Davies, Meta’s head of security, said in an emailed statement. ledge, “These are complex issues, but we will continue to work with parents, experts and regulators like state attorneys general to develop new tools, features and policies that meet the needs of teens and their families.”
Google denies the allegations outlined in the lawsuit, company spokesman Jose Castañeda said in a statement. ledge“In collaboration with child development experts, we’ve created age-appropriate experiences on YouTube for kids and families, and provide strong parental controls.” Meanwhile, Snap spokesman Pete Boogaard says the company “screens all content before it reaches a large audience, which helps protect against the promotion and discovery of potentially harmful content.” ByteDance did not immediately respond ledgeRequest for comment.
Critics have drawn attention to the potential impact of social media on children and teens, especially after Facebook whistleblower Frances Haugen came forward with a trove of internal documents that told Meta about Instagram’s potential harm on some young users. I knew Last week, US Surgeon General Dr Vivek Murthy issued a public advisory, describing social media as “a profound risk of harm to the mental health and well-being of children and adolescents”.
Some states have responded to the safety issues posed by social media by enacting laws that prohibit children from signing up for social media sites. While Utah will ban children under 18 from using social media without parental consent starting next year, Arkansas has passed a bill to prevent underage children from signing up for social networks. Similar legislation has been passed. At the same time, a flurry of national online security laws, some of which could enforce online age verification systems, have made their way to Congress, despite warnings from civil liberties and privacy advocates.










