By Hadassa Ferreira, Arts & Culture Editor
Are social media platforms safe for child access? This is a question that social media giants, such as Facebook, TikTok, and others, aim to answer in a series of trials that started in the first week of February.
Facebook just turned 20 years old on Feb. 4. Over the past two decades, the platform has enabled people to maintain relationships online by sharing personal content and communicating directly with others.
Although there is a huge celebration for Meta, the moment is also of significant concern. According to The New York Times, Meta, TikTok, YouTube, and Snapchat were sued due to the accusations that they are addictive products.
In the same week, a series of trials started, and the company, as well as other platforms, faced landmark legal tests that claimed social media platforms cause harm by affecting users’ mental health and causing addiction.
Dr. Jacqueline Wisler, assistant professor of marketing at Saint Leo University, stated, “Research summarized by the American Psychological Association and the U.S. Surgeon General indicates that passive use, such as extended scrolling and social comparison without interaction, is associated with higher reports of anxiety, depressive symptoms, and lower self-esteem, particularly among adolescents and young adults.”


Those who are filing the lawsuits claim they have been personally injured by social media platforms; they argue that social media causes excessive use.
People have compared platforms such as YouTube, TikTok, Snapchat, and Facebook to cigarettes and have argued that they are addictive and harmful. Therefore, those companies are facing a series of landmark trials to respond to accusations and prove that the platforms are safe.
“I think we depend too much on other people’s opinions. We value attention, and we value what other people say too much. Also, we get influenced by celebrities, and social media is a tool to get all of that,” said Andreas Cubillos, a graduate student pursuing an MBA in project management.
This is not the first time that social media platforms have faced a judicial process. In 2023, a now 20-year-old Californian woman, identified by initials as K.G.M., filed a lawsuit arguing that she started her contact with social media as a child and after that, became addicted to social media websites, struggling with anxiety, body-image issues, and depression.
The number of people filing lawsuits against social media platforms is larger this year, and there is the concern that those cases might present serious legal threats to Meta, YouTube, TikTok, and Snapchat, forcing them to have bigger responsibility for users’ well-being.
According to LitPRO, a litigation website, by the end of 2025, there were 2,172 multidistrict litigation cases, with hundreds more in the state courts. Simmons Hanly Conroy said that 2,325 lawsuits have been filed so far in 2026.
In the past, social media platforms avoided huge legal liabilities by relying on a federal law that exempts social media websites from responsibility for content users published on social media.
The current scenario is one of caution. The discussion on the side effects of social mediausage has never been as intense, and people are starting to reconsider how they use these platforms.
This means there are more expectations for the future generation’s use of social media. Beyond that, new standards in how children can access social media have been suggested.
“You need to be careful in how you use it. Social media is good for marketing and to make money, such as to promote a business, but people need to be careful, especially younger people and kids,” added Cubillos.
Wisler concluded by saying, “From a marketing and platform governance perspective, ethical usage guidelines include greater transparency about how algorithms recommend content, stronger age-appropriate design protections for minors, user controls that allow individuals to manage notifications and exposure time, clearer labeling of sponsored or AI-generated content, and collaboration with public health researchers to evaluate long-term platform effects. Most experts describe responsibility as shared among platforms, regulators, marketers, educators, parents, and users.”
