Social media scandals and the trend of tightening censorship
Minh Khanh
Monday, May/04/2026 - 21:42
Listen to Audio
0:00
(L&D) - The rapid development of social media platforms is posing multiple consequences for users, especially children, ranging from harmful content to risks of privacy violations. In response to this situation, many countries have begun to tighten regulation over these platforms, establishing clearer requirements regarding responsibility for content control and user protection.
Many popular social media platforms today have been involved in scandals related to unlawful data collection, the dissemination of harmful content, and threats to user safety, particularly for minors. In the United States, Meta Platforms – the company that owns Facebook and Instagram – was sued by more than 30 states in 2023, alleging that these platforms have caused negative impacts on young users. According to the complaints, Meta is alleged to have used content recommendation mechanisms and continuous engagement features to retain users, including repeatedly displaying content that may affect psychological well-being, such as body image, violence, or other negative themes. Representatives of the states argue that prolonged exposure to such content may contribute to increased levels of anxiety, depression, and mental health issues among adolescents.
Mark Zuckerberg – Chief Executive Officer of Meta Platforms – attended a hearing at the Los Angeles Superior Court in late February 2026. The hearing is considered a significant milestone, focusing on whether social media platforms are designed in ways that are addictive and harmful to young users. Photo: The Guardian.
Not limited to state-level lawsuits, Meta Platforms has recently also been involved in civil litigation in the United States. In March 2026, a jury delivered a verdict in a case regarded as the first trial concerning the issue of “social media addiction.” Specifically, the family of a teenage girl (K.G.M.) stated that she had used Facebook/Instagram from a very young age, continuously over an extended period, and had repeatedly been exposed to negative content, thereby experiencing serious psychological health problems, including depression and self-harm behaviors. These harms are considered to be the basis for families filing lawsuits against Meta and related platforms. The plaintiffs argue that the manner in which content is operated and displayed on the platforms may increase the level of usage, thereby leading to negative impacts on young users. The case is viewed as a notable milestone, as it marks the first time arguments concerning the “addictive nature” of social media have been examined within a litigation process in the United States. In the same case, the platform YouTube was also brought under consideration for liability, and the jury accepted part of the plaintiffs’ claims.
Alongside Meta Platforms, several other platforms have also faced similar issues, albeit in different aspects. Snap Inc. – the operator of the Snapchat application, a messaging and image-sharing platform popular among young people in the United States – has been sued by the Texas Attorney General on allegations of misleading parents regarding the platform’s level of safety, while failing to implement adequate protective measures for underage users. According to the complaint, Snapchat allows users to send messages and images that automatically disappear after being viewed or after a short period of time. This feature, according to the plaintiffs, makes it difficult to retain and control content, thereby increasing the risk that children may be exposed to inappropriate content or exploited by malicious actors for grooming or harassment without leaving clear traces.
Meanwhile, TikTok has repeatedly been investigated by regulatory authorities in the United States and Europe in relation to the collection and processing of user data, including that of minors, as well as the risk of distributing inappropriate content through content recommendation mechanisms. As for Discord, a widely used group messaging and video-calling application, the platform has also been criticized for allowing the existence of private groups that may be exploited to disseminate harmful content, facilitate cyberbullying, or approach children, thereby posing challenges for content control and monitoring.
The Attorney General of the state of Florida, James Uthmeier, announced an investigation into multiple social media platforms, including TikTok and Discord, regarding child safety issues, amid growing concerns over abusive conduct and harmful content in the online environment.
In response to concerns over harmful content and risks to young users, many countries have begun to shift from a recommendation-based approach to imposing specific legal obligations on social media platforms. In Europe, the European Commission has implemented a series of measures to strengthen content control and user protection, most notably the Digital Services Act (DSA). Under this framework, large online platforms are required to assess and mitigate content-related risks, including risks of harm to minors, while assuming greater responsibility for controlling illegal or negatively impactful content. In practice, several social media platforms have been investigated by regulatory authorities in Europe and required to provide explanations regarding compliance with these obligations. Notably, TikTok has been subject to a fine of up to 345 million euros under EU data protection rules in relation to the processing of minors’ personal data.
The social media platform X (formerly known as Twitter) has also been alleged to have violated the European Digital Services Act (DSA). Illustrative image. Source: Internet.
In addition, Europe is also promoting technical solutions to support law enforcement, notably the deployment of age verification applications, which allow users to prove their age when accessing online services without providing excessive personal data. This approach demonstrates a trend toward integrating technology and law in the regulation of the digital environment.
In the United States, although there is not yet a unified federal law similar to the Digital Services Act (DSA), regulatory agencies and legislative bodies have increased pressure on platforms through investigations, hearings, and legislative proposals related to child protection and user data. Several bills have been introduced to restrict data collection from underage users, as well as to require platforms to design their products in a safer manner. In parallel, civil litigation and legal actions from states are also becoming important tools to compel technology companies to change their modes of operation.
Meanwhile, China has adopted a stricter control-oriented approach, as the Cyberspace Administration of China has issued regulations to control online content, including content generated by artificial intelligence, while also restricting services and designs that may be addictive for young users.
These measures indicate an increasingly evident trend of regulatory intervention, aimed at establishing specific limits on the operation of platforms in the digital environment. A common feature of these approaches is the shift from corporate self-regulation to specific legal obligations that can be subject to oversight and enforcement in cases of violation.
(L&D) - Berlin has put forward a proposal to increase the tax on spirits within the trajectory of public health reform, sending a clear message that alcoholic beverages can no longer remain outside regulatory instruments for public health and budget balance.
(L&D) - A federal district court judge criticized an attorney in the state of Indiana for copying and pasting AI-generated content into court filings, stating that the individual failed to fulfill the duty of legal research and content verification, instead delegating such responsibilities to artificial intelligence.
(L&D) - President of the Republic of Korea Lee Jae Myung and the First Lady are scheduled to pay a state visit to Viet Nam from April 21 to 24, 2026, at the invitation of General Secretary and President To Lam and the First Lady. This is a high-level external relations activity taking place in the context of the Viet Nam - Republic of Korea relationship continuing to develop positively, especially following the establishment of the Comprehensive Strategic Partnership framework between the two countries.
(L&D) - Through the Mining Code 2023 and recent law enforcement actions, Cameroon is tightening the gold extraction chain, from licensing and environmental regulation to the control of commercial output.
(L&D) - The Senate of Cambodia has passed a cybercrime law considered to be a landmark measure, as the country faces increasing pressure related to online scam centers and transnational criminal activities.
(L&D) - Speaking ahead of the Spring Meetings of the World Bank and the International Monetary Fund on April 13 in Washington, Ajay Banga, President of the World Bank, stated that the global labor market is unlikely to recover in the short term, even if current conflicts come to an end. This assessment indicates that the difficulties of the labor market are not merely temporary but also stem from structural economic factors..
(L&D) - Chinese lawmakers have introduced a draft law to regulate “virtual humans” — AI-generated characters — while simultaneously tightening measures to protect children from services with addictive potential.
(L&D) - The opening of the Vietnamese Embassy in Ireland on the occasion of the 30th anniversary of the establishment of diplomatic relations marks a new phase in bilateral relations, aiming to enhance economic, educational, and innovation cooperation in the coming period.