EU Initiates Investigation Into Meta Over Child Safety Concerns
The European Commission mentioned in its statement that they are concerned that the systems of both Facebook and Instagram, including their algorithms, "may stimulate behavioural addictions in children."
European Union regulators have launched a formal investigation into Meta. The tech giant is under scrutiny over concerns that Facebook and Instagram are creating addictive behaviour among children. The European Commission, the EU’s executive arm, said that Meta may have breached the Digital Services Act (DSA) in areas linked to the protection of minors.
The European Commission mentioned in its statement that they are concerned that the systems of both Facebook and Instagram, including their algorithms, "may stimulate behavioural addictions in children." Additionally, the regulators are also concerned about the age-assurance and verification methods put in place by the social media giant. The latest investigation will explore the potential addictive impacts of social media platforms, known as “rabbit hole” effects. The Commission announced that they will carry out an in-depth investigation as a matter of priority and will continue to gather evidence. The opening of formal proceedings will enable the Commission to take further enforcement measures, such as adopting interim measures and non-compliance decisions.
We've opened formal proceedings against Meta regarding the protection of minors on Facebook and Instagram.
— European Commission (@EU_Commission) May 16, 2024
The systems of both platforms, including their algorithms, may stimulate behavioural addictions in children, as well as create so-called 'rabbit-hole effects’.
More ↓
"Today we open formal proceedings against Meta. We are not convinced that it has done enough to comply with the DSA obligations to mitigate the risks of negative effects to the physical and mental health of young Europeans on its platforms Facebook and Instagram," Commissioner Thierry Breton said in the statement. At the same time, a Meta spokesperson told CNN that the company wants young people to have safe, age-appropriate experiences online and has spent a decade developing more than 50 tools and policies designed to protect them. Earlier this year, the Commission launched an investigation against Meta over concerns that its platforms were failing to counter disinformation ahead of the European Parliament elections.
Adjust Story Font
16