Meta sued again: Millions of users harmed by Illegal AI data use and manipulative chatbot design
— children at greatest risk
The Dutch non-profit organisation ‘Stichting Onderzoek Marktinformatie (SOMI)’ has filed a second collective representation action for damages against Meta in Germany. The lawsuit is centred around the unlawful use of personal data collected from Instagram and Facebook users as well as non-users for the purpose of training Meta's AI systems.
SOMI is an independent non-profit organization that wants to protect our society from unlawful BigTech conduct. Registration on this website costs €7.50. In return, we will register you with the Federal Ministry of Justice. You also support our campaigns throughout Europe.

Compensation
How does this work?
Would like to participate in the claim? Please read the information carefully, then register on this website and pay the €7,50 registration cost.
Alternatively, you can register directly with the Federal Office of Justice, free of charge.
Our team of legal and IT experts will take action for you and thousands of others.
You will receive your compensation once we reach a positive decision in court.
What is the claim about?

Personal data from children and third parties
Meta claims that it only uses public content from adult users to train its
AI. However, Meta's own Privacy Center states that it also processes personal data of children and
teenagers, as well as data from people who do not even have a Facebook or Instagram account. This can
happen when someone appears in a photo, is mentioned in a post or caption, or is discussed in public
comments. Even if that person never consented and never signed up. Meta further confirms that it does not
distinguish between sensitive and non-sensitive data. As a result, highly personal information, such as
contact details or other sensitive topics, may also be used for AI training. In urgent proceedings
brought by the Consumer Advice Center of North Rhine-Westphalia, the Higher Regional Court of Cologne
(Oberlandesgericht Köln) confirmed that Meta’s announced AI training may include children’s data,
sensitive data, and data from unregistered third parties.

Lack of transparency
Between 14 April and 27 May 2025, and even after AI training began, Meta did
not adequately inform consumers about how, why, and to what extent their personal data was being used for
AI training. Meta failed to explain which types of data were processed, which specific AI models or
applications were trained, where the processing took place, which companies or countries were involved, or
what happened to the data of people in Germany. Consumers were also not clearly informed about their data
protection rights, such as the right to object, access, correct, or delete their data. As a result, most
people were unaware of the data use: a Gallup Institute study published in June 2025 found that nearly
three quarters of respondents had never heard of Meta’s AI training plans, and only a small minority of
active Facebook and Instagram users recalled the notification or agreed to their data being used.

Difficult to object
Meta only allows registered users to object to the use of their data for AI
training, and even then, the process is confusing and only limited to individual accounts or profiles.
Users with suspended or blocked accounts are completely excluded, and there is no way to object to the use
of data published on other users’ profiles, on institutional accounts. The Higher Regional Court of
Cologne confirmed that objections cannot protect data contained in accounts of schools, companies, fan
pages, or other profiles. Even for users who can submit an objection, only a fraction of their personal
data can be targeted, while data already incorporated into AI models cannot be removed. Once personal data
is used to AI models, it is technically impossible to delete it. This means that Meta’s AI models can
continue to contain and reproduce personal data indefinitely, making the right to object or erase largely
ineffective.
Why join this claim?
- 1.
Meta misused personal data – Data from German consumers, including minors and third parties, was used for AI training without proper consent or transparency.
- 2.
Regain control over your data – Stop Meta from using sensitive information in AI models and assert your rights over personal data.
- 3.
Claim your compensation – Consumers are entitled to financial compensation for the unlawful use of their personal data.
- 4.
Join the movement – By participating in this claim, you help hold BigTech accountable and defend data privacy for everyone.
Timeline
Frequently Asked Questions

EU Recognition
SOMI was designated as a competent authority for bringing cross-border representative actions for consumers in the European Union. This means that SOMI can represent victims across EU member states in our collective actions.
You can see the list of all entities on:
About SOMI
Foundation for Market Information Research (SOMI) is a non- profit organization established to identify and influence issues of social importance. SOMI is a recognized claim foundation in the field of privacy and data autonomy and is committed, among other things, to protecting the fundamental rights of consumers and minors who use online services. With the app that SOMI has developed, we want to return ownership and control of personal data to all people:
All your data. All yours.
SOMI investigates abuses, informs the public and helps injured parties. SOMI does this by conducting collective proceedings and claiming compensation. For more information about SOMI, visit www.somi.nl.


Documents
