The statement addresses serious concerns about AI systems that generate realistic images and videos depicting identifiable individuals without their knowledge or consent.Whilst AI has the potential to bring numerous benefits for individuals and society, recent developments – particularly AI image and video generation integrated into widely accessible social media platforms – have enabled the creation of non-consensual intimate imagery, defamatory depictions, and other harmful content featuring real individuals. The co-signatories are especially concerned about potential harms to children and other vulnerable groups, such as cyber-bullying and/or exploitation.
Expectations for organisations
The co-signatories remind organisations developing and using AI content generation systems that these systems must be developed and used in compliance with applicable legal frameworks, including data protection and privacy rules.
Although specific legal requirements vary by jurisdiction, fundamental principles should guide all organisations developing and using AI content generation systems. These principles include:
implementing robust safeguards,
ensuring meaningful transparency,
providing effective and accessible mechanisms to protect individuals, and
addressing specific risks to children.
Joining forces to address a global risk
The harms arising from the non-consensual generation of intimate, defamatory, or otherwise harmful content depicting real individuals are significant and warrant urgent regulatory attention. The co-signatories are committed to addressing this global risk and will join efforts. To achieve this, the co-signatories aim to share information on their approaches to addressing these concerns.
Finally, the co-signatories call on organisations to engage proactively with regulators, implement robust safeguards from the outset, and ensure that technological advancements do not come at the expense of privacy, dignity, safety, and other fundamental rights – particularly for the most vulnerable members of our global society.


