Maria Miller MP has responded to the Department for Science, Innovation and Technology’s consultation on AI, focusing on how we can make sure rapidly changing technologies are safe for all women and girls.
Maria said, ‘as has been the case since the early days of the internet and even long before, malicious actors will try to use the technological tools at their disposal to harm women and girls. We’ve seen the rise in intimate image abuse using ‘real’ images but, now that AI is becoming ever more advanced, it is possible – and easy – to create sexual or nude images of any woman and use them against her.
We now have a rare opportunity to put structures in place to protect women and girls, foreseeing as we can the threat of AI-related intimate image abuse. This didn’t happen at the dawn of the internet – but we can, and must, do it in the early days of AI.’
There must be robust structures in place that are flexible enough to deal with AI as it develops. Already we see huge changes in very short timescales.
Deepfakes – manufactured images which convincingly depict real life – are on the rise and analysis shows up to 96% of deepfakes on the internet are pornographic. Nudification software, which strips women of their clothing virtually, works only on images of females. These technologies are violating, and a threat to the safety of women.
Maria is clear that we must ensure the following actions are taken:
- Ensuring AI-manipulated or -generated images are clearly marked.
- Further change the law on intimate image abuse to ensure that the making, taking, and sharing of intimate images is fully covered by law and that deepfake images are included in the relevant definitions.
- A reliable system for removing harmful deepfaked images, ensuring images are correctly categorised for timely takedown.
- Use existing AI technology to identify victims of intimate image abuse.
- Ensuring fines levied under the Online Safety Bill are used to directly support victim groups.
This consultation response builds on the extensive work Maria has done to tackle intimate image abuse.
In November 2022, she successfully secured amendments that will shortly be tabled to the Online Safety Bill to ban the non-consensual sharing of intimate images. Maria’s amendments will cover sharing deepfakes, but it is important that we also look to ban the making and taking of these images.