Introducing the Preventing Deepfakes of Intimate Images Act
Representative Joe Morelle has taken action to combat the spread of deepfake pornography by introducing the Preventing Deepfakes of Intimate Images Act, HR 3106. This bipartisan legislation aims to address the increasing problem of deepfake pornography generated through artificial intelligence, with a particular focus on its widespread impact on women and girls.
The Alarming Trend of Non-Consensual Deepfakes
This Act is a response to the alarming trend where 96 percent of all deepfakes are pornographic, primarily targeting women. Although these images are fake, they have real-world consequences. Morelle and other supporters emphasize the urgent need for federal action to protect victims and provide legal recourse against this form of exploitation.
Criminalizing Non-Consensual Deepfakes
A key aspect of the bill is criminalizing the creation and distribution of non-consensual intimate deepfakes intended to harass, harm, or alarm the victim. The proposed penalties include fines and imprisonment, with harsher punishments for disclosures that could impact government functions or facilitate violence.
Granting Rights to Victims
The legislation also grants victims the right to file civil lawsuits against creators and distributors of non-consensual deepfakes while remaining anonymous. This approach allows victims to seek monetary damages and punitive measures against perpetrators, offering a more comprehensive form of justice.
Ethical Use of AI and Technological Advancements
This move by Representative Morelle highlights the ethical use of AI and the need for legal frameworks to keep pace with technological advancements. The bill emphasizes preventing harm, particularly against vulnerable groups like women and minors. It underscores the growing awareness and concern about potential abuses of AI in creating deepfakes and the need for stringent laws to prevent such abuses.
Hot Take: Combating Deepfake Pornography with Legislation
Representative Joe Morelle has introduced the Preventing Deepfakes of Intimate Images Act, a bipartisan legislation criminalizing the creation and distribution of non-consensual deepfake pornography, primarily affecting women and girls. This important step addresses the increasing problem of deepfake pornography generated through artificial intelligence. By focusing on the non-consensual nature of these deepfakes, the bill aims to protect victims and provide legal recourse against exploitation. It also grants victims the right to file civil lawsuits while remaining anonymous, offering a more comprehensive form of justice. This legislation reflects the growing concern about potential AI abuses and the need for stringent laws to prevent harm.