A bipartisan group of House lawmakers is introducing a bill aimed at restricting Section 230 protections for tech companies that fail to remove intimate AI deepfakes from their platforms. Representatives Jake Auchincloss (D-MA) and Ashley Hinson (R-IA) announced the Intimate Privacy Protection Act to address issues like cyberstalking, intimate privacy violations, and digital forgeries.
This bill amends Section 230 of the Communications Act of 1934, which currently shields online platforms from legal responsibility for user-generated content. Under the new legislation, that immunity would be revoked if platforms do not take adequate steps to mitigate the identified harms. The bill establishes a "duty of care" for tech platforms, requiring them to implement a reasonable process for addressing incidents of cyberstalking, intimate privacy violations, and digital forgeries, particularly AI deepfakes. These deepfakes are defined as digitally altered content that closely resembles authentic recordings of individuals.
To comply with the duty of care, platforms must develop measures to prevent privacy violations, establish a straightforward reporting mechanism, and ensure the removal of harmful content within 24 hours.
Both Auchincloss and Hinson emphasized that tech companies should not exploit Section 230 as a shield against accountability for failing to protect users. Auchincloss stated, “Congress must prevent these corporations from evading responsibility over the sickening spread of malicious deepfakes.” Hinson added, “Big Tech companies shouldn’t be able to hide behind Section 230 if they aren’t protecting users from deepfakes and other intimate privacy violations.”
Efforts to combat intimate, sexually explicit AI deepfakes have gained momentum among lawmakers nationwide. While many aspects of AI policy are still developing, the Senate recently passed the DEFIANCE Act, allowing victims of nonconsensual intimate images created by AI to seek civil remedies. Several states have enacted laws against intimate AI deepfakes, especially regarding minors. Additionally, some companies, like Microsoft, have advocated for Congressional regulation of AI-generated deepfakes to prevent fraud and abuse.
There has long been bipartisan interest in limiting Section 230 protections for platforms deemed to be misusing the legal shield originally designed for smaller companies. However, there has often been disagreement on the specifics of these changes. A notable exception is the passage of FOSTA-SESTA, which removed sex trafficking protections from Section 230. The inclusion of a duty of care in the Intimate Privacy Protection Act mirrors provisions in the Kids Online Safety Act, which is anticipated to receive strong support in the Senate, indicating a growing trend toward establishing new online protections.