The Take It Down Act marks one of the most decisive federal responses yet to the weaponization of AI. Passed 409–2 in the House, it makes nonconsensual AI-generated sexual content a criminal offense, targeting the exploding trade in deepfake pornography that has humiliated students, teachers, journalists, and even children. Platforms will now be legally required to remove flagged content within 72 hours, or face serious consequences. Victims, long told there was “nothing anyone could do,” gain the power to sue those who create, share, or host these images.
What makes this moment extraordinary is who is standing behind it. Backed by President Trump and embraced across party lines, the law treats privacy and bodily autonomy as something more than talking points. In an age where a face can be stolen with a click, Congress has finally drawn a boundary: your image is not public property, and your dignity is not up for algorithmic theft.