Explicit AI image creation increasingly a legal issue amid crackdown on deepfakes
Published on March 30, 2026.
The recent probation handed to two Pennsylvania teenagers who created fake nude photos of their classmates underscores the growing legal issue of artificial intelligence (AI) creating images that could be used to weaponize someone's image. The TAKE IT DOWN Act, passed in 2025, criminalizes the nonconsensual publication of intimate images, including "digital forgeries" (deepfakes), requires certain websites and online or mobile applications to implement a "notice-and-removal" process to remove such images at the depicted individual's request. The Federal Trade Commission has enforcement authority over platforms that fail to comply. The federal law was co-sponsored by Sens. Ted Cruz, R-Texas, and Amy Klobuchar, D-Minn., and passed unanimously in the Senate and by a vote of 409-2 in the House. It also requires covered platforms to create a process for consumers to notify them of a nonConsensual intimate visual depiction on the platform and remove such depictions within 48 hours of receiving notice.
Read Original Article