Kids are making deepfakes of each other, and laws aren’t keeping up
A report highlights a troubling trend: children are using AI tools to create deepfake images and videos of their peers—often of a sexual nature. While every U.S. state and Washington, D.C. have laws against nonconsensual sharing of real intimate images, many existing statutes don’t cover AI-generated deepfakes, especially when minors create them or are the victims. Legal systems struggle because deepfakes are artificially crafted, not authentic photographs, and current laws weren’t designed for this technology. As a result, schools and legislators are scrambling to adapt policies to address this emerging form of peer-to-peer image-based sexual abuse .