“Court rules AI-generated pornographic images not punishable if victim is not identifiable”, 21 August 2025
A South Korean court has ruled that the distribution of AI-generated nude images cannot be punished under current deepfake pornography laws unless the depicted individual is a real, identifiable person. The decision has drawn criticism for exposing a legal loophole amid a rise in AI-generated sexually explicit content.
On 20 August, it was confirmed that Judge Lee Jung-hoon of the Uijeongbu District Court’s Goyang Branch recently acquitted a man in his 30s, surnamed Kim, who had been charged with distributing deepfake pornography under South Korea’s Sexual Violence Punishment Act.
Kim was accused of sharing AI-generated nude images of women in a Telegram chatroom in November last year. Prosecutors argued the images were “materials capable of inciting sexual desire or shame.” However, the defence claimed the individuals in the images were likely fictional characters generated by AI, not real people. The court accepted this argument, citing the lack of evidence proving the existence or identity of the woman in the image.
Judge Lee stated that “there is no material confirming the origin or synthesis method of the image, making it difficult to conclude that a real person was depicted.” As the prosecution did not appeal, the acquittal is now final.
The ruling effectively limits the application of South Korea’s so-called “Deepfake Prevention Law” to cases involving real persons capable of expressing intent. Legal experts have criticised this narrow interpretation, arguing that AI pornography is increasingly realistic and harmful regardless of whether a victim is identifiable.
…Courts are likely to continue issuing conservative rulings where victimhood is uncertain, in line with the principle of strict interpretation in criminal trials.
…Other jurisdictions have already begun expanding the scope of legal protection. The U.S. state of Virginia redefined deepfake victims as “persons who could be perceived as real,” including AI-generated images. California prohibits the intentional distribution of images that could be mistaken for real exposure. The UK also regulates the distribution of synthetic sexual content made for gratification.
…Experts in South Korea are calling for legal reform that focuses not solely on the existence of a specific victim, but on the broader social harm. They argue that even when victims are not clearly identifiable, the spread of AI-generated pornography can still cause significant negative consequences and should be punishable under the law.
WACOCA: People, Life, Style.