Amid a series of crime damages in the production and distribution of “Deep Fake (Artificial Intelligence-based Image Synthesis)” obscene videos based on the mobile messenger “Telegram,” foreign media also noted, “Korea is facing a deepfake emergency.”
The BBC reported on the 28th (local time) that “a number of chat groups, including minors, have recently been found in Korea to produce and share sexually explicit deepfake images,” adding, “President Yoon Suk Yeol ordered ‘to thoroughly investigate, solve, and eradicate digital sex crimes.”
The BBC said, “Korea has a dark history of digital sex crimes,” referring to the Telegram Nth Room incident in 2019. It also pointed out that online deepfake sex crimes in Korea are increasing rapidly. The BBC said, “There were 297 cases of deepfake sex crimes reported to the police in the first half of this year, far exceeding the total (180 cases) last year and 160 cases in 2021.” In particular, it pointed out that “more than two-thirds of these crimes have been committed by teenagers in the past three years.”
The background of the rapid increase in hidden camera and deepfake sex crimes was the gender gap. The BBC pointed out that “Only 5.8% of executives at Korean listed companies are women, and Korean women are paid an average of one-third more than men, making the gender wage gap the most among the rich countries in the world.” He added, “Digital sex crimes are exploding due to the widespread culture of sexual harassment in the rapidly growing technology industry.”
The British Guardian also said on the same day, “After long efforts to eradicate hidden cameras, Korea is now fighting against the image of deepfake.” The Guardian said, “297 deepfake crimes were reported in seven months this year,” and stressed, “The problem is that it is estimated to be more serious than the official figures.” The Guardian also mentioned Cho Joo-bin, the main culprit of the Nth Room incident, saying, “The investigation into deepfake sex crimes is expected to deal a bigger blow to Telegram’s reputation in Korea, which was used to run an online sexual intimidation organization.”
The Wall Street Journal (WSJ) also ran an article titled “Anyone can be a victim: The crisis caused by artificial intelligence (AI) fake pornography hits Korea.” The article pointed out, “Hundreds of thousands of anonymous Telegram users sent pictures of Korean women fabricated by images and videos without permission,” and added, “Since the chat window distributing fake pornography is made up in Korean, it suggests that the people in the chat room are Korean.”
According to a 2023 report by Security Hero, a global security firm, Korean singers and actresses accounted for about half of the people appearing in Deepfake pornography posted online. The firm analyzed 100,000 videos that were distributed across 100 websites. The WSJ reported, “The fact that Korea comes from about half of the world’s Deepfake pornography reflects the scale of the problem (Korea) is facing.”
JULIE KIM
US ASIA JOURNAL