(SBS Night) Online phishing crime impersonating 'G-R' celebrities, damages amounting to 1 trillion won… Deepfake phishing crime, how to avoid it?


That egg

Deepfake phishing crime, how can we stop it?

SBS' 'I Want to Know That' (hereinafter referred to as 'G-R'), which aired on the 20th, highlighted crimes using deepfake videos and deep voices.

Comedian Hwang Hyun-hee, who is living a second life as a personal investment expert, complained, “I'm sorry to greet you with such an unpleasant incident. How can something like this happen?”

He then said, “I went to the market with my wife, and an old man said he was enjoying the advertisements and said he made a good investment and that he wanted to make money too,” and revealed that he was committing investment fraud by stealing his name and photo and impersonating him.

When I searched the name on social media, numerous impersonation accounts came up. And when you click on the link, you are taken to a chat room that provides investment information, and a person impersonating Hwang Hyun-hee induces stock investment.

Investment expert John Lee also expressed his distress, saying he suffered similar damage. In particular, he revealed that the victims even sued him, confirming the seriousness of what they suffered.

And recently, it was shocking to see that the damage caused by online phishing crimes impersonating celebrities such as Yoo Jae-seok and Song Eun-i amounted to approximately 1 trillion won and hundreds of victims were reported.

In addition, it was revealed that an account impersonating author Park Soon-hyuk, “Battery Man,” used deepfake videos of celebrities such as Song Hye-kyo and Jo In-sung to encourage investment.

And crimes using these deepfake videos were not far away. The informant said she recently received a call saying her daughter had been kidnapped by a loan shark. The voice and phone number clearly belonged to her daughter, but the presumed loan shark spoke in a Korean-Chinese tone, so she thought it was voice phishing and hung up, but she almost got into big trouble.

Additionally, an informant who works as a health trainer said that an impersonation account that stole his photo and name even used deepfake technology to commit a romance scam. In particular, victims occurred not only domestically but also overseas. A Vietnamese woman, who did not know that she had been the victim of a deepfake romance scam, even came to see him and asked him to repay the stone he had lent her.

In broadcasting, through experiments, it has been confirmed that voice modulation is possible through deep voice technology in real time, and deepfake technology is also improving day by day, so that real-time video calls can also use deepfake videos.

And the largest recently arrested voice phishing organization attracted attention as it was revealed that it had been preparing voice phishing based on deep fakes and deep voices until just before its arrest.

An expert warned, “Currently, deepfakes and deep voices are possible only when a person is present, but in the future, deepfakes that can be deceived even without a person will be possible.”

In addition, the police said that in the case of recent crimes using deepfake technology, there are many clues, but it is difficult to identify or arrest merchant ships because the crime is not based in Korea. “Nowadays, most crimes are committed in Cambodia. Crimes are carried out like a factory.” “It is happening,” he said.

And on the broadcast, he pointed out SNS platform services that do not take responsibility for crimes that occur due to almost no sanctions against impersonation accounts, and raised his voice, saying, “Strengthening self-regulation is being discussed, but a clear legal system is needed to prevent such crimes.”

(SBS Entertainment News Editor Kim Hyo-jeong)

Add a Comment

Your email address will not be published. Required fields are marked *