close
close

(Society) “Mom, save me”… It was a video of my daughter being kidnapped during a trip to Jeju Island, “Deep Fake”.

(Society) “Mom, save me”… It was a video of my daughter being kidnapped during a trip to Jeju Island, “Deep Fake”.

There was a recent voice phishing crime in which a Chinese criminal organization created a fake video combining children’s faces with artificial intelligence (AI) deep fake technology and demanded money from parents.

Police warned that similar incidents could occur in Korea in the near future.

Late last month, the Jeju West Police Station received a request from Chinese police to find a Chinese woman in her 20s who was traveling to Korea.

At this time, the Chinese police received a message from A.’s parents that their daughter was suspected of kidnapping.

At about 10 am that day, the police sent about 10 detectives and visited A., who was driving around 9 pm that day.

According to investigators, the organization that threatened A’s parents for kidnapping was a Chinese voice phishing group. They made a video of the kidnapping using artificial intelligence and deepfake technology with A’s photo and voice and gave it to A’s parents.

In the video, A was crying and screaming for his life: “Mom” and “Dad”, his body and hands were tied with rope.

A police spokesman said: “This is an elaborate video that uses artificial intelligence technology to make the child appear to be in an emergency situation and being tricked.”

“I immediately asked an expert to read whether the video was fake or not, and received the answer that there was a high probability of a deepfake,” he said. “This is not an incident in Korea, but it is an example showing that deepfake technology can be used quite well for violent crimes.”

Reporter Park Sung-young from the digital news team

(Copyright holder (c) YTN. Unauthorized reproduction, distribution and use of AI data is prohibited)