Stiehlt man Quan Hongchan's Stimme und verkauft Eier im Internet? Die von Ihnen angesehene Star-Live-Einkaufsveranstaltung könnte möglicherweise mit einer gestohlenen KI-Stimme sein...
In the overwhelming sea of online short videos, do you often come across videos with "AI voiceovers"? Now, many short videos are using AI voiceovers directly. You must have heard the classic sound effects like Wukong Siro.
Today, the technology of "voice cloning" has been integrated into various short - video and AI platforms, becoming a feature that every user can use with just one click: all you need to do is record a short clip of your own voice, and the AI can directly "copy" your voice and tone.
However, this also means that your "voice" is very likely to be stolen and exploited, leading to unimaginable consequences.
Have you ever come across a short - video of Olympic champion "Quan Hongchan" selling eggs? In the video, the so - called Olympic champion "Quan Hongchan" wearing a mask says to the camera: "I'm your Chanbao. I need your help today. Chanmei just wants to help her mother and make the family's life better. At the same time, I'd like to share the local eggs from my hometown with my fans..." Some netizens who hear "Quan Hongchan's" voice will buy the products to show their support. The number of product purchases even reached 47,000 times.
An online account used "Quan Hongchan's" voice to sell eggs. Quan Hongchan's family members said that these voices were all AI - generated. | CCTV News
In addition to Quan Hongchan, Olympic champions Sun Yingsha, Wang Chuqin, etc. also seem to be helping short - video merchants promote eggs. However, recent reports show that the voices in the short - videos are all "fake", cloned by AI after stealing the real voices, and these Olympic champions have not participated in any endorsements or given any authorizations. However, judging from the egg sales, 47,000 netizens accidentally fell into this AI scam.
Have you ever seen short - videos of AI - cloned celebrities promoting products on short - video platforms? | CCTV News
Last year, Lei Jun was also in a difficult situation of being imitated and parodied by AI. A large number of "parodied Lei Jun AI voiceover" videos emerged on the Internet, using Lei Jun's image and voice to spread false statements, sharp comments, and even extreme content. Lei Jun himself had to come forward and appeal to everyone to stop using the Lei Jun voice pack, saying, "I hope everyone will stop."
In the United States in 2024, Joe Biden's voice was also cloned, and calls were made directly to voters' homes, telling the recipients not to vote in the Tuesday presidential primary.
Besides fake celebrity endorsements and speeches, it is also very dangerous for ordinary people's voices to be stolen. Scammers may use your AI - cloned voice to induce your relatives or friends to remit money or disclose sensitive information.
A survey of 7,000 people conducted by the cybersecurity company McAfee showed that 70% of people couldn't tell the difference between cloned voices and real voices. 7.7% of people said they had been scammed out of money by AI - cloned voices. Among them, 36% lost between $500 and $3,000, while 7% were scammed out of $5,000 - $15,000.
This survey also revealed some details of the scams. For example, 45% of the scams were voice messages pretending to be from friends or family members. The requests came from partners or spouses (40%), mothers (24%), or children (20%). And the message contents included: car accidents (48%), being robbed (47%), losing mobile phones or wallets (43%), and accidents while traveling abroad (41%).
Common scam scenarios used by scammers: car accidents, robberies, etc. These happen frequently in every country. | Reference 1
This type of scam is also a form of "spear phishing", which refers to targeted fraud and attacks on specific targets. Nowadays, many online scams are like this. They seem to be tailor - made for the victims and appear very credible. And AI has made this kind of scam more common and harder for us to detect.
Therefore, in this situation, it is crucial to contact relatives and friends in time for fact - checking. A bank information security officer also reminded that you can agree on a "security phrase" with your relatives - like a unique secret signal between you and your family - which can be used to verify identities over the phone.
Elderly people who are not familiar with new technologies are more likely to be scammed. Therefore, it is very important to remind the elderly in the family and agree on a way to verify identities with them. | Screenshot from CBC
However, what's even more dangerous is that it's much easier for scammers to steal "voices" than you think. In addition to various online techniques and tools, McAfee researchers also found that a 3 - second audio clip is enough to generate a cloned voice that matches the original voice by up to 85%. If a little more training data is added, the matching rate can reach 95%.
Some AI audio - generation software can even clone various "breathing sounds" in human voices, such as heavy sighs and gasping sounds, and can even change the ambient sound around you when you speak...
So, from which channels can scammers steal your voice?
(1) Your voice from existing videos posted on social media or phone calls.
(2) When using some software, you may have "authorized" your voice to the software in the user agreement that you didn't take the time to read.
Sharing voices online has become a very common phenomenon. Due to the lack of supervision and protection on platforms and software, our voices are very likely to be stolen. | Reference 1
(3) Scams promising to make money: Some online voice - auditions for "part - time audiobook dubbing" may be means to steal personal privacy and voices.
This kind of "dubbing part - time job" involves risks of brushing orders, fraud, and stealing personal information, so you must stay vigilant. | Image from the Internet
Finally, China is introducing various regulatory measures to rectify the chaos of AI voice cloning and help victims safeguard their rights.
In 2020, the "Civil Code" included our voices in legal protection for the first time. Illegally imitating someone else's voice, like using someone's portrait without permission, may be an infringement. In 2024, the Beijing Internet Court ordered two companies to pay a voice actor 250,000 yuan in compensation because they cloned the voice actor's voice without permission. This is China's first legal judgment protecting voice rights from AI cloning.
The "Measures for Marking Artificially - Generated and Synthesized Content" implemented on September 1, 2025, clearly states that service providers should add explicit markings to generated and synthesized content such as text, audio, pictures, videos, and virtual scenes.
References
[1]https://www.mcafee.com/ai/news/ai-voice-scam/#:~:text=These%20tools%20required%20only%20a,small%20number%20of%20audio%20files.
This article is from the WeChat public account "Bring Science Home" (ID: steamforkids), author: Skin. It is published by 36Kr with authorization.