- Joined
- Aug 20, 2022
- Messages
- 24,340
- Points
- 113
M'sia police warn against speaking even for 3 seconds on 'silent' scam calls, audio can be used for AI cloning
The AI-replicated voice may be used in calls to the victim's friends and family.
Gawain Pek
December 12, 2025, 11:10 AM
Police officers in Malaysia have warned their citizens not to speak during scam calls that remain silent.
They explained that scammers are purportedly recording audio during such calls and subsequently using AI to replicate the victim's voice to scam their friends and family.
AI replication
According to New Straits Times (NST), two Malaysian police officers explained in an online video that 'silent' scam calls serve an ulterior purpose.When victims answer such calls and speak, such as to say "hello?", the scammers may be recording the victim's voice.
As little as three to five seconds of audio is all that is needed.
The captured audio may then be used for replication using AI.
"They may then contact a family member, such as a parent, pretending that the victim is in urgent need of money," one of the Malaysian police officers said, according to NST.
When family members hear their relative's voice, they may believe it is genuine, fall for the ruse, and transfer the money.Let the caller speak first
The officers advised the public to avoid speaking first when answering unknown calls.They posited that genuine callers will usually be the first to introduce themselves and state their purpose when reaching out.
"Let the caller speak first. If they remain silent or do not identify themselves, end the call immediately," one of the officers said.
Top image via Izzul Islam / Facebook, Canva