By Favour Unukaso
Cybercriminals are leveraging Artificial Intelligence (AI) through cloning to defraud unsuspecting consumers.
A new report by iiDENTIFii, remote biometric digital authentication and automated onboarding technology platform, noted that this said imagine receiving a call, email or SMS from authorities urgently requesting payment.
“The details of the request are clear, professional and include personal information unique to you, so there is no reason to doubt it. This scam is fairly common and the majority of consumers are on the lookout for it.
“Now imagine receiving a call from a loved one and hearing their unmistakable voice on the other end of the line saying that they need money or your account information right away. This may sound like a fraud lifted straight out of science fiction, but– with the exponential development of AI tools – it is a growing reality.”
According to the Southern African Fraud Prevention Service (SAFPS), impersonation attacks increased by 264 per cent for the first five months of the year compared to 2021.
Founder and Chief Executive Officer of iiDENTIFii, Gur Geva, said: “The technology required to impersonate an individual has become cheaper, easier to use and more accessible. This means that it is simpler than ever before for a criminal to assume one aspect of a person’s identity.”
The report observed that last week in the United States, the Federal Trade Commission issued an alert urging consumers to be vigilant for calls in which scammers sound exactly like their loved ones. It stressed that all a criminal needs is a short audio clip of a family member’s voice – often scraped from social media – and a voice-cloning program to stage an attack.
It noted that the potential of this technology is vast. It said Microsoft, for example, has recently piloted an AI tool that, with a short sample of a person’s voice, can generate audio in a wide range of different languages. While this has not been released for public use, it does illustrate how voice can be manipulated as a medium.
Exposing fault lines in voice biometrics, Geva said: “Historically voice has been seen as an intimate and infallible part of a person’s identity. For that reason, many businesses and financial institutions used it as a part of their identity verification toolbox.”
According to the report, audio recognition technology has been an attractive security solution for financial services companies across the globe, with voice-based accounting enabling customers to deliver account instructions via voice command. It stressed that voice biometrics offers real-time authentication, which replaces the need for security questions or even PINs.
According to it, Barclays, for example, integrated Siri to facilitate mobile banking payments without the need to open or log into the banking app.
“As voice-cloning becomes a viable threat, financial institutions need to be aware of the possibility of widespread fraud in voice-based interfaces. For example, a scammer could clone a consumer’s voice and transact on their behalf,” said Geva.
Geva added that the rise of voice-cloning illustrates the importance of sophisticated and multi-layered biometric authentication processes.