24.10.2024 - Criminals have been scamming the elderly with distressing phone calls for some time. But now they are also using AI to carry out these attacks: they use voice cloning software that takes one or more voice samples of a person known to the victim and creates a fake voice that sounds similar. This allows scammers to pose as trustworthy people and deceive their unsuspecting victims. These scams are becoming increasingly difficult to detect.
In earlier versions of this type of scam, the victim would receive a fake call from the police, prosecutor's office or ambulance service. To tailor the scam to the target and increase the chances of success, cybercriminals are now using AI-generated voices of their targets' family members and claiming that there's been an emergency. They get the voice samples from social media or other public sources, and their victims' contact information from the phone book.
A well-made audio deepfake can be difficult to detect, especially on the phone. Look out for the following:
- The person's voice sounds metallic and monotonous
- The content of the conversation seems strange
- The person is uses words they don't normally use
In the video below, Brian Ceccato, technical analyst at the National Cyber Security Centre, shows how numbers can be spoofed, voices and even Swiss dialects can be imitated, and live videos can be faked.
In collaboration with Ralph Landolt, seniorweb.ch contributor, the NCSC has produced a video on the subject specifically for older people.
Tips:
- Do not trust every caller.
- If you are not sure if the caller really is who they say they are, hang up and call them back on a number you know.
- Hang up immediately if a call seems suspicious.
- Do not allow yourself to be intimidated or put under pressure.
- Never share passwords or PIN numbers over the phone.
- Never give unauthorised people access to your computer, even if they seem trustworthy.
Further information:
Last modification 24.10.2024