Your grandfather receives a call late at night from a person pretending to be you. The caller says that you are in jail or have been kidnapped and that they need money urgently to get you out of trouble. Perhaps they then bring on a fake police officer or kidnapper to heighten the tension. The money, of course, should be wired right away to an unfamiliar account at an unfamiliar bank.
It’s a classic and common scam, and like many scams it relies on a scary, urgent scenario to override the victim’s common sense and make them more likely to send money. Now, scammers are reportedly experimenting with a way to further heighten that panic by playing a simulated recording of “your” voice. Fortunately, there’s an easy and old-school trick you can use to preempt the scammers: creating a shared verbal password with your family.
The ability to create audio deepfakes of people’s voices using machine learning and just minutes of them speaking has become relatively cheap and easy to acquire technology. There are myriad websites that will let you make voice clones. Some will let you use a variety of celebrity voices to say anything they want, while others will let you upload a new person’s voice to create a voice clone of anyone you have a recording of. Scammers have figured out that they can use this to clone the voices of regular people. Suddenly your relative isn’t talking to someone who sounds like a complete stranger, they are hearing your own voice. This makes the scam much more concerning.
Voice generation scams aren’t widespread yet, but they do seem to be happening. There have been news stories and even congressional testimony from people who have been the targets of voice impersonation scams. Voice cloning scams are also being used in political disinformation campaigns as well. It’s impossible for us to know what kind of technology these scammers used, or if they’re just really good impersonations. But it is likely that the scams will grow more prevalent as the technology gets cheaper and more ubiquitous. For now, the novelty of these scams, and the use of machine learning and deepfakes, technologies which are raising concerns across many sectors of society, seems to be driving a lot of the coverage.
The family password is a decades-old, low tech solution to this modern high tech problem.
The first step is to agree with your family on a password you can all remember and use. The most important thing is that it should be easy to remember in a panic, hard to forget, and not public information. You could use the name of a well known person or object in your family, an inside joke, a family meme, or any word that you can all remember easily. Despite the name, this doesn’t need to be limited to your family, it can be a chosen family, workplace, anarchist witch coven, etc. Any group of people with which you associate can benefit from having a password.
Then when someone calls you or someone that trusts you (or emails or texts you) with an urgent request for money (or iTunes gift cards) you simply ask them the password. If they can’t tell it to you, then they might be a fake. You could of course further verify this with other questions, like, “what is my cat’s name” or “when was the last time we saw each other?” These sorts of questions work even if you haven’t previously set up a passphrase in your family or friend group. But keep in mind people tend to forget basic things when they have experienced trauma or are in a panic. It might be helpful, especially for people with less robust memories, to write down the password in case you forget it. After all, it’s not likely that the scammer will break into your house to find the family password.
These techniques can be useful against other scams which haven’t been invented yet, but which may come around as deepfakes become more prevalent, such as machine-generated video or photo avatars for “proof.” Or should you ever find yourself in a hackneyed sci-fi situation where there are two identical copies of your friend and you aren’t sure which one is the evil clone and which one is the original.
The added benefit of this technique is that it gives you a minute to step back, breath, and engage in some critical thinking. Many scams of this nature rely on panic and keeping you in your lower brain, by asking for the passphrase you can also take a minute to think. Is your kid really in Mexico right now? Can you call them back at their phone number to be sure it’s them?
So, go make a family password and a friend password to keep your family and friends from getting scammed by AI impostors (or evil clones).