How ‘voice cloning’ can weaponize the sense of urgencyA public service announcement from the Better Business Bureau
February 14, 2020 — Everyone knows to be on the lookout for phony emails – especially at work. Scammers can easily make messages that appear to come from anywhere – from your boss’s account to the office printer. But what about voicemail? New voice-mimicking software is now being used by scammers to create convincing voicemail messages.
How the Scam Works
You get a voicemail from your boss. They are instructing you to wire thousands of dollars to a vendor for a rush project. The request is out of the blue. But it’s the boss’s orders, so you make the transfer.
A few hours later, you see your boss and confirm that you sent the payment. But there’s one big problem; your manager has no idea what you are talking about! It turns out that the message was a fake. Scammers used new technology to mimic your boss’s voice and create the recording. This “voice cloning” technology has recently advanced to the place where anyone with the right software can clone a voice from a very small audio sample.
Businesses may be the first places to see this con, but it likely won’t stop there. The technology could also be used for emergency scams, which prey on people’s willingness to send money to a friend or relative in need. Also, with the US now in the midst of the 2020 election season, scammers could use the technology to mimic candidates’ voices and drum up “donations.”
How to Avoid a Business Compromise Scam:
Secure Accounts: Set up multifactor authentication for email logins and other changes in email settings. Be sure to verify changes in information about customers, employees, or vendors.
Train Staff: Create a secure culture at your office by training employees on internet security. Make it a policy to confirm all change and payment requests before making a transfer. Don’t rely on email or voicemail.
Excerpts from BBB Sources on Business eMail Compromise And ‘Audio Deepfakes’
Is That eMail Really From Your Boss?
An in-depth investigative study by Better Business Bureau (BBB) finds that business email compromise scams are skyrocketing in frequency and have cost businesses and other organizations more than $3 billion since 2016.
Business email compromise fraud is an email phishing scam that typically targets people who pay bills in businesses, government and nonprofit organizations. It affects both big and small organizations, and it has resulted in more losses than any other type of fraud in the U.S., according to the Federal Bureau of Investigations (FBI).
The investigative study – “Is That Email Really From ‘The Boss?’ The Explosion of Business Email Compromise (BEC) Scams” – looks at the prevalence of BEC scams and the criminal systems that perpetrate them. It digs into the scope of the problem, who is behind it, the multi-pronged fight to stop it and the steps consumers can take to avoid it.
BEC fraud takes many forms, but in essence, the scammer poses as a reliable source who sends an email from a spoofed or hacked account to an accountant or chief financial officer (CFO), asking them to wire money, buy gift cards or send personal information, often for a plausible reason. If money is sent, it goes into an account controlled by the con artist.
The FBI recognizes at least six types of activity as BEC or email account compromise (EAC) fraud, which differ based on who appears to be the email sender – a chief executive officer (CEO) asking the CFO to wire money to someone, a vendor or supplier requesting a change in invoice payment, executives requesting copies of employee tax information, senior employees seeking to have their pay deposited into a new bank account, an employer or clergyman asking the recipient to buy gift cards on their behalf, even a realtor or title company redirecting proceeds from a real estate sale into a new account. These targeted email phishing scams are sometimes called “spear phishing.” READ MORE
FTC: Audio Deepfakes
Rapid progress in voice cloning technology is making it harder to tell real voices from synthetic ones. But while audio deepfakes — which can trick people into giving up sensitive information — are a growing problem, there are some good and legitimate uses for the technology as well, a group of experts told an FTC workshop this week.
“People have been mimicking voices for years, but just in the last few years, the technology has advanced to the point where we can clone voices at scale using a very small audio sample,” said Laura DeMartino, associate director in the FTC’s division of litigation technology and analysis.
At its first public workshop on audio cloning technology, the FTC enlisted experts from academia, government, medicine, and entertainment to highlight the implications of the tech and the potential harms.
FTC spokesperson Juliana Gruenwald Henderson said after the workshop that impostor schemes are the number one type of complaint the agency receives. “We began organizing this workshop after learning that machine learning techniques are rapidly improving the quality of voice clones,” she said in an email.
Deepfakes, both audio and visual, let criminals communicate anonymously, making it much easier to pull off scams, says Mona Sedky of the Department of Justice Computer Crime and Intellectual Property Section. Sedky, who said she was the “voice of doom” on the panel, says communication-focused crime has historically been less appealing to criminals because it’s hard and time-consuming to pull off. “It’s difficult to convincingly pose as someone else,” she says. “But with deep fake audio and anonymizing tools, you can communicate anonymously with people anywhere in the world.” READ MORE…
For more information
In January, the Federal Trade Commission held a workshop on voice cloning. See notes and video from the event on FTC.org or check out this report of the highlights. Also, read BBB’s report on Business Email Compromise scams for more tips on avoiding scams at work.
If you’ve been the victim of a scam, please report it at BBB.org/ScamTracker. Your report can help expose scammers’ tactics and prevent others from having a similar experience.