Police in the United Arab Emirates claim fraudsters stole USD35mn by cloning the voice of a company director and then calling the company’s banker to falsely authorize a series of payments to foreign accounts. The novel deception happened in early 2020 but was only recently made public through US court documents discovered by a journalist working for Forbes. The court filing is a request for the US authorities to gather relevant evidence, and it provides the following description of the crime.
According to Emirati authorities, on January 15, 2020, the Victim Company’s branch manager received a phone call that claimed to be from the company headquarters. The caller sounded like the Director of the company, so the branch manager believed the call was legitimate. The branch manager also received several emails that he believed were from the Director that were related to the phone call. The caller told the branch manager by phone and email that the Victim Company was about to acquire another company, and that a lawyer named Martin Zelner (Zelner) had been authorized to coordinate procedures for the acquisition. The branch manager then received several emails from Zelner regarding the acquisition, including a letter of authorization from the Director to Zelner. Because of these communications, when Zelner asked the branch manager to transfer USD 35 million to several accounts as part of the acquisition, the branch manager followed his instructions. The Emirati investigation revealed that the defendants had used “deep voice” technology to simulate the voice of the Director. In January 2020, funds were transferred from the Victim Company to several bank accounts in other countries in a complex scheme involving at least 17 known and unknown defendants. Emirati authorities traced the movement of the money through numerous accounts and identified two transactions to the United States. On January 22, 2020, two transfers of USD 199,987.75 and USD 215,985.75 were sent from two of the defendants to Centennial Bank account numbers, xxxxx7682 and xxxxx7885, respectively, located in the United States.
This scam represents an upgrade to the well-known CEO Fraud, which relies on emails that appear to come from the CEO or another corporate leader. Scam messages like these usually demand immediate action, perhaps whilst intimating there is a tight deadline to close an important business deal. This discourages calm reflection about whether the request seems plausible, and leads underlings to dispense with any checks of the authenticity of the instruction for fear they will be held responsible for delays. Voice calls will only increase the sense of urgency conveyed by fraudsters. There has recently been rapid improvements in the use of artificial intelligence to create ‘deepfake’ clones of real voices. The spread of this technology adds another layer of risk to the use of telephony voice services to authenticate customers and transactions.
Businesses and governments are unlikely to adapt with sufficient haste, but deepfake cloning is another reason to adopt more secure channels of communication where authentication does not rely on checking a CLI, asking for personal data, or voice biometrics.