
Photo: B_A/Pixabay
Not much info has been revealed in this case, apart from the fact that U.A.E. authorities believe that the elaborate scheme involved at least 17 people and that the money was sent to accounts all over the world, to make it difficult to track down and recover. Neither the Dubai Public Prosecution Office nor the American lawyer mentioned by the fraudsters, Martin Zelner, had responded to Forbes‘ request for comment, at the time of this writing. It’s only the second known case of a bank heist involving deepvoice technology, with the first occurring in 2019, but cybercrime experts warn that this is only the beginning. Both deepfake video and deepvoice technologies have evolved at an astonishing pace, and criminals are bound to take advantage of them. “Manipulating audio, which is easier to orchestrate than making deep fake videos, is only going to increase in volume and without the education and awareness of this new type of attack vector, along with better authentication methods, more businesses are likely to fall victim to very convincing conversations,” cybersecurity expert Jake Moore said. We are currently on the cusp of malicious actors shifting expertise and resources into using the latest technology to manipulate people who are innocently unaware of the realms of deep fake technology and even their existence. The thought of a sample of your voice, be it from a YouTube or Facebook video, or from a simple phone call, being used to clone your voice for malicious purposes is sort of terrifying…