White Papers

Protecting Against Deepfake Spoof Attempts and Recorded Playback Attacks

Angelo Gajo | August 20, 2020 | 2 minutes



Cybercriminals may attempt to defraud security systems with voice biometrics implemented. These fraudsters may utilise voice deepfake technologies or recorded playback attacks in order to break into an organisation’s security system. Fortunately, with Auraya’s ArmorVox voice biometric engine, organisations can protect their security systems and most importantly, their customers, from these such attempts.

Click the download button to download your copy of the white paper.


sign up to our mailing list