RESOURCES

White Papers

Protecting Against Deepfake Spoof Attempts and Recorded Playback Attacks

Angelo Gajo | August 20, 2020 | 2 minutes

deepfake-spoof-attempt-recorded-playback-attacks-white-paper

Summary

Cybercriminals may attempt to defraud security systems with voice biometrics implemented. These fraudsters may utilise voice deepfake technologies or recorded playback attacks in order to break into an organisation’s security system. Fortunately, with Auraya’s ArmorVox voice biometric engine, organisations can protect their security systems and most importantly, their customers, from these such attempts.

Click the download button to download your copy of the white paper.

NEWSLETTER

sign up to our mailing list