RESOURCES

White Papers

Protecting Against Deepfake Spoof Attempts and Recorded Playback Attacks

Angelo Gajo | August 20, 2020 | 2 minutes

deepfake-spoof-attempt-recorded-playback-attacks-white-paper

Summary

Cybercriminals may attempt to defraud security systems with voice biometrics implemented. These fraudsters may utilise voice deepfake technologies or recorded playback attacks in order to break into an organisation’s security system. Fortunately, with Auraya’s ArmorVox voice biometric engine, organisations can protect their security systems and most importantly, their customers, from these such attempts.

Fill in the form below to download your copy of the file.

protecting-against-deepfake-spoof-attempts-and-recorded-playback-attacks-cover-image

    NEWSLETTER

    sign up to our mailing list