This alert may not be shared outside your organization, Do Not Repost or send, place on other websites, List servers, or send to others via email, including other associations or parties. Members and Law enforcement use only. Contact us for any permissions. To do otherwise will result in the loss of membership.
Complete Story
12/20/2024
When voice deepfakes come calling
CIO
Intro: Time was, a call center agent could be relatively secure in knowing who was at the other end of the line. And if they weren’t, multi-factor authentication (MFA), answers to security questions, and verbal passwords would solve the issue.
Those days are behind us, as deepfake audio and video are no longer just for spoofing celebrities. Voice deepfakes – in which a real person’s voice is cloned from recorded snippets of their voice – are one of the biggest risks facing modern businesses and their call centers.
Deepfake fraud attacks surged 3,000% last year, and unlike email phishing, audio and video deepfakes don’t come with red flags like spelling errors or strange links. A recent survey showed that 86% of call centers surveyed are concerned about the risk of deepfakes, with 66% lacking confidence that their organization could identify them.
More InfoAlerts
The FRPA alert system distinguishes us from other groups by gathering and providing information to law enforcement, retailers AND financial institutions.
more informationResources
Your electronic library to help in fighting financial fraud for all of our partners.
more information