×
AI voice scams target US officials at federal, state level to steal data
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

The FBI is warning about sophisticated smishing campaigns targeting current and former government officials that use AI-generated voices and social engineering techniques to steal sensitive information. This escalation represents a concerning evolution in government-targeted scams, as cybercriminals impersonate senior officials to establish trust before directing victims to malicious links that compromise personal accounts.

The big picture: Since April, cybercriminals have been targeting U.S. federal and state employees with texts and AI-generated voice messages that impersonate senior officials to establish rapport and ultimately gain access to sensitive information.

  • Once scammers compromise one account, they use the stolen information to target additional government officials or their contacts in a chain-like attack pattern.
  • The compromised information can be leveraged to impersonate legitimate contacts and extract further information or funds from unsuspecting victims.

Key warning signs: The FBI advises vigilance for several telltale indicators of these sophisticated impersonation attempts.

  • Requests to switch to different messaging platforms should be treated with immediate suspicion.
  • AI-generated media often contains noticeable imperfections including distorted extremities, unrealistic facial features, irregular faces, inaccurate shadows, and unnatural movements.
  • Voice calls may exhibit lag time, voice matching issues, or unnatural speech patterns that differ subtly from the person being impersonated.

Recommended protections: The FBI outlines several defensive measures to avoid falling victim to these scams.

  • Never share sensitive information with new online or phone contacts without independent verification of their identity.
  • Avoid clicking links in unsolicited messages and refuse requests for money transfers via any method.
  • Implement two-factor authentication for all accounts, never share 2FA codes, and establish secret verification phrases with family members.

The simplest solution: The FBI notes that the most effective protection is to simply ignore calls from unknown numbers, and to report any suspected scam attempts to the local FBI Field Office or the Internet Crime Complaint Center.

Warning: This AI Voice Scam Mimics US Officials to Gain Access to Your Accounts

Recent News

AI avatars help Chinese livestreamer generate $7.65M in sales

Digital humans can stream continuously without breaks, maximizing sales during peak shopping periods.

Plaud AI sells 1M voice recorders as workplace privacy debates intensify

Executives expect employees will use recordings to report problematic coworkers to HR.

Google launches Search Live for real-time voice conversations with AI

Conversations continue in the background while you switch between apps.