The FBI is warning that threat actors are impersonating senior US officials in phishing attacks designed to compromise users’ accounts.
Notably, the attackers are using AI-generated audio to convincingly spoof the voices of real people.
“The malicious actors have sent text messages and AI-generated voice messages — techniques known as smishing and vishing, respectively — that claim to come from a senior US official in an effort to establish rapport before gaining access to personal accounts,” the FBI says.
“One way the actors gain such access is by sending targeted individuals a malicious link under the guise of transitioning to a separate messaging platform. Access to personal or official accounts operated by US officials could be used to target other government officials, or their associates and contacts, by using trusted contact information they obtain.
"Contact information acquired through social engineering schemes could also be used to impersonate contacts to elicit information or funds.”
If you’re unsure whether a message is legitimate, the FBI recommends contacting the impersonated agency or individual through a separate channel, rather than responding to an unsolicited message. Additionally, the Bureau offers the following advice to help users identify AI-assisted social engineering attacks:
- “Carefully examine the email address; messaging contact information, including phone numbers; URLs; and spelling used in any correspondence or communications. Scammers often use slight differences to deceive you and gain your trust. For instance, actors can incorporate publicly available photographs in text messages, use minor alterations in names and contact information, or use AI-generated voices to masquerade as a known contact
- Look for subtle imperfections in images and videos, such as distorted hands or feet, unrealistic facial features, indistinct or irregular faces, unrealistic accessories such as glasses or jewelry, inaccurate shadows, watermarks, voice call lag time, voice matching, and unnatural movements
- Listen closely to the tone and word choice to distinguish between a legitimate phone call or voice message from a known contact and AI-generated voice cloning, as they can sound nearly identical
- AI-generated content has advanced to the point that it is often difficult to identify. When in doubt about the authenticity of someone wishing to communicate with you, contact your relevant security officials or the FBI for help”
KnowBe4 empowers your workforce to make smarter security decisions every day. Over 70,000 organizations worldwide trust the KnowBe4 platform to strengthen their security culture and reduce human risk.
The FBI has the story.