Skip to main content
Business

AI-powered fraud is exploding, FBI says

FBI stock image | Getty Images
Fraudsters are using generative AI tools to flirt their way to a payday, according to the FBI. | Source: Getty Images

The same artificial intelligence tools that can write goofy limericks or serve as a pseudo-therapist are helping fraudsters pump up their profits. 

For years, scammers have used flirty or urgent-sounding messages to dupe victims into sending them money, but now their tactics are super-charged by generative AI tools that can make missives more convincing or even impersonate a loved one’s speaking voice, according to the FBI. 

Investment scams — often described as “pig butchering,” because they involve engaging extensively and “fattening up” victims before making off with their money — are surging this year, and the FBI pegs GenAI tools as part of the problem. 

The FBI’s Internet Crime Complaint Center received 38,000 reports about investment scams from January to October, amounting to $4.7 billion in losses. That’s up from 30,000 reports and $3.6 billion in losses in the same period last year.

“I strongly believe that [GenAI] plays a role in the relatively dramatic increase in the last year,” said Supervisory Special Agent Scott Hellman in the FBI’s San Francisco division. “GenAI tools became very quickly available to just about everyone, and they’ve proliferated — we certainly see cybercriminals using them. And what are they very, very good for? Creating language and voice in the tone that the creator wants.” 

Today’s stories straight to your inbox

Everything you need to know to start your day.

A bad actor can use these tools to craft messages that lure victims, making them believe they are being contacted by a potential love interest, a government agency, a savvy investor, or a family member. 

While cybercriminals can utilize mainstream AI tools like OpenAI’s ChatGPT or Anthropic’s Claude, services have sprung up specifically to help scammers, such as  WormGPT and the blatantly named FraudGPT. These tools are on the FBI’s radar, according to Hellman. 

“You can’t simply go after the people behind the keyboard; we have to go after the services that service the whole ecosystem,” he said. “But we’ve got to collect evidence, and it takes a tremendous amount of time and effort.”

The scams often target elderly people and those with limited English proficiency. For example, an SF-based Japanese American man lost almost $400,000 after someone goaded him into investing in a fake cryptocurrency, while a Santa Clara man gave $260,000 to a con artist with whom he struck up a purportedly romantic relationship via text message.

These scams often start on social media or dating apps, or with seemingly innocuous “wrong number” messages. Once victims start engaging, the scammer leads them on and eventually asks for money or recommends an investment.

“Whether the tone is romantic, trust, fear, enticement — all of those types of things, GenAI can excel in,” Hellman said. 

Victims often get prompted to relinquish their savings through cryptocurrency. In the territory of the FBI’s San Francisco field office, which includes the Bay Area and several other counties, crypto-related scams last year amounted to losses of more than $260 million across 1,226 victims. 

Fear-based approaches are some of the most popular tactics employed by criminals; for example, making victims think they’re being investigated by the IRS or using a voice-cloning service to mimic a loved one asking for help, Hellman said.  

That’s why he advises people to slow down and think logically when they get a stressful message, instead of letting their emotions snap them into making a quick decision. 

“If you open an email or receive a phone call or text message, and your immediate reaction is to feel fear or anxiety, that could be a flag for you that something else is going on, that maybe someone is trying to victimize you,” Hellman said. “Be able to recognize when your body feels a certain way and be in touch with that.”