If there was one thing you could usually count on when it came to online scams, it was that misspellings and bad grammar would often be a dead giveaway.
Enter AI, and now those tip-offs have disappeared, experts say.
“The quality of fraudulent content has gotten so high that it has become very hard for people to differentiate a real person from a bot,” said Kevin Gosschalk, the founder and chief executive of Arkose Labs, a security consultancy that tracks online scams and technology.
The rapid development of artificial intelligence has allowed online fraudsters not only to fine-tune their scams, but to more effectively unleash them broadly, hoping to find someone who will take the bait.
Gosschalk said that in the old days, scammers had to involve themselves more closely in the early stages of spam, phishing or romance scam campaigns, which was time-consuming and opened them up to being caught. Now, however, they can just let the computer do much of that work.
Experts say AI is being put to use in a multitude of ways in online scams, whether by building more convincing phony profiles on dating sites or by mimicking the voice of a grandchild to persuade relatives into thinking their loved one is in trouble and needs money.
While the approach has become more sophisticated, the ultimate goal is the same: to get you to do something, like hand over your money or personal information. And even with the advances made by AI, some signs remain.
If the proposal urges you to act quickly, sounds too good to be true or asks you to provide information or money in some very specific way, such as via gift cards, cryptocurrency or wire transfer, stay away.
With the holidays approaching, here are a few ways criminals have developed these more sophisticated schemes — and some things you can look out for to avoid becoming a target.
Believe it or not, the most common and widespread scams still revolve around spam emails. Simple as they may be, these allow criminals to blast scam campaigns as widely as possible with a minimal amount of effort. If the scammers can just get a few hundred people to respond to the several million emails they send, it is worth it.
In the past, these were easy to spot, because they often read like they had been written by someone whose first language wasn’t English or who had limited command of basic grammar rules. Poor syntax and misspellings abounded. But AI solves all that.
“AI programs are really good at crafting authentic-sounding and well-written messages,” Gosschalk said. “It makes it harder to spot obvious frauds.”
The programs can also easily fine-tune messaging campaigns depending on the time of year, target locations and age groups, reducing the amount of hands-on labor involved and allowing a spam campaign to extend even further.
“Scammers work best at scale, so this allows them to scale up exponentially,” Gosschalk said.
Ultimately, for these scams to work, the target needs to be convinced to do something. That means that any unsolicited request for money or personal information should be treated with extreme caution.
Some of the most devious online scams involve fraudsters targeting those seeking love on dating sites. For these scams to work, however, the scam artists must create convincing fake profiles complete with authentic-sounding answers and a profile photo that isn’t clearly lifted off the internet or taken from a stock-image service.
Programs using AI are able to create far more sophisticated phony profiles and even fabricate realistic-looking profile images out of whole cloth. The bot is also able to communicate with a victim in the early stages of a scam, with the humans behind the fraud to only stepping in later in the process.
These scams can take months to unfold, but with or without AI, they will always end with a request for money. If you have found yourself communicating for a long time with someone you’ve never met in person and who asks you to send money, beware.
This age-old scam has become something straight out of a sci-fi movie: An AI program finds a recording of a person online and is able to use it to clone their voice and create a message claiming the person is in distress and needs money fast.
This kind of scheme — often called a grandparent scam, because that’s who it typically targets — has been around for years. It has traditionally involved someone calling the grandparent and claiming to be a lawyer or a law-enforcement official, saying the person’s loved one was in trouble and needed money to pay their way out of it.
Now with AI, the scammer can have a computer impersonate the loved one directly, allowing them to make an even more convincing appeal to the relative for money.
To defend against this, consumer-affairs agencies recommend being patient to see if the scammer trips up by using an incorrect name or unusual term of endearment. You should also call the person back on a phone number you know belongs to them, and check in with other relatives, before handing over any cash.
As the holiday season rolls around, people often look for side gigs to make extra cash. This provides an opportunity for scammers, and they’ll often post job ads that look legit but are really just an attempt to steal your money or personal information.
“They might offer you the job and quickly ask for your personal information like your driver’s license, Social Security, or bank account number to fill out their ‘employment paperwork.’ But if you share it, they might steal your identity,” the Federal Trade Commission warned in a recent public notice.
With this scam as with other ones, AI makes it harder to spot frauds because the job posts will now lack those telltale spelling errors and poor grammar, making them appear legitimate and professional.
The FTC says the best way to avoid these scams is to never hand over personal information without researching a company first. It recommends that people Google the company to see if there are reviews on sites like Glassdoor, and contact the business through a phone number or email posted on their website rather than the one in the job ad.
And finally, remember that no legitimate employer will ask you to pay an application or training fee up front.