Crypto scams are nothing new. What is new is how convincing they’ve become.
Data published by blockchain analytics firm Chainalysis shows that impersonation scams surged in 2025, rising by roughly 1,400% from the year before. The jump lines up almost perfectly with the wider availability of artificial intelligence tools, the same tools now being used by scammers.
For a long time, crypto fraud followed a familiar pattern. Poorly written emails. Obvious fake websites. Messages that didn’t quite sound right. That’s no longer the case. Many scams today look and sound legitimate. Some even use cloned voices or realistic profile photos.
In practice, these scams often start small. A message on X. A DM on Telegram. Sometimes it claims to be customer support. Sometimes it looks like it’s coming from someone the victim already knows. From there, the conversation moves to private chat, where urgency takes over.
Chainalysis says a significant share of crypto losses this year came from these impersonation schemes. Victims are often persuaded to send funds directly to wallets controlled by the scammers. Once that happens, the money is usually gone.
One case outlined in the report involved attackers pretending to work for a major cryptocurrency exchange. Victims were told their accounts had been compromised and that their assets needed to be moved immediately for “security reasons.” By the time the scheme was uncovered, nearly $16 million had been lost.
Security researchers say accessibility is a big part of the problem. Tools like voice cloning software, deepfake generators and automated chat systems are no longer difficult to obtain. What once required advanced skills can now be done with minimal effort.
That shift has made scams easier to run and harder to spot. The trend has unsettled parts of the crypto industry, especially as more newcomers enter the space. Education still matters, but many experts say it’s no longer enough on its own. Exchanges and wallet providers are being pushed to improve detection systems, flag suspicious behavior earlier, and rethink how users are warned before risky transactions.
Chainalysis also repeated advice that’s become familiar for a reason. Legitimate companies do not ask for private keys or recovery phrases. Messages that arrive out of the blue, even if they appear trustworthy, should always be verified independently. As artificial intelligence continues to improve, few expect this type of fraud to slow down. If anything, the scams are likely to become harder to recognize. The challenge for crypto will be figuring out how to protect users without giving up the open access that defines the industry.