Rising Threat of AI-Driven Crypto Scams Predicted for 2025
In 2025, crypto scams are poised to escalate, fuelled by AI technologies and scam-as-a-service operations. Chainalysis reports that generative AI makes fraud cheaper, scalable, and harder to detect. Revenue from scams surged to $9.9 billion in 2024 and could surpass $12 billion in 2025. Scammers increasingly utilise AI tools to create realistic identities and content, raising concerns for financial institutions and the economy at large.
In 2025, crypto scams are projected to reach unprecedented levels, largely due to advancements in AI and the rise of scam-as-a-service operations, facilitating fraudsters in executing deceitful schemes. Chainalysis, a blockchain analytics firm, highlights these developments in its Crypto Scam Revenue 2024 report, noting that generative AI has made scams cheaper, more scalable, and increasingly difficult to detect.
Elad Fouks, Chainalysis’ head of fraud products, emphasises that AI is amplifying the threat to financial institutions by producing highly realistic fraud that preys on human vulnerabilities. AI tools enable scammers to create synthetic content and fake identities, successfully navigating identity verification systems and impersonating genuine users.
Crypto scam revenue surged to approximately $9.9 billion in 2024, with “pig butchering” scams leading in prevalence. Projections estimate that scam revenue could exceed $12 billion in 2025, as the discovery of more fraudulent addresses unfolds. A contributing factor to this increase is an extensive support network; for example, Huione Guarantee, a peer-to-peer marketplace, provides both legitimate and fraudulent services that benefit scam networks.
From 2021 to 2024, the revenue derived from scam-related services associated with Huione demonstrated exponential growth. Notably, AI service vendors reported a staggering 1,900% increase in revenue, showcasing a boom in the availability of AI-powered tools for scammers.
The rise of AI-driven scams is marked by their inordinate realism. Counterfeit websites, investment platforms, and social media profiles can now be produced rapidly and affordably, complicating victims’ ability to distinguish genuine entities from fraudulent ones. Fouks points out that generative AI permits the seamless creation of realistic fraudulent content, enhancing the effectiveness of various scams.
Traditional barriers against fraud are eroding, as Chainalysis discovered that 85% of scams employed fully verified accounts in its recent fraud detection analysis, revealing a vulnerability exacerbated by the use of AI-generated identities.
The ramifications of AI in fraud extend well beyond the cryptocurrency landscape. Deloitte forecasts that AI-driven scams and deepfakes may inflict a $40 billion toll on the US economy by 2027, prompting governmental awareness. The FBI issued warnings in December 2023 regarding AI’s role in targeting cryptocurrency investors, emphasising the escalating threat.
Chainalysis has documented a continuous rise in scam activity, averaging a 24% increase annually since 2020, which is expected to persist into 2025, bolstered by advancing AI technology. A separate Chainalysis report anticipates that illicit crypto transactions might have totalled up to $51 billion in 2024, despite the overall rate of illegal activities relative to crypto usage remaining at a three-year low.
Post Comment