Craig Scott Capital

Delve into Newstown, Venture into Businessgrad, Explore Tech Republic, Navigate Financeville, and Dive into Cryptopia

Artificial Intelligence Can Be Used For Crypto Fraud.

Image1

While the world is discussing how else you can make your life easier with the help of artificial intelligence, there are also reasons to sound the alarm. After all, any tool can be used for its benefit and if it is powerful and multifunctional. Here are 3 ways that AI can easily be used for fraud right now. If you use cryptocurrency for Melbet betting, you need to know now how artificial intelligence can take your tokens. You may have to deal with one of these methods.

Method 1: Make it easier to hack Defi projects

Eric Jardine, the Lead of Cybercrimes Research at Chainalysis, outlined the evolving landscape of illicit blockchain activities in 2023 and provided insights into potential tactics that criminals might adopt. Jardine indicated that ongoing discussions within the industry are exploring the impact of emerging technologies on blockchain crime, particularly focusing on large language models (LLMs).

According to Jardine, these technologies could affect different types of criminal activities. For instance, AI models could be utilized to enhance DeFi’s security through code audits. Conversely, the same models could be exploited by illicit actors to identify vulnerabilities in smart contracts.

DeFi has emerged as a prime target for cryptocurrency theft, with hackers exploiting code flaws, acquiring private keys, and manipulating prices to pilfer assets. While formal audits have become customary within the blockchain industry, their effectiveness in preventing hacks remains to be determined. The integration of AI introduces further uncertainty, as it could either bolster the security of smart contracts or empower attackers, with the balance of its advantages shifting over time.

Despite these challenges, there has been a noticeable decline in the absolute volume of stolen DeFi funds, plummeting from $3.1 billion in 2022 to $1.1 billion in 2023. Additionally, incidents decreased from 273 to 172 over the same period.

Method 2: Automate the “romance” scam

In addition to its potential impact on DeFi, artificial intelligence presents concerning implications for romance scams, colloquially referred to as “pig butchering” scams. These schemes typically commence with seemingly innocuous interactions, gradually evolving into fabricated relationships that scammers exploit for financial gain.

Eric Jardine highlighted the alarming prospect of large language models possessing infinite patience and creativity and being wielded by illicit actors in perpetrating these scams with devastating effectiveness.

Image3

Despite an overall decrease in total scam volume from $6.5 billion to $4.6 billion, romance scammers operating within the crypto sphere nearly doubled their revenue last year compared to 2022. The average payment in a romance scam amounts to $4,593, although victims likely paid substantially more, considering these scams often entail multiple payments.

Despite these concerning trends, illicit activity as a proportion of all crypto transactions dwindled to a mere 0.34% of blockchain activity in 2023.

Method 3: Voice cloning

The emergence of new generative artificial intelligence (AI) technologies capable of replicating anyone’s voice or appearance is now readily accessible to the public, leading to an alarming surge in scams exploiting this capability.

Various types of scams leveraging AI voice cloning software have been reported. For instance, an Ontario resident fell victim to a scam in which he received a call purportedly from his fishing buddy claiming to have been arrested for texting while driving and causing an accident. Believing it was his friend in need, the man transferred $8,000, only to realize later that it was a fraudulent scheme.

Image2

Even celebrities like Scarlett Johansson are not immune to such exploitation. Legal action is being pursued as an app developer allegedly utilized an AI-generated clone of Johansson’s voice to simulate her endorsement of a product.

Consider a scenario where you receive a call that appears to be from your sister. The caller’s voice is uncannily identical to hers, and she excitedly shares news about purchasing a new house but requests financial assistance for the down payment, promising repayment within a month. While your instinct may be to help, verifying the caller’s identity is crucial before taking any action.

In the same way, AI can be used for crypto scams. For example, a generated voice can communicate with a person who has cryptocurrency and convince him to “invest,”,” “borrow” (on behalf of a friend), “buy/sell at a profit,”, or “protect” his coins from hackers.