Saturday, December 27

Personal Finance: The year in AI investment scams


American consumers surrendered over $12 billion to fraudsters in 2024 according to the Federal Trade Commission. Nearly half of that or just under $6 billion was due to investment scams, and 2025 is proceeding on pace to eclipse that number.

Over half of that financial fraud now involves the use of artificial intelligence, making these crimes easier to fall for and harder to detect before the damage is done. And while financial institutions and regulators are themselves employing AI to identify and mitigate investment fraud, the only truly effective prevention is to avoid becoming a victim. Here is a look at some of the most successful AI-enabled financial scams of 2025.

Phantom AI trading bots. Investors have long chased the holy grail of a fool-proof trading system that could generate consistent profits (in the finest tradition of Ponce de Leon). A host of new automated trading programs have appeared, claiming to harness the power of artificial intelligence to beat the market. Called “bots,” short for robots but essentially a software program capable of processing huge amounts of data to detect patterns, many make outrageous claims including guarantees or touting astronomical records of success.

While the track record of legitimate trading bots has been mixed at best, bad actors create programs that lure investors into depositing increasingly large sums to a broker dealer of their choosing. Once the pot has grown large enough, the bot may disappear or cease functioning, taking the investor’s funds with them, a scheme called a “rug pull.” Another variation is a pump and dump, where the bot promotes a little known cryptocurrency to artificially pump up its value until the bot sells its own holding, crashing the price. Others may trick the target into granting access to their digital wallet and then draining the account.

Celebrity deepfakes. A deepfake is a manufactured image or video that uses a type of AI called deep learning to replicate a real person and manipulate what they are saying. Fraudsters love to use fake celebrity endorsements because the familiar visage creates a sense of trust by the victim. Many people form what are called parasocial relationships with public figures, a one-sided connection in which the fan feels they know the celebrity and therefore trusts their recommendation.

An AI generated bogus celebrity might offer a free product, asking only that the customer pay shipping by logging onto a counterfeit website. Poof, a $300 charge. Or worse. Other schemes involve promoting risky or fraudulent crypto investments or penny stock offerings.

According to cybersecurity firm McAfee, 72% of Americans have seen a deepfake pitch, 31% have clicked on the link, and 10% have lost money to the scam. According to McAfee, the top 10 most frequently faked celebs include Taylor Swift, Tom Hanks, Scarlett Johannson, Sydney Sweeney and Lebron James, pitching everything from free cookware and cosmetics to magic pink salt or even soliciting donations for victims of the Los Angeles fires. And the rise of deepfake scams doesn’t stop there. McAfee also lists the 10 most impersonated social media “influencers” including such household names as Pokemane, MrBeast, Karina and Brooke Monk, obviously targeting the younger set (as if the real TikTok influencers aren’t bad enough). Caveat emptor.

Crypto recovery room scams. What could be more fun for a lowlife criminal than to swindle an unsuspecting mark? How about swindling the same victim again?

Fraudsters pose as law enforcement agencies, law firms or crypto recovery specialists offering to assist in reclaiming cryptocurrency lost in a previous scam, hence the term “recovery room.” These chisellers often create flashy websites including AI-manufactured testimonials from nonexistent clients and charge a hefty upfront fee, typically payable in cryptocurrency as well. Any red flags here?

Tech support and fake QR code scams. It used to be the case that phishing emails were easily recognizable by their poor grammar and spelling errors. No more. Using AI, criminals produce professional looking pitches to entice gullible targets into signing up for tech support services they do not need and then fleece the victims. They often pose as well-known firms in the cybersecurity industry and convince their victims to grant remote access to infest their devices with malware.

Another twist is the fake QR code. Perpetrators advertise or distribute handbills or emails with a QR code that once scanned can implant malicious bugs or infiltrate your personal information for nefarious purposes.

How to protect yourself. Bad guys are always a step or two ahead of regulators and law enforcement when it comes to deployment of artificial intelligence. Consumers need to be aware of the increasingly realistic solicitations to avoid falling victim.

Warning signs of AI investment fraud include guarantees of impressive returns. Representations that an investment is a sure thing or without risk are almost always fraudulent. If it sounds too good to be true …

Demand for upfront payment is another alarm bell, especially if payment is requested in cryptocurrency. Also be highly skeptical of any unsolicited offer to make money or recover lost assets, especially if they employ high pressure tactics. Always independently verify the identity of anyone with whom you are considering a financial transaction. And never scan a QR code from an unknown source.

Investment fraud often involves solicitation to invest in a hot stock or digital asset. Remember that it is illegal for an unregistered individual to solicit securities investments. Before opening an account, do a background check on the individual at BrokerCheck.Finra.org to review the disciplinary history and work experience of legitimate representatives.

If you are the victim of a scam, report it as soon as possible to law enforcement. Contact the Federal Trade Commission at ReportFraud.FTC.gov, or the FBI Internet Crime Compliant Center at IC3.gov. They have online resources to direct you to your next steps. And although it is unlikely that you with be reunited with your money, you might help someone else avoid the same fate.

Christopher A. Hopkins, CFA, is a co-founder of Apogee Wealth Partners in Chattanooga.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *