A growing number of Americans are turning to artificial intelligence tools for financial guidance, even for things as serious as retirement planning. Caitlyn Yingling, a Dallas-based recruiter who said she rarely checked her retirement account, finally logged into her 401(k) and was stunned.
The 32-year-old discovered her retirement savings were invested in a target-date fund set for someone who retired in 2015 — as if she had already left the workforce a decade ago.
Embarrassed but motivated, she turned to ChatGPT for help. The chatbot explained target-date funds and suggested adjusting her retirement year to 2060 and recommended a more growth-oriented allocation. Later, a human financial adviser confirmed the oversight and helped her fix the lingering settings.
Yingling’s experience, first reported by The New York Times (1), captures the pros and cons of using artificial intelligence for retirement planning.
But Yingling’s experience also highlights a larger question: As more people turn to AI for financial guidance, how reliable is the advice when it comes to something as complex and high-stakes as retirement planning?
More Americans are turning to AI tools like ChatGPT to ask questions about budgeting, investing and retirement planning. Nearly 80% of respondents who sought financial advice from an AI tool say it improved their financial situation, according to an August survey by Intuit Credit Karma (2) of more than 1,000 adults. A separate survey by Empower found 47% of Americans (3) now feel comfortable using AI in their financial lives.
Chatbots are free, available 24/7 and respond instantly without any appointment required.
However, a recent study cited by Kiplinger found (4) ChatGPT gets financial questions wrong 35% of the time. Researchers at Investing in the Web asked the chatbot 100 personal finance questions (5) and found more than a third were partially incorrect or flat-out wrong.
That’s a sobering statistic when the topic is retirement because small errors can compound over decades.
And 52% of Americans (6) who acted on AI-generated financial advice later said they made a mistake, according to Credit Karma.
Read More: The average net worth of Americans is a surprising $620,654. But it almost means nothing. Here’s the number that counts (and how to make it skyrocket)
Large language models like ChatGPT predict patterns in text. They don’t pull live market data, don’t inherently understand tax nuance and don’t act as fiduciaries required to put your interests first.
Andrew Lo, a finance professor at MIT Sloan, has described today’s AI chatbots as the “digital equivalent of sociopaths” (7). While the language may sound harsh, he describes them as smooth, persuasive and devoid of empathy. In other words, they can present both good and bad advice with the same confident tone.
That lack of ethical grounding matters in retirement planning. Should you convert a traditional IRA to a Roth? When should you claim Social Security? Which account should you withdraw from first to minimize taxes?
These decisions require context about income levels, longevity assumptions, tax brackets, risk tolerance and often emotional judgment. Chatbots don’t inherently understand your personal financial situation, goals or risk tolerance.
They also struggle with math and volatility modelling. When one journalist asked (8) ChatGPT how much $10,000 invested in Nvidia would be worth in 20 years, it produced a range of $38,000 to $164,000 — without clearly modelling taxes, volatility drag or inflation.
While AI can’t be fully trusted to plan your retirement investments, it can be helpful when it comes to understanding financial concepts.
AI excels at explaining financial jargon in plain language: the difference between a 401(k) and an IRA, how required minimum distributions work or what net investment income tax means. It can also help spot simple mistakes, compare basic account options or walk users through how financial tools work.
But following investment advice provided by AI isn’t recommended unless you’ve reviewed it with a financial professional.
Here are some guidelines to keep in mind if you want to use AI in conjunction with your finances:
-
Use it for educational purposes, not for final decisions.
-
Avoid sharing sensitive data including your Social Security numbers or full account statements.
-
Run major decisions past a professional.
For major financial decisions, human advisers remain the safer option, whether that’s a fee-only fiduciary adviser, a robo-adviser for simple portfolios or a planner offered through your employer.
Fee-only fiduciary advisers affiliated with the National Association of Personal Financial Advisors (NAPFA) are legally required to put clients’ interests first.
AI is rapidly reshaping retirement planning. It’s expanding access, lowering barriers and making financial education more conversational than ever. But when a tool gets the answers wrong 35% of the time, blind trust isn’t a strategy. And when it comes to retirement, accuracy matters the most.
Join 250,000+ readers and get Moneywise’s best stories and exclusive interviews first — clear insights curated and delivered weekly. Subscribe now.
We rely only on vetted sources and credible third-party reporting. For details, see our editorial ethics and guidelines.
The New York Times (1); Credit Karma (2, 6); Empower (3); Kiplinger (4, 5); Wall Street Journal (7); Yahoo! Finance (8); Alliance Bernstein (9)
This article provides information only and should not be construed as advice. It is provided without warranty of any kind.