You already know not to share sensitive information with scammers over the phone, in emails or on sketchy websites that set off alarm bells.
But now that AI tools are everywhere — and more people are using them every day —it’s becoming surprisingly common to see users share highly personal details with chatbots. At the same time, lawsuits from individuals, companies and even government organizations have raised concerns about how user data is handled and shared.
I get the appeal. Chatbots are incredibly convenient and often helpful when you’re trying to solve a problem. But that doesn’t mean everything belongs in the prompt box. Sharing certain types of information can increase the risk of it being exposed, misused or stored in ways you didn’t expect.
Article continues below
Since I started using AI daily, I’ve followed one simple rule: assume anything you type could be seen by someone else. That mindset has made it much easier to spot the seven things you should never share with a chatbot.
Never let a chatbot access these 7 sensitive details about you or others

Whether you’re new to chatbots or consider yourself experienced, these 10 reminders are worth keeping in mind when it comes to oversharing:
- Passwords: You should never paste any of your passwords into a chatbot. Trying to think of a new password variation that’s strong enough or finding the solution to login issues you’re currently experiencing are situations that don’t require going to a chatbot for help.
- Financial information: Common sense says to never expose your debit/credit card details, bank account information, investment data, or anything else that pertains to your finances to a chatbot. Asking for budgeting advice from an AI tool and running it across a legit financial adviser is fine, but sharing your personal money data with a chatbot to get those answers is not.
- Social security number: This is a goldmine for scammers and should never be included in any AI prompt, especially with the threat of AI companies sharing user data without their consent on the rise.
- Confidential documents: Uploading documents that contain your address, account numbers, and other personal identifiers for some sort of analysis about them is not the way.
- Work-related information: You may get the urge to let AI summarize a lengthy email, PowerPoint presentation or business deck you’ve gotten from a colleague. But it’s not worth violating company policy and risking exposing your company’s inner dealings after sharing it with a chatbot.
- Medical documents: Referring to AI tools like Ada and Wysa for inquiries about your physical and mental symptoms is worth doing, while also mentioning them to a real health professional. But uploading documents that make your medical history, health insurance details, or lab results is a no-no.
- Other people’s information: There’s no need to make someone susceptible to a data leak by sharing their most personal data information while using a chatbot. Breaking that level of trust with a loved one just because you attached it to a prompt is just not worth the risk.
Bottom line
There are so many ways you can get the tips, advice, and creative ideas you need from AI tools. But protecting yourself is paramount when interacting with them — that means keeping your most precious to yourself and never getting the urge to paste or attach them to ChatGPT, Gemini, Perplexity or others during a chat.
AI users should always be aware that chatbots aren’t private diaries — they are digital assistants that can help in many cases, but aren’t private havens for your info.

Follow Tom’s Guide on Google News and add us as a preferred source to get our up-to-date news, analysis, and reviews in your feeds.
More from Tom’s Guide
Back to Mobile Cell Phones
