
Sam Altman has often championed the transformative power of ChatGPT and OpenAI’s large language models. But in a recent podcast appearance, he took a different angle—one that should give users serious pause about how and where they’re using AI chatbots. Speaking with comedian Theo Von, Altman highlighted a major privacy concern: OpenAI retains all your inputs, and under current law, that means anything you say to ChatGPT could be legally requested and handed over.
Altman didn’t mince words about how deeply people confide in ChatGPT. “People talk about the most personal shit in their lives,” he said, noting that many treat it like a trusted confidant. But unlike a real-life therapist or lawyer—professions bound by confidentiality and legal protections—chatbots enjoy no such privileges. If a legal authority asks OpenAI for your chat logs, it has little recourse but to comply. That’s a sobering reality for anyone who has used ChatGPT as a sounding board for relationship advice, moral conflicts, or sensitive personal disclosures.
This vulnerability is exactly why some tech-savvy users are turning to local LLMs. These are AI chatbots that run entirely offline, directly on your own computer, without sending any data to the cloud. Apps like GPT4All and others are rapidly improving, and thanks to recent advancements in laptop hardware—like integrated GPUs and NPUs—it’s increasingly feasible to run powerful models locally. And the biggest advantage? Privacy. When you control the model, the data stays with you. You decide whether chats are saved or erased.
Of course, this doesn’t mean your data is immune to legal action. If law enforcement or a court has a warrant, your PC itself can be searched. But the bar for that is much higher than a simple court order to a third party like OpenAI. More importantly, with local models, you have the option to never store anything in the first place.
Altman’s candid admission underscores a broader point: while cloud AI models are convenient and incredibly useful, they exist in a gray area when it comes to legal privacy. Until regulations catch up, users should think carefully about how they interact with these tools—especially when the conversation shifts from casual prompts to deeply personal issues. A local LLM might not be as flashy as a constantly-updated cloud model, but in some cases, it may be the wiser, safer choice.

