Have you asked ChatGPT for legal advice in the past year? If yes, OpenAI CEO Sam Altman has some bad news for you.
Sam Altam recently revealed in a podcast with Theo Von that conversations with ChatGPT can be used against you in court. So, while you’re protected by legal confidentiality when speaking to a lawyer, no such privilege exists when you're talking to ChatGPT.
AI has integrated seamlessly into our lives. So much so that many now treat it like a digital doctor, therapist, relationship advisor, and even a lawyer. Whenever they need advice, they simply open ChatGPT, type a query, and receive an answer within seconds.
And this isn’t remotely far-fetched. In fact, recent research shows that people with no legal training are more likely to follow legal advice from ChatGPT than from real lawyers.
A major reason for this choice is that LLMs, like ChatGPT, often use formal, technical language that sounds professional. In contrast, lawyers tend to use simpler language, but often with more words and elaboration.
There is a clear answer for this one: No.
First off, ChatGPT can and does make mistakes. Hence, relying on it for legal advice isn’t wise, especially if you’re already in trouble. This was illustrated when two New York lawyers were fined after submitting a legal brief filled with fake case citations generated by ChatGPT.
Also, if you consider this alongside Sam Altman’s recent comments. You’ll see that seeking legal advice from ChatGPT could actually weaken your case if it ever ends up in court. Every question or statement can potentially be used as evidence, leaving you in a worse position than you started.
The CEO admitted that the AI industry is still figuring out how to protect user privacy when it comes to sensitive conversations.
“People use it — young people, especially, use it — as a therapist, a life coach; having these relationship problems and [asking] ‘what should I do?’ And right now, if you talk to a therapist or a lawyer or a doctor about those problems, there’s legal privilege for it. There’s doctor-patient confidentiality, there’s legal confidentiality, whatever. And we haven’t figured that out yet for when you talk to ChatGPT,” he said.
OpenAI is currently fighting a court order in its lawsuit with the New York Times, which now requires ChatGPT to store chats from hundreds of millions of users globally. Previously, deleted chats were retained up to 30 days before being permanently deleted. But under the court order, OpenAI must preserve all deleted chats indefinitely, even those you thought were gone.
This means that in the event of a lawsuit, ChatGPT could be legally required to produce those conversations.
Altman added, “I think that’s very screwed up. I think we should have the same concept of privacy for your conversations with AI that we do with a therapist or whatever — and no one had to think about that even a year ago.”
People are turning to ChatGPT for all kinds of answers. And while it’s perfectly fine to ask for a lemon tart recipe, using it for legal advice is a major red flag.
Until AI companies figure out how to better protect user privacy, it’s safer to steer clear of ChatGPT for legal queries and research.
From breaking news to thought-provoking opinion pieces, our newsletter keeps you informed and engaged with what matters most. Subscribe today and join our community of readers staying ahead of the curve.