Judge Rules Anything You Tell AI Can Be Used Against You, Even If You Ask for Legal Advice

A federal court case in New York has established a precedent that could reshape how Americans interact with artificial intelligence tools. According to the ruling, conversations with AI chatbots are not protected by attorney-client privilege, even when users seek legal guidance.

The ruling emerged from United States v. Bradley Heppner, where prosecutors sought access to approximately 31 documents the defendant created using Claude, an AI tool developed by Anthropic. Before his arrest on securities fraud charges, Heppner had entered queries related to the government’s investigation into the AI platform, then shared the responses with his defense attorneys.

Defense counsel argued these documents should be protected as privileged communications. The government disagreed, filing a motion that systematically dismantled each pillar of the privilege claim.

Prosecutors argued the AI documents failed every requirement for attorney-client privilege. First, they were not communications between a client and attorney since the AI tool is not a licensed lawyer. Second, they were not made for obtaining legal advice because Anthropic’s terms explicitly state that Claude does not provide legal advice and users should consult qualified attorneys. Third, they were not confidential since Heppner voluntarily shared his queries with a third-party commercial platform whose privacy policy permits disclosure to governmental authorities.

The government emphasized that merely forwarding unprivileged documents to an attorney later does not retroactively create privilege protection. “Sending preexisting documents to counsel does not confer attorney-client privilege,” prosecutors wrote, citing established legal precedent.

According to sources, lawyers report that clients increasingly turn to chatbots like ChatGPT for legal guidance, often receiving misleading information that creates unrealistic expectations.

“It’s like the WebMD effect on st**oids,” said Dave Jochnowitz, a partner at law firm Outten & Golden, referring to how medical websites can give people misguided understanding of their conditions.

Personal injury lawyer S. Randall Hood noted that AI-generated advice makes clients skeptical of actual legal counsel. “ChatGPT is telling them ‘You got a ki ller case,'” Jochnowitz explained, but the models lack understanding of full context, applicable laws, or case history.

Meanwhile, law firms are cautiously adopting AI for document review and drafting, particularly after several attorneys faced sanctions for submitting AI-generated briefs containing false information.

“There’s been a lot of experimentation,” said Aubrey Bishai, chief innovation officer at Vinson & Elkins, adding that tedious tasks like copying provisions into spreadsheets are becoming obsolete.

The industry faces particular anxiety about AI’s impact on the billable hour. If technology speeds up tasks, that means fewer billable hours. The American Bar Association’s recent guidance on AI clarified that lawyers cannot bill for more time than actually spent on tasks.

Legal software companies felt the potential disruption when their stock prices dropped sharply after Anthropic released new legal products. However, experts caution against overreaction.

Elliott Rush, a law professor and economist at ETH Zurich, noted that while Anthropic’s tools excel at document analysis, they cannot replace research databases like Westlaw or LexisNexis because they lack connections to comprehensive case law and statutes.

The Heppner case establishes that AI platforms operate fundamentally differently from human attorneys, who owe duties of loyalty, confidentiality, and professional responsibility to courts and bar associations. These policy considerations underlying attorney-client privilege cannot simply transfer to machines.

As J.H. “Rip” Verkerke, a law professor at the University of Virginia, observed about AI adoption in legal practice: “There are going to be some big claims about what can be done. A lot of them will turn out to be vaporware. That’s what a lot of firms have discovered.”