A trend has crept into estate planning consultations across the country. Clients walk in with printouts, screenshots, and surprisingly polished questions about dynasty trusts, Medicaid look-back periods, and asset protection structures. The client isn’t bringing documents from the previous attorney; they’re bringing in the not-so-sound legal advice of ChatGPT or another chatbot.
Estate planning attorneys are watching it happen and growing uneasy. It doesn’t take long for an experienced lawyer to look beyond the veneer of a polished document and find a recommendation that ignores state law, misreads a tax rule, or proposes a structure that will cost the family more in administration fees than it saves in taxes.
AI does some things well. But it’s not an attorney and the consequences of treating it as licensed counsel can be detrimental.
The Privilege Problem
The first issue has nothing to do with what AI knows or fails to know. It has to do with where the words go after a person types them.
Estate planning lawyer Marielle Hazen of the Hazen Law Group in Pennsylvania has spent decades guiding families through estate planning and elder law decisions. When clients mention they have already worked through some of their planning with a chatbot, she pauses the conversation to make sure they understand what just happened.
According to Hazen, "Attorney-client privilege protects the conversations you have with me. It does not protect the conversations you have with an AI chatbot. In a disputed estate, that distinction can be the difference between a clean administration and years of litigation built on your own words."
The point lands hardest in the cases that turn into fights. A will contested among siblings. A trust dispute over a parent's intentions. An IRS examination of a complex gift. Each of these can pull discovery requests for any communications relevant to the testator's reasoning. Conversations with a lawyer are protected by privilege. Conversations with a chatbot sit on a server owned by corporations and enjoy no such legal protections.
Estate planning attorney Jake Slowik of Slowik Estate Planning in Atlanta, GA puts the legal exposure in plainer terms. "Anything you type into an AI chatbot can be subpoenaed. In a trust dispute, a probate contest, or an IRS audit, those chat logs don't belong to you and aren't protected by attorney-client privilege."
The Accuracy Problem with AI
Elder law sits at one of the densest crossroads in American law: state probate codes, federal tax provisions, Medicaid eligibility rules that shift from state to state, the federal five-year look-back period for asset transfers, and trust doctrines that diverge sharply across jurisdictions. A chatbot drawing on the public internet can return a response that reads as if written by an expert while citing a rule from the wrong state, applying a tax exemption at last year's threshold, or describing a Medicaid planning technique that a particular state has already closed off.
Chatbots get things wrong without accountability. They get things wrong in ways that look right.
The damage from these errors tends to surface long after the conversation has ended. A parent enters a nursing home, applies for Medicaid, and the family learns the trust they set up two years earlier disqualifies them for benefits. A widow tries to fund a bypass trust and discovers the language pulled from a chatbot fails the state's statutory requirements. A son named as successor trustee finds the trust never owned the house it was supposed to hold.
"I tell clients that AI is a fine place to start a question, but a terrible place to finish one,” said Hazen. “The cost of bad estate or elder law advice often doesn't show up until someone has already passed away or entered a nursing home, when it's too late to fix."
Private Information Can Become Training Data
Estate planning conversations contain some of the most sensitive disclosures a person ever makes. Net worth. Business interests. Real estate holdings. But beyond that, the messy realities of life, which could include a beneficiary with a substance abuse issue. The estranged daughter. The second marriage. The child from an affair.
Within exchanges with your lawyer, these financial positions and family secrets are part of the variables that are needed to create a strategic and compassionate estate plan that’s right for you. But to a tech company, it’s now training data.
Slowik emphasizes this with his clients' saying, "Every asset you disclose to a chatbot is a data point stored on someone else's server. Your net worth, business interests, and family dynamics are now all in a tech company's training pipeline."
The disclosures happen casually. A user types a question. The chatbot answers. The exchange feels private because it looks like a conversation. The reality is closer to filling out a detailed financial questionnaire and mailing it to a company whose internal data practices the user has no real visibility into. Whether the data trains future models, gets reviewed by human raters, or sits in retention systems for years depends on a privacy policy most users never read.
The Fiduciary Gap
A licensed attorney operates under a fiduciary duty. The phrase carries real weight: a legal obligation enforced by state bar associations, malpractice exposure, and ethical rules that predate the existence of computers. A chatbot operates under terms of service.
The fiduciary gap shows up most visibly in what a chatbot will recommend. A prompt about minimizing estate tax often returns a stack of sophisticated structures: dynasty trusts, domestic asset protection trusts, family limited partnerships, irrevocable life insurance trusts. Each one is a real tool with real uses. Each one carries real costs that the chatbot tends to skip past.
Slowik warns that "AI will suggest a dynasty trust, a DAPT, and a family limited partnership without ever telling you that the setup costs, annual administration, and trustee fees might run tens of thousands of dollars a year. A good estate planning attorney talks you out of things as often as they talk you into them. A chatbot never will."
The Human Element
The deepest limitation of AI in this space has less to do with law than with people. An estate plan exists on paper. The plan executes through humans. A surviving spouse who has to remember how to fund the trust. An adult child who agrees to serve as trustee. An estranged sibling who decides whether to challenge a distribution. A family business partner who has the option of buying out the heirs.
The skipped conversation is where the practice of estate planning lives. Will the daughter agree to be trustee for her brother's spendthrift trust without resenting him for the next thirty years? Will the family business survive a buy-sell agreement that requires three siblings to agree on a valuation? Will the surviving spouse remember to retitle the house? A chatbot can describe each of these structures in sound prose. It cannot weigh whether the family in front of it can carry them.
Where AI Fits in the Estate Planning Process
Many lawyers do leave a narrow lane open. Hazen suggests AI as a starting point: a way to learn vocabulary before a meeting, get oriented to general concepts, or develop questions to bring into a consultation. This can lead to sound advice without divulging private information about your situation.
Drafting documents. Naming fiduciaries. Structuring trusts. Making tax elections. Each of these decisions echoes for decades, and many of them only become visible after the person who made them has lost the ability to revisit the choice.
Comments
Loading comments…