What if your estate trustee had to manage not just your estate, but your digital clone?
Inspired by my colleague Margarita’s blog on AI chatbots in a criminal trial, I started thinking about how these emerging technologies are reshaping our world in estates.
Popularly known as ghostbots, these technologies raise complex questions that cut across the fields of tech, law, and estate administration. What started as niche personal experiments are now becoming commercial products. Whether individuals wish to live on through digital avatars or to protect their legacy from such replication, these developments demand legal and ethical attention
The Technology Itself
The phenomenon of ghostbots began gaining attention as early as 2016, when stories like “The Man Who Turned His Dead Father Into a Chatbot” (BBC) or the virtual reunion between a grieving mother and her deceased daughter in “Meeting You” (MBC) surfaced. Since then, several platforms have emerged to commercialize the idea of digital immortality, including:
- Replika.ai – AI companions trained to simulate personality and conversation.
- HereAfter AI – Allows individuals to record stories in their voice to be used by future generations.
- StoryFile – AI-powered interviews that allow people to “speak” posthumously.
- RAVATAR and CAAVault – Projects creating synthetic performers and AI replicas, particularly in the entertainment industry.
The Legal Side
Legal responses to ghostbots remain patchwork and reactive. Most current legislation focuses on living subjects – particularly in protecting against deepfakes and unauthorized use of name, image, and likeness (NIL):
In the United States, California and 32 other states have enacted specific laws addressing deepfakes, focusing particularly on issues related to elections and consent. Meanwhile, Australia has recently introduced criminal sanctions through the Criminal Code Amendment (Deepfake Sexual Material) Bill 2024, targeting the misuse of deepfake technology. The European Union has also taken steps to regulate synthetic media, including deepfakes, by requiring disclosure under the EU AI Act. In Canada, the legal landscape surrounding deepfakes is still developing, with ongoing discussions about how best to address these emerging challenges.
Where the law is notably underdeveloped is in governing the posthumous use of these digital identities. A recent piece in JURIST, highlights the uncertainty in adapting existing legal frameworks to address post-mortem privacy and the commodification of personality traits in digital form. Academics have emphasized the need for comprehensive frameworks that distinguish between privacy, property, and personality rights after death. Crucially, not all jurisdictions recognize post-mortem personality or NIL rights.
The Estate Angle
As more individuals begin to train or interact with AI tools to preserve their voice, stories, and likeness, these digital remains present a new category of non-traditional estate assets.
One proposal to address this is the “Do/Don’t Bot Me” clause. This clause can be included in wills or digital estate plans to direct whether – and how – a person’s voice, likeness, or personality may be used by AI after their death.
From an administration perspective, these AI personas introduce novel questions: Who maintains the bot? (Is funding allocated? Is someone named to oversee it? What happens if the platform becomes obsolete?
Key Takeaways
This is not science fiction. Many people already curate digital content in ways that could later serve as training data for a ghostbot – intentionally or otherwise. Estate professionals may want to raise this topic with clients during planning, particularly those who:
- Engage heavily with AI and digital platforms;
- Have strong feelings about digital privacy or posthumous legacy;
- Are public figures or artists with a commercially valuable persona.
Some early – stage considerations for estate practitioners:
- For those who want a ghostbot: Provide detailed instructions, designate a digital executor, and set up funding mechanisms.
- For those who don’t want one: Include a “Do Not Bot Me” clause in the will or digital will, and ensure no AI access is granted to stored data or likeness.
- For everyone else: Discuss the possibilities. Just as we plan for tangible property, digital identity now deserves a place in modern estate conversations.
We are entering an era where the line between memory and simulation is blurring (cue the ominous music). The implications – for grief, legacy, and the law – are just beginning to unfold.
Boris