As artificial intelligence capabilities expand, a new digital dilemma is emerging: AI-powered tools can now create realistic digital replicas of deceased individuals using their photos, videos, messages, and social media posts. These “AI ghosts”—chatbots, voice simulators, and even video representations trained on a person’s digital footprint—are becoming increasingly sophisticated and accessible to anyone with basic technical skills.
The technology raises profound questions about consent, privacy, and the rights of the deceased. While some families find comfort in these digital memorials, others view them as disturbing violations of their loved ones’ memory. This tension has sparked a complex legal and ethical debate: Can you legally prevent yourself from becoming an AI ghost after death?
The answer, according to legal experts and estate planning professionals, is complicated. Current laws offer limited protection, and writing “no AI resurrections” into a will may not provide the safeguard many people assume it would.
Digital resurrection technology has evolved rapidly from experimental projects to commercially available services. Companies like HereAfter AI, SeanceAI, and StoryFile now offer platforms where users can upload photos, videos, and text messages from deceased loved ones to create interactive AI representations.
The process is surprisingly straightforward. Modern AI models like ChatGPT can be trained on a person’s digital communications—text messages, emails, social media posts—to generate responses that mimic their communication style. More advanced services can clone voices from audio recordings or create video avatars that appear to speak in the deceased person’s voice.
This technology gained widespread attention recently when a realistic video simulation was used to deliver a murder victim’s impact statement in court. The incident sparked significant social media backlash, with many describing the AI representation as deeply unsettling rather than meaningful.
Muhammad Aurangzeb Ahmad, a computer science professor at the University of Washington Bothell, was among the earliest researchers to create such a system. More than a decade ago, after his father’s death, Ahmad built a digital replica to ensure his future children could interact with their grandfather. Unlike today’s commercially available tools, Ahmad’s system required extensive technical expertise—he had to train his own AI model and kept the system isolated on a single laptop accessible only to family members.
Ahmad’s experience illustrates both the appeal and the complications of AI resurrection technology. While his family found value in the digital representation, Ahmad later realized he never asked his father’s permission for the project. The bot also reflected Ahmad’s specific relationship with his father, potentially feeling “off” to siblings who had different dynamics with their parent.
The legal framework surrounding AI ghosts remains largely uncharted territory. Katie Sheehan, a managing director and wealth strategist for Crestwood Advisors who specializes in estate planning, reports that the topic rarely arises in professional practice.
“I have not seen any documents drafted to date taking this into consideration, and I review estate plans for clients every day,” Sheehan explains. The legal profession simply hasn’t caught up with the technology’s rapid advancement.
While it’s theoretically possible to include AI resurrection restrictions in estate planning documents, the effectiveness of such provisions is questionable. Sheehan notes that people could draft language preventing executors from sharing texts, voice recordings, images, or writings with AI tools. However, these restrictions would only bind the person’s official estate representatives—they wouldn’t prevent other family members, friends, or even strangers from creating unauthorized digital replicas using publicly available information.
The challenge is compounded by the patchwork of relevant laws. The Revised Uniform Fiduciary Access to Digital Assets Act, adopted by most states, governs access to deceased individuals’ online accounts like social media and email. However, this law doesn’t directly address AI ghosts, though it may cover some of the source material used to create them.
For celebrities and public figures, existing right of publicity laws provide some protection against commercial use of their likeness after death. However, these protections typically don’t extend to non-commercial, personal uses of someone’s digital likeness—exactly the scenario most families encounter with AI ghosts.
Even if someone successfully includes AI resurrection restrictions in their will, enforcement presents significant challenges. Estate planning documents are only binding on the deceased person’s official representatives and beneficiaries. They cannot control the actions of others who might have access to the person’s digital footprint.
Consider the complexity: a person’s social media posts, photos tagged by friends, voice messages sent to multiple recipients, and other digital traces exist across numerous platforms and in other people’s possession. Creating comprehensive restrictions that address all potential sources of data would be extremely difficult, if not impossible.
Victoria Haneman, a law professor who has written extensively on digital resurrection law, points to a telling example. After publishing her research, a lawyer contacted her about a client whose deceased grandmother’s image was used without permission to create a dancing meme. Despite the family’s distress, they had no legal recourse because the use wasn’t commercial.
“If it’s not being used for a commercial purpose, she really has no control over this use,” Haneman explains. “And she’s deeply troubled by this.”
Rather than relying solely on estate planning, some legal experts advocate for a different approach: establishing a “right to deletion” that would focus on removing the underlying data used to create AI ghosts rather than regulating the digital replicas themselves.
This concept would grant living family members or designated representatives the authority to request deletion of a deceased person’s data from various platforms and services. The approach recognizes that controlling the output (AI ghosts) is more complex than controlling the input (personal data).
Haneman argues this solution would be more equitable than estate planning alone. More than 60 percent of Americans die without a will, often including those without significant wealth, women, and racial minorities who are less likely to have formal estate plans. A right to deletion framework could provide protection for these vulnerable populations without requiring expensive legal services.
However, even a right to deletion would face practical limitations. Data exists across countless platforms, in other people’s devices, and in archived formats that may be difficult to identify and remove. The global nature of the internet means data could be stored in jurisdictions with different privacy laws.
The legal complexity reflects deeper cultural uncertainties about death, technology, and memory. Some families find AI ghosts comforting, viewing them as a natural extension of traditional memorialization practices like keeping photographs or saved voicemails. Others see them as disturbing violations that prevent healthy grieving processes.
Mental health experts have raised concerns about the psychological impact of AI ghosts. While they might provide short-term comfort, they could potentially interfere with the natural grieving process by making it harder for people to accept loss and move forward. Some worry that people might become emotionally dependent on AI representations rather than processing their grief.
Ahmad’s experience with his children illustrates these concerns. Initially, his young kids became confused about whether their grandfather was alive or dead, particularly during the early pandemic when many family interactions moved online. Ahmad had to restrict access to the AI system until his children were older and better able to understand the distinction between the bot and their actual grandfather.
The growing “digital afterlife industry” raises additional concerns about potential exploitation. Critics worry that companies might eventually target grieving individuals with aggressive marketing, offering free trials during vulnerable moments and then requiring subscriptions to maintain access to AI representations of loved ones.
More troubling scenarios involve advertising models where conversations with AI ghosts could feature ads delivered in the deceased person’s voice, or data harvesting where personal information from the dead—which receives fewer legal protections than data from the living—could be used for commercial purposes.
These concerns aren’t merely theoretical. The business model incentives of AI companies often involve collecting and monetizing user data, and the data of deceased individuals represents a largely unregulated resource.
For individuals concerned about becoming AI ghosts, several practical steps can provide some protection, though none are foolproof:
Estate planning provisions: Work with an attorney to include specific language in wills and powers of attorney that restricts executors from sharing personal data with AI tools or authorizing digital replicas.
Digital legacy management: Use existing tools to designate trusted contacts for social media accounts and specify what should happen to digital accounts after death.
Family communication: Clearly communicate preferences about AI resurrection to family members, as their cooperation may be more important than legal restrictions.
Privacy settings review: Regularly audit social media privacy settings and consider limiting public access to personal content.
Data minimization: Be mindful of the digital footprint being created and consider periodically deleting old posts, messages, and other personal data.
Ahmad’s children represent the first generation growing up with AI ghosts as a normal part of life. His daughter once suggested building a robot version of her grandfather that her father could hug—an idea that might seem natural to someone raised with AI companions but could feel unsettling to older generations.
This generational divide may ultimately determine how society approaches AI resurrection. If future generations view digital replicas as normal memorialization tools, cultural pressure to respect “no AI resurrection” requests may diminish. Conversely, if AI ghosts become widely viewed as disrespectful to the dead, legal protections may become less necessary.
The technology continues advancing rapidly. Ahmad is currently working to improve his father’s digital replica with better voice synthesis that can accurately capture his South Asian accent. Other researchers are developing realistic video representations and augmented reality tools that could make AI ghosts even more lifelike.
The intersection of AI technology, legal frameworks, and cultural attitudes around death creates a complex landscape that will likely evolve significantly over the coming years. Current legal protections are inadequate for people who want to prevent AI resurrection, but developing comprehensive solutions faces both technical and philosophical challenges.
The debate reflects broader questions about digital rights, technological consent, and the balance between innovation and individual autonomy. As AI capabilities continue expanding and digital afterlife services become more sophisticated, society will need to develop clearer frameworks for navigating these sensitive issues.
For now, people concerned about becoming AI ghosts have limited options, and even the most carefully planned estate documents may not provide complete protection. The most effective approach may be combining legal precautions with family communication and cultural advocacy for respecting the wishes of the deceased—recognizing that technology has outpaced both law and social norms in this deeply personal domain.