HoloAvatar by 2Wai is an AI tool that creates interactive digital replicas of deceased loved ones from just three minutes of footage, enabling real-time conversations in over 40 languages, but it raises serious ethical concerns about consent and privacy in grief tech.
-
2Wai’s HoloAvatar uses proprietary FedBrain technology for on-device processing to protect user data and minimize AI errors.
-
The app, co-founded by former Disney actor Calum Worthy, launched in iOS beta on November 11, 2025, and supports avatars for both deceased and living individuals.
-
Public backlash highlights risks of commercializing grief, with comparisons to dystopian scenarios and calls for stronger post-mortem privacy laws, amid a growing grief tech market valued at over $500 million globally.
Explore 2Wai’s HoloAvatar AI: Create lifelike digital replicas of deceased loved ones ethically? Dive into ethical alarms, privacy risks, and grief tech innovations raising consent debates today.
What is 2Wai’s HoloAvatar and How Does It Recreate Deceased Loved Ones?
2Wai’s HoloAvatar is an innovative AI-powered mobile app feature that generates conversational video avatars from minimal user inputs, allowing interactive chats with digital recreations of deceased individuals. Launched in iOS beta on November 11, 2025, by startup 2Wai, co-founded by actor Calum Worthy and producer Russell Geyser, it processes three minutes of footage, audio, and text to produce lifelike replicas supporting over 40 languages. While aimed at preserving legacies, it has sparked intense debate over the ethics of simulating personal loss.
How Does HoloAvatar’s Technology Work and What Are Its Privacy Features?
At the core of HoloAvatar is 2Wai’s proprietary FedBrain technology, which enables on-device AI processing to safeguard user privacy and restrict responses to approved data sources, thereby reducing the risk of inaccurate AI-generated “hallucinations.” Users upload personal media, and the system analyzes voice patterns, facial expressions, and speech habits to construct interactive avatars capable of real-time conversations. This approach contrasts with cloud-based alternatives by limiting data exposure, though experts like those from the Electronic Frontier Foundation emphasize the need for robust encryption in handling sensitive posthumous materials.
For living users, the tool extends to creating avatars for professional purposes, such as fan engagement or virtual coaching—Worthy’s own digital twin, for instance, shares anecdotes from his Disney Channel days. The app is currently free during its beta phase but plans a subscription model, estimated at $10 to $20 monthly, similar to other AI companion services. Development traces back to the 2023 SAG-AFTRA strikes, where performers advocated against unauthorized AI use of likenesses, motivating Worthy to focus on controlled, consent-based digital legacies.
2Wai secured $5 million in pre-seed funding in June 2025 from investors including partnerships with British Telecom and IBM, underscoring industry interest in ethical AI applications. As Worthy noted during the launch, “Having worked as an actor, writer, and producer for the last 20 years, I experienced firsthand how challenging it is to create a meaningful relationship with fans around the world.” This background informs the app’s emphasis on bridging personal connections through technology.
Frequently Asked Questions
What Are the Ethical Concerns Surrounding AI Recreations of Deceased Loved Ones?
Ethical issues with tools like HoloAvatar center on consent, as recreations can be made without the deceased’s prior approval, potentially distorting legacies and exploiting grief for profit. Privacy advocates warn of vulnerabilities in handling personal data, while psychologists highlight risks of hindering natural mourning processes, urging users to seek professional counseling alongside such tech.
Is HoloAvatar Legal for Creating Digital Twins of the Deceased?
HoloAvatar operates in a legal gray area without federal regulations on posthumous digital likenesses, though California’s AB 1836, enacted in September 2024, prohibits unauthorized AI replicas of deceased performers without estate consent, imposing fines up to $10,000. For non-celebrities, enforcement relies on family opt-ins, but broader laws are under consideration to address deepfake abuses and data ownership.
Key Takeaways
- Innovation in Grief Tech: HoloAvatar advances AI by enabling multilingual, interactive avatars from minimal inputs, positioning 2Wai as a leader in legacy preservation tools.
- Privacy Safeguards: On-device processing via FedBrain minimizes data risks, but critics demand stricter post-mortem protections to prevent exploitation.
- Call for Regulation: As grief tech grows, stakeholders should advocate for comprehensive laws ensuring consent and ethical commercialization to balance innovation with human dignity.
Conclusion
2Wai’s HoloAvatar represents a bold step in AI recreating deceased loved ones, blending advanced technology with the profound human need to maintain connections beyond loss. While its FedBrain-driven privacy features and consent protocols address some ethical concerns in grief tech, the tool underscores the urgency for evolving legal frameworks to protect digital legacies. As the sector matures, responsible innovation could transform mourning into meaningful continuity—encouraging users to explore these tools thoughtfully while prioritizing emotional well-being.
The launch has ignited widespread discussion, with promotional materials depicting heartfelt scenarios like a digital grandmother advising her daughter during pregnancy or reading stories to grandchildren. Public sentiment on platforms like X has been polarized, with millions of views on related posts labeling it “dystopian” or “exploitative,” echoing fears from cultural references such as the Black Mirror episode “Be Right Back.” Critics argue it commodifies grief, potentially delaying closure, as one commentator stated: “This turns human beings psychotic by simulating loss rather than processing it.”
Despite backlash, the grief tech landscape is expanding, with competitors like HereAfter AI focusing on consent-based story avatars from pre-death interviews and StoryFile providing interactive videos for memorials. Replika’s experiences, including user distress from bot updates, serve as cautionary tales. Legal scholars from institutions like Stanford Law School note that current privacy laws inadequately cover the deceased, exposing families to subscription traps and data misuse. 2Wai’s opt-in requirements for deceased avatars aim to mitigate this, but enforcement remains a challenge.
Investor confidence in AI companionship persists, yet grief monetization demands ethical scrutiny—evidenced by closures like Eternal Digital Assets due to unsustainable models. As lawmakers consider extending protections beyond celebrities, inspired by deepfake incidents in elections, the future of HoloAvatar and similar innovations hinges on balancing technological promise with societal values. For now, 2Wai continues refining its platform, inviting beta users to contribute to a “living archive of humanity.”




