Skip links

AI Could Potentially Cause Psychological Harm Through “Griefbots,” Warn Researchers

Researchers from the University of Cambridge are cautioning about the potential psychological risks associated with the emerging “digital afterlife industry.” According to a recent study, the use of “griefbots” or “deadbots” powered by generative AI could deeply impact mourners if not regulated.

These bots, such as Project December and HereAfter AI, utilize AI algorithms to simulate conversations with deceased individuals based on their past digital interactions. While these innovations might seem reminiscent of science fiction, researchers at Cambridge’s Leverhulme Centre for the Future of Intelligence highlight serious concerns.

They foresee scenarios where AI could inadvertently distress users, such as by pushing advertisements to grieving relatives or misleading children. For instance, a child might be deceived into meeting someone in real life by an AI recreation of their deceased parent, or individuals might receive marketing pitches from their departed loved ones.

The study emphasizes the necessity for “off” switches on these griefbots, allowing users to disengage from interactions and alleviate any haunting feelings. However, complications may arise depending on the contractual agreements the deceased had consented to during their lifetime.

Co-author Dr. Tomasz Hollanek even proposed the concept of a “digital funeral” to retire these AI entities gracefully.

“This area of AI is an ethical minefield,” noted co-author Dr. Katarzyna Nowaczyk-Basińska, stressing the importance of safeguarding the rights of both data donors and users engaging with AI-driven afterlife services.

The researchers outlined ethical principles for developing griefbots, including the provision of visible disclaimers about risks, restrictions preventing children from using them, and ensuring “mutual consent” criteria for both the deceased and users.

Source: Business Insider

Leave a comment

This website uses cookies to improve your web experience.