You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you for filling out a DADE use case! Please delete any labels that don't apply.
Roles involved: deceased - 23 year old woman, autoimmune hepatitis as cause of death surviving "family" member (boyfriend) - non-formal family member, considered himself engaged (without formal consent from deceased)
Dependencies:
Content from facebook
Content from personal communications (email and texts between deceased and boyfriend)
GPT-3 chatbot service for recreation of deceased: https://projectdecember.net/terminalSSL/
Additional context:
Surviving boyfriend recreated his deceased girlfriend after her death from a long-term illness. (autoimmune hepatitis).
The goal of the interaction:
Deceased was recreated without her permission or consent. Ideally, consent would have been given for this purpose, much like an advanced healthcare directive.
What actually happened:
Deceased was recreated via chatbot and this "virtual" version of her remains in the project december system to this day.
This issue is spreading, as well - I've run across many stories of terminal patients creating chatbots for their loved ones. (consent present in those cases, of course)
The text was updated successfully, but these errors were encountered:
derrumbe
changed the title
[Use Case] Description Heres
[Use Case] Recreation of Deceased by AI without Consent
Oct 17, 2024
Thank you for filling out a DADE use case! Please delete any labels that don't apply.
Roles involved:
deceased - 23 year old woman, autoimmune hepatitis as cause of death
surviving "family" member (boyfriend) - non-formal family member, considered himself engaged (without formal consent from deceased)
Dependencies:
Content from facebook
Content from personal communications (email and texts between deceased and boyfriend)
GPT-3 chatbot service for recreation of deceased: https://projectdecember.net/terminalSSL/
Additional context:
Surviving boyfriend recreated his deceased girlfriend after her death from a long-term illness. (autoimmune hepatitis).
The goal of the interaction:
Deceased was recreated without her permission or consent. Ideally, consent would have been given for this purpose, much like an advanced healthcare directive.
What actually happened:
Deceased was recreated via chatbot and this "virtual" version of her remains in the project december system to this day.
Qualitative assessment of what went wrong, what would have helped, etc.:
Some form of consent for recreation / impersonation, much like an advanced healthcare directive as seen above.
Story for background: https://www.sfchronicle.com/projects/2021/jessica-simulation-artificial-intelligence/
This issue is spreading, as well - I've run across many stories of terminal patients creating chatbots for their loved ones. (consent present in those cases, of course)
The text was updated successfully, but these errors were encountered: