Lord Wiggle to Not The Onion@lemmy.worldEnglish • 3 days agoTherapy Chatbot Tells Recovering Addict to Have a Little Meth as a Treatfuturism.comexternal-linkmessage-square189fedilinkarrow-up1963arrow-down112cross-posted to: aboringdystopia@lemmy.world
arrow-up1951arrow-down1external-linkTherapy Chatbot Tells Recovering Addict to Have a Little Meth as a Treatfuturism.comLord Wiggle to Not The Onion@lemmy.worldEnglish • 3 days agomessage-square189fedilinkcross-posted to: aboringdystopia@lemmy.world
minus-square@deathbird@mander.xyzlinkfedilinkEnglish7•2 days agoSue that therapist for malpractice! Wait…oh.
minus-square@jagged_circle@feddit.nllinkfedilinkEnglish3•2 days agoPretty sure you can sue the ai company
minus-square@webghost0101@sopuli.xyzlinkfedilinkEnglish3•2 days agoPretty sure its in the Tos it can’t be used for therapy. It used to be even worse. Older version of chatgpt would simply refuse to continue the conversation on the mention of suicide.
minus-square@jagged_circle@feddit.nllinkfedilinkEnglish2•edit-21 day agoWhat? Its a virtual therapist. Thats the whole point. I don’t think you can sell a sandwich and then write on the back “this sandwich is not for eating” to get out of a case of food poisoning
Sue that therapist for malpractice! Wait…oh.
Pretty sure you can sue the ai company
Pretty sure its in the Tos it can’t be used for therapy.
It used to be even worse. Older version of chatgpt would simply refuse to continue the conversation on the mention of suicide.
What? Its a virtual therapist. Thats the whole point.
I don’t think you can sell a sandwich and then write on the back “this sandwich is not for eating” to get out of a case of food poisoning