Chatgpt is not a therapist: how the use of AI can be harmful instead of traditional therapy

Chatgpt is not a therapist: how the use of AI can be harmful instead of traditional therapy

She also notes that the ethical consequences arise from the lack of chatgpt regulations. It has not been tested or approved to act as a therapy tool, and so its use can lead to harmful results.

“There have been some really tragic cases of people who have really negative results,” said Dr. Gold. “There was this case that someone reported, someone who has suicide, a child, because of a dialogue who was said to be on chatgpt.”

The case she refers to is that of 16-year-old Adam RaineA young man who turned to chatgpt to go through fear and eventually take his own life, where his parents claimed that the program coached him about how he could plan and continue his suicide.

“It can be really dangerous,” said Dr. Gold.

Refinement can help, but it will never replace traditional care

Although chatgpt, as it stands today, is not a feasible source for therapy, that does not mean that people have to throw the baby away with the bath water. AI solutions have their benefits, even if they do not specifically chatgpt, and with refinement they can become a valuable tool in the future for tackling some of the challenges that people with mental disorders are confronted, such as access to care.


#Chatgpt #therapist #harmful #traditional #therapy

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *