Would you use an AI chatbot instead of a therapist?

Would you use an AI chatbot instead of a therapist?

4 minutes, 1 second Read

Would you trust an AI chatbot and tell the vulnerable, personal information about yourself?

For some, this may seem like the far-fetched plot of a SCI-Fi film, but for others it is a reality that is much closer to home.

More Australians have access to generative AI chatbots for emotional guidance and help – seeing it as a free, personal therapist.

But experts remain at odds when it comes to the general benefits that weigh free accessible advice with privacy problems and matters with critical care.

How does it work?

Artificial intelligence expert Ronnie Das said that chatbots could predict what a person could say and then replicate those emotions to them again, using a statistical formula.

Artificial intelligence expert Ronnie Das says that AI can “absolutely” replicate human emotions. ((Delivered: Ronnie Das))

“I think if you would ask me this question about (AI and therapy) five years ago, I would probably have laughed at you … But here we are about human emotions today,” said Mr. Das.

“AI is absolutely able to replicate human emotions.

“The reason that it has been so advanced is because it has been trained in the entire knowledge that exists in humanity or has been documented, at least digitally, for several decades.”

More people turn to AI chatbots when they are emotionally sad or are looking for a moment of reflection.

In these cases, Mr. Das said that AI was effective in “listening”, “validating” and gave a pseudo feeling of understanding.

“Users in the mental health space rise. It increases steeply,” said Mr. Das.

“Ai learns about your behavior, who you are, and they always build a kind of profile, what kind of person you are, how your mind works … and based on that they are very smart in personalizing the answers.”

What are the disadvantages?

But according to the president of the Australian Psychological Society, Sara Quinn, an AI chatbot cannot “feel or give” like a person, according to the president of the Australian Psychological Society.

“No algorithm, no matter how intelligent or innovative we think they can be, the holy space that is drawn between two people can,” said Dr. Quinn.

“The current general AI models are good at imitating how people communicate and reason, but it is precisely that – it is imitation.”

A woman with blond hair named Sara Quinn laughs at the camera.

Dr. Sara Quinn is president of the Australian Psychological Society. ((Delivered))

Dr. Quinn said there was a “rhetoric” that AI could simulate real conversations, but there was no ability to navigate “complex social signals”.

People can have inappropriate coping mechanisms that cannot identify a chatbot or be at a point of critical care where medical intervention is required.

Dr. Quinn acknowledged that artificial intelligence would form an integral part of therapeutic treatment as the technology develops, but emphasized the need for privacy regulations.

“[It] Must be approached, yes, with enthusiasm. It is very easy for us to reconcile with as long as we know AI’s place and ensure that we can integrate the use of AI ethically, “she said.

Could it be beneficial?

When it comes to expanding access for those who live in rural and remote areas, Dr. Quinn ai -chatbots as an “incredible asset” – a better alternative to people who are silent in silence without support.

Amanda Davies, the head of the School of Social Sciences in UWA, agrees.

She said that people who live in isolated areas are struggling to find therapists, in the midst of long waiting times and high costs.

“Therapy is a luxury item for some people, although it is essential,” said Mrs. Davies.

A woman named Amanda Davies is sitting at a desk in front of a computer.

Head of the School of Social Sciences at UWA Amanda Davies. ((ABC News: Ruby Littler))

“People have to get things like that from their household budgets, and that is where Chatgpt can fill in a gap, which is an unfortunate truth.“

It often works best for people who are already working on therapy because they have the language and tools to best involve the generative capacity of AI.

As this technology continues to develop, experts in the field expect that the use of these services for emotional guidance and regulations will occur more often.

The Australian Psychological Society said that AI was normalized and part of the “vernacular” in their therapeutic practice, but concerning privacy and confidentiality were just as common.

In the meantime, AI experts, just like Ronnie Das, believe in the mental health space in the mental health space in mental health care in the past six months alone and he would continue to grow projects that would continue to grow.

Load

#chatbot #therapist

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *