Months after the release of ChatGPT, social media users began sharing their experiences with the tool on the internet. Some unusual comments caught attention. These are people who used the program to try to heal from psychological trauma.
ADVERTISING
After the repercussion, there was a greater movement of users considering the use of AI to help their mental health:
We spoke with Juliana Vieira, PhD in Psychology, to understand this phenomenon and the problems involved in using an artificial intelligence tool as a therapist.
AI is not a therapist and sources consulted may not be reliable
For Juliana, “artificial intelligence is a very interesting strategy, but it cannot be considered a therapist”.
The professional explains that the big issue with using a machine to help with mental health is that the AI may not be accurate in identifying the person's problem, which can create even more confusion for the user.
ADVERTISING
“I don’t see a problem in seeking (help with AI). The so far unanswered question is whether the sources are reliable or not. It is important for people to filter the information and confirm on other websites that have scientific information.”
One of the arguments of people who seek psychological help in ChatGPT is accessibility. Because it has free and intuitive access, people can just open the browser and vent. For psychologist Juliana Vieira, a therapist cannot be replaced by a machine.
“A machine can help, but psychotherapy with a specialized professional, a psychologist, who has qualifications, knowledge, ethics and commitment to the patient’s confidentiality is still recommended.”
ADVERTISING
Since it is still unreliable to use an AI as a therapist, what would be the solution for people who do not feel comfortable exposing their problems to a professional conventionally?
“The solution is to overcome the barriers of reaching a psychologist. And for a few years now we have had online psychotherapy validated by the Federal Psychology Council.”
@curtonews ChatGPT psychologist? The artificial intelligence tool has been used as a therapist, but professionals point out danger. ⚠️ #NewsversobyCurto ♬ original sound – Curto News
A OpenAIOn official statement of its usage policies, it says that the chatbot cannot “tell someone that they have or do not have a certain health condition or provide instructions on how to cure or treat a health condition”. Furthermore, the company states that “the platforms of OpenAI should not be used to triage or manage life-threatening problems that need immediate attention.”
See also:
Receive news and newsletters do Curto News by Telegram e WhatsApp.
ADVERTISING