Browse all

"I underwent psychoanalysis with ChatGPT"

Psychoanalysis and artificial intelligence: An unusual relationship gains traction on TikTok

I underwent psychoanalysis with ChatGPT Psychoanalysis and artificial intelligence: An unusual relationship gains traction on TikTok

“I underwent psychoanalysis with ChatGPT: this is the introduction chosen by the creator Vera Verace for one of her latest TikTok content. In the video, the girl decided to perform a famous psychological test, namely the “Rorschach inkblot test”, used for personality assessment. However, it is crucial to emphasize that every test and experiment in this field should be interpreted by a professional. In this case, Vera asked artificial intelligence to play the role of a psychotherapist and analyze the sentences associated with the iconic images. “The results were really surprising; ChatGPT managed to interpret every element I saw in the inkblots to describe personal characteristics, with which I strongly identified”: from Verace's final conclusions, the question arises, is it possible to psychoanalyze with AI without the intervention of an expert? Like two sides of the same coin, there are both negative and positive aspects, but let's proceed in order.

@vera_verace chat gpt 4 / chat gpt immagini / chat gpt psicologo #chatgpt4 #chatgptpsicologo #chatgpthack suono originale - Vera | Creatrice & Artista

Definition of Psychoanalysis under the Expert's Lens

Before delving into the recent relationship between psychology and artificial intelligence, it is important to explain what “Psychoanalysis” is. Psychologist Gaia Cavalleri invites us to reflect on the use of certain words: “There is often much confusion about it. Psychoanalysis is one of the psychotherapeutic orientations; not all professional psychologists are, therefore, psychoanalysts. It is necessary to attend a Psychotherapy school with this orientation, based on the theories of Sigmund Freud and his successors. Psychoanalyzing means conducting an in-depth analysis of the psyche, thoughts, emotions, and behaviors of a person to understand the deep motives of certain attitudes or potential psychopathological disorders. Psychoanalysis requires the use of specific techniques, such as free association and dream interpretation, through which the psychoanalyst seeks to bring unconscious thoughts and feelings to light. The goal of these practices is to promote personal growth and overcome potential blocks.

Why Self-Psychoanalysis is Not Possible

@madeofmillions With the rise of AI, we’ve seen lots of people online talking about using chatbots as a replacement for therapy. But while chatbots can provide some general helpful feedback, they don’t always get the full picture, and can sometimes cause more harm than good

Self-psychoanalysis is not possible for the same reasons self-therapy is impossible: all the relational mechanisms within which change can occur are lacking. Speaking of a "self-made" analysis raises an alarm, as it is already incorrect and risky, especially for those unfamiliar with this world and encounter it only through social media scrolling.

To support this thesis, the words of cyber-psychologist Olga Armento: “Using AI for these practices is really very dangerous. AI can never replace the work of a psychologist and the relationship it establishes with the patient.” As Armento suggests, it is important to focus on the ethical code of Italian psychologists, particularly on Article 21 – “Psychologists, also through teaching, in every field and at every level, promote psychological knowledge, share and disseminate psychological culture. However, it constitutes a serious ethical violation to teach people outside the psychological profession the use of methods, techniques, and tools of knowledge and intervention specific to the profession itself. It constitutes an aggravating circumstance if the teaching of methods, techniques, and specific tools of the psychological profession aims to preconstitute possible abusive exercises of the profession.” To conclude this point, the expert raises a question: “So, who gave ChatGPT the authorization to use these tools, i.e., those specific to the profession of psychologist?”

Light and Shadows of AI-Driven Psychoanalysis

@bbcnews And it’s a lot more popular than the Harry Potter, Beyoncé and Super Mario chatbots on Character.ai. #AI #ArtificialIntelligence #Chatbot #Therapy #TherapyTiktoks #SelfImprovement #Psychology #MentalHealth #BBCNews original sound - BBC News

Despite appearing to be an unexplored field, there are numerous studies and scientific articles on the subject easily accessible online. An example is the case of the chatbot Wysa, a hybrid of AI and industry experts who educate the algorithm to make it more efficient. According to researchers, this alternative is safer than ChatGPT regarding the concept of "Privacy", as Wysa does not collect email addresses, phone numbers, or real names and censors the information users share that could help identify them. The application, which provides cognitive-behavioral therapy for anxiety and chronic disorders, has received the designation “Breakthrough Device” from the US Food and Drug Administration. It can be used as a standalone tool or integrated into traditional therapy, where psychotherapists can monitor their patients' progress between sessions, for example, by evaluating performance in cognitive restructuring exercises.

“However, this system is not intended to replace psychologists or human support. It is a new way to receive support,” revealed Smriti Joshi, the chief psychologist of the company that promoted the project. Cavalleri also highlighted some positive aspects of this “unusual” collaboration: “It can be potentially useful as a “virtual assistant” to a psychologist, as it offers a 24-hour service: the patient can use this service to express emotional states at a specific moment or to address tasks assigned by the therapist, allowing for better monitoring (Eshghie and Eshghie, 2023).

@crystaldbright Having this access to this has actually improved my mental health. I understand some people may think speaking with a feeling human would be beneficial but as an analytical and impatient person I would prefer AI who is never insensitive or opinionated to give me direct advice and conversation about my worries. AI which has access to the data of a million doctors, mental health studies, best practices etc seems like a safer bet to me #chatgpttherapy Aesthetic - Tollan Kim

However, one must also consider the limitations, as it is still an artificial intelligence: foremost among them is the lack of empathy, the ability to put oneself in another's shoes and understand their internal states, a fundamental pillar for a good therapist (Ray, 2023). It cannot interpret what the patient says, so a small misunderstanding or misleading information could lead to serious consequences.

According to the study by Dergaa et al. (2023), the quality of the content generated by ChatGPT is far from being a guide for users and professionals or providing accurate information on mental health. It is therefore premature to draw conclusions about the usefulness and safety of ChatGPT in mental health practice. To navigate these waters effectively, Armento suggests education on cyber issues in schools and for families, which we currently lack. What do you think? Would you ever trust a chatbot to investigate such important aspects as your own inner self or address personal problems?