It has been 100 days since the war in Gaza began, and it has become increasingly difficult to read the news. Her husband told me it might be time to talk to a therapist. Instead, on a cold winter morning, after fighting back tears as I read another tale of human tragedy, I turned to artificial intelligence.
“I’m pretty depressed about the state of the world,” I typed into ChatGPT. “It’s natural to feel overwhelmed,” the magazine responded, offering a list of practical advice, including limiting media exposure, focusing on the positive and practicing self-care.
I closed the chat. I was sure that I would benefit from doing all this, but at that moment I didn’t feel much better.
It may seem strange that an AI would even try to provide this kind of assistance. But millions of people have already turned to his ChatGPT, a professional therapy chatbot that provides convenient and inexpensive mental health support. Even doctors are said to be using AI to create more empathetic notes for patients.
Some experts say this is a boon. After all, AI may be able to express empathy more openly and tirelessly than humans, unhindered by shame or burnout. “We admire empathetic AI” by a group of psychological researchers I wrote recently.
But others are not so sure. Many people question the idea that AI can be empathetic and worry about the consequences if people seek emotional support from machines that can only pretend to care. Some even wonder if the rise of so-called empathic AI might change the way we think…
Source: www.newscientist.com