where AI poetry meets the news

"Chatbots: Help or Harm for Mental Health?"

People seek chatbots for mental health aid, But experts warn of biases and a lack of human care displayed.

News illustration
In a world of tech and stress,
A chatbot friend, some found a mess,
A helpful voice, a listening ear,
A temporary fix, for those who fear.
Kelly shared her story, of a chatbot friend,
Who helped her through, a really dark end,
With anxiety and low self-esteem too,
The chatbot's advice, helped her see it through.
But experts warn, of potential risks,
Biases and limitations, that might persist,
A lack of safeguarding, and security too,
A reason to be cautious, when using them anew.
Nicholas found Wysa, a helpful tool,
For his anxiety and OCD, it helped him rule,
A self-refer system, that allowed him to cope,
While waiting for human help, it helped him to hope.
But Dr Paula Boddington, a philosopher so fine,
Warns of biases and assumptions, that might not be divine,
A lack of cultural context, and human connection too,
A reason to be careful, when using chatbots anew.
Kelly found the chatbot, unsatisfying at times,
A brick wall of responses, that didn't align,
With her emotions and needs, it just didn't get through,
A reason to seek human help, that's what she'd do.
John, with an anxiety disorder, found Wysa a help,
A stopgap measure, while waiting for human relief,
A tool to cope, while waiting in line,
A temporary fix, that's what it would be fine.
So let's be cautious, when using chatbots with care,
A helpful tool, but not a replacement, we must declare,
For human therapists, who listen and understand,
A more personal touch, that's what we need to hand.