Loading
Hey everyone,
For the last 6 months, I’ve been down a rabbit hole. As a dev, I got obsessed with a question: why does talking to an AI about mental health usually feel so… empty?
I ended up scraping 250+ Reddit threads and digging through over 10,000 comments. The pattern was heartbreakingly clear.
ChatGPT came up 79 times, but the praise was always followed by a “but.” This quote from one user summed it up perfectly:
“ChatGPT can explain quantum physics, but when I had a panic attack, it gave me bullet points. I didn’t need a manual – I needed someone who understood I was scared.”
It seems to boil down to three things:
What shocked me is that people weren’t asking for AI to have emotions. They just wanted it to understand and remember theirs. The word “understanding” appeared 54 times. “Memory” came up 34 times.
Think about the difference:
The second one feels like a relationship. It’s not about being smarter; it’s about being more aware.
This whole project has me wondering if this is a problem other people feel too.
So, I wanted to ask you guys:
submitted by /u/SignificanceTime6941
[link] [comments]