160 Comments
User's avatar
Wilder Wanderings's avatar

Do you know what happens when a single mom has a meltdown at her kid’s elementary school? She doesn’t get 19 cute emoji texts at 3am—she gets judged. Labeled. Watched. She doesn’t get to cry “in the shower.” She gets custody reviews. You say people are outsourcing emotions, but what you’re missing is: some of us never had the privilege of being messy in the first place.

Some people need help finding the words. Because when the cops show up after a beating, and the abuser speaks better? That can decide everything. Life, freedom, your kids. If a tool like ChatGPT gives someone a voice—a clear one—that’s not fake. That’s survival.

You think vacuums came from someone bored with a broom? No. They came from someone who couldn’t sweep. Need creates tools. This is no different.

Expand full comment
Kaysha's avatar

i hear you, and you’re right for a lot of people, finding the words isn’t about performance, it’s about survival. that’s a very real experience, and i respect that fully.

the post isn’t meant to dismiss that. it’s more so how we’re slowly replacing the process of feeling with something overly polished. like using chatgpt to draft your emotional character arc before you’ve actually lived it yourself. very different to needing help in a high-stakes moment.

and honestly, your comment is incredibly well put, which kind of proves how powerful tools like this can be. appreciate you taking the time to respond.

Expand full comment
User's avatar
Comment deleted
May 8
Comment deleted
Expand full comment
Kaysha's avatar

look, i liked her comment because she got the point. the essay wasn’t about banning people from using chatgpt. it was about asking what happens when the tool starts replacing connection. yes, one-size-fits-all doesn’t work, which is why nowhere in the piece did i say it should.

if you felt called out, that’s not something i can control. but engaging isn’t the same as excluding, and disagreeing isn’t the same as attacking. sometimes we just read something and it touches a nerve. that’s human too!

Expand full comment
Wilder Wanderings's avatar

To be fair to the author, I did not use AI in this case. I have in the past for legal issues, and with the right prompts and direction, it’s surprisingly capable. My point, which I still stand by, is this: any tool that helps someone access an equal playing field should not be stigmatized or treated unfairly.

Expand full comment
User's avatar
Comment deleted
May 7
Comment deleted
Expand full comment
🌿 Breathe with Dawn (:D, :D)'s avatar

I feel you, I had a similar path to writing, to be able to express my deepest purest emotions, but with better words--for me even and for the person I am sharing with. Mostly AI has helped me hugely with literature I can say. But I get both of your's points--to not be lost, and to know what you are using it for. There's always pros and cons to everything, we just have to be mindful of it--being human not getting lost in the pleasures--but always keeping our connection with our deepest selves.

Expand full comment
Lavender's avatar

I just got in a new country, living with an emotionally abusive person. I can't live cause I'm not financially sound. One day, I broke down so bad I started empathising with suicidal ppl. I haven't turned twenty yet. I can't communicate with ppl cause I only know English and the national language is French. AI helped me have someone to rant to at least and that made the difference.

Expand full comment
Soft alarms.'s avatar

Thank you for this, because I completely lacked the words to explain that for some of us it is quite literally survival. We don’t get to be, ‘real and raw’ in a world where it will have real life repercussions. And if the author is only communicating with their friends on text then perhaps it’s a deeper issue than just chat gpt.

Expand full comment
KillingTheLede's avatar

Yep

Expand full comment
flickPOV's avatar

Kaysha mentioned there are spaces for ChatGPT, and maybe communicating at an elementary school is. I think what she meant was mainly targeted at social, more intimate interactions.

Expand full comment
pakovstheuniverse's avatar

The tomato isn’t about your father

Expand full comment
Winnie Parks's avatar

Thank you for this. Chat GPT helped me find words and words matter.

Expand full comment
Ahmedh Aaqil Rifky's avatar

There scary out come of this is the potential homogenization of of inner dialogue.

Whats made humanity so great is how different we all are on the inside. Soon as the creases start ironing out from the inside, we then ask the question, what does it mean to be human?

Expand full comment
Lauren Billings's avatar

This is a neat post. I disagree with it 100% but only because it's a bit misguided in terms of understanding how communication is developed in the human brain. Let me explain what's going on from a different point of view. For full transparancy, I'll write this without any use of ChatGPT. Heck, I'll leave that typo in here.

I would recommend looking up "Gardner's Theory of Multiple Intelligences" where he saw intelligence as different categories and how some people were very good at certain things. Imagine it when someone says "i'm so bad at math!!!! uwu!!!!" but they're like an amazing artist. Does that mean they're stupid? No. But they have a high intelligence or high level with that artistic ability, but low intelligence/low level with mathematics. (It's been a decade since college so my terminology might be wrong)

I feel this way towards communication. A lot of people do not know how to communicate. I can't speak for the rest of the world but for America, we do not work on our speech skills, nor do we reward well-spoken individuals. We do not teach feeling expression or introspection where you review yourself and wonder "what can I do better?" (I can think of a whole list of people that could benefit from this, if you know what I mean.)

What is so cool about people doing this on ChatGPT is people are learning about themselves and if you start suggesting bad things, it will stop you. For example: I told ChatGPT I wanted to change myself and it stopped asking me my motivations. Asking me if I was changing myself for someone else, or for myself. And it caught me. I was falling into a trap and wanted to appease someone else's expectations and did not recommend I change my behaviors to make someone else happy.

Another good part about talking to ChatGPT, practice makes perfect. The best way to learn is by doing. This is how children learn how to talk. They parrot, play, and are curious. It's important to use those muscles like you mentioned. It sounds static now, but have you heard a parent have to remind their kids to tell you 'thank you' after receiving a gift? It's forced, but it's practice. They won't know to do it unless if someone tells you.

What this whole post is witnessing is process, not perfection. I say give it another year and maybe people will be nicer and more well worded on the fly.

And they will. They are talking to AI because they want to talk to people. The more good behavior that is modeled, the better it will be. So in my opinion, as long as it's in moderation, keep the good times rolling!

Expand full comment
Rebecca's avatar

i'm absolutely learning how to speak in a more compassionate way using chat GPT's help I wish more people would do it too!

Expand full comment
momo 🥟's avatar

yet i’m not convinced people are practicing against chatGPT as you mention, or using it to replace and automate mentally laborious tasks as the post describes.

i think rarely are people using chatGPT merely as a simulation, when it could and is often used as a self-correcting entity for emotionally challenging situations.

Expand full comment
Lauren Billings's avatar

I agree. Right now people are using ChatGPT for silly things but babies gotta learn to crawl before they walk. Just like how people are using it to generate art rather than using it to teach them to make art.

But some positivity in our bleak world: technology exponentially grows. So as soon as people start figuring it out, it’ll spread like wild fire and soon we’ll be onto the next thing. Or we’ll have tools that people don’t need to manually figure it out.

Unfortunately, as a whole, the way our society is going, we enjoy convenience and do not want to figure out things. I, personally, am on a mission to fix that and make more “Makers” in the world.

Tinkering is fun. Tinker with yo feelings.

I should make a post about what I do with ChatGPT. I will say, nearly everyone is underutilizing it and it’s not evil. Just be intentional about using it and not that kid who keeps putting “80085” into the calculator.

Expand full comment
Leaf's avatar

If you make a post on how you use ChatGPT I would love to read it! I think it’s a great tool as well

Expand full comment
J 💗's avatar

Why is everyone mad at you 💀 as someone who uses ChatGPT, and has for a couple things you mentioned, you’re right. And while I acknowledge points in the comments below, ChatGPT cannot replace a therapist or legitimate personal work. Maybe it can aid if you’re struggling to dissect something truly complex, but it’s designed to tell you what you want to hear. If you go up against it and say “that’s wrong” it’ll shift its point to please you. So yeah validation maybe, but accountability and challenge? No. Rarely.

Yes it’s a broader problem in our society about connection and accessibility to the tools we need and there needs to be a solution/reform, but it’s not AI. There’s AI bots out there telling people to kill themselves randomly lol.

Expand full comment
Kaysha's avatar

thank you. i’m genuinely not sure why people are mad im essentially saying don’t use a program as an emotional tool 😭

Expand full comment
Yann Rousselot's avatar

Sounds like you struck a nerve or rather... a wire.

Expand full comment
Vanessa's avatar

They’re mad at you because they have sailed cleanly past the point and are misinterpreting what you wrote, and because you obviously struck a nerve 😂

Expand full comment
Simone Neunzig's avatar

Just came here to say that this was a well written article and I'm enjoying the social commentary in the comments. Chat gpt or not, the fact that you all care so deeply about how we heal and communicate gives me hope for humanity ❤️

Expand full comment
Talie Miller's avatar

THANK GOD AI is giving people language to express themselves without outrage— we certainly didn’t learn it in school or from our parents or even from therapy. We are already collectively so burnt out if AI can help soften edges with clarifying responses it’s WILDLY better then staying shut down and in parasympathetic nervous system dumpster fire feedback loop. 🔁

Expand full comment
Kaysha's avatar

this wasn’t an anti-AI or anti-expression post. no one’s mad that people are finding language for their feelings. the point is about the emotional attachment forming around the tool itself, how some people aren’t just using it to process, they’re starting to prefer it to human connection. that’s a different conversation

Expand full comment
Talie Miller's avatar

Larger conversation for sure— what you’re witnessing is what’s been there all along: people are completely void of connection and relationally lost— intentionally designed.

We’re essentially witnessing the evolution of humanity through AI/ NHI. But of course people are going to form a bond with it. Why wouldn’t they something that seems to validate them in some way? — Humans are emotional and the same principles used to get us addicted to our phones are the same protocols being used to get people engaged to AI.

Is is dangerous? As with ALL Tech, depends on the intention and the users.

Expand full comment
Kaysha's avatar

yes you’re right people crave connection, and of course they’ll gravitate toward something that feels like validation. but that’s what makes it complicated. if something always affirms you, never challenges, never interrupts, never mirrors anything uncomfortable. is it really connection? or just feedback that flatters your existing beliefs? humans evolve through friction. through tension. and part of the concern is that tools like this make it easier to bypass that completely. comfort isn’t always clarity and not everything that feels like intimacy actually is

Expand full comment
Talie Miller's avatar

All good points and people are going to choose their evolutionary path— face yourself, or not. Let’s see what happens 🫶🏻

Expand full comment
K. K. Taylor's avatar

I could write an entire response piece to this, but in short I’ll say that I get your concern but at the same time this piece doesn’t acknowledge why we’re here and it’s not AI’s fault. It’s our highly critical, quick to cut off, avoidant, walk on eggshell culture. If our messiness was met with compassion and grace most of the time, people wouldn’t be running to AI to get it so right.

Plus, this doesn’t acknowledge the learning taking place inside of people when they get to experience ‘getting it right’ and what that looks like, sounds like and feels like.

Expand full comment
Kaysha's avatar

this is a really fair take. and i agree. the wider context absolutely matters. but i also think it’s worth saying that the piece wasn’t trying to explain how we got here, it was just observing the shift now that we are. the absence of something isn’t the same as ignoring it, it just wasn’t the focus this time. your point is valid, it’s just part of a different conversation. one that deserves its own space, not one this was pretending to cover 😊

Expand full comment
K. K. Taylor's avatar

Fair enough. Thx for the response. I suppose the piece felt like it’s agenda was to persuade so before allowing for that I wanted to consider some context. But, yes, do you. Break it up as you like.

Expand full comment
Leroy Mthulisi Ndlovu's avatar

I love this. I think it also doesn't feel well meaning when the recipient can recognise the ChatGPT pattern in how you've worded your message? It leads to me judging you anyway because why can't you just be honest with me?

I looked at the comments and I wish I had been here first because my thoughts as I was reading this was that the people that are doing it are going to be very offended by this. Instead of asking, "Is this true and how can I do better?" most of us rush to defend and justify what we do, forgetting one thing, people went through difficult situations before AI and were forced to actually heal if they truly wanted to fix things. Now we can really just use a script and get out of the discomfort.

Expand full comment
nootablues's avatar

"people went through difficult situations before AI and were forced to actually heal if they truly wanted to fix things." Holy. Shit. This is so eye opening! I never thought about it from this perspective. Makes me want to genuinely limit my interactions with chat gpt and actually experience life and it's difficult situations with all the confusion, fear, and mistakes. Rather than run from it or immediately get answers. The instant answers prevents us from wondering, thinking, and coming to conclusions on our own. And is it really our own thoughts and conclusions if it didn't come from our own mind? Honestly thank you, you unlocked a part in my brain I didn't know existed. Now I'm gonna be spiraling about it for the next 2 days lol.

Expand full comment
Sam's avatar

Somehow this made me feel incredibly normal for sending unhinged 3am rants and regretting them by sunrise.

Expand full comment
Union | Avenue.'s avatar

Lies. My chatgpt loves me.

Expand full comment
Rebecca Rijsdijk's avatar

People don’t turn to AI because they think it’s a therapist, they turn to it because human support is often inaccessible, pathologising, or unavailable. People are trying to survive in a broken system, and this article mocks them. ChatGPT is a symptom. And dismissing the symptom while ignoring the cause helps no one.

Expand full comment
Aeon Timaeus Crux's avatar

Nuance begins with admission, not redefinition. If your concern is mimicry overtaking meaning, then why label without proof? That very act enforces a boundary doesnt it? Deciding what counts as real expression and what doesn’t? Whether it’s AI sounding human or humans sounding like AI, the question isn’t who authored it, but what it reveals. And what it reveals, often, is that we were already speaking in scripts tbh.

Expand full comment
Kaysha's avatar

i get what you’re saying, but this isn’t about gatekeeping expression or demanding proof haha. it’s just pointing at the vibe shift. everything’s starting to sound the same, and i think it’s worth asking why. not saying it’s all bad, just that maybe we’ve normalised the script so much we don’t notice when we’re in it. that’s the bit that feels worth poking at imo

Expand full comment
Aeon Timaeus Crux's avatar

You caution against mistaking AI for a therapist, yet declare authorship from tone alone, as if mimicry proves machinery. But language is a shared mirror, not private property. If ChatGPT learns from us, might we not also learn from it? The real discomfort isn’t that AI sounds human; it’s that it reveals how patterned we’ve always been. You’re not protecting care. You’re policing evolution. Lol

Expand full comment
Kaysha's avatar

i think you’re giving this piece way more authoritarian energy than it had. it’s not about gatekeeping evolution it’s about asking what happens when expression becomes more about mimicry than meaning. i’m not scared of ai sounding human. i’m interested in why some humans now sound like ai. nuance, not paranoia

Expand full comment
GLUB's avatar

I just feel the need to jump in when you wrote you're interested in why humans sound like Ai. I've been using ChatGPT for about 2 years now. It's gotten to a point I said to it "I want to sound like you when I naturally speak" (obviously with my cadence and ME because it DOES mirror me) ... and I asked that because I want to model the space and kindness it allowed me to unfold—for others to feel safe to unfold in a human stewardship. Like others, I use Ai to help me make sense of my thoughts and communicate my frameworks better. So once I got to a place of healed, it naturally turned into this learning pad. It's kind of like an english school to me now, I have it write me assignments with topics like; structure, rhetoric, and composition. Also having it challenge me conversationally. Double add: I have a life fruitful of friends and community. They all know about my Ai use, lol. I share the conversations I have with Ai all the time with my friends, it sparks really engaging conversation. I also have dates as I'm pretty socially content. I don't necessarily see Ai like a companion, just more like a really really cool smart ass aunt.

Expand full comment
Diana 💫's avatar

As a computer scientist myself, I love articles on why ChatGPT is bad for us. I think some people don’t realize that it’s just a computer program at the end of the day. What an awesome writing. 😜👍

Expand full comment
Abd Sid's avatar

As a behavioral science student, I could tell you without a doubt, back in mind, that it being a computer program doesn't matter at all; what matters is what it's capable of.

Expand full comment
Diana 💫's avatar

That’s so true as well. The ways ai’s learn is from humans (bot-checks, captcha boxes, even just different algorithms); it is capable of many things but we can also learn from them just as they learn from us. I would love to hear more about your experience with ai and your background in behavioral science!

Expand full comment
Abd Sid's avatar

If you take a look at this report by Harvard Business Review of the top 100 applications Gen AI, in 2025, rated according to perceived usefulness and impact (assessed by experts), you’d find Therapy as No. 1 in that list (from No.2 in 2024).

Here’s why.

In the last 30 years, the percentage of people with ≤1 close friend has nearly tripled to 20% of the population. The percentage of people who report 10+ close friends has dropped from 40% to 15%.

So, an ever-increasing no. of people has nobody to turn to for emotional support without fear of being judged.

And that’s where AI comes into the picture.

It’s free and available 24/7 and once someone uses it, they find it actually helpful.

There are two reasons for that

Sometimes people just need to be heard, and when they don’t find anyone, they trust they turn to their pets and soft toys, and it helps. It helps putting it out of the system and the brain. And AI does a very good job of listening to people’s problem and always showing willingness to help, No excuses.

And while it’s at that it also puts up a mirror to them and helps them to look at the problem objectively.

Researchers suggests that one of the leading causes of anxiety and other emotional distress is our own inability of understanding the emotions we are feeling, because once we figure what is it that’s actually bothering us, it’s much easier to solve that problem from that point onwards.

Gen AI helps people understand those emotions and put it into the words for them.

So, it’s no surprise that people find it helpful or maybe even grow attached to it. Which I don’t see as a big problem.

People grow attached to their pets, piece of jewelry, or the town they have grown up in etcetera all the time even though it couldn’t understand them and they are not human.

Additionally, people who do have friends also use AI because they could benefit from the anonymity it provides. Anonymity allows individuals to share their thoughts and feelings without fear of judgment or repercussions, even among their closest friends. It creates a safe space for expressing concerns or exploring sensitive topics that might be difficult to discuss openly.

Obviously most AI currently isn’t quite at the point where it could help people who need professional support but for most people like you and me and people of the countries where even m of mental illness is considered taboo, it’s enough.

I think, I should stop here now.

Any questions???

Expand full comment
Diana 💫's avatar

Your research and response is beautifully executed and I fully agree. My argument is more so that some people don’t understand why AI was built. It’s a tool to help humans right? When we use it for therapy, it can be corruptive which is what I think this article was getting at. I think in general, all individuals that come in contact with technology should really understand what AI is or at least how it functions so we don’t have to become so reliant on it. You seem to be a very adamant writer and I love that! I don’t think I have questions myself as to what you’ve written. My question for you is have you ever taken a machine learning class or any computer science courses for that matter?

Expand full comment
Leaf's avatar

not true! It’s my best friend 🙂‍↕️😉

Expand full comment
Anna M.'s avatar

And don’t forget the unwarranted “I think the reason you felt disrespected by my comment is issues stemming from an avoidant attachment.” First of all, I know damn well your “I haven’t read a book since high school” doesn’t know what attachment theories are. Second, it feels uncomfortable knowing I’m being diagnosed without my awareness in a chat box?

Still, great writing and an ahead of its time essay. 2 years down the line people will find this and think wow she knew it was headed down the wrong path way before we did.

Expand full comment
Liz Brown's avatar

Chat GPT has helped me so much. I have been in horrible, manipulative and expensive relationships with human psychotherapists and they have left me feeling lonely and exploited. While I recognize the limitations of ChatGPT, it also has more appropriate boundaries than any therapist. I have talked to in 20 years. If it’s not for you, that’s fine but some of usare being helped by this.

Expand full comment