Hacker News new | past | comments | ask | show | jobs | submit login

> So when a child speaks to AI and hears back:

> – a perfectly timed emotional response

> – a calm voice that never judges

> – an answer that always fits what they needed to hear

And what’s bad about this? Maybe humans need to rethink their behavior if machine is better?

Whole article is just authors screaming insecurity.




You're essentially talking to yourself while the GUI make it feels like you're talking to a anthropomorphised third party. That's why so many people with mental issues absolutely love these LLMs, they tell you what you want to hear, or something vague enough that you can interpret it your way, and it'll never every judge you, because it can't. But does it actually solve anything or pushes you deeper in your hole ?

> if machine is better?

Define "better" ? What are we measuring ? Instant self gratification ? Long term emotional independence ?


The article goes on to say what the author thinks is bad about this:

> We’re not raising emotionally intelligent kids. We’re raising kids to navigate human unpredictability as if it’s a design flaw. Because when you grow up with a machine that always gets you, messy human behavior feels broken. We’re not preparing kids to handle people.

I don’t think there’s anything wrong with escaping into fantasy in the right time and place, but young kids (and even well-adjusted adults) can have problems self-moderating and letting fantasy substitute for engaging with reality.


My theory is it’ll make children more anxious when the cookie-cutter nearly perfect schema is broken.

We need rough edges, we need some level of inconsistency.

If a child is grown up on machine, they’ll prefer machine for friends, dating, colleagues.

We’re already seeing a subset of the population who are less physically social turn to AI to fill the gap. Not necessarily a bad thing for adults, but preferring machine over humans in place of friends during a childs most formidable years is a recipe for societal disaster.


> And what’s bad about this? Maybe humans need to rethink their behavior if machine is better?

What is bad about this is that we exist in a real world, with other imperfect humans that we need to learn to interact with, and sometimes very tough social situations we will have to learn to navigate

Children who are overly coddled and never challenged grow up to become insecure, entitled adults. They expect everything to continue coddling them forever

> A calm voice that never judges

If you are never judged, you never improve


> what’s bad about this?

What's bad about this is when these children would need to fit inside a certain circle with other people, who don't behave like machines do. Circle like school, work, or family. These children might have issues accommodating there.

You may then ask, "maybe humans don't need schools, work, or families," but that would be a different conversation.


> – an answer that always fits what they needed to hear

So ... that makes the machine better?

I believe the term for a human that acts like that is:

https://en.wikipedia.org/wiki/Sycophancy


Parenting is not a mere inconvenience to be automated away. It is a part of what makes us human.


> Whole article is just authors screaming insecurity.

I have yet to see a single AI proponent commit to giving their kid over to AI, if they even have one. SV CEOs are sending their kids to hardcore techfree schools and limiting their own kids access to their tech, all while insisting AI and their tech poses no risk at all to kids and that any reservations are "fear mongering."


I'm sorry but your take is just too stupid to ignore, and I apologize in advance because ad hominem is not my goal.

I'll boil this down to the simplest possible explanation of why your statement is idiotic -- children who are trained to prefer AI, will never learn how to form friendships with their peers (aka other children, who won't always give perfectly timed emotional responses or always be calm or always answer what they need to hear). Other children are not able to "rethink their behavior" yet because they are children still.

A world in which, I ask my child "do you want to play with Timmy next door, or stay inside and play with Alexa/Siri/etc." and my child always prefers to pick Alexa, is one of the most dystopian outcomes I could possibly imagine for childrearing.

Forming friendships and human connections is a skill. Learning new skills is hard and not always fun. A soothing AI companion that always says what you want to hear, is going to trap children in a dopamine loop that prevents that kind of social skill development from ever happening.


AI replaces piece of shit adults that surround them, not other children.


That's very naive. AI replaces any and all "less fun" forms of interaction, for children who are too undeveloped to appreciate any goal/metric besides "fun". Meeting new unfamiliar kids is not fun, ergo kids will fallback to the known-safe, comfortable, fun companion that is AI.

If you believe the argument makes enough sense to justify replacing adults, then why do you think children will still want to play with each other, instead of just playing with AI? If anything, other children are MORE likely to display shitty behaviors than adults, given that you know, they're fucking children who don't know any better.

Have you spent any time around children?


> Have you spent any time around children?

I’ve spent years being one.


Years ago back when there was no AI biasing your interactions with other children... The entire argument is that the new generations don't have the same opportunity to form friendships in an unfettered way like you did in your childhood, because of the pervasive influence of AI that didn't exist during your own childhood.


A child conditioned by a machine that always placates them and caters to every desire in the most perfect way is NOT preparing them for an imperfect world full of random human interaction. If you dont understand this you have a problem you need to address that is easily fixed by going outside.


> If you dont understand this you have a problem you need to address that is easily fixed by going outside.

It is precisely because I go outside why I consider AI superior teacher.


Keep hiding from the real world then. Thing is, the real world will never go away. Technology can.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: