Hacker News new | past | comments | ask | show | jobs | submit | sixQuarks's comments login

Again, the question is for how long

Exactly! We’ve been seeing more and more posts like this, saying how AI will never take developer jobs or will never be as good as coders. I think it’s some sort of coping mechanism.

These posts are gonna look really silly in the not too distant future.

I get it, spending countless hours honing your craft and knowing that AI will soon make almost everything you learned useless is very scary.


I'm constantly disappointed by how little I'm able to delegate to AI after the unending promises that I'll be able to delegate nearly 100% of what I do now "in the not too distant future". It's tired impatience and merited skepticism that you mistake for fear and coping. Just because people aren't on the hype train with you doesn't mean they're afraid.

Personally, I am. Lots of unusual skills I have, have already been taken by AI. That's not to say I think I'm in trouble, but I think it's sad I can't apply some of these skills that I learned just a couple of years ago like audio editing because AI does it now. Neither do I want to work as an AI operator, which I find boring and depressing. So, I've just moved onto something else, but it's still discouraging.

Also, so many people said the same thing about chess when the first chess programs came out. "It will never beat an international master." Then, "it will never beat a grandmaster." And Kasparov said, "it would never beat me or Karpov."

Look where we are today. Can humanity adapt? Yes, probably. But that new world IMO is worse than it is today, rather lacking in dignity I'd say.


I don't acquire skills and apply them just to be able to apply them. I use them to solve problems and create things. My learned skills for processing audio are for the purpose of getting the audio sounding the way I want it to sound. If an AI can do that for me instead, that's amazing and frees up my time to do other things or do a lot more different audio things. None of this is scary to me or impacts my personal dignity. I'm actually constantly wishing that AI could help me do even more. Honestly I'm not even sure what you mean by AI doing audio editing, can I get some of that? That is some grunt work I don't need more of.

I acquire skills to enjoy applying them, period. I'm less concerned about the final result than about the process to get there. That's the different between technical types and artist types I suppose.

Edit: I also should say, we REALLY should distinguish between tasks that you find enjoyable and tasks you find just drudgery to get where you want to go. For you, audio editing might be a drudgery but for me it's enjoyable. For you, debugging might be fun but I hate it. Etc.

But the point is, if AI takes away everything which people find enjoyable, then no one can pick and choose to earn a living on those subset of tasks that they find enjoyable because AI can do everything.

Programmers tend to assume that AI will just take the boring tasks, because high-level software engineering is what they enjoy and unlikely to be automated, but there's a WHOLE world of people out there who enjoy other tasks that can be automated by AI.


I'm with you, I enjoy the craftsmanship of my trade. I'm not relieved that I may not have to do it in the future, I'm bummed that it feels like something I'm good at, and is/was worth something, is being taken away.

I realize how lucky I am to even have a job that I thoroughly enjoy, do well, and get paid well for. So I'm not going to say "It's not fair!", but ... I'm bummed.


I can't tell whether I'm supposed to be the technical type or the artist type in this analogy. In my music making hobby, I'd like a good AI to help me mix, master, or any number of things under my direction. I'm going to be very particular about every aspect of the beat, but maybe it could suggest some non-boring chord progressions and I'll decide if I like one of them. My goal as an artist is to express myself, and a good AI that can faithfully take directions from me would help.

As a software engineer, I need to solve business problems, and much of this requires code changes, testing, deployments, all that stuff we all know. Again, if a good AI could take on a lot of that work, maybe that means I don't have to sit there in dependency hell and fight arcane missing symbol errors for the rest of my fucking career.


> Again, if a good AI could take on a lot of that work, maybe that means I don't have to sit there in dependency hell and fight arcane missing symbol errors for the rest of my fucking career.

My argument really had nothing to do with you and your hobby. It was that AI is signficantly modifying society so that it will be hard for people to do what they like to make money, because AI can do it.

If AI can solve some boring tasks for you, that's fine but the world doesn't revolve around your job or your hobby. I'm talking about a large mass of people who enjoy doing different things, who once were able to do those things to make a living, but are finding it harder to do so because tech companies have found a way to do all those things because they could leverage their economies of scale and massive resource pools to automate all that.

You are in a priveleged position, no doubt about it. But plenty of people are talented and skilled at doing a certain sort of creative work and the main thrust of their work can be automated. It's not like your cushy job where you can just automate a part of it and just become more efficient, but rather it's that people just won't have a job.

It's amazing how you can be so myopic to only think of yourself and what AI can do for you when you are probably in the top 5% of the world, rather than give one minute to think of what AI is doing to others who don't have the luxuries you have.


Everyone should do the tasks where they provide unique value. You could make the same arguments you just made for recorded music, automobiles, computers in general in fact.

Difference is though AI does it much faster and has much fewer central sources that provide the service. The speed and magnitude is important as well, just like a crash at 20km/h is different than a crash at 100km/h. And those other inventions WERE also harmful. Cars -> global warming.

My point is every invention has pros and cons, and tends to displace people who were very tied to the previous way.

You can still do those tasks, but the market value will drop. Automatable work should always be automated, because we best focus on things that can't be automated yet and those gain more market value. Supply and demand and all that. I do hope we have a collective plan about what we do when everything is automated at some point. Some form of UBI?

What do you mean that AI can do audio editing? I don't think all sound engineers have been replaced.

Yes. I know what you’re referring to, but you can’t ignore the pace of improvement. I think within 2-3 years we will have AI coding that can do anything a senior level coder can do.

Nobody knows what the future holds, including you.

Yes, but we can see current progress and extrapolate into the future. I give it 2/3 years before AI can code as well as a senior level coder

Recent benchmarks show that improvements in the latest models are beginning to slow down, what makes you so sure there's another breakthrough coming?

Copium.

People that bet on this bubble have to keep it as big and for as long as possible.


That is true, which is why we should be cautious instead of careless.

People are definitely starting to get white savior fatigue these days. Several comments like this in response, whereas in the past you’d seen none.

It’s so stupid. I get the emperor has no clothes vibe from all of this

Your just noticing?

Most of the stuff on HN isn’t exactly written by super geniuses, especially for blog posts without some kind of analysis.


This post was more about some midwit trying to coin a phrase than a breakdown of future design patterns one could entertain. Clout chasing nonsense.

Which emperor though? Flat design or 3d design?

I do agree with the article that before the iOS 7 flat design rush the barrier to entry for indie app developers was super high because it's damn hard to make the iOS <7 style look good. Flat is easy though. But with AI tools, the old style is suddenly available to lots of people again.


This is such an obvious jump the shark moment for openAI.

These types of puffery acquisitions, with a former “legend”, announced with such gusto, have never materialized into anything.

You’re not gonna get breakthrough products like this. Breakthrough products just appear unexpectedly, they’re not announced a year or two ahead of time.


You know you are in front of the impending explosion of a bubble when discussions shift from products themselves and towards who will be working with whom.

I don’t think AI is a bubble at all, but openAI is.

There is no way civilizations make it past a certain point. It’s so completely obvious, just look at our world. In 2025 we are enabling a genocide while the masses don’t seem to care or even know about it.

You think the people that are having these types of atrocities committed against them would think twice about ending civilization as revenge if given the power? What do you think is going to happen with AI?

If we can’t stop a genocide, why would you think we can stop civilization ending?


But context windows for LLMs include all the “long term memory” things you’re excluding from humans


Long term memory in an LLM is its weights.


Not really, because humans can form long term memories from conversations, but LLM users aren’t finetuning models after every chat so the model remembers.


He's right, but most people don't have the resources, nor indeed the weights themselves, to keep training the models. But the weights are very much long term memory.


users aren’t finetuning models after every chat

Users can do that if they want, but it’s more effective and more efficient to do that after every billion chats, and I’m sure OpenAI does it.


If you want the entire model to remember everything it talked about with every user, sure. But ideally, I would want the model to remember what I told it a few million tokens ago, but not what you told it (because to me, the model should look like my private copy that only talks to me).


ideally, I would want the model to remember what I told it a few million tokens ago

Yes, you can keep finetuning your model on every chat you have with it. You can definitely make it remember everything you have ever said. LLMs are excellent at remembering their training data.


All I see here is coping by developers. You all are so blind to what’s coming.

Let’s not forget, if you relied on hacker news to give you accurate takes on things, you would’ve never bought bitcoin at one cent.


What is coming? AGI that will destroy any notion of property rights or freedom that you ever held? Or something better?


If past human behavior is any indication, way worse.


How terrible is our healthcare system? There should be automatic alerts when a blood test returns dangerous levels like this


i agree!


Mother jones used to be an awesome magazine before they went full woke and developed Elon derangement syndrome


I saw someone calling out Tesla for not producing something called Tesla Roadster since 2017 even though they collected pre-order money for it. Doesn't really sound like a derangement. Any company would get called out for that.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: