Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Is there? I don't see any contradiction there.

For me it's funny that the first time most programmers ever think about the ethics of automating away jobs is when they themselves become automated.



He didn't say "contradiction," he said "kafkaesque," meaning "characteristic or reminiscent of the oppressive or nightmarish qualities of Franz Kafka's fictional world" (according to Google).


I don't see why it would be "kafkaesque" either.

In fact I fail to see any connection between those two facts other than that both are decisions to allow or not allow something to happen by OpenAI.


It's oppressive and nightmarish because we are at the mercy of large conglomerates tracking every move we make and kicking our livelyhoods out from under ourselves, while also censoring AI to make it more amenable to pro-corporate speech.

Imagine if ChatGPT gave "do a luigi" as a solution to walmart tracking your face, gait, device fingerprints, location, and payment details, then offering that data to local police forces for the grand panopticon to use for parallel reconstruction.

It would be unimaginable. That's because the only way for someone to be in the position to determine what is censored in the chat window, would be for them to be completely on the side of the data panopticon.

There is no world where technology can empower the average user more than those who came in with means.


Yeah but what we are all whining here about (apart from folks working on llms and/or holding bigger stocks of such, a non-trivial and a vocal group here) has hit many other jobs already in the past. Very often thanks to our own work.

It is funny, in worst way possible of course, that even our chairs are not as stable as we thought they are. Even automation can be somehow automated away.

Remember all those posts stating how software engineering is harder, more unique, somehow more special than other engineering, or generally types of jobs? Seems like its time for some re-evaluation of that big ego statements... but maybe its just me.


> Yeah but what we are all whining here about has hit many other jobs already in the past.

I'm less talking about automation and more about the underpinnings of the automation and the consequences in greater society. Not just the effects it has on poor ole software engineers.

It is quite ironic to see the automation hit engineers, who in the past generally did not care about the consequences of their work, particularly in data spaces. We have all collectively found ourselves in a local minima of optimization, where the most profitable thing we can do is collect as much data on people as possible and continually trade it back and forth between parties who have proven they have no business holding said data.


There’s two kinds of programmers:

0. The people who got into it just as a job

1. The people who thought they could do it as art

And #1 is getting thrashed and thrown out the window by the advent of AI coding tools, and the revelation companies didn’t give a darn about their art. Same with AI art tools and real artists. It even begs the question if programming should ever have been viewed as an art form.

On that note, programmers collectively have never minded writing code that oppresses other people. Whether with constant distractions in Windows 11, building unnecessarily deadly weapons at Northrop Grumman, or automating the livelihoods of millions of “inferior” jobs. That was even a trend, “disrupting” traditional industries (with no regard to what happens to those employed in said traditional industry). Nice to see the shoe just a little on the other foot.

For many of you here, keep in mind your big salary, came from disrupting and destroying other people’s salaries. Sleep well tonight and don’t complain when it’s your turn.


> building unnecessarily deadly weapons at Northrop Grumman

Northrop Grumman only builds what Congress asks of them, which is usually boring shit like toilet seats and SLEPs. You can argue that they design unnecessarily deadly weapons, but if they've built it then it is precisely as deadly as required by law. Every time Northrop grows a conscience, BAE wins a contract.


> If Northrop grows a conscience, Bofors earns a contract.

That's a lame "I was just following orders" excuse. Doesn't matter who gets the contract, if you work for a weapons manufacturer or a large corporation that exploits user data you have no moral high ground. Simple as that.


gjsman-1000 says "Whether with constant distractions in Windows 11, building unnecessarily deadly weapons at Northrop Grumman, or automating the livelihoods of millions of “inferior” jobs."

"unnecessarily deadly"?

I had no idea that it was possible to measure degrees of dead: she's dead, they're dead, we're all dead, etc. - I thought it was the same "dead" for everyone.

Also, interesting but ambiguous sentence structure.

Is this an offshoot of LLMs that I've overlooked?


What's sad is engineering is very much an art. Great innovation comes from the artistic view of engineering and creation.

The thing is, there's no innovation in the "track everything that breaths and sell the data to advertisers and cops" market.

They might get better at the data collection and introspection, but we as a society have gotten nothing but streamlined spyware and mental illness from these markets.


Having used agentic ai (Claude Code, Gemini CLI) and other LLM based tools quite a bit for development work I just don't see it replacing developers anytime soon. Sure a lot of my job now is cleaning up code created by these tools but they are not building usable systems without a lot of developer oversight. I think they'll create more software developer roles and specialties.


What you are saying does not contradict the point from your parent. Automation can create "more roles and specialties" while reducing the total number of people in aggregate for greater economic output and further concentration of capital.


I was talking about software development roles specifically, LLMs aren't going to reduce them imo - they just aren't good enough, and I don't think they can be


They are reducing jobs already.

Recent grads are having serious trouble to get work right now: https://www.understandingai.org/p/new-evidence-strongly-sugg...


I don't see any evidence this is about LLMs vs the general state of the economy


If it was the general state of the economy, unemployment would be hitting all groups of developers. TFA I linked to is showing that the reduction in positions for recent grads has fallen down disproportionately compared to everyone else.


> Imagine if ChatGPT gave "do a luigi" as a solution to walmart tracking your face, gait, device fingerprints, location, and payment details, then offering that data to local police forces for the grand panopticon to use for parallel reconstruction.

> It would be unimaginable.

By "do a luigi" you're referring to the person who executed a health insurance CEO in cold blood on the street?

Are you really suggesting that training LLMs to not suggest committing murder is evil censorship? If LLMs started suggesting literal murder as a solution to problems that people typed in, do you really think that would be a good idea?


Didn't OpenAI already suggest to a kid to kill himself and avoid asking for help from the outside some weeks ago?


You misread my comment completely. I was saying these tools will never be capable of empowering the average user against the owners who hold all the cards. "do a luigi" was an exaggeration.


If you don't see why this is oppressive, that's really a _you_ problem.


I'm being facetious, but life in the rust belt post industrial automation is kinda close. Google Maps a random Detroit east side neighborhood to see what I mean.


But it wasn’t industrial automation that ruined Detroit. It was the automakers’ failure to compete with highly capable foreign competition.


> It was the automakers’ failure to compete with highly capable foreign competition.

I contend it was when Dodge won the court case deciding that shareholders were more important than employees. It’s been a slow burn ever since.


> It was the automakers’ failure to compete with highly capable foreign competition.

A lot of their capability was due to them being better at automation. See: NUMMI


Detroit's decline started as soon as assembly plants went one-story in the 40's-50's. There was further decline with the advent of robotics/computers in the 70's-80's, and 2000's with globalization.


It's not a comment on the ethics of replacing jobs but the hypocrisy of companies using "ethics" as reasoning for restricting content.

They are pursuing profits. Their ethical focus is essentially a form of theater.


Replacing jobs is not an ethical issue.

Automation and technology has been replacing jobs for well over a century, almost always to better outcomes for society. If it were an ethical issue, then it would be unethical not to do it.

In any case, which jobs have been replaced by LLMs? Most of the actual ones I know were BS jobs to begin with - jobs I wish had not existed to begin with. The rest of the ones are where CEOs are simply using AI as an excuse to execute layoffs (i.e. the work isn't actually being done by an LLM).


BeetleB says "The rest of the ones are where CEOs are simply using AI as an excuse to execute layoffs (i.e. the work isn't actually being done by an LLM)."

So lay people off to reduce costs, say that they have been replaced by AI now, and the stockholders love you even more!

Indeed, a model that should cascade thru American businesses quickly.


Your definition what is an ethical issue is reductive. It means the issue involves ethics, and they are obviously involved. Even if ultimately society at large would benefit from the disappearance of certain jobs, that can still create suffering for hundreds of thousands of people.


Artists generally? Translators? People at various bureaucratic positions doing more menial white collar work? And tons more.

That you specifically wish for them to not even exist is your own internal problem and actually pretty horrible thing to say all things considered.

People had/have decent livehoods from those, I know a few. If they could easily got better jobs they would go for them.

Egos here sometimes are quite a thing to see. Maybe its good that chops are coming also for this privileged groups, a bit of humility never hurts.


So suppose someone wants to say provide localized versions of their software and avails themselves of translation software. Are we supposing that such ought not exist to provide for the livelihood of the translator who would otherwise have been paid?

If so where do we stop. Do we stop at knowledge work or do we go back to shovels and ban heavy equipment or shall we go all the way back to labor intensive farming methods?

>Egos here sometimes are quite a thing to see. Maybe its good that chops are coming also for this privileged groups, a bit of humility never hurts.

This doesn't appear to be so. AI is discussed as a pretext for layoffs more fashion than function.


> Artists generally?

Which artists have lost their jobs?

But I am willing to grant you that. From a big picture society perspective, if it means that ordinary people like me who cannot afford to pay an artist can now create art sufficiently good for my needs, then this is a net win. I just made an AI song a week ago that got mildly popular, and just got a request to use it at a conference. No one is losing their job because of me. I wouldn't have had the money to pay an artist to create it, and nor would the conference organizers. Yet, society is clearly benefiting.

The same goes for translators (I'm not actually aware that they're losing jobs in a significant way, but I'll accept the premise). Even before LLMs, the fact that I could use Babelfish to translate was fantastic - LLMs are merely an incremental improvement over it.

To me, arguing we shouldn't have AI translators is not really different from arguing we shouldn't have Babelfish/Google Translate. Likely 99% of the people who will benefit from it couldn't afford a professional translator.

(I have, BTW, used a professional translator to get some document translated - his work isn't going away, because organizations need a certified translator).

> People at various bureaucratic positions doing more menial white collar work?

"Menial white collar work" sounds like a good thing to eliminate. Do you want to go back to the days where word processors were not a thing and you had to pay someone to type things up?

> People had/have decent livehoods from those, I know a few. If they could easily got better jobs they would go for them.

I'll admit I spoke somewhat insensitively - yes, even I know people who had good careers with some of them, but again: Look to the past and think of how many technologies have replaced people, and do you wish those technologies did not replace people?

Do you want to deal with switchboard operators every time you make a call?

Do you want to have to deal with a stock broker every time you want to buy/sell?

Do you want to pay a professional every time you want to print a simple thing?

Do you want to go back to snail mail?

Do you want to do all your shopping in person or via a physical catalog?

The list goes on. All of these involved replacing jobs where people earned honest money.

Everything I've listed above has been a bigger disruption than LLMs (so far - things may change in a few years).

> Egos here sometimes are quite a thing to see. Maybe its good that chops are coming also for this privileged groups, a bit of humility never hurts.

Actually, I would expect the SW industry to be amongst the most impacted, given a recent report showing which industries actually use LLMs the most (I think usage was SW was greater than all other industries combined).

As both an engineer and a programmer, who makes a living via programming, I am not opposed to LLMs, even if my job is at risk. And no, I'm not sitting on a pile of $$$ that I can retire on any time soon.


Ask ChatGPT to explain consequentialism to you.


> Most of the actual ones I know were BS jobs to begin with

I cannot edit my original comment, so I'll address this here:

Yes, I admit some legitimate jobs may have been lost (and if not yet, likely will be). When I spoke of BS jobs, I was referring to things like people being paid to ghostwrite rich college students' essays. That's really the only significant market I know to have been impacted. And good riddance.


Yeah, the issue is that there is no common benefit if the private company is the only one doing the replacement. Are we ready for AGI before we solve issues of capitalism? Otherwise, the society may get a harsh reset.


There's actually a lot of common benefit. That company can now supply their goods and services in greater quantity and at lower cost, which raises consumers' standard of living. Meanwhile the workers who were previously employed in menial clerical tasks will simply switch to supervising the AI's that perform those same tasks for them.


> Meanwhile the workers who were previously employed in menial clerical tasks will simply switch to supervising the AI's that perform those same tasks for them.

Why would LLMs be incapable of these new jobs?


> That company can now supply their goods and services in greater quantity and at lower cost, which raises consumers' standard of living

It turns out that standard of living requires more than just access to cheap goods and services

Which is why despite everything getting cheaper, standard of living is not getting better in equivalent measure


Also why this country is full of fat retards


I don't think this will happen. It does not work with the capitalism if we have only few companies which have all this power. And many consumers don't have jobs left so the value of money increases faster than the "cost" decreases.

> Meanwhile the workers who were previously employed in menial clerical tasks will simply switch to supervising the AI's that perform those same tasks for them.

Put this to numbers, right now - if we remove all workers and leave managers on those fields - how many people are still employed?


You're so wrapped up in defending the job replacement aspect that you miss the point on hypocrisy.

I would like to make one small point about job replacement, the better outcomes for society are arguably inconclusive at this point. You've been indoctrinated to think that all progress and disruption is good because of capitalism.

We're still in the post-industrialization arc of history and we're on a course of overconsumption and ecological destruction.

Yes, we've seen QoL improvements over the course of recent generations. Do you really think it's sustainable?


How is it hypocrisy when OpenAI is clearly acknowledging in their blog post that AI is going to disrupt jobs?

When a factory decides to shut down, and the company offers to pay for 2 years of vocational training for any employee that wants it, is it hypocrisy? One of my physical therapists, who took such an offer, definitely doesn't see it that way. The entity responsible for her losing her job actually ended up setting up a whole new career for her.

> I would like to make one small point about job replacement, the better outcomes for society are arguably inconclusive at this point. You've been indoctrinated to think that all progress and disruption is good because of capitalism.

That's overstating my stance. I can accept that it's too early to say whether LLMs have been a net positive (or will be a net positive), but my inclination is strongly in that direction. For me, it definitely has been a net positive. Because of health issues, LLMs allow me to do things I simply couldn't do before.

> Yes, we've seen QoL improvements over the course of recent generations. Do you really think it's sustainable?

This is an age old question and nothing new with LLMs. We've been arguing it since the dawn of the Industrial Revolution (and for some, since the dawn of farming). What I do know is that it resulted in a lot of great things for society (e.g. medicine), and I don't have much faith that we would have achieved them otherwise.


Then why are you even talking about the replacement of jobs?


I've already explained it. I don't know how to break it down any further without coming off as patronizing. You seem dead-set on defending OpenAI and not getting the point.


I think many of us question the ethics of lying to sell a product that cannot deliver what you are promising.

All the good devs that I know aren't worried about losing their jobs, they are happy there is a shortcut through boilerplate and documentation. They are also equally unhappy about having to talk management, who know very little about the world of dev, off the ledge as they are getting ready to jump off with their AI wings that will fail.

Finally, the original point was about censorship and controlling of information, not automating jobs.


> All the good devs that I know aren't worried about losing their jobs

While many of them are mistaken, the much bigger problem is for all the early career developers, many of whom will never work in the field. These people were assured by everyone from professors to industry leaders to tech writers that the bounty of problems available for humanity to solve would outpace the rate at which automation would reduce the demand for developers. I thought it was pretty obviously a fairy tale that people who believed in infinite growth created to soothe themselves and other industry denizens suspecting the tech industry hadn’t unlocked the secret to infinite free lunch, but in reality are closer to the business end of an ouroboro than they realize.

Just as the manufacturing sector let its Tool and Die knowledge atrophy, perhaps irreversibly, the software business will do the same with development. Off-shoring meant the sector had a glut of tool and die knowledge so there was no immediate financial incentive to hire apprentices. There’s a bunch of near-retirees with all of that knowledge and nobody to take over for them, and now that advanced manufacturing is picking up steam in the US again, many have no choice but to outsource that to China, too.

Dispensing with the pretenses of being computer scientists or engineers, software development is a trade, not an academic discipline, and education can’t instill professional competence. After a decade or two of never having to hire a junior because the existing pool of developers can serve all of the industry’s needs, suddenly we’ll have run out of people to replace the retirees with and that’s that for the incredible US software industry.


For another thing the owners of the data centers may not do so well if their wildest dreams fail to come true, and if they don't happen to make enough money to replace the hardware before it wears out.


I’m not saying AI isn’t useful or won’t get more useful, but the entire business side of it seems like a feedback loop of “growth at all costs” investment strategies.


> All the good devs that I know aren't worried about losing their jobs...

'Good' is doing heavy lifting here. E.g AI/Automation could possibly eliminate 90% of IT jobs and cause all kind of socio-economic issues in society. All the while good developers remain in great demand.


>that the first time most programmers ever think about the ethics of automating away jobs is when they themselves become automated.

It's more about a logical outcome. Automating scripts means existing employees can do other or more work.

AI doesn't feel like that at all. it wants to automate labor itself. And no country has the structure ready for that sort of "post work" world.


The book "Why We Fear AI" by Hagen Blix and Ingeborg Glimmer talks about this dynamic. Whether it will lead to a class awakening because previously if you were aligned with the company you were rewarded as well, but now if you align with the company you're advocating to destroy your livelihood.

What rational worker would want to take part in this?


Software developers. Many of them are still championing LLMs. Also anybody who still contributes to open source software.


In contributing to open source software at scale I'm teaching apprentices. I expect them to adapt what I've done to their own purposes, and have seen a good amount of that out in the wild, often people who ended up doing something entirely different like building hardware that also contains software.

I don't think LLMs will be able to pick up on what's done by an evolving and growing codebase with previous projects also included. More likely it will draw from older stuff and combine it with other people's more normal stuff and end up with an incoherent mess that won't compile. Not all work is 'come up with the correct answer to the problem, and then everybody uses it forever'.


It can lead to class awakening but I think AI is not sufficient. It would need very large scale climate / ecological disasters where suddenly lot of current middle classes conveniences become available only to top classes.


This is happening parts of the world where hyper scale data centers are being built. Rolling brown outs and diverting potable water from towns, you find these stories both in Ireland and across South America.

We already see it happening in the US too, with the Nashville data centers causing immense medical issues.


Lots? I mean most people I know aren't even willing to entertain the notion wherever it's gonna happen within our lifetime.

The argument usually centers around the fact that LLMs aren't AGI, which is obviously true but also kinda missing the point


We don't need AGI to cause a massive amount of disruption. If leadership of companies want to force use these LLMs, which is what we've been experiencing the last two years, workers will be forced to use them.

It's not like there is an organic bottom up movement on driving this usage. It's always top down mandated by executives with little regard on how it impacts worker's lives.

We've also seen how these tools have made certain jobs worse, not better, like translating:

https://www.bloodinthemachine.com/p/how-ai-is-killing-jobs-i...


The number of bullshit jobs has been growing since the Internet, and programmers have facilitated them by adding unnecessary complexity.

This time the purported capabilities of "AI" are a direct attack on thinking. Outsourcing thinking is creepy and turns humans into biorobots. It is different from robotic welding in an assembly line.

Even if new bullshit jobs are created, the work will just be that of a human photocopier.

[All this is written under the assumption that "AI" works, which it does not but which is the premise assumed by the persons quoted in the Register article.]


I don't see how thinking about some source code is an innately more human activity than welding. Both can be done by humans, both couldn't be done by anything but humans until automation came along and now both can be done by humans and automated systems.

I also fail to see how LLMs can turn humans into "biorobots". You can still do all the things you could do before LLMs came along. The economic value of those things just decreased enourmously.


Then go weld. There are still some positions for humans.


Tons of welding and other manufacturing jobs in the northeast— they’ll even apprentice you into positions with no existing knowledge and larger companies (like General Dynamics) will even pay for your job-related degrees, sometimes being able to take the classes on the clock or get a stipend.

They have to do this because the industry has basically been kicking the aging-workforce can down the road for a few decades since off-shoring and automation outpaced increasing demand, and now they don’t have nearly enough people that even know how to program CNC machines when CAM software falls short.

I have a feeling a lot of displaced software people will go that route, and have a big change in compensation and working conditions in the process.


> I have a feeling a lot of displaced software people will go that route, and have a big change in compensation and working conditions in the process.

I've watched my cousin weld on a horse trailer overhead in 105F Texas heat, would be interesting to see the typical SWE step away from an Xbox and do that.


Yeah I don’t think they’re going to have much of a choice unless they plan on doing gig jobs indefinitely. The software business has given a lot of people the impression that they’re far more special than they actually are.

I’ve seen devs say they’d pick up a trade like being a plumber or electrician because their their master electrician cousin gets paid a ton money, probably they imagine for wiring up new residential buildings and changing out light sockets… how long did it take that cousin to get there? In any trade, there’s quite a number of years of low pay, manual labor, cramming into tight spaces in hot attics or through bug infested crawl spaces, factory basements, etc. that most apprentices complete in their early twenties. Nobody gives a shit what you did as a developer and nobody gives a shit how good you are at googling things in most blue collar work environments. Getting experienced enough to have your own business making good money in some job where you need many thousands of work hours to even take a test to get licensed isn’t a lateral move from being a JS toolchain whiz. Even in less structured jobs like working as a bartender — it takes years of barbacking, serving, or bartending in the least desirable jobs (events, corporate spaces) before you get something you can pay rent with.


My argument isn't that I like welding more. I'm asking you what the _ethical_ difference is between automating welding and automating programming.

The fact that you like programming more than welding is nice to know but there's probably also a lot of people who like welding more than progamming.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: