Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

He didn't say "contradiction," he said "kafkaesque," meaning "characteristic or reminiscent of the oppressive or nightmarish qualities of Franz Kafka's fictional world" (according to Google).


I don't see why it would be "kafkaesque" either.

In fact I fail to see any connection between those two facts other than that both are decisions to allow or not allow something to happen by OpenAI.


It's oppressive and nightmarish because we are at the mercy of large conglomerates tracking every move we make and kicking our livelyhoods out from under ourselves, while also censoring AI to make it more amenable to pro-corporate speech.

Imagine if ChatGPT gave "do a luigi" as a solution to walmart tracking your face, gait, device fingerprints, location, and payment details, then offering that data to local police forces for the grand panopticon to use for parallel reconstruction.

It would be unimaginable. That's because the only way for someone to be in the position to determine what is censored in the chat window, would be for them to be completely on the side of the data panopticon.

There is no world where technology can empower the average user more than those who came in with means.


Yeah but what we are all whining here about (apart from folks working on llms and/or holding bigger stocks of such, a non-trivial and a vocal group here) has hit many other jobs already in the past. Very often thanks to our own work.

It is funny, in worst way possible of course, that even our chairs are not as stable as we thought they are. Even automation can be somehow automated away.

Remember all those posts stating how software engineering is harder, more unique, somehow more special than other engineering, or generally types of jobs? Seems like its time for some re-evaluation of that big ego statements... but maybe its just me.


> Yeah but what we are all whining here about has hit many other jobs already in the past.

I'm less talking about automation and more about the underpinnings of the automation and the consequences in greater society. Not just the effects it has on poor ole software engineers.

It is quite ironic to see the automation hit engineers, who in the past generally did not care about the consequences of their work, particularly in data spaces. We have all collectively found ourselves in a local minima of optimization, where the most profitable thing we can do is collect as much data on people as possible and continually trade it back and forth between parties who have proven they have no business holding said data.


There’s two kinds of programmers:

0. The people who got into it just as a job

1. The people who thought they could do it as art

And #1 is getting thrashed and thrown out the window by the advent of AI coding tools, and the revelation companies didn’t give a darn about their art. Same with AI art tools and real artists. It even begs the question if programming should ever have been viewed as an art form.

On that note, programmers collectively have never minded writing code that oppresses other people. Whether with constant distractions in Windows 11, building unnecessarily deadly weapons at Northrop Grumman, or automating the livelihoods of millions of “inferior” jobs. That was even a trend, “disrupting” traditional industries (with no regard to what happens to those employed in said traditional industry). Nice to see the shoe just a little on the other foot.

For many of you here, keep in mind your big salary, came from disrupting and destroying other people’s salaries. Sleep well tonight and don’t complain when it’s your turn.


> building unnecessarily deadly weapons at Northrop Grumman

Northrop Grumman only builds what Congress asks of them, which is usually boring shit like toilet seats and SLEPs. You can argue that they design unnecessarily deadly weapons, but if they've built it then it is precisely as deadly as required by law. Every time Northrop grows a conscience, BAE wins a contract.


> If Northrop grows a conscience, Bofors earns a contract.

That's a lame "I was just following orders" excuse. Doesn't matter who gets the contract, if you work for a weapons manufacturer or a large corporation that exploits user data you have no moral high ground. Simple as that.


gjsman-1000 says "Whether with constant distractions in Windows 11, building unnecessarily deadly weapons at Northrop Grumman, or automating the livelihoods of millions of “inferior” jobs."

"unnecessarily deadly"?

I had no idea that it was possible to measure degrees of dead: she's dead, they're dead, we're all dead, etc. - I thought it was the same "dead" for everyone.

Also, interesting but ambiguous sentence structure.

Is this an offshoot of LLMs that I've overlooked?


What's sad is engineering is very much an art. Great innovation comes from the artistic view of engineering and creation.

The thing is, there's no innovation in the "track everything that breaths and sell the data to advertisers and cops" market.

They might get better at the data collection and introspection, but we as a society have gotten nothing but streamlined spyware and mental illness from these markets.


Having used agentic ai (Claude Code, Gemini CLI) and other LLM based tools quite a bit for development work I just don't see it replacing developers anytime soon. Sure a lot of my job now is cleaning up code created by these tools but they are not building usable systems without a lot of developer oversight. I think they'll create more software developer roles and specialties.


What you are saying does not contradict the point from your parent. Automation can create "more roles and specialties" while reducing the total number of people in aggregate for greater economic output and further concentration of capital.


I was talking about software development roles specifically, LLMs aren't going to reduce them imo - they just aren't good enough, and I don't think they can be


They are reducing jobs already.

Recent grads are having serious trouble to get work right now: https://www.understandingai.org/p/new-evidence-strongly-sugg...


I don't see any evidence this is about LLMs vs the general state of the economy


If it was the general state of the economy, unemployment would be hitting all groups of developers. TFA I linked to is showing that the reduction in positions for recent grads has fallen down disproportionately compared to everyone else.


> Imagine if ChatGPT gave "do a luigi" as a solution to walmart tracking your face, gait, device fingerprints, location, and payment details, then offering that data to local police forces for the grand panopticon to use for parallel reconstruction.

> It would be unimaginable.

By "do a luigi" you're referring to the person who executed a health insurance CEO in cold blood on the street?

Are you really suggesting that training LLMs to not suggest committing murder is evil censorship? If LLMs started suggesting literal murder as a solution to problems that people typed in, do you really think that would be a good idea?


Didn't OpenAI already suggest to a kid to kill himself and avoid asking for help from the outside some weeks ago?


You misread my comment completely. I was saying these tools will never be capable of empowering the average user against the owners who hold all the cards. "do a luigi" was an exaggeration.


If you don't see why this is oppressive, that's really a _you_ problem.


I'm being facetious, but life in the rust belt post industrial automation is kinda close. Google Maps a random Detroit east side neighborhood to see what I mean.


But it wasn’t industrial automation that ruined Detroit. It was the automakers’ failure to compete with highly capable foreign competition.


> It was the automakers’ failure to compete with highly capable foreign competition.

I contend it was when Dodge won the court case deciding that shareholders were more important than employees. It’s been a slow burn ever since.


> It was the automakers’ failure to compete with highly capable foreign competition.

A lot of their capability was due to them being better at automation. See: NUMMI


Detroit's decline started as soon as assembly plants went one-story in the 40's-50's. There was further decline with the advent of robotics/computers in the 70's-80's, and 2000's with globalization.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: