Hacker News new | past | comments | ask | show | jobs | submit | quonn's comments login

How do you get the idea that we have any of that technology? Brains are nothing like computers and we have almost no technology that is like biology. There is no reason to be sure that we can just scale down the existing tech to arrive at something as efficient. Maybe we can. But if not then developing a system that‘s more like chemistry could take 50 years or 100 or more.

A bunch of matrix multiplications is not a species.

> You lack that deterrent you get attacked, as simple as that

No you don't, unless you're a dictatorship (including all the examples you gave).


Ukraine would like to have a word with you.

Ukraine was not in the examples given and you read both the original post and my response wrong. The post did not say „if you have nuclear weapons you don‘t get attacked“. It said „if you don‘t have nuclear weapons you get attacked“. Of course you get attacked by Russia either way, but that‘s not the point.

The point is that you don‘t get attacked by anyone else unless you‘re a dictatorship.


What's wrong with TypeScript for backends? Specifically backends where the load is moderate or happening in the database anyway?

TypeScript is a wildly unpleasant language. If I'm going to be waiting around for a compiler anyway, why not use an easier nominally typed language? I mean, dotnet is right there... and it's got features out of the wazoo. No need to download external libraries when your bloated backend framework ties your shoes for you.

Or if I'm really into structural typing, Go is right there. I'll tell you, it compiles a lot faster. And a lot less type Olympics, too. Easier to read. Less bug-prone. TS isn't a bad language. But it's certainly not the best.


Indeed, might as well use Go if you're going to use TypeScript.

Or ruby on rails, laravel, elixir or even WordPress, if you want things that are batteries included (which js frameworks or go still most definitely are not)


I learned Javascript was made for making the web interactive. I choose to use it this way. There's nothing wrong with using it for backends too, I just prefer using other langs for that domain as it just doesn't feel right to build my backend in Js/Ts.

For small and moderate projects I dont see issues, but I always build with scaling in mind and I'd like to keep the bill for that small.


> It's early days and nobody knows how things will go, but to me it looks that in the next century or so

How is it early days? AI has been talked about since at least the 50s, neural networks have been a thing since the 80s.

If you are worried about how technology will be in a century, why stop right here? Why not take the state of computers in the 60s and stop there?

Chances are, if the current wave does not achieve strong AI the there will be another AI winter and what people will research in 30 or 40 or 100 years is not something that our current choices can affect.

Therefore the interesting question is what happens short-term not what happens long-term.


I said that one hundred years from now humans would have likely gone the way of the horse. It will be a finished business, not a thing starting. We may take it with some chill, depending on how we value our species and our descendants and the long human history and our legacy. It's a very individual thing. I'm not chill.

There's no comparing the AI we have today with what we had 5 years ago. There's a huge qualitative difference: the AI we had five years ago was reliable but uncreative. The one we have now is quite a bit unreliable but creative at a level comparable with a person. To me, it's just a matter of time before we finish putting the two things together--and we have already started. Another AI winter of the sort we had before seems to me highly unlikely.


I think you severely underestimate what the 8 billion human beings on this planet can and will do. They are not like horses at all. They will not allow themselves to be ruled by, like, 10 billionaires operating an AI and furthermore if all work vanishes then we will find other things to do. Just ask a beach bum or a monk or children in school or athletes or students or an artist or the filthy rich. There _are_ ways to spend your time.

You can‘t just judge humans in terms of economic value given the fact that the economy is something that those humans made for themselves. It‘s not like there can be an „economy“ without humankind.

The only problem is the current state where perhaps _some_ work disappears, creating serious problems for those holding those jobs.

As for being creative, we had GPT2 more than 5 years ago and it did produce stories.

And the current AI is nothing like a human being in terms of the quality of the output. Not even close. It‘s laughable and to me it seems like ChatGPT specifically is getting worse and worse and they put more and more lipstick on the pig by making it appear more submissive and producing more emojis.


> How is it early days?

When you have exponential growth, it's always early days.

Other than that I'm not clear on what you're saying. What is in your mind the difference between how we should plan for the societal impact of AI in the short vs the long term?


Is it early days of exponential growth? The growth of AI to beat humans in chess and then Go took a long time. Appears to be step function growth. LLMs have limitations and can't double their growth for much longer. I'd argue they never did double, just a step function with a slow linear growth since.

The crowd claiming exponential growth have been at it for not quite a decade now. I have trouble separating fact from CEOs of AI companies shilling to attract that VC money. VCs desperately want to solve the expensive software engineer problem, you don't get that cash by claiming AI will be 3% better YoY


> When you have exponential growth, it's always early days.

Let‘s take the development of CPUs where for 30-40 years the observable performance actually did grow exponentially (unlike the current AI boom where it does not).

Was it always early days? Was it early days for computers in 2005?


In a way, yes, I think that rise smartphones and the resulting proliferation of cheap mobile processors was ground-shattering in terms of what we consider to be a computer. If I'm reading the numbers correctly, there are about 2B people with their own pc/laptop while there are about 5B people with their own smartphone. So 60% of people using computing do so on a type of device that didn't exist in 2005.

And this is before we talk about new modalities that might take over in the future, like VR/AR, wearable AI "pins" (which is what I assume that Johnny Ive is working on), neural interfaces, and maybe even quantum computing. So I personally don't see any issue of thinking of what happened 20 years ago, and what's happening at the moment as "early days for computers".

This is of course a semantics argument, but in my mind "early days" is a term for the time before things stabilized, and I'm very unclear on what that means for a field that's continuously undergoing rapid growth. As another extreme case - what would you say were the "early days" for life on planet Earth? I assume that most people wouldn't be thinking of just the first 50 years after the first RNA molecules got enclosed in a membrane.


Given how Japan works in general I bet it's the latter. It's a great country to travel and eat alone, for example.

> that they can internally represent many different complex ideas efficiently and coherently

The Transformer circuits[0] suggest that this representation is not coherent at all.

[0] https://transformer-circuits.pub


I guess that depends on what you think is coherent. A key finding is that the larger the network the more coherent the representation becomes. One example is larger networks merge the same concept across different languages into a single concept (as humans do). The addition circuits are also fairly easy to interpret.

> merge the same concept

It's doing compression which does not mean it's coherent.

> The addition circuits are also fairly easy to interpret.

The addition circuits make no sense whatsoever. It's doing great at guessing that's all.


I am curious, what would you count as coherent? I think it is absolutely insane that we can open and understand what are essentially alien intelligences at all!

In the end the data has to fit into structures or tables that can be processed by some algorithms. If the system is not rigid to a certain degree it would become unmaintainable or full of bugs or both.


> The person lamented the ongoing economic volatility — caused by tariffs and Trump’s unpredictability — during a presidency that they had been promised would be a boon to business. “We’re all experiencing a liquidity crunch,” they said. “We need public markets to open.”

"Promised", boohoo. These people really don't understand what a society is and are operating at the level of toddlers to put it mildly.

If they ever manage to destroy many jobs by building strong AI, we will hear "We're experiencing a lack of demand".


> most CS has now become blue-collar work, and it is something people in the industry have not yet come to terms with.

No, actually it's literally office work, working in an office or even working from home.


Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: