> code generation today is the worst that it ever will be, and it's only going to improve from here.
I'm also of the mindset that even if this is not true, that is, even if current state of LLMs is best that it ever will be, AI still would be helpful. It is already great at writing self contained scripts, and efficiency with large codebases has already improved.
> I would imagine the chance of many of us being on the losing side of this within the next decade is non-trivial.
Yes, this is worrisome. Though its ironic that almost every serious software engineer at some point in time in their possibly early childhood / career when programming was more for fun than work, thought of how cool it would be for a computer program to write a computer program. And now when we have the capability, in front of our eyes, we're afraid of it.
But, one thing humans are really good at is adaptability. We adapt to circumstances / situation -- good or bad. Even if the worst happens, people loose jobs, for a short term it will be negatively impactful for the families, however, over a period of time, humans will adapt to the situation, adapt to coexist with AI, and find next endeavour to conquer.
Rejecting AI is not the solution. Using it as any other tool, is. A tool that, if used correctly, by the right person, can indeed produce faster results.
I mean, some are good at adaptability, while others get completely left in the dust. Look at the rust belt: jobs have left, and everyone there is desperate for a handout. Trump is busy trying to engineer a recession in the US—when recessions happen, companies at the margin go belly-up and the fat is trimmed from the workforce. With the inroads that AI is making into the workforce, it could be the first restructuring where we see massive losses in jobs.
I'm also of the mindset that even if this is not true, that is, even if current state of LLMs is best that it ever will be, AI still would be helpful. It is already great at writing self contained scripts, and efficiency with large codebases has already improved.
> I would imagine the chance of many of us being on the losing side of this within the next decade is non-trivial.
Yes, this is worrisome. Though its ironic that almost every serious software engineer at some point in time in their possibly early childhood / career when programming was more for fun than work, thought of how cool it would be for a computer program to write a computer program. And now when we have the capability, in front of our eyes, we're afraid of it.
But, one thing humans are really good at is adaptability. We adapt to circumstances / situation -- good or bad. Even if the worst happens, people loose jobs, for a short term it will be negatively impactful for the families, however, over a period of time, humans will adapt to the situation, adapt to coexist with AI, and find next endeavour to conquer.
Rejecting AI is not the solution. Using it as any other tool, is. A tool that, if used correctly, by the right person, can indeed produce faster results.