I think there was a kind of "golden period" that goes in between.
In the 90s, the economics around software had already heated up to the point where there was an insatiable appetite for software engineering manpower, but the university system wasn't yet geared to churning out specialists in this field in such large numbers, so a lot of software engineers back then were people coming from other professions who picked it up autodidactically and were just not very good. At the same time programming languages and tooling weren't yet at a point where they were good at guiding people towards good software engineering practice, and this lead to a kind of software quality crisis.
But this situation changed fast. I would say from maybe roundabout 2003 to maybe roundabout 2013 there was a bit of a "golden period" where we had good reason to be optimistic about the future of software quality. The software quality crisis of the 90s was largely overcome through better education, better software engineering methodology, and better programming language ecosystems and toolchains. Back in those days we still had purpose-built tooling for doing things like desktop UIs. Windows Forms based in C# and Aqua-era MacOS GUI programming in ObjC were actually quite a good experience for both developers and users. We also had cross-platform ways of doing GUI programming like Swing on Java.
In the next ten years, i.e. the ten years leading up to now, things took a decided turn for the worse. If I were to speculate about the reasons, I would say it was related to the rise of mobile, and the continued rise in the importance of the web platform over the desktop platform, meaning that application development now had to straddle web, mobile, and desktop as three distinct development targets. This created a need for truly cross-platform application development, while Apple and Microsoft continued to make plays to fortify their monopoly power instead of giving the world what it needed. Swing/JavaFX lost its footing when enterprises decided that web was all they really needed.
So, to answer your intial question: Has software quality really gotten worse? I would say, yes, over the last 10-15 years definitely. If you compare now to the mid-90s, then maybe, maybe not.
> Has software quality really gotten worse? I would say, yes, over the last 10-15 years definitely.
By what metric?
Taking all your above examples, I (and many others) could argue that the move to web brought new techniques that overall improved software for developers and users. That's not to say I'm right, or you are, but to point out that everything you put forward is purely subjective.
What has objectively gotten worse in the past 10 years?
On the user's side: Just pick any set of well-established best practices such as Shneiderman's Eight Golden Rules or Nielsen & Molich's 10 Usability Heuristics, an then pick a typical 2024 electron app that has an equivalent from the 2003-2013 era and is written with a typical UI technology of the time (such as Windows Forms), and compare the two UIs with respect to those best practices. -- I'm pretty sure you will find usability blunders in today's software that you simply couldn't commit back then, even if you tried. -- Essential UI elements being hidden away (with no indication that such hiding is taking place) based on viewport size, leaving the user unable to perform their task is one thing that immediately comes to mind. Another example I happened to experience just yesterday: UI elements disappearing from underneath my mouse cursor when my mouse cursor starts to hover over them.
Also: Just look at the widget gallery in Windows Forms, providing intuitive metaphors for even quite subtle patterns of user interaction and check how many of those widgets you find implemented in modern web-based design languages and web component systems. ...usually you don't get much beyond input fields, buttons, and maybe tabbed-views if you're lucky. So today's software is relegated to using just those few things, where, 10 years ago, you had so many more widgets to pick and choose from to get it just right.
On the developer's side: Was JavaScript ever actually designed to do the things it's being used for today? Is dependency hell, especially in the web ecosystem, worse today than it was 10 years ago?
> Just pick any set of well-established best practices such as Shneiderman's Eight Golden Rules
Excellent, we have something objective to look at. Now, where's the studies, reports, etc. that this has declined in the past decade? I'm not asking for a double-blind, peer reviewed study, just something a bit more concrete than "stop the world, I want to get off."
> Was JavaScript ever actually designed to do the things it's being used for today?
> [...] Now, where's the studies, reports, etc. [...] "stop the world, I want to get off."
This argument is getting a bit tediuos. It started with you offering an opinion. I offered a counter-opinion, while clearly marking my opinion as such using language such as "I think ...", "I would say ...", "If I were to speculate ..."
I'm clearly not alone with my opinion (see original post), and you're trying to undermine your opponents' credibility by getting ad-hominem and pointing out that their position lacks the kind of research which you yourself did not provide either.
> > Was JavaScript ever actually designed to do the things it's being used for today?
> Was anything?
Hyperbole. Many things were designed to do the things they now do. Lua was designed as a language for embedding. SQL was designed as a language for querying databases.
...because I happened to come across it in my bookmarks just now, there's an article by Don Norman [1] that made the HN frontpage somewhat recently [2] sharing my pessimistic view about usability today. Admittedly, he has a conflict of interest, making money by telling people how bad their UIs are and how to make them better. But he definitely is very respected, and, in my opinion, deservedly so.
> [...] strong "kids today" vibes. [...] entire industry has forgotten how to do our jobs.
The OP seemed to be pessimistic, your initial point was "It was pretty bad in the mid-90s, and it's no worse today" which is really not very optimistic either, and my point was "Well, it was bad in the mid-90s, then got better, then got worse again". So, FWIW, I think that my point was actually somewhat more nuanced than the pessimistic context. I was also expressing optimism towards certain technologies while expressing pessimism towards others.
> your initial point was "It was pretty bad in the mid-90s, and it's no worse today" which is really not very optimistic either,
Partially but I suspect things are better than everyone keeps saying. The whole "it used to be better" is a meme I see in all walks of life and I want to see something to prove it beyond a bunch of opinions.
Do I think some things are worse? Yeah, probably, such is the way of life. Do I think the entire industry went to shit? That's something that even the most respected people will need to provide evidence for. It seems a bit strange that I somehow joined the industry in 2010 and spent my entire career getting shitter and shitter.
I've never seen a chat app taking gigabytes of RAM before Electron, for example.
I've extremely rarely seen applications going nuts, eating several CPU cores and draining my battery in 20 minutes before Electron, for example. Now it's a weekly occurence.
It's improved only for developers who only know web development. And we users pay for it in hardware costs, electricity costs etc.
> I've never seen a chat app taking gigabytes of RAM before Electron, for example.
Is that a general software problem or a problem specific to Electron? Is that a permanent problem or a problem right now because of the technology and your attitude towards it?
I say this because I do recall seeing complaints about Java being bloated in the 2000s. I briefly used Swing in my university days and it was pretty awful compared to HTML at the time. In 2044, maybe I'm going to be shaking my fist at the new-fangled tech and telling everyone how nice Electron apps were in comparison.
> complaints about Java being bloated in the 2000s.
It's bloated in the 2023s too. Last year I caught Android Studio (which I wasn't even using at the moment, just had it open from a quickie fix a few days ago) going over 4 Gb of ram. I had two projects open, under 20k lines of code total (ok, maybe I should count again).
But why bring Java in? We're talking about native applications vs applications that pull in a copy of Chrome and half of the npm registry. Java isn't native either.
> We're talking about native applications vs applications that pull in a copy of Chrome and half of the npm registry.
You might be but I'm not. I'm talking about the state of software in the 1990s, 2000s, 2010s and today and how a general "it's worse" isn't particularly useful (or probably even true).
Oh and... if you want problems specific to Electron... I'm pretty sure Discord was keeping all the cat pictures and memes that it had ever displayed uncompressed, in ram, for a long while. Memory usage of several gigabytes if you had the meme channel open. Even it displayed only the last 3 cats.
It's better these days but it was a problem for years. And tbh I'm not sure they fixed it or even realized or cared about the problem or one of the 2498127 npm packages fixed it.
> Oh and... if you want problems specific to Electron...
It seems you don't get my point so let me be explicit:
Pointing out issues with a single framework that powers a subset of software does not mean there is a general decline in software quality across the industry.
In the 90s, the economics around software had already heated up to the point where there was an insatiable appetite for software engineering manpower, but the university system wasn't yet geared to churning out specialists in this field in such large numbers, so a lot of software engineers back then were people coming from other professions who picked it up autodidactically and were just not very good. At the same time programming languages and tooling weren't yet at a point where they were good at guiding people towards good software engineering practice, and this lead to a kind of software quality crisis.
But this situation changed fast. I would say from maybe roundabout 2003 to maybe roundabout 2013 there was a bit of a "golden period" where we had good reason to be optimistic about the future of software quality. The software quality crisis of the 90s was largely overcome through better education, better software engineering methodology, and better programming language ecosystems and toolchains. Back in those days we still had purpose-built tooling for doing things like desktop UIs. Windows Forms based in C# and Aqua-era MacOS GUI programming in ObjC were actually quite a good experience for both developers and users. We also had cross-platform ways of doing GUI programming like Swing on Java.
In the next ten years, i.e. the ten years leading up to now, things took a decided turn for the worse. If I were to speculate about the reasons, I would say it was related to the rise of mobile, and the continued rise in the importance of the web platform over the desktop platform, meaning that application development now had to straddle web, mobile, and desktop as three distinct development targets. This created a need for truly cross-platform application development, while Apple and Microsoft continued to make plays to fortify their monopoly power instead of giving the world what it needed. Swing/JavaFX lost its footing when enterprises decided that web was all they really needed.
So, to answer your intial question: Has software quality really gotten worse? I would say, yes, over the last 10-15 years definitely. If you compare now to the mid-90s, then maybe, maybe not.