The "website obesity crisis" is related to the rise of single page applications and the growing popularity of frameworks like React, Vue, and Angular.
Having worked on many SPAs in my career, I've noticed a similar pattern which has happened on essentially every project I've worked on. I call it the SPA descent into madness.
Initially, a SPA is probably the fastest way to start prototyping a UI. You don't even need a server - just throw some HTML, CSS, and JS onto the page, add some mock data, ReactDOM.render, and you're off to the races. All of the UI logic is handled by your frontend framework, and the backend exposes an API the frontend interacts with. Peachy.
But every non-trivial project hits an inflection point where things start to get tricky. In a classic server rendered website (e.g. Ruby on Rails, Django, etc.) you can add as many features as you want because the size of the server binary doesn't really matter. This isn't the case for SPAs - every feature and every additional dependency bloats the size of the JavaScript bundle.
To combat this, developers do route-based code splitting, but oftentimes this isn't sufficient - critical pages usually have the most features stuffed into them, which means they can't be reduced in size enough. Server side rendering can be effective, but now your data model needs to exist on both the client and server, so hopefully you took this into consideration. If not, it's common to run into situations where your application's model layer is partially duplicated across your client and server.
Are all of these problems solvable? Sure. There are great, performant SPAs with millions of lines of code. But let's be real - most organizations wont solve these problems due to lack of engineering ability, time, or politics. The path of least resistance with SPAs is to shoot yourself in the foot on performance, so that's what tends to happen. It's the reason you see these insane 3MB bundles on text based webpages. They started with good intentions, but never got the love necessary to make a large SPAs work well.
All this to say: Make sure you're picking the right tool for the job. SPAs have a low upfront cost, but can have unexpectedly high long term costs. A sprinkle of JavaScript on top of a server rendered page, with a few tricky components backed by a light framework like Inferno, Mithril, or HyperHTML, is oftentimes all that's needed.
You're absolutely right - SPA bloat isn't directly due to the weight of the underlying frameworks.
I think what I was touching on more was the fact that SPAs generally encourage engineering practices which lead to more complexity and page bloat, and that many teams don't think about these maintenance difficulties when initially picking their tech stack (as proven by the number of bloated, slow SPAs on the internet.)
A well engineered SPA won't suffer from the issues I outlined in the original post, but getting to that point has a non-trivial cost which has to be considered. As usual, it's all about picking the right tool for the job - teams need to make sure they're getting a net benefit from using this sort of application architecture vs. a server oriented one.
And... what problem do they solve? They create an application which you have to download over-and-over again instead of providing downloadable application which starts quickly. Then there is a boatload of problems (how to store data, how to refresh data) and then reinventing technologies to solve those problems that didn't exist in the first place.
Supposedly the benefit is being portable and "independent" therefore we are required to install Chrome everywhere.
I recently toyed a bit with JavaWebStart and JavaFX and... this is quite nice actually - it allows easily distribute portable applications, that look good (IMHO) and they are cached well (or you can provide normal binary to download). And quite often they are smaller. The problem is - Java got this label of "slow and bloated" but nowadays Java apps are way snappier and start faster (and they are smaller) than JavaScript SPAs…
The issue I have with commercial web design is the lack of restraint. Just because you’re able to do something doesn’t mean you should. Rare is the commercial web designer able to just leave the damn content alone. A good website is unnoticeable to the user, allowing them to focus on the content. Instead commercial web design has always managed to find creative ways to make the user fight the website to access the content.
I just discovered that a couple of days ago, clicking an m.wikipedia link from HN on my desktop.. Huh? Where am I? This looks so good, can't be wikipedia.. ohhh its the mobile site?! It looks a million times better, hard to believe the huge difference.
This article was a big impetus for me to learn web development: the shear madness of waiting several seconds to read a 140 byte tweet!
The megabytes of ad and tracking javascript loaded for text articles of a few kilobytes - working in advertising exposed me to the magnitude of internet traffic that is dedicated to auctioning advertisements and tracking users thereafter. The weight of mass surveillance is stifling our networks. I really can't wait to move on from this era.
That's why I don't like web technologies in general, HTML, javascript, the DOM. They allow things like angular and all of those javascript bloated blobs. It also allows to track you, and show ads than are on another server.
If you look back at why web techs thrived and prevailed, it was essentially an adverse effect of microsoft business practices. HTML and js is a gate that allows developers to build application that run on a linux server, and be used on a windows client.
HTML was never designed for web applications. I don't understand why people cannot stop using js and the DOM seen how slow it is on android. HTML is an ambiguous language which is also why rendering it is slow.
I'm sure I'm missing a lot of job opportunities by avoiding web dev jobs, but I don't care.
This website obesity is also linked to Wirth's law:
> Wirth's law, also known as Page's law, Gates' law and May's law, is a computing adage which states that software is getting slower more rapidly than hardware becomes faster.
The problem has only gotten worse in the last 3 years, from what I've seen. Has the adtech industry gotten larger in the meantime? Has their surveillance technologies delivered on their promises? Do you still think the adtech industry will implode?
Not the author, but ad tech is making more money than ever, so I can only surmise that they're pushing more traffic and tracking more users. That is, after all, how they make money.
IMO as a lowly developer surrounded by data scientists, the mass intake of user data (the less anonymous the better!) only served as a canvas to paint data-massaged-stories onto to make presentations to clients of how effective their advertising has been, and that they should definitely renew their contract, because numbers don't lie!
Don't get me wrong, there are some really talented statisticians developing new analysis techniques to track what a person does after being exposed to a message -- but that knowledge is probably used more effectively by intelligence agencies than the marketing managers trying to snag their next big client.
My impression is that the more data they collect, the more it becomes clear that advertising didn't make a meaningful difference in purchase habits. Anything to the contrary is funny numbers made up by the people who sell advertising.
I think the industry will implode someday, but its just a gut feeling. Nothing scientific ;)
Responsive design is great, but many people seem to forget that usually this just comes “for free”, as long as you don’t put very strong constraints on how your content should look.
If you're willing to deal with HTML's native document flow, which is quite good for literature and scholarly material.
I kinda wish there was a robust second non-commercial web with a number of technical restrictions designed to keep things from descending into crap. Not give people enough rope to hang themselves with.
Tor's onion sites are the closest we've got, most have zero JavaScript, zero tracking and zero ads. It provides an amazing vision into the web we lost.
yeah sure of course ... has there been a renaissance of gopher? Last time I checked (about 8 years ago or so) things were effectively last updated in the late 90s.
IIRC the final users were mostly professional librarians. I believe gopher support was removed from firefox about half a decade ago.
Not exactly what your looking for, but whenever supported, I use Safari’s read mode. Cleans the page of crap and leaves only text and pictures. Works for me. Only downside though: you gotta load the whole crap first
I remember this web page. What's hilarious is that practically none of it uses semantic markup and it's all rendered in a giant table, so you can't use browser-native functionality like Safari Reader.
The page also uses so many images the page weight comes in over a meg.
The images are part of the content here. The article attacks pointless obesity -- not large web pages period. Indeed, here is the first paragraph:
> Let me start by saying that beautiful websites come in all sizes and page weights. I love big websites packed with images. I love high-resolution video. I love sprawling Javascript experiments or well-designed web apps.
Also, the article never mentions the word "semantic" or the best practices around markup.
Well then the sprit of the whole thing is just worthless. Why focus just on page weight? Why not also whether or not I should see tons of thumbnails I don't care about (which is pointless obesity), or attention to using the technology correctly in the first place?
Taking a single concept by itself out of all of web development and advocating for it is such a pointless game.
All the images are meme-tier garbage. It's such a distraction. It is hypocrisy.
Having worked on many SPAs in my career, I've noticed a similar pattern which has happened on essentially every project I've worked on. I call it the SPA descent into madness.
Initially, a SPA is probably the fastest way to start prototyping a UI. You don't even need a server - just throw some HTML, CSS, and JS onto the page, add some mock data, ReactDOM.render, and you're off to the races. All of the UI logic is handled by your frontend framework, and the backend exposes an API the frontend interacts with. Peachy.
But every non-trivial project hits an inflection point where things start to get tricky. In a classic server rendered website (e.g. Ruby on Rails, Django, etc.) you can add as many features as you want because the size of the server binary doesn't really matter. This isn't the case for SPAs - every feature and every additional dependency bloats the size of the JavaScript bundle.
To combat this, developers do route-based code splitting, but oftentimes this isn't sufficient - critical pages usually have the most features stuffed into them, which means they can't be reduced in size enough. Server side rendering can be effective, but now your data model needs to exist on both the client and server, so hopefully you took this into consideration. If not, it's common to run into situations where your application's model layer is partially duplicated across your client and server.
Are all of these problems solvable? Sure. There are great, performant SPAs with millions of lines of code. But let's be real - most organizations wont solve these problems due to lack of engineering ability, time, or politics. The path of least resistance with SPAs is to shoot yourself in the foot on performance, so that's what tends to happen. It's the reason you see these insane 3MB bundles on text based webpages. They started with good intentions, but never got the love necessary to make a large SPAs work well.
All this to say: Make sure you're picking the right tool for the job. SPAs have a low upfront cost, but can have unexpectedly high long term costs. A sprinkle of JavaScript on top of a server rendered page, with a few tricky components backed by a light framework like Inferno, Mithril, or HyperHTML, is oftentimes all that's needed.