Before microservices were a thing, I had the chance to work on a couple of telecom systems written in Erlang/OTP, but it wasn't until years later that I realized we were already doing most of the things people were using microservices for, with the single exception of being polyglot (although Elixir and Gleam are starting to challenge that).
Small teams were dealing with specific functionality, and they were largely autonomous as long as we agree upon the API, which was all done via Erlang's very elegant message passing system. Scalability was automatic, part of the runtime. We had system-wide visibility and anyone could test anything even on their own computers. We didn't have to practice defensive programming thanks to OTP, and any systemic failure was easier to detect and fix. Updates could be applied in hot, while the system was running, one of the nicest features of the BEAM, that microservices try to address.
All the complexity associated with microservices, or even Kubernetes and service meshes, are ultimately a way to achieve some sort of "polyglot BEAM". But I question if it's really worth it for all use cases. A lot of the "old" technology has kept evolving nicely, and I'd be perfectly fine using it to achieve the required business outcomes.
Looking back at the time this article was written, I used to believe the same things, that people would rise up, mesh networks were going to change the world, and the distributed web was going to change everything.
I ran IPFS nodes, I was on cjdns (Hyperborea network), I joined all alt sites trying to disrupt FB and whatnot (Diaspora, Friendica, Mastodon). I paid a lot more to my ISP to have no bandwidth caps (a key blocker for dweb technologies).
In the end, nobody came. Nobody else cared. The huge time sink that was necessary only to maintain these technologies was eating either on my work or my personal life. I wasn't even capable of convincing family members in 3 countries to use Signal or Wire instead of WhatsApp. So I gave up.
Every once in a while I take a peek into the dweb world, because I just love the technologies, but I see little to no movement. Outside folks like archive.org, few others have serious, production-quality systems based on dweb techs.
When I was a product lead, the most important question was "why". What problems are you trying to solve. And the problems need to be so clear, obvious and powerful that customers would be willing to pay to solve them.
As I see it now, even if the problems described in the article are real, the great majority of people don't care enough to make the effort required to change their habits.
> the problems need to be so clear, obvious and powerful that customers would be willing to pay to solve them.
The problems are clear but the solutions need to be clear too. So far many of the alternatives tend to focus more on the tech (decentralized protocols, this or that programming language, etc) than solving the problem. Even worse, sometimes the nature of the tech makes solving the problem more difficult or impossible (decentralized protocols brings a lot of challenges by themselves for example).
Back in the day @moxie wrote a good text explaining why UX of a centralized solution will always be as good or better than UX of a decentralized solution. Most users crave pleasant UX, and easily discard applications and services that have annoying UX, as long as there is a sleeker alternative.
I'd add that a centralized solution can be run by a big corporation extracting significant profits, and thus investing significant resources into it. Investing into a decentralized solution gives a much vaguer idea of ROI. Look at email, the long-standing champion of federated protocols. Most investment went into Gmail and Outlook, proprietary solutions that happen to interoperate with the rest of email universe, but which use proprietary ways to communicate to centralized infrastructure as their strong suit. They are wildly popular.
I posit that for normal users a decentralized solution only makes sense when a centralized solution is impossible and / or illegal. See p2p music-sharing networks of 2000s, or modern bittorent. For bittorent though, centralized catalogs like TPB or rutracker are the norm, unlike the p2p search in Gnutella or DC++ of old. Even though incentives of those running TPB are better aligned with the interests of its users than e.g. in the cases of FB or Reddit, TPB is not a non-profit, AFIACT.
So, for decentralized web of 1995 to return, a lot of people must have it very bad using the centralized web. Even though ad networks actively work in the direction of making the experience of web browsing insufferable, it appears that relatively simple tools like uBlock Origin, or paying a small subscription fee, make the experience okay again.
So, YouTube + $5/mo, or even YouTube played via NewPipe, again trump the experience of using PeerTube, etc.
BTW even if the internet becomes a mesh network on transport / connectivity level, it won't change much in these dynamics. Instant gratification + not needing to pay money are winning, and will win, the majority of the audience, by default.
>As I see it now, even if the problems described in the article are real, the great majority of people don't care enough to make the effort required to change their habits.
Then it's not solving any immediate problems for them. Anyway - people get obsessed with getting the entire planet on to distributed networks. IMO that's not realistic - the mass population is always going to choose simple, corporate shit unless there's a direct need for something disruptive enough that they'll spend literally days working out how to do it - eg learning how to find and download torrents.
>Looking back at the time this article was written, I used to believe the same things, that people would rise up, mesh networks were going to change the world, and the distributed web was going to change everything.
It won't change everything but I think it will become important. General purpose computing will continue it's dying path and in 10-15 years, normies will be solely on their smartphone walled gardens and programmers, scientists, muckrakers, enthusiasts etc will populate some kind of very niche darknet (<5% population) - or an array of totally disparate darknets aligned to various niches - whether theyr'e running over the Internet proper or some kind of alt network.
>It won't change everything but I think it will become important. General purpose computing will continue it's dying path and in 10-15 years
I swear these kind of statements makes me feel old.
People in tech has been saying this for a very long time, and it's never been true, the PC market will only die if innovation and usability is dead.
What has happened is that we've spaced out our usage of tech with specialized tech, a good recent example is how say for instance smart watches have replaced the heart beat sensor and as a notifier /clock tool which the smartphone used to be.
As I see it now, even if the problems described in the article are real, the great majority of people don't care enough to make the effort required to change their habits.
In order for people to care, give them something to care about.
It is hard to expect that an average person would 'care' about IPFS or Mastodon which are unpolished, hard to use technologies. But people care about their iPhones and Instagram accounts, even if they come at a great price. It is our job as technologists to give people something to care about. The perceived rate of 'caring' measures our own abilities (as creators of technologies and products), not the lack of theirs.
From my experience just being an observer and not too much of a techie I’d say it’s one of the cons of open source products.
For example, if there’s an open source product you like to use but the maintainers are not very active, you may get aggravated and fork it and start another similar product thinking you can do it better justice. Problem is, for every new “fork” of a product you end up dividing the user base as well.
When users have too many options to choose from they usually do not compare differences when there’s too many to compare and just pick one.
Also, most of these similar products are not always backwards compatible which is one of many reasons people may not try to compare similar products.
Another way of thinking about it is the less effort someone has to input to copy something, the more “copies” you’ll have to pick from. From a users perspective it is very confusing and frustrating not knowing what to pick to use.
> the great majority of people don't care enough to make the effort required to change their habits.
I've started to wonder if the web will split for this reason.
Main stream stays on the normal web and techies move to their own. Almost like an end to the Eternal September but really it just feels like everything old is new again.
The author just recently got contacted (again) by Cuban activists. In the face of internet shutdowns they, like Iranian activists before them, Indian and Colombian activists, are looking for networks that work off the grid. SSB isn't there yet. It leaks too much data at the moment. But the need is there. And if these people find it useful... we'd never know. That's the nature of these networks.
Web 3.0 and blockchain technology[1] is doing a great job getting people into dweb. If we remove the "get rich fast" part of it, dApps are doing a great work bringing users to distributed platforms.
IPFS is great, but just a nitpick, there's nothing blockchain about IPFS. Blockchain usually implies decentralized but decentralized doesn't imply blockchain. Cryptocurrency blockchains only really make sense when you're managing something scarce and you need to prevent double-spends (like for currency or domain names).
Don't judge the author too harshly. Ingrid presents an overall perspective of the entire field. Yes, some definitions are somewhat skewed, but I'm sure many people see things the same way. In a way, the fact the author wrote the article and we're all reading all these comments, is because we all care about this and we want to see a truly distributed web succeed. So it's ok to try to reason how we get there.
In all honestly, I've had very similar thoughts. As someone who started with BITNET before the Internet, I always believed "the Net" was supposed to level the field, and we no longer have to be "consumers" but rather willing participants capable of innovation, on our own right. I know, how naive.
Where I disagree is in the solution. It's not political. Whenever you think about the Internet, you MUST think global. I've had the chance to visit around 35 countries, and I've seen first hand how governments don't work for people, for the most part.
It comes down to this: we (geeks and nerds) need to get out of our comfort zone and think in terms of user experience. We CAN create amazing things. But we must think about the people using these solutions. People use Dropbox because it's easy, not because it's good. Same for Gmail and other services. If we are not creating solutions that can be used by anyone, in an easy and straightforward fashion, we're not doing our job. If companies don't see value in what we are creating when it could help them tremendously (i.e. IPFS) then WE are doing something wrong.
I started writing blog-like text entries in my .plan file in the mid 80s, so others could use finger in UNIX and read what I was thinking.
I later learnt HTML just so I could keep writing my thoughts. I lost content, I got spam, I had to learn to defend myself when the Internet became aggressive. It wasn't like that early on. We believed it was for the betterment of humankind. I know, how naive.
I went through the "Wordpress phase", the static phase (Pelican, then Hugo) but I ultimately decided to define what I wanted to share, and I realized few people really cared about long writings, but rather, the nitty gritty. So I chose to use the Zettlekasten model where I focus on specifics, narrow, single-topic. I chose TiddlyWiki (the Drift distribution specifically), and I have two files, one public, one private. As simple as that. My default is public. If I have a thought, learning, recipe, experience, anything that could be useful to anyone else, I put it on the public file. Otherwise, it goes in the private. As simple as that.
And thanks to CloudFlare's JS compression, the whole thing works great for me. https://ramirosalas.com
I find Althea [1] a lot more interesting than Helium, where their crypto is used basically as an incentive model for bandwidth, with no proprietary hardware or software in the mix.
There have been several posts in recent weeks on this topic, and every time I have the same reaction: how can they all just ignore TiddlyWiki [0], that has been out there since 2004 and has evolved nicely over the years to deliver most of the same outcomes? Is it that folks just ignore prior art? or they just gravitate to the latest more sexy software? (honest question).
I've been using TiddlyWiki on and off for many years, but 2-3 years ago I moved heavily into it (the Drift [1] distribution), and I haven't looked back. To me, it has become less about the tool and more about the information, and ensuring I have complete access to it, even 20 years from now. That includes data, metadata and even the software itself, regardless of the platform or OS.
Let's not be hypocritical. What we really want to know is: did you find/preserve intact DNA? If the answer is yes, I'm not even going to speculate what happens next. We just know it.
That would mean that there were plenty of mammoths around just underground, easy to find and within easy reach. If that is the case, wouldn't scientists flock to the area and collect them by the dozens?
I think you underestimates the difficulty to travel across Siberia: almost no roads, no trains, few and scattered inhabitants, thousands of km of steppe in all directions... Huge parts of the country are basically only accessible by helicopter (which is extremely expensive). In fact they even used an heavy helicopter to bring back a whole mammoth to Moscow a few years ago.
Although as a non-expert, the methodology of the study seems sound to me, I find its conclusion depressing.
I know it wasn't their intent to attempt to provide solutions, but it bothers me having absolutely no idea how to even begin to solve this problem at scale, in a world where it's becoming easier and cheaper to influence such large amounts of people. Every single idea I've thought can either be defeated, exploited, or ignored, and I haven't seen a project or effort that seems strong enough to go to battle for.
After years of being beaten by customers with stories like these, I learnt to treat InfoSec and Compliance teams as finite state machines, particularly at banks and other financial institutions. Learn not to question the sacred spreadsheet, or debate the merits of a request. It's pointless, and you keep rolling your eyes will only end up with you at the optometrist.
Instead, treat compliance like part of your API. Ensure your product delivers on the expected answer, while continuously improving the security of your products in the parts that are not directly visible.
However DO get in writing that the option was offered to them for possible future court battles so that the onus was on them for failed security damages.