You can use incremental reading, which is built on top of spaced repetition. Lots of people have invented it independently [1] and it works amazingly! Once you get the hang of it, it changes the way you think about learning stuff.
Interesting article. I agree that this double language phenomenon of "biformity" can be a source of complexity, but I actually think the majority of complexity comes from one level higher: the paradigm, as the paradigmatic level is ultimately where assumptions about the modeling of problem spaces lie.
The go example in the article is actually an instance of this: kubernetes went ahead and implemented an oop system in golang—why? because they felt go's assumption about the problem space (that it can be modeled and solved as imperative programs) was not a good fit for their actual problem space.
Haskell's assumption of purity leads to a problem/solution space that's a good fit for problems that are primarily themselves pure and mathematical, but leads to complexity when it comes to having to solve problems that are not in this space (having to use monads or effects for io)
Java's problem/solution space assumes you can model everything as objects and classes and runs into complexity when we attempt to use it for problems that are actually better modeled by other means.
Many languages that are "multiparadigm" or "general purpose" really have an underlying problem/solution model driving the organization of programs. When our particular problem is not a good fit for this model, we have to contort and wind up spending more time dealing with language constructs themselves than actually expressing our problem and solution. Couple this with the fact that languages have different performance properties, which may also be a constraint you need to satisfy and things get...complicated. A lazy pure language might be the best modeling system for your problem (e.g. dealing with infinite sequences) but a non-starter due to memory constraints (not enough resources).
The real problem is complexity, which has gone exponential.
When I joined the work force in 2000, my life was comparably so stunningly simple. Just a few guys in the same room. Barely any process or documentation. Email was still new so the concept of an outside world barely existed. Chat did not exist, but wouldn't make sense anyway. We talked a bit here and there but 80% was actually doing the work, not talking about it. Management had no idea what we were doing and metric porn did not yet exist.
A lot has changed. More complicated tech stacks means more deeper specializations, requiring more handovers. A lot is outsourced now so you may need vendors to move things. You may have off-shored things. Nobody has clarity on what you need to do, hence you need to hop the organization to find out the details. You need to pass legal and the privacy office. You need to report status constantly to an army of bean counters. Testing has become amazingly complicated and so is system administration.
It requires super human effort to move things by an inch. So no, "collaboration is not a force multiplier". Collaboration isn't a product or outcome. Ideally you'd have an absolute minimum of it. The ideal workflow is that you create a clear and detailed work package, hand it over to the worker, whom you then leave alone to actually do it.
Your companies' purpose is to ship software or whatever else it does, it isn't to ship emails, chat, status updates, approvals and documents.
It is absolutely baffling to me how highly paid office workers' productivity is pissed away like this without intervention. Don't send them to a yoga class to cope, fix the fucking problem. You're setting your money on fire.
There’s a kind of bathtub curve between generalism and specialism.
On the far left you have master generalists with incredible breadth of understanding and the ability to summarize and strategize. These people are highly valued and often land in architecture or leadership roles.
Then in the middle you have the specialists. The Java developers and the Ops people. They’re less valuable because there are so many of them, and they’re somewhat interchangeable.
But then on the far right you’ve got super-specialists. They are masters of their niche - world-class experts with deep understanding who are sought after as the authority in their domain. These folk are super-valuable too.
AI will probably eat the ones in the middle first.
For day-to-day tasks as a technical user I use a Mac but wouldn't find it to be an issue to use linux for most of my work.
However, my strictly-gaming PC runs Windows. It's by-far the easiest, least-effort, and overwhelmingly the most supported choice. I'm not wanting to put on the sysadmin hat when wanting to decompress after work.
And also the universe itself is often seen through the lens of contemporary technology.
Are we living on an island that floats on a giant turtle’s back? Or are the heavens like giant clockworks? Or maybe it’s all a computer simulation?
These cosmological speculations are separated by thousands of years, but they are all simply a reflection of what the person finds most awe-inspiring in their everyday life.
[1] https://supermemo.guru/wiki/Michael_Nielsen_re-discovers_inc...