It's all about long-term vs short-term. Everyone architects software for the short term, I'd say the industry at large has collectively lost/never had the vision and wisdom to do anything else.
Now maybe if you are a tiny-ass start-up, sure, but for a big established company, this is just bad economics.
Why do we talk about "disrupting" the "behemoths"? Why is everything done in tiny-ass largely-parallel teams? Very few companies have had serious thoughts about programming at scale.
I don't dispute that doing things the right way is often a huge up-front initial investment, but you do eventually get over the hump.
> Everyone architects software for the short term, I'd say the industry at large has collectively lost/never had the vision and wisdom to do anything else.
I think everyone architects for the long term, they just do so poorly. The problem is that architecture has become synonymous with "more layers".
OK, so in the beginning there were no layers. People occasionally wrote a layer but it was common to just say "fuck it", and through it away. As late as the 90s, you read about C programmers writing hash tables all the time, wtf.
Then, somewhere along the way I don't know exactly when, we hit an inflection point where there were some layers that didn't work quite right, but were hard to do without, so we'd try to shim it.
Really good long-term engineering means also ripping up the under-performing layers, attacking the unneeded complexity. This does not mean giving up on abstractions altogether.
I think most people at the org should be Alan Kay called "second order work"—libraries foremost, but also programming tools, etc. Just about any end business goal should be a trivial composition of existing abstractions—if it isn't that's a problem to be addressed.
Work reuse always reflects the dependency graph of libraries and tools etc. In this type organization, I'd expect much bigger (depth and width) dependency graphs than the current independent teams model.
The end result is organizations should become more "agile" as they grow because they have more high-quality abstractions to lean on.
But current practices always put end-goal over process, and thus have no chance of cultivating this efficiency.
Now maybe if you are a tiny-ass start-up, sure, but for a big established company, this is just bad economics.
Why do we talk about "disrupting" the "behemoths"? Why is everything done in tiny-ass largely-parallel teams? Very few companies have had serious thoughts about programming at scale.
I don't dispute that doing things the right way is often a huge up-front initial investment, but you do eventually get over the hump.