I've had them. They're fine. But this is overselling the variety angle. The meat eater equivalence of forage like this would be game animals. In my experience and extrapolating, the taste difference between game and farm animals is generally greater than among the green vegetables.
Not sure I agree, I think there's as much difference between spinach, leek, fennel and Brussels sprouts as between beef and deer and that's without foraging into fancy vegetables...
Sure, but spinach, kale, mustard greens, chard, and arugula are all pretty wildly different. With different textures, flavors, and other things going on.
I think most actually know and don't think it's all that much different from what other have done for decades. I'm not saying they are correct to think it, just that they do think it. They think it's refreshing that the corruption is in the open. It's a societal boy-cried-wolf numbness. People are tired of the of finger pointing and screaming about every thing and now don't listen when the real stuff goes down.
I believe it was always more myth than fact. There's always been rough edges in Apple products line. If anything its more an indication of where the real focus is now. And it's not iOS.
The standard screen was 80 by 25. There were two addresses you needed to know 0xb000 for monochrome displays and 0xb800 for color. For monochrome you could just blast characters/attributes to the address and everything looked great. For color you had to add a little bit of assembly so writes didn't happen when the monitor was doing certain things (or else you would get some flickering). The little hacks were all well known.
Then you could build your own 'windowing' system by just maintaining separate screen buffers and having a little bit of code to combine for buffers when writing the actual hardware.
In the early days everyone code was synchronous and code would start listening for keyboard events and react and repaint in a very ad hoc fashion. Mouses made things a bit more complicated as you needed to maintain a persistent model of the UI to process their events.
So the UI code was simple and easy to work on, but you had to squeeze these programs into tiny memory footprints so you would spend a lot of time trying to find more memory. One of the bigger projects I worked on had a memory manager that relocated blocks to create contiguous space but since there was no OS support for things that like the code was actually updating pointer in the heap and stack - which was a great source of complicated bugs. Whoa onto anyone that tried to use a linked lists in such an environment.
But yeah, it was a fun time.
I believe the pedantry label was sufficient acknowledgment of fact, while also pointing out that in the context of the larger conversation we are really talking about whether his leadership decisions led to success.
I hope this was more of a philosophical musing than career advice. I've not worked at every big company, but I have worked at a few.
I agree that in the context of a big companies, "done" is a metric; and career success at that big company depends on moving the metrics those leaders track. But in my experience modern big companies also look at peer review and if you're always committing junk, those reviews are not going to be kind. So like everything, it's a balance. Please your boss by closing tickets. Please your peers by writing good code.
Spotify has not viewed itself as a music company for longer than that. It's a platform for audio. And, while there are still music first people at the company, they are not in the power positions that they used to be.
The transition didn't start when they laid off Glenn MacDonald, but that sort of cemented it. They had already gutted curation before that and by this time you were far more likely to find people talking about AI in the halls than music. If you've never heard of Glenn, check out his book: "You Have Not Heard Your Favorite Song: How Streaming Changes Music." Or his old online projects at https://everynoise.com/.
In those early days the Spotify user experience needed to try and differentiate and put up barriers to being copied. Later it suffered from being purely metric driven and tracking things like user-engagement thinking it's a proxy for happiness with the platform. And then later still they start to mostly care about the cost of delivery.
I think this specific quote from the article deals with this situation.
> U.S. Circuit Judge Patricia Millett wrote for a unanimous three-judge panel on Tuesday that U.S. copyright law "requires all work to be authored in the first instance by a human being."
The different mindsets exist, but I agree these are bad words to differentiate them. Back when I started in software in the 80s a common expression was: there are two types of programmers, nerds and hippies. The distinction falling along similar lines - nerds needed to taste the metal, while hippies were more interested in getting stuff done.