Not sure why am getting in the middle of this but I need to point out that you are not even correct for Linux.
Linux rather famously has avoided the GPL3 and is distributed under a modified GPL2. This license allows binary blob modules. We are all very familiar with this.
As a result, the kernel that matches your description above that ships in the highest volume is Linux by a massive margin. Can you run a fully open source Linux kernel on your Android phone? Probably not. You do not have the drivers. You may not pass the security checks.
Do companies like Broadcomm “collaborate” on Linux even in the PC or Mac space? Not really.
On the other side, companies that use FreeBSD do actually contribute a lot of code. This includes Netflix most famously but even Sony gives back.
The vast majority of vendors that use Linux embedded never contribute a single line of code (like 80% or more at least - maybe 98%). Very few of them even make the kernel code they use available. I worked in video surveillance where every video recorder and camera in the entire industry is Linux based at this point. Almost none of them distribute source code.
But even the story behind the GPL or not is wrong in the real world.
You get great industry players like Valve that contribute a lot of code. And guess what, a lot of that code is licensed permissively. And a lot of other companies continue to Mesa, Wayland, Xorg, pipewire, and other parts of the stack that are permissively licensed. The level of contribution has nothing to do with the GPL.
How about other important projects? There are more big companies contributing to LLVM/Clang (permissive) than there are to GCC (GPL).
In fact, the GPL often discourages collaboration. Apple is a great example of a company that will not contribute to even the GPL projects that they rely on. But they do contribute a fair bit of Open Source code permisssively. And they are not even one of the “good guys” in Open Source.
A few vendors have been stopped from shipping binary modules with Linux, notably those linking to certain symbols. Enough vendors have contributed enough to make Linux actually usable on the desktop with a wide range of off the shelf hardware and more and more are announcing day one compatibility or open source contributions. The same is hardly true for the BSDs.
It's obvious Sony is keeping certain drivers closed source while open sourcing other things, and why Nvidia decided to go with an open source driver. It's not hard to understand why, it could be some pressure or a modified GPL2.
Probably not, but possibly yes. Which is more than the cuck license guarantees. See postmarketOS and such, which would be impossible in a BSD world.
>The vast majority of vendors that use Linux embedded never contribute a single line of code
It doesn't matter. The point is just that they can be legally compelled to if needed. That is better than nothing.
>The level of contribution has nothing to do with the GPL.
None of this would be feasible if linux wasn't a platform where the drivers work. They wouldn't have worked on the linux userspace in the first place if it didn't have driver support: it wouldn't be a viable competitor to windows and the whole PC platform would probably be locked down anyways without a decent competitor. Permissive software is parasitic in this sense that it benefits from inter-operating in a copyleft environment but cooperates with attempts to lock down the market.
LLVM was made after GCC and is designed with a different architecture. It is apples and oranges.
Apple is a great example of a company that is flooding the world with locked-down devices. Everything they do is an obstacle to general purpose computing. What do they meaningfully commit to the public domain? Swift? Webkit? It is part of a strategy to improve their lock-in and ultimately make collaboration impossible.
This makes me worry for the GCC implementation of Rust. People do not seem to use or upkeep the GCC versions of languages who primary Open Source implementations are elsewhere.
There is the advantage that GCC will be only way for Rust to be available in some targets where LLVM isn't an option.
Regarding Go, gccgo was a way to have a better compiler backend for those that care about optimizations that reference Go compiler isn't capable of, due to the difference in history, philosophy, whatever.
Apparently that effort isn't seen as worthwile by the community.
Some languages do define &&b, like Rust, where its effect is similar to the parent post's C example: it creates a temporary stack allocation initialized with &b, and then takes the address of that.
You could argue this is inconsistent or confusing. It is certainly useful though.
Incidentally, C99 lets you do something similar with compound literal syntax; this is a valid expression:
A tour de force indeed. Asahi Linux only works as well as it does because of the massive effort put in by that team.
For all the flack Qualcomm takes, they do significantly more than Apple to get hardware support into the kernel. They are already working to mainline the X2 Elite.
The difference is that Apple only makes a few devices and there is a large community around them. It would be far less work to create a stellar Linux experience on a Lenovo X Elite laptop than on a M2 MacBook. But fewer people are lining up to do it on Lenovo. We expect Lenovo, Linaro, and Qualcomm to do it for us.
You could argue that is exactly what Tuxedo is doing. In this case, they could not provide the end-user experience they wanted with this hardware so they moved on.
System76 may be an even better example as they now control their software stack more deeply (COSMIC).
when I say "control the software" what i mean is we need another company that can say "hey we are moving to architecture X because we think it's better" and within a year most developers rewrite their apps for the new arch - because it's worth it for them
there needs to be a huge healthy ecosystem/economic incentive.
it's all about the software for end users. I don't care what brand it is or OS and how much it costs. I want to have the most polished software and I want to have it on release day.
Right now, it's Apple.
Microsoft tries to do this but is held back by the need for backward compatibility (enterprise adoption), and Google cannot do this because of Android fragmentation. I don't think anyone is even near to try this with Linux.
Almost everything on regular Fedora works on Ashai Fedora out of the box on Apple Silicon.
You can get a full Ubuntu distribution for RISC-V with tens of thousands of packages working today.
Many Linux users would have little trouble changing architectures. For Linux, the issue is booting and drivers.
What you say is true for proprietary software of course. But there is FEX to run x86 software on ARM and Felix86 to run it on RISC-V. These work like Rosetta. Many Windows games run this way for example.
The majority of Android apps ship as Dalvik bytecode and should not care about the arch. Anything using native code is going to require porting though. That includes many games I imagine.
I don’t think Apple is catering to developers at all. If that were the case they wouldn’t be charging them $100 a year just to exist. They have the marketshare especially in iOS to force developers on to their platform.
I believe that running Windows games is something that Apple does not care about at all. Their efforts to create the game porting tool kit are 100% about getting more Windows games ported to the macOS App Store.
Linux rather famously has avoided the GPL3 and is distributed under a modified GPL2. This license allows binary blob modules. We are all very familiar with this.
As a result, the kernel that matches your description above that ships in the highest volume is Linux by a massive margin. Can you run a fully open source Linux kernel on your Android phone? Probably not. You do not have the drivers. You may not pass the security checks.
Do companies like Broadcomm “collaborate” on Linux even in the PC or Mac space? Not really.
On the other side, companies that use FreeBSD do actually contribute a lot of code. This includes Netflix most famously but even Sony gives back.
The vast majority of vendors that use Linux embedded never contribute a single line of code (like 80% or more at least - maybe 98%). Very few of them even make the kernel code they use available. I worked in video surveillance where every video recorder and camera in the entire industry is Linux based at this point. Almost none of them distribute source code.
But even the story behind the GPL or not is wrong in the real world.
You get great industry players like Valve that contribute a lot of code. And guess what, a lot of that code is licensed permissively. And a lot of other companies continue to Mesa, Wayland, Xorg, pipewire, and other parts of the stack that are permissively licensed. The level of contribution has nothing to do with the GPL.
How about other important projects? There are more big companies contributing to LLVM/Clang (permissive) than there are to GCC (GPL).
In fact, the GPL often discourages collaboration. Apple is a great example of a company that will not contribute to even the GPL projects that they rely on. But they do contribute a fair bit of Open Source code permisssively. And they are not even one of the “good guys” in Open Source.
This comment is pure ideological mythology.
reply