Ubuntu is my first and last Linux distribution, which I have used for 16 years without discontinuation, since my early PhD days up to today.
The only change during the 16 years is that I switched from Ubuntu (with Gnome) to Xubuntu (with Xfce) [1], but I still call it Ubuntu. When people ask me about the difference between the two, I respond "color". I spend 90% of my time on the terminal, and I prefer the gray-blue style of Xfce and its lightness when I use the window system during the other 10%.
Before starting Linux, I had a very limited idea about what a computer is and how it works. An anecdote I often relate to my students is that I once copied the icon of an application (Turbo C) from one computer to another, believing that I could use the application on the second computer. Linux has taught me what a computer really is.
An "achievement" may be worth mentioning: in the past 16 years, essentially all my work as an applied mathematician has been typed under terminal using Vim, including papers, lecture notes, slides, programs, and particularly my 200-page PhD thesis in Chinese. It was not quite trivial to type Chinese in Vim --- think about it: how to get a Chinese inputting system that can work seamlessly with Vim's key bindings? Fcitx would not work (at least it was the case 13 years ago).
As a fellow alternative Ubuntu user, one additional difference is the LTS support schedule which is shorter for Ubuntu alternatives. For example, Xubuntu 24.04 is supported until 2027, while Ubuntu 24.04 is supported until 2029.
Interestingly, as I understood it, when a release such as Xubuntu goes out of support it does not stop getting updates that are not specific to that respective flavor, so in a way you still have some indirect support, but feels a bit like a gamble.
Good point. My strategy is to buy a new laptop (Thinkpad X1 Carbon) every one or two years and install the latest LTS of Xubuntu, so that a 3-year support is long enough for me.
Over the years, I have developed my notes and scripts to configure quickly a newly installed Xubuntu system on a new computer, so that everything works in the same way as on my old computer. Since I stick with the same brand of laptop (Thinkpad X1 Carbon), I do not feel any difference after the configuration, except that the computer becomes more powerful. I do not want to spend my time on adapting myself to a new system or a new computer.
Buying a new laptop so frequently may sound a bit expensive. It is indeed not if you spend so much time on your laptop as me. A more powerful laptop means that I can finish my work (e.g., numerical experiments) in (much) less time. In this sense, my life is prolonged. This is the only case I know that a common person can effectively trade an affordable amount of money for a longer life, as I often tell my students.
If you’re ever feeling adventurous, I would suggest trying out Debian with XFCE instead of Xubuntu. I recently migrated and even though the installation isn’t as pretty, I find both the installation and the distribution itself to be much more stable and lightweight without sacrificing any important functionality.
I've tried many Linux flavours and some BSDs. In the end I've always come back to Xfce. I haven't tried the more recent Kubuntu's though, but put it off thinking I'd migrate away from a Canonical managed OS.
Not really. But I have introduced (enforced ...) deepin [1] to my wife, who is definitely not a "computer person". She essentially uses only WPS, WeChat, and Chrome. She did complain a few times in the beginning, requesting me to "get her Windows back", but I resisted and the complaints somehow stopped after the first month. This may be a sign that the system is not toooo difficult to use, and that I am an extremely lucky man with an extraordinarily forgiving wife --- I do not advise you to try the same unless you are as lucky!
As someone who works across the stack, I've come to really appreciate seeing "LTS" and I think for me it comes from Ubuntu directly growing up as a kid in technology, understanding it means that people are committed to supporting something for the long term.
Obviously I know there are business cases for this sort of stuff and whatnot generally, but as a kid first learning what LTS meant, I've always appreciated Ubuntu for this.
After 18.04 LTS the distro has been headed in questionable directions.
However, it is the only distro that came very close to unifying the desktop, mobile, server and embedded application spaces. In a way it greatly impacted how people approached designs, as the classic heterogeneous build circus approaches often became a trivial role assignment.
It takes a bit of work to get the current builds "usable", but the FOSS curse now tightly couples release cycles to specific application compatibility versions. Or put another way... everything is perpetually Beta eventually, or becomes a statically linked abomination. This is the very real consequence of the second system effect: https://en.wikipedia.org/wiki/Second-system_effect
At least they haven't jammed an AI search indexing snitch into their interface... yet...
I was about to comment that even the 5-year standard support of LTS releases seems to end before I'm ready, but I looked at the release cycle page and 24.04 is posted with a 10 year (ending Apr 2034) standard support lifetime. Is that a typo or did they put the "pro" (paid) support end date in the wrong column?
Interesting observation, that in 20 years installer iso size grew almost ten times: from 643M CD to current 5.7G image, that won't fit even on single layer DVD (not that someone are still using those).
I was also surprised how little difference "default selection" and "extended selection" makes in disk space usage - "extended" is taking up only 1GB more of disk space.
I really wish net installers were still popular. For a while, most distros offered a tiny install image, like a few hundred MB that contained just enough to bootstrap the installer to pull the real OS data from the internet (or other network source).
Sometimes I only have small USB2.0 flash drives. I can't fit a full-fat installer on there, and even if I could, my network connection is much faster anyway.
Hell, installers don't even do on-the-fly updates anymore. You install whatever stale packages are in your install media then go through the process of re-downloading and updating every package anyway.
An offline installer makes obvious sense, but in this modern age, an online installer is superior in every way.
> that won't fit even on single layer DVD (not that someone are still using those).
I do sometimes (even have one of those tall containers full of empty DVDs :-P) but not for installing Linux - i haven't tried it but i think even an old Pentium III machine i have around here could install Linux from USB.
One nice benefit for optical media is that it's hard read only by default. This makes it easy to ensure the install media does not get corrupted by overwrites or malware.
No, it's not read-only. It's just that the writes are somewhat random, and in control of God and physics, rather than, of human design.
(speaking as someone with a big pile of CD-Rs in the attic, most of which have some forms of corruption on them)
I'd love to see a standard like M-Disc in mainstream use. The problem is optical has not kept up with magnetic. M-Disc is about $100 for 100GB. In contrast, I bought a 20TB HDD for ≈$200-300, so about $10/TB, so 100x cheaper. It's as cheap to buy a HDD every year and make a full copy for a century as it is to buy M-Disc.
I don't think that's fundamental, so much as economies-of-scale. Optical should be cheaper per density, more stable, and write-only, but CD was invented in 1982, DVD in the nineties, and we've only made limited progress since then. HDD were on a rapid growth curve until SSDs came in. Today, SSDs are on the growth curve, and I expect will eventually be cheaper than magnetic or optical.
Optical made advancements beyond the DVD. However they caught on only in a limited manner. There is Blue-Ray, now 128GB 4-layer. However, due to the amount of data we generate and consume, long term storage is less of a concern at the consumer level, i.e. there is almost always more where that came from. Content has, simply put, been commoditized.
I never said they didn't, and indeed, cited 100GB optical media. I said they made _limited_ progress.
In 1982, a 20MB HDD was considered large, while a CD is 640MB. That's an almost insurmountable 20x advantage to optical.
By the late nineties, a DVD was 4.7GB, while typical HDDs were maybe 500MB-2GB, giving a more modest advantage to optical.
In 2024, a HDD is maybe 200 times bigger than optical (20TB versus 100GB), while an SSD is maybe 10x bigger (1TB versus 100GB).
Prices are also worth looking at. 100GB media is maybe $10/disk. I remember buying CD-Rs and DVD-Rs in stacks of 20-100, at maybe 10 cents-$2 per disk, depending on type, quantity, and year. The cost-per-byte for optical media has hardly changed in two decades.
The container image for 24.04 is different from that of 22.04 and 20.04. The 24.04 container includes a "ubuntu" user with a UID of 1000, where the previous containers shipped with only a "root" user. The "ubuntu" user does not have sudo turned on by default.
I'm curious, is bringing goodies of CorpIT and manageability
> Ubuntu's Active Directory (AD) Group Policy client, available via Ubuntu Pro, now supports enterprise proxy configuration, privilege management, and remote script execution. It also continues to support AD Group Policy Objects.
make a small step towards more adoption of Linux at work?
From my perspective, having just those two - "Privilege management and the ability to remove local admin accounts" and "Remote script execution" [0] open the gates to consider Ubuntu for wider adoption by orgs.
That’s an Xwayland thing. If your Obsidian has no Wayland support, it will be blurry with any distro. It looks like newer versions use a newer version of Electron that supports Wayland:
Will be installing in a week or two. I'm not a fan of snap, and I know Mint doesn't use it (by default, anyway), but I'm used to Xubuntu and will stick with it for now.
Apparently, the upgrade path isn't quite ready yet. If you're installing now, do a fresh install.
They use snap more and more with each release. It's why I switched over to Pop OS. I like flatpak since it's not baked into the distro. I don't mind if sometimes a theme or something doesn't the system one. I'm not one of those types who is "shocked" and "jarred" by gui differences.
I get the idea behind snap, flatpak, and appimage. But what I don't like is:
1.) config file locations end up all over the place depending on which you use. I like taking my .thunderbird data and just dropping it from one system to the next as one example. Snap makes that harder. Likewise for firefox - the snap version is behind as well.
2.) It solves a problem already long since solved in linux systems - package management. You still need APT or what ever the distro is built with. Two systems solving the same problem often seems like one cook too many. Hell even homebrew in macos is a pain.
Will I finally be able to use a bluetooth headset's high quality audio output and its mic at the same time without significant changes in the terminal?
It's getting embarrassing that every other mobile and desktop OS has no problem doing this but not Ubuntu.
The "solution" other vendors take is to maintain the high quality bluetooth audio output to your headphones while using your laptop's microphone as input. They basically don't use your headset's mic in order to keep your headset in A2DP mode for high quality audio. You can also easily configure this in PulseAudio Volume Control.
You can't work around the fundamental limitations of Bluetooth protocols like A2DP without cheating and using another microphone somewhere else. You can't use the headset mic when receiving high quality A2DP audio, ever, no matter how hard you try. It is a limitation of the Bluetooth standard.
When you switch the device into headset mode the quality turns to potato because it uses bidirectional audio codecs from the stone age in HFP/HSP mode with mSBC or (god help us all) CVSD.
I think this is the result of how bluetooth itself works for bi-directional voice headsets, no? I do get bad quality on every OS when I have to do this, unless it has to do with the audio codec.
AFAIK windows gets around this by having audio devices for the A2DP profile (ie. audio output only) and the HFP profile (ie. audio output and input) and having both be active at the same time. The A2DP device is set as the "default device" which means if you play spotify or whatever the audio goes through that and you hear high quality audio. The HFP device is set as the "default communications device", which is what apps like Teams or Zoom is supposed to use, and has shitty audio quality.
> It’s because of the limited bandwidth in Bluetooth that it has to lower the audio quality
It's not a bandwidth limitation. According to google A2DP supports up to 728kbit/s, but SBC (the default codec) only goes up to around 300. Clearly there's enough bandwidth for mid quality input and output streams.
It is much more complicated than that. That 700kbit/s figure is theoretical performance under ideal conditions: one device, one host, no interference.
In the real world, you get a lot less bandwidth. In my office building I see 300-400kbit/s of real throughput. The radio also only has one transceiver, if you connect multiple devices, they have to take turns broadcasting which cuts your bandwidth roughly linearly proportional to the number of nodes.
Also remember that Bluetooth and WiFi share the same spectrum. A high power or very busy WiFi network nearby will also drop Bluetooth bandwidth. You also get a precipitous drop in bandwidth if there are other Bluetooth devices nearby, as they all have to share the same spectrum.
There's a lot of effort put into mitigating these problems, but either way the real world performance of Bluetooth is much, much lower than theoretical figures. Streaming useful audio over a link like this is not a trivial problem.
At work, I'm currently trying to cram ten separate audio streams into a Bluetooth link for reasons. We're switching to WiFi.
Only in one direction. The microphone is only used by the HFP (hands-free profile), which does not support higher quality codecs. You can't have HFP and audio sink active at the same time, so you are stuck with telephony grade codecs any time the mic is active.
Linux in general has been the most reliable of any OS I've used in the last 5 years or so. Well, except maybe Android.
Windows is the worst of the worst, but every Linux distro I've tried works great and actually supports every feature of Bluetooth. Unlike Windows, who only implements the barest of minimums, and about half of those features just don't work.
I’m looking to do something like autoinstall for my office but is there an easier way have my employees do their own thumb drive (remote users) and get the config they need from me? I’m looking for an MDM Intune like enrollment that’s easier and configured it a certain way. I’m not in a position to use and manage Puppet.
The installer (subiquity) supports the "autoinstall" config files embedded in the install USB/ISO if you want. You can also just point the USB/ISO's autoinstall to an HTTPS address so an update doesn't require spinning new install media (does require the install start with some form of network access then). If you have wildly technical users only then they can even just have them type the autoinstall web address in the grub entry before the standard install media autoboots the entry (this falls apart the second someone isn't ultra techncial). In any of these cases the autoinstall file can still handle both the installer as well as post install custom scripts to do whatever you need but then it's done.
For long term management/enrollment you'd need to look at something beyond just putting stuff on the installer USB/ISO. Ubuntu has Landscape for this but there are some alternatives. Just depends on what you need to manage long term and what you can realistically do (e.g. you note you aren't in a spot to manage something like Puppet).
One advantage with standardized workstation OS images, is the bugs/updates/compatibility only requires a 1 support ticket solution.
Deploy a gpg signed public script to periodically download and install updates from a public server. i.e. anyone that has to update knows the package is from you, and the machine role is pre-defined by you with a config file in "/etc/example/myhost.conf". If secrecy is required, than publish host specific encrypted public payloads named for their primary interface MAC.
This is how to handle clowns pulling drives in colocation data-centers.
First impressions of 24.04 desktop compared to 22.04 is that it feels more polished. A lot of apps on 22.04 had large windows around them taking up way to much vertical space by default.
The foundation is way better than any other OS, so if they just keep improving the UI/UX then it would be a dream to use.
Yes, it's still a problem. People will tell you that's there's "bad hardware" and maybe there is, but here's the thing: Windows will probably work with it, even if it's "bad".
With Lenovo it could be either. Thinkpads have historically worked pretty well with Linux. The more consumer oriented stuff can be hit or miss. Generally the more newfangled features (dual screens, detachability, extra eink displays, etc.), the worse the support. Thinkpads also tend to last longer, so there’s more time for driver support to catch up before they’re broken or obsolete.
"It could be either." This isn't a fact. It's not written down anywhere that anybody can look up. It is essentially lore, rumor, hearsay. That is not a firm foundation for building confidence in a platform.
I don't know. Depends on whether it has devices where manufacturers have written drivers only for windows or also for Linux. Sometimes vendors also hide information on how firmware works to prevent other people from writing drivers for Linux.
Of course end users think Linux doesn't support that hardware, when reality is that the device manufacturers don't support Linux.
It doesn’t matter, the end result is that you are substantially more limited in hardware choice and generally have more trouble with Linux on newer hardware. So if you’re not invested in Linux ideologically, it’s difficult to recommend, for laptops in particular.
That's right. Reading the above thread it seems to me the distinction between "good" and "bad" hardware is made by whether it works with Linux, but that's circular reasoning. Moreover, my intention behind my comment about my new Lenovo laptop was that it can't simply be a matter of unusual low-volume hardware running into support issues.
Poor support for new hardware, yes. Every OS is a series of tradeoffs (Linux lags in new hardware and commercial software, Darwin supports less hardware than any other major OS, NT views the user as the product); you have to decide if the advantages or disadvantages matter more.
Obviously, it's a matter of trade-offs. However, these trade-offs aren't written in stone. If Linux lags in hardware, new or old, that is in part itself a trade-off made by its developers, and if those developers want more people to use Linux, perhaps a change in their priorities would make that more likely. Devote less time to polishing new graphical installers so that Bluetooth is rock solid on more hardware, and maybe more people will use Linux. Or, maybe they won't. That's for the market to decide. But, clearly there are market participants (some of them in this thread) who wish hardware support had a higher priority than it does. Make of that what you will.
> Devote less time to polishing new graphical installers so that Bluetooth is rock solid on more hardware, and maybe more people will use Linux.
I can't imagine that there is any meaningful overlap between people capable of polishing install wizards (UX-centric userspace applications) and dealing with BT (kernel code and plumbing daemons), so it's not really a trade.
> But, clearly there are market participants (some of them in this thread) who wish hardware support had a higher priority than it does. Make of that what you will.
Are those market participants willing to pay for that work, in cash or code? TANSTAAFL.
> Are those market participants willing to pay for that work, in cash or code? TANSTAAFL.
Is Ubuntu demanding they pay for that work in cash or code? Obviously not, since Ubuntu generally offers it for free. You think they do that out of the goodness of their hearts? I don't. I think they benefit from people using their software even for free, otherwise they wouldn't do it. Whatever that benefit is, they'll get less of it if people reject their software because Bluetooth sucks (for example). Suppose that gives them incentive to do something about it. Then what's the problem? Sounds like an efficient market interaction to me.
> I can't imagine that there is any meaningful overlap between people capable of polishing install wizards (UX-centric userspace applications) and dealing with BT (kernel code and plumbing daemons), so it's not really a trade.
Ubuntu pays developers. The more they pay one kind of developer the less they're able to pay other kinds of developers. So yeah. That really is a trade-off for Ubuntu.
Sure, unless you've already purchased the hardware, or had it purchased for you outside of your control.
All I'm saying is, if the the Linux developers made a better product, probably more people would use it. That is entirely independent of the fact that you can say the same thing about the hardware manufacturers.
Hardware choice being limited is a problem Linux has. You are defensive about it not being Linux’ fault, but the point is that users generally don’t care whose fault it is.
I’m a Linux user, by the way; but only on my servers.
Definitely it is a problem linux has, and users do not care and sometimes shouldn't care whose fault it is. But if someone asks, then it's okay to give this perspective in my opinion.
There are other reasons not to like linux. Like needless fragmentation due to dynamic libraries versions, like unstable desktop environments for years, like lack of commercially supprted desktop clients for basic office stuff like email and calendar (I have used KDE, Evolution and Thunderbird and all of them feel just slightly underpolished). So, :shrug:
It's ok, but what I'm reacting to is any implication that it's simply a matter of the hardware being "bad" and a figurative shrugging of the shoulders with the further implication that the fault, if there is one, lies squarely with the hardware manufacturers for making "bad" hardware and the users for choosing "bad" hardware. Let's set aside value judgments and the assignment of blame and agree that if Linux had better hardware support, probably more people would use it, shall we?
Lenovo is a good brand and the one issue I've had with several models is Bluetooth audio. You and I had different experiences with the same brand of hardware. To a potential new user considering Linux, do you think our exchange would increase or decrease their confidence about adopting Linux?
If you're using new hardware it's a crap shoot unless you get a machine ready for linux or you do a bunch of research on hardware that has known hardware support for linux. It's always been this way, although the situation has gotten better over the years.
Ah good times. I started out with one of the early ones, maybe 4.04 or something like that. Ubuntu was my first Linux, and then Mint. Unfortunately, for work purposes, I had to switch to MacOS (not a bad OS), but I still miss Linux from time to time, especially reading about that story on the frontpage about people losing access to their Apple IDs. I think I'll give it another shot soon...
I probably shouldn't say that but I have been daily driving Ubuntu for about 10 years. There are certainly 'rough' edges but anyone technical should be comfortable getting it done.
Increasingly I think Windows is an self-driving automatic car where Ubuntu is a manual car that you have to drive. Sure, you'll have to bang around under the hood occasionally but if your job is working on cars, it shouldn't be a problem.
The common refrain of Bluetooth not working is completely gone in my opinion. I haven't had to mess with BT since I put a BT dongle in and installed Blueman. My current issue is that if I boot into the most recent kernel, my Wifi doesn't work. No problem, just boot into the last kernel. I am going to guess the wifi thing is probably my fault because I _abuse_ my installation with all kinds of weird things. Again, it's a manual car that lets you bang around under the hood.
I have seen others work on Windows and it is astonishing to me how often their development environments break. I have to get on calls with them and try to step them through how to fix their environment because a) windows sucks b) it's an 'automatic car' that doesn't want you banging around under the hood, it actively tries to stop you c) it doesn't use a sh shell language so you have to specifically cater to it
Desktop users who last used, and liked, Ubuntu during the pre-Unity/pre-GNOME-3/pre-Wayland era (before 2011 or so) may not like it so much these days.
It's a very different default experience now.
While I have fond memories of the earlier releases of Ubuntu, the more modern releases have been pretty much unusable for me any time I've tried them, unless I redo the default desktop environment with something sensible. Snap certainly hasn't helped the overall usability, either. Having to make so many changes pretty much defeats the purpose of using a distro in the first place.
I've also helped several non-technical and semi-technical macOS and/or Windows users who've wanted to try Linux. The ones who wanted to try Ubuntu first ended up much happier when I eventually introduced them to KDE running on X.
4.10 was the initial release, afair :) With 6.06 (postponed by two months to "get the release right", due to the extended support cycle) as the first LTS release. I fondly receiving 5.xx-release-CDs in the mail and distributing them on our uni campus and in schools back in the day!
The only change during the 16 years is that I switched from Ubuntu (with Gnome) to Xubuntu (with Xfce) [1], but I still call it Ubuntu. When people ask me about the difference between the two, I respond "color". I spend 90% of my time on the terminal, and I prefer the gray-blue style of Xfce and its lightness when I use the window system during the other 10%.
Before starting Linux, I had a very limited idea about what a computer is and how it works. An anecdote I often relate to my students is that I once copied the icon of an application (Turbo C) from one computer to another, believing that I could use the application on the second computer. Linux has taught me what a computer really is.
An "achievement" may be worth mentioning: in the past 16 years, essentially all my work as an applied mathematician has been typed under terminal using Vim, including papers, lecture notes, slides, programs, and particularly my 200-page PhD thesis in Chinese. It was not quite trivial to type Chinese in Vim --- think about it: how to get a Chinese inputting system that can work seamlessly with Vim's key bindings? Fcitx would not work (at least it was the case 13 years ago).
[1] Xubuntu 24.04 released! https://xubuntu.org/news/xubuntu-24-04-released