Linux Problems
Linux





Linux is suffering from Fragmentation





Desktop Linux is 400+ distros flying (as much as penguins fly) in loose formation.

We should try to shift the culture toward some consolidation instead of everyone creating new distros and apps. Who needs 400 distros and 40 different tweak-OS-settings apps ? How about 20 and 3 ?



Prices we pay because of fragmentation:

Paraphrased from Dmitry Vyukov's "Syzbot and the Tale of Thousand Kernel Bugs" (video) (9/2018):

Every new distro represents a forking/multiplication/replication of the existing bugs in the original code of kernel, user-space code, and apps. Many of the distros "handle" the huge steady flood of bug-fixes and security fixes from upstream by ignoring it: freezing on a specific release of the upstream distro or kernel. This keeps bugs (including security holes) in place for years, to bite people again and again.



From discussion 2/2020 on /r/windows:

> What made you switch back to (or come to) Windows
> as your primary system after using Mac/Linux?

I used to use both but realized if you want to consume multimedia content comfortably you have to have Windows.

Linux is great for servers and stuff like that but as a daily OS it sucks. I tried bunch of different distros but there always seemed to be an issue with drivers, apps or compatibility. Ubuntu, Mint, Debian variants, Fedora. They work ok but there always seem to be some form of tinkering requirements on a regular basis. Ain't nobody got time for that!

...

Linux is good if you want to waste 3 days getting your graphics card to work with 3D function. Seriously, f*ck Linux. Waste of time system on a PC.

...

Before switching to Windows 10 I tried giving Linux a final chance because I was going to wipe my system anyways. Ran into a driver issue with a RAID card. Downloaded the driver from Intel ... and it was in the wrong package format. F*ck Linux. Constant problems like that.

Edit; fix your goddamn stupid driver support instead of creating a new distro every week!



From someone on reddit 5/2020:

Linus Torvalds On Future Of Desktop Linux (video; see minutes 3-5) (6/2019)

The creator of the Linux kernel blames fragmentation for the relatively low adoption of Linux on the desktop. Torvalds thinks that Chromebooks and/or Android is going to define Linux in this aspect.



From Chris Fisher on Linux Unplugged podcast episode 358 at 1:02:15:

There are so many areas where it feels like you're running a desktop environment on top of a command-line environment which is running on top of a kernel. You can feel that stack, sometimes.



From someone on reddit 5/2019:

My opinion is that the very things that we Linux users love about our platform are the same things that have prevented it from becoming a real contender as a viable desktop alternative to Windows or macOS. Linux is all about choice. Unfortunately that choice has splintered Linux into 300 active distributions. There are now at least 18 desktop environments. There are over 40 music players alone to choose from. There are even more than 20 sys init systems. The choices go on and on and forks, which can be done by any person or group, add even more confusion. Can you imagine trying to manage a help desk for mainstream Linux users who are lay people who purchased a computer running Linux pre-installed? It would be a nightmare! Sure there are some hardware vendors who ship Linux systems but those are aimed for developers and Linux geeks like us, not for mom and dad.



[Even what may seem to be a single project may not be:]
From someone on reddit 6/2020:

In GNOME, just like in most other OSS projects, there is no "leader" that decides while the rest listens. GNOME itself for example is just a collection of projects, where each maintainer for their own project decides what to do with it. They agree on some things like a release schedule, and try to follow the GNOME HIG, but that's basically it.



Summarized from Frank Karlitschek talk 2019 Linux App Summit (video):

Desktops (GNOME, KDE) are fine, and were okay 10 or 20 years ago. Base apps (file managers, mail clients, browsers) are good. What is missing is the third-party app ecosystem.

Missing from / problems in Linux: central developer portal, stable APIs, consistent desktop APIs, consistent desktop functionality (e.g. no systray on some DEs), cross-platform toolkits and libraries (e.g. GTK on Windows and Mac), packaging (app release cycle gets tied to distro release cycle; snap/flatpak/appimage are promising but there should be only one; Electron is a symptom of devs avoiding Linux APIs).

Can we somehow merge/coordinate KDE/GNOME/Gtk/Qt a bit ?

Open app stores separate from distros (e.g. Flathub, but even better if federated).

Agree on one packaging format.

Integrate packaging into IDEs. You should be able to push a button in the IDE to package and app and publish it into N stores.

We (the world) need a free and open desktop. We need it for privacy, freedom.



Tobias Bernard's "There is no 'Linux' Platform (Part 1)" (and read the comments)
Tobias Bernard's "There is no 'Linux' Platform (Part 2)" (and read the comments)
Emmanuele Bassi's "Dev v Ops"
Ask Noah podcast, episode 152 "Too Much Choice" (audio; mainly minutes 8-20)
Dmitry Vyukov's "Syzbot and the Tale of Thousand Kernel Bugs" (video) (mainly minutes 7 through 11)
V.R.'s "systemd, 10 years later: a historical and technical retrospective"
We have met the enemy, and he is us.



Another facet: Linux has a "hoarding" problem:

We keep adding new stuff without ever getting rid of old stuff. So the junk keeps building up, the overall system keeps getting more complex, the available resources keep getting more and more diluted, more and more duplication of effort.



Penguins

Areas that should be "consolidated" a bit:

Installing / updating / package management is far too fragmented (see for example DistroWatch's "Package Management Cheatsheet"); we need a couple of standards, and push the various apps and services to use those. Relevant (but 2014): Pid Eins' "Revisiting How We Put Together Linux Systems"



How to do it:

"Some consolidation" is not something one developer can do. We need to change (by persuasion) the culture, the attitudes, of the major devs and managers in the community.

...

Many people don't like to hear this. "HE'S SAYING NO ONE SHOULD DO ANYTHING NEW. HE WANTS TO STOP ME FROM DOING WHAT I WANT. HE WANTS LINUX TO BE LIKE MICROSOFT OR APPLE. HE'S EVIL, BURN HIM !"

Actually, what I'm saying is that the adults, the devs who do the big work and run the major distros, should think about ways to consolidate things a bit. For example, do Ubuntu, Lubuntu, Xubuntu, Kubuntu, Ubuntu Budgie, Ubuntu Kylin, Ubuntu MATE, Ubuntu Studio, Cubuntu, Fluxbuntu, Ubuntu Mini Remix, UbuntuLite, Mint and many more (see Ubuntu wiki's "Derivatives") all need to be separate distros, or can they be one with various install and config options ? There would be a benefit to the community, in terms of mindshare and bug-fixing etc, if they could be one. Maybe there are technical reasons they can't be; I'm no expert. And I'm sure there are organizational/political/legal/strategy conflicts that would prevent some of this. But I'm putting forth the idea. Having 400 distros (see GNU/Linux Distributions Timeline) imposes costs and holds back Linux.

If all the *buntu's and Mint*'s and Elementary OS became one distro "Ubuntu+", then when you fix a bug in that one distro, it's fixed in all the combinations. One distro name ("Ubuntu+"). One installer. One set of release images. One repo. One set of tests. One bug-reporting and bug-tracking system. One set of documentation.

I'm told Debian does something like this ? Near the end of the installer, it gives you a list of available DEs and says "pick one". One ISO and installer for all N configurations.

I'm told Manjaro GNOME has something a bit like this ? In Manjaro "Hello", there is "GNOME layout manager" ?

[From someone on reddit 6/2020:
"OpenSUSE lets you try different DEs just by logging out. There's only one distro OpenSUSE and it comes with KDE, Gnome, Xfce, Enlightenment, Mate, LXDE, LXQT, and more."]

Some of the biggest problems are political. I'm sure one reason that distro Y forked off from distro X was that the Y devs/managers didn't agree with decisions made by the X devs/managers. They argued, split, and a fork happened. Merging back in, or even submitting changes back to upstream, would be very difficult.

...

We need variety and choice, but a reasonable level of it. We never should prevent random person X from creating a new distro. But we need more focus among the majority, the core, of the community.



Suppose other areas of the Linux/GNU ecosystem were more like the kernel and GNU ?

The Linux kernel, GNU, and util-linux generally work pretty well and don't have a lot of duplicate effort and forks etc.

Why is that ? Because each has a single owner and standard. This does not eliminate all "choice"; the kernel has pluggable drivers and modules. And it does not kill innovation; the kernel gets new features, new CLI commands get added.

So, suppose other areas of the Linux/GNU ecosystem were handled the same way ? Suppose there was an agreement that systemd was the only init system, and there was a clear central owner of systemd ? It has modular plug-ins, you can innovate on top of systemd, you can add units. The major projects all agree to (over time) rip out any old init structures and only use systemd.

Now, someone could refuse to accept this, and use their own non-systemd init system. But over time they would find fewer and fewer apps and devs and base distros supporting that. The costs of being different would get higher. Just as if they forked the Linux kernel and changed it, and based their distro on that forked kernel. Nothing stops them from choosing to be different, but they'll be fighting against the tide.

Similar with package formats. Suppose Red Hat and Canonical and Debian etc were to get together and say "look, let's try to reduce our differences. let's add the best features of rpm/dnf packaging to dpkg/apt, and then we'll all use the enhanced dpkg/apt, and eliminate any support for the old formats and managers".

Each of these changes would take many years. It would not be an overnight change. But with a clear new standard, slowly people/apps would adopt the new standard.



Corporate funding idea (you won't like it):

Suppose Red Hat was to take a tiny chunk of its billions and say to the Fedora, CentOS, Qubes OS teams: "we will fund you to help port your best features and apps back into base Red Hat, and try to reduce the deltas between our distros. we will allocate some of our devs to help you."

Suppose Canonical was to take a chunk of its millions and say similar to the Mint, Zorin OS, Elementary OS, Whonix, Pop OS teams: "we will fund you to help port your best features and apps back into base Ubuntu, and try to reduce the deltas between our distros, let you become more like 'flavors' of Ubuntu". we will allocate some of our devs to help you."

Suppose Red Hat and Canonical and Debian etc were to get together and say "look, let's try to reduce our differences. let's add the best features of rpm/dnf packaging to dpkg/apt, and then we'll all use the enhanced dpkg/apt". And the effort was staffed by employees of the corps, or funded by the corps.

Google and Microsoft and Apple have tons of money, and some stake in the success of Linux. Any way to tap their funding to implement some consolidation and increased commonality ?













Secure because Linux



Don't expect perfect security just because you're running Linux. You're still relying on a lot of applications and other software to be well-behaved.



Complexity and bugs:

From /u/ninimben on reddit:
"in 2017 the kernel had 454 CVE's which is more than one a day. in 2018 they had 170 which is in the ballpark of one every two days or so" [But these probably are greatly undercounting the actual serious bugs, maybe by a factor of 10x or more; most bugs are not assigned a CVE number. Not all serious bugs are externally exploitable, of course. But they are bugs.]

From Dmitry Vyukov's "Syzbot and the Tale of Thousand Kernel Bugs" (video) (9/2018):
Conclusions from fuzz-testing on the Linux kernel:
"Every 'looks good and stable' release we produce contains > 20,000 bugs.
No, it is not getting better over time.
No, this is not normal."
and
"The kernel does not have a bug-tracking system per se. There are mailing lists with traffic in them."
[Apparently Kernel.org Bugzilla is not well-used.] Lack of bug-tracking

Follow-on talk: Dmitry Vyukov's "LPC2019 - Reflections on kernel quality, development process and testing" (video) (11/2019)

From Artem S. Tashkinov's "Major Linux Problems on the Desktop, 2020 edition":
Critical bug reports filed against the Linux kernel often get zero attention and may linger for years before being noticed and resolved. Posts to LKML oftentimes get lost if the respective developer is not attentive or is busy with his own life.

Counts of "todo" and other words in the kernel source code

As of 1/2020, the kernel had about 28 million lines of code, although most of that is drivers, and code "around" the central kernel. article

A visualization of complexity, not sure it's a problem: Kharacternyk / pacwall



There seem to be a number of serious design flaws or lacks:
Linux X-ray Farhan's "Linux maintains bugs: The real reason ifconfig on Linux is deprecated"
madaidan's "Fixing the Desktop Linux Security Model"
Madaidan's Insecurities' "Linux (in)security"
Bjorn Pagen's "State of Linux Desktop Security"
Bradley Spengler's "10 Years ofLinux Security" (slides) (video)
Matthew Garrett's "Linux kernel lockdown, integrity, and confidentiality"

Vivek Haldar's "How Unix Won"

For servers:
Sudhakar Dharmaraju's "GoodBye Linux: the next OS"



Open-source software:

Using tons of software created by many individual people, frequently updated, is an insecure situation. For example, apparently the node.js/npm registry has more than 1.2M packages in it as of 4/2020. There's no way all of those are checked and safe. Same for the code in Python's pip system or PyPI, Ruby's system, and others.
Modulecounts
Jarrod Overson's "Exploiting Developer Infrastructure Is Ridiculously Easy (The open-source ecosystem is broken)"

Bugs in open-source software, including that used by Linux or common apps/services on Linux, can go undiscovered for years. For example, The Heartbleed Bug

From Daniel Micay (lead dev of GrapheneOS, I think) on reddit 4/2019 (here):
It's just a fallacy that open-source is more secure and privacy-respecting. It's quite often not the case. There's also the mistaken belief that closed-source software is a black box that cannot be inspected / audited, and the massively complex hardware underneath is the real black box. A lot of the underlying microcode / firmware is also a lot higher to inspect.

From /u/longm0de on reddit 2/2020:

Many eyes prevents security backdoors and other security exploits right? Or at least gets them fixed faster? Statistically there is no real and significant data that supports open-source or closed-source software being more secure than the other. You can't easily gauge this statistic either since many proprietary software suites incorporate open-source components as well. Closed-source software can also have "many eyes". Thousands to millions of individuals/entities can be looking at the source code of Microsoft Windows through the Shared Source Initiative. Our government certainly takes advantage of that program.
[Also see Apple Open Source]

From Artem S. Tashkinov's "Major Linux Problems on the Desktop, 2020 edition":
Year 2014 was the most damning in regard to Linux security: critical remotely-exploitable vulnerabilities were found in many basic Open Source projects, like bash (shellshock), OpenSSL (heartbleed), kernel and others. So much for "everyone can read the code thus it's invulnerable". In the beginning of 2015 a new critical remotely exploitable vulnerability was found, called GHOST.

Year 2015 welcomed us with 134 vulnerabilities in one package alone: WebKitGTK+ WSA-2015-0002. I'm not implying that Linux is worse than Windows/MacOS proprietary/closed software - I'm just saying that the mantra that open source is more secure by definition because everyone can read the code is apparently totally wrong.

Year 2016 pleased us with several local root Linux kernel vulnerabilities as well as countless other critical vulnerabilities. In 2016 Linux turned out to be significantly more insecure than often-ridiculed and laughed-at Microsoft Windows.

The Linux kernel consistently remains one of the most vulnerable pieces of software in the entire world. In 2017 it had 453 vulnerabilities vs. 268 in the entire Windows 10 OS. No wonder Google intends to replace Linux with its own kernel.
[But: many bugs are not assigned a CVE number, and it's not clear if different OS teams have similar reporting policies.]



From blakkheim's "Linux Security Hardening and Other Tweaks":

A common misconception about the Linux kernel is that it's secure, or that one can go a long time without worrying about kernel security updates. Neither of these are even remotely true. New versions of Linux are released almost every week, often containing security fixes buried among the many other changes. These releases typically don't make explicit mention of the changes having security implications. As a result, many "stable" or "LTS" distributions don't know which commits should be backported to their old kernels, or even that something needs backporting at all. If the problem has a public CVE assigned to it, maybe your distro will pick it up. Maybe not. Even if a CVE exists, at least in the case of Ubuntu and Debian especially, users are often left with kernels full of known holes for months at a time. Arch doesn't play the backporting game, instead opting to provide the newest stable releases shortly after they come out.



From Daniel Micay (lead dev of GrapheneOS, I think) on reddit 4/2019 (here):

The Linux kernel is a security disaster, but so are the kernels in macOS / iOS and Windows, although they are moving towards changing. For example, iOS moved a lot of the network stack to userspace, among other things.

The userspace Linux desktop software stack is far worse relative to the others. Security and privacy are such low priorities. It's really a complete joke and it's hard to even choose where to start in terms of explaining how bad it is. There's almost a complete disregard for sandboxing / privilege separation / permission models, exploit mitigations, memory-safe languages (lots of cultural obsession with using memory-unsafe C everywhere), etc. and there isn't even much effort put into finding and fixing the bugs. Look at something like Debian where software versions are totally frozen and only a tiny subset of security fixes receiving CVEs are backported, the deployment of even the legacy exploit mitigations from 2 decades ago is terrible and work on systems-integration-level security features like verified boot, full system MAC policies, etc. is near non-existent. That's what passes as secure though when it's the opposite. When people tell you that Debian is secure, it's like someone trying to claim that Windows XP with partial security updates (via their extended support) would be secure. It's just not based in any kind of reality with any actual reasoning / thought behind it.

The traditional desktop OS approach to disk encryption is also awful since it's totally opposed to keeping data at rest. I recommend looking at the approach on iOS which Android has mostly adopted at this point. In addition to all the hardware support, the OS needs to go out of the way to support fine-grained encryption where lots of data can be kept at rest when locked. Android also provides per-profile encryption keys, but has catching-up to do in terms of making it easier to keep data at rest when locked. ... iOS makes it easier by letting you just mark files as being in one of 2 encryption classes that can become at rest when locked. It even has a way to use asymmetric encryption to append to files when locked, without being able to read them.

Really, people just like saying that their preferred software stack is secure, or that open-source software is secure, when in reality it's not the case. Desktop Linux is falling further and further behind in nearly all of these areas. The work to try catching-up such as Flatpak is extremely flawed and is a failure from day 1 by not actually aiming to achieve meaningful goals with a proper threat model. There's little attempt to learn from other platforms doing much better and to adopt their privacy and security features to catch up. It's a decade behind at this point, and falling further behind.

Also, all these things about desktop Linux completely apply to anything else using the software stack. It doesn't matter if it's FreeBSD or whatever. FreeBSD also has a less secure kernel, malloc, etc. but at least it doesn't have nonsense like systemd greatly expanding attack surface written with tons of poorly written C code.

...

There are literally hundreds of serious, game-over vulnerabilities being fixed every month in the Linux kernel. There are so many vulnerabilities that vulnerability tracking and patching doesn't scale to it at all. It has no internal security boundaries. It's equivalent to running the entirety of userspace in a single process running as full unconstrained root, written entirely in C and assembly code rather than preferring memory-safe / type-safe languages. Watch this talk as a starting point: Dmitry Vyukov's "Syzbot and the Tale of Thousand Kernel Bugs" (video)

...

> you've said Flatpak is flawed. is Snap any better as an app sandbox?

No, not really. They're both fundamentally flawed and poorly implemented. They're a lot worse than even the very early Android sandbox from a decade ago before all of the work on hardening it and improving the permission model. They're approaching it completely wrong and treating it as if they need to figure out how to do things properly themselves, by not learning from existing app sandboxes.

... It's a fundamentally broken approach to implementing a sandbox. It doesn't draw an actual security boundary and fully trusts the applications. The design choices are being made based on the path of least resistance rather than actually trying to build a proper security model. There's a big difference between opportunistic attack surface reduction like this and an application sandbox, which these are not implementing. They cannot even be used to properly sandbox an application no matter how the application chooses to configure the security policies, even if the app is fully trustworthy and trying to do it. The implementation is not that complete. It could certainly be done properly but it would require a huge amount of work across the OS as a whole treating it as a unified project, along with a massive overhaul of the application ecosystem. I can't see it happening. It requires throwing out the traditional distribution model and moving to a well-defined base OS with everything outside of that being contained in well-defined application sandboxes with a permission model supporting requesting more access dynamically, or having the user select data as needed without granting overly broad forms of persistent access.



From /u/longm0de on reddit 2/2020:

[In the context of "why do people go back to Windows"]

I feel security is a massive burden put upon Linux developers. Linux was not made to be "the most secure" system in the world or even secure at all. Linux was made with portability in mind and with portability there can be conflicts with security mechanisms.

Take an actual look at the Linux kernel, for a long time it lacked security that Windows NT had since its release. It’s important to know that the NT lineage of Windows is not based off of or even similar to MS DOS or OS/2-like Windows 95 and etc. The NT lineage of Windows is initially based off of VAX/VMS (now known as OpenVMS) and still largely is based off of that architecture as developed by Dave Cutler and his team. Windows NT from the get-go had users, roles, and groups as well as proper access control. NT contains discretionary access control lists as well as system access control lists which can be used for auditing in comparison to Linux which relied on rudimentary RWX permissions with an owner-group-world philosophy. SELinux finally brought discretionary access control lists to Linux as well as mandatory access control. SELinux is a great thing and should be treated as such - it implements a form of MLS. Windows later on added a form of MLS to known as mandatory integrity control. Nearly all objects in NT are securable with DACLs and auditing such as processes, threads, sockets, pipes, mutexes, etc., NT has an underlying unifying security principle. In later versions of Windows (such as Vista) UAC was implemented with the Administrator account so that even administrators didn’t execute things as administrators, but had to explicitly grant permissions. It’s stated that UAC is insecure because in normal implementations, it is just a "yes" or "no". This is largely untrue, UAC in its current default configuration is ran in Secure Desktop Mode which prevents software input emulation as well as keylogging. In Linux, if I want to run a program elevated, I have to use the terminal and on X11, I can just intercept the key events and then log the users password without any high privileges. Where is the security in that? Windows has exploit mitigation policies which are VERY similar to hardenedBSD and grsecurity/PaX. Many Linux distributions don’t even want to use grsecurity/PaX and the kernel developers don’t even want to support it because it may "break" some devices.

Again, Linux was made for portability, not security. It’s not exactly "insecure", but it’s not exactly secure either. Also, I don’t run any anti-malware on Windows (for resource purposes I even disabled Windows Defender by setting -X on core files it requires), and my computer hasn’t received any malware, and years back the only time my PC did receive malware was due to being socially engineered. There is nothing about Linux that magically prevents malware - nothing about its architecture as compared to Windows accomplishes this. When somebody can make an actual case about its architecture - I will change my mind. No, don’t point out access control that Windows already has. Windows on the other hand has driver signature enforcement, kernel patch protection, AppContainers, etc. You can even configure Windows so that the only applications to run with administrative privileges have to be digitally signed. There is a lot you can do in terms of security on Windows systems.



Not going to happen: RIIR: Rewrite Linux using Rust programming language:

An important point made by some people: we really should stop using the C programming language (created in 1972-3). It is not memory-safe and type-safe, doesn't have the concepts of exceptions (it always just does something and keeps going), heap, strings. Unfortunately, the Linux kernel and much of the user-space code is written in it. This leads to tens of thousands of bugs in Linux today, including security vulnerabilities. Maybe C is appropriate for very low-level system programming, as an alternative to assembly language. But for apps and services and modules, not.

What is better ? Probably Rust.

Dominus Carnufex's "Rewrite the Linux kernel in Rust?"
LWN thread
Quora thread
Serdar Yegulalp's "4 projects ripe for a Rust rewrite"
tsgates / rust.ko
Joel Spolsky's "Things You Should Never Do, Part I"

This would not help/solve issues such as all of the kernel code operating in one memory address space at one processor privilege level (lack of compartmentalization). A bug in device driver X still could mangle something in iptables code Y, for example.

But it should help get rid of entire classes of errors such as buffer overflows and use-after-free.

People bringing up this idea have provoked a "so you go do it" reaction. "RIIR: You're telling existing devs to go do a ton of work." A fair point. Except that the work would be pointless if at the end the existing devs reject the new code. And indications are that devs all the way up to Linus Torvalds would reject it.

A rewrite would solve some classes of low-level problems, not fix bigger problems, be an ENORMOUS amount of work, and be resisted by the existing devs. Not going to happen.



Now Linux desktop users are using the same browsers etc as the Windows people are, so vulnerabilities seen on Windows are more likely to exist on Linux too. Same with PDF docs and Office macroes. And with cross-platform apps such as those running on Electron or Docker. And libraries (such as the SSL library) used on many/all platforms. An exploit may work the same way regardless of the underlying OS type.



Your own actions can seriously compromise the security of your system:
From Easy Linux tips project's "Avoid 10 fatal mistakes in Linux Mint":
Software from third-party repositories (like PPA's) and external .deb installers, is untested and unverified. Therefore it may damage the stability, the reliability and even the security of your system. It might even contain malware ...

Furthermore, you make yourself dependent on the owner of the external repository, often only one person, who isn't being checked at all. By adding a PPA to your sources list, you give the owner of that PPA in principle full power over your system!













Some cautionary experiences with Linux



/u/sng_shivang's "Why I came back to Windows from Linux?"
Christopher Shaw's "My Adventure Migrating Back To Windows"

Gamers may have a tougher time with Linux than with Windows. Vendors target the biggest market first and best.

Same for video-editors and such.

Open-source software may be great, or may have one guy working on it occasionally and be really hit-or-miss.



From someone on reddit 4/2018:
> I'm sorry it's a bit of a rant and I might sound like
> a noob to you all, I'm really disappointed and not in a
> good mood at the moment. I've been using Linux only for at
> least 6 months and I've been in love with it when I decided
> to make the switch for good ... and I'm beginning to think
> it sucks. Tonight I had to do a simple slide-show for a client
> and I used mostly Shotcut but I tried Openshot and Kdenlive
> and the three of them was horribly buggy and a nightmare ...
> I really didn't enjoyed my experience and it pissed me off.
> I do not understand, most of those softwares have been in
> development for years and they look like in beta phases or as
> if only one person worked on it, but there's a big community
> and I keep seeing donations for open source I don't think money
> is really an issue. As for bugs I didn't even try to break them,
> they struggled with tasks such as fade in and fade out, transitions,
> adding texts, very basic stuff, I had to restart Shotcut like
> 4 times because it couldn't add the pictures on the timeline,
> and it's a well-known bug that is from around 2016. I'm on
> Ubuntu Mate and everyone says Ubuntu is a stable distro for gaming
> and doing work so I installed it. The only softwares that are
> stable to me is Blender, Krita, Gimp, Inkscape and Godot. As for
> Gimp it really is very powerful but there are some tools that are
> missing that Photoshop has, and if I go for the latest version
> it's very slow and not usable. I do a lot of multimedia and I
> don't think I will survive ... There's Natron, Fusion 9 that I
> didn't used yet but they are compositing softwares, I don't think
> I can do a lot with them as for video editing. It's already hard
> to not be able to play recent video games, if it also removes tools
> for working and being creative there's just no point to stay or to
> suggest it to anyone.

I think you have a wrong picture here. Most open-source projects indeed have only one (or very few) developers working on them, and get very few (if any) donations.

Ubuntu is mostly stable in the sense of "let's not change it after the release". That's great to avoid introducing new bugs, but not so great to remove old bugs.



From /u/BlueGoliath on reddit 9/2017:

As someone who previously used Windows and now uses Linux for 90% of my time now: If you are going to switch to Linux, be ready to deal with bugs, piss-poor UI design, hardware incompatibilities, and other issues.

Despite what you hear on tech sites about how great the Linux community is, it really isn't. If you complain about Linux you are most likely going to be met with one of the following: Yeah it sucks but that's the current mentality of the Linux community. Be ready for it.
[Re: "piss-poor UI design": probably not a problem if you spend most of your time in a browser, desktop, and a couple of major applications.]



From /u/OnlyScar on reddit 3/2018:

Around 6 months ago, I made the move to Linux. I am not a gamer, so it was easy for me. To make the experience more authentic, I installed linux on my main machine and didn't dual boot Windows. It was only linux for me. It has been an interesting journey, but sorry I can't take it anymore. Please note that I am strictly speaking as a non-developer, non-geek but a "power user". My reasons might not apply for developers and very technical users. Below are the reasons am going back:

1) Windows vs Package Manager Repo System : Repeatedly I was told that the software repository and package manager system of linux is much superior than the Windows system of downloading .exes from developer sites. This is such a lie that it's not even funny. The reason: age of software. Win32 .exe softwares get updates independently from the base OS. You can use Windows 7 and guess what, your favorite softwares will all run at the LATEST version. I repeat, you can use Windows 7 and your Blender and Krita will be at the latest version. What the version of Blender and Krita on Ubuntu 16.04 or 14.04? Is Ubuntu 14.04 even usable for normal desktop use anymore, consider its software repo age? And no, am not using any rolling distro or Fedora because their stability doesn't hold a candle in front of Ubuntu, mint, debian stable, win 10 or macOS. Also I shouldn't have to upgrade my OS just to get the next version of software. This is absolutely unacceptable and ridiculous. The fact that my softwares stays fully cutting edge, up to date on Windows while the base OS stays same is extremely important.

2) Security and BSODs etc : Contrary to FUD, Windows 10 is actually very secure unless you want to download softwares from crackedfreesoftwares.ru. You DO NOT need a separate antivirus, Windows Defender is now enough. It runs like a dream on most hardware. And Windows do NOT force upgrades in the middle of work. BSODs have long been a thing of distant past. Basically am saying that repeatedly using the boogeyman of security, bsods etc isn't working.

3) Atrocious Desktop Environments : My main reason of ditching linux. Linux DEs are such a sad joke compared to Windows (or Mac) DE that it is not even funny. Let's start, shall we:

i: GNOME: The DE suffers from MEMORY LEAK for god's sake. Performance is pathetic, much much worse than Windows 10 or mac DE. This is also the main default desktop of linux world, which actually says a lot about linux. It's absolutely unthinkable for us to even use a DE which suffers from extreme memory leak, and developers doesn't even shows any intention of fixing it. It is just unthinkable on Windows. Gnome is also unusable out of the box, and you have to use random 3rd party hack job extensions just to get a basic fully functional DE. You need to download a software to get simple minimise button. Simply Unbelievable. And you guys, like a bunch of callous users, continue to support it and use it while happily doing ALT+F2 -> r. Lame.

ii: KDE - So, so many small random but crucial bugs that it is really impossible to list them all. They try to emulate Windows, and does a pretty poor job. For example, just use the "hover option" on KDE task bar. See the quality of preview. Does KDE devs even know how important that single function is? Small random bugs like this simply makes it inferior to Windows DE.

iii: XFCE - Thanks, but no thanks. Its 2018, not 1998. No hover option btw. Too basic and limited.

iv: Cinnamon - Too strongly tied to Linux Mint, a distro indulging in many questionable practises. Bad aesthetics. What up with that huge square-like menu? And why does the menu size increases when I add favorites?? It's already too big anyway. It just looks like a cheap rip-off of Windows XP.

v: Mate - Still too basic compared to Windows.

vi: Tiling windows managers - Unusable and irrelevant for non-developers, non-geeks.

Anyway, for me default DE matters. Even if the perfect DE exists somewhere in the wild, if a distribution chooses a subpar DE, it says a lot about them and their focus on user-friendliness. And since most of the linux world has enthusiastically opted for Gnome 3, a pathetic subpar incomplete DE, it says a lot about you guys.

4) Sickening Hypocrisy of the Community : Let's start, shall we - i: Saw multiple caustic rants about how MS Windows 10 provides a poor inconsistent UI because of 2 settings menu (legacy and metro). And you guys say this while primarily using a piece of jewel like Gnome 3. /s ii: Linux is all about control. Just don't expect a fcking minimise button by default on popular DEs like Gnome and Pantheon. OK got it. iii: The arrogance and know-it-all attitude of gnome devs and elementary OS devs will put the arrogance of MS and Apple to shame. But i guess that's okay cause they are your own. iv: Continuously compare Windows from 2002 to Linux from 2017 and try to prove your point about how linux desktop is superior. Continuously attack MS for telemetry and control and while happily using Google services and FB. Giving Apple a pass cause they are unix. The list goes on and on ...

5) Last but not the least, atrocious softwares - Yeah guys, accept it, LibreOffice and GIMP sucks balls compared to MS Office and Photoshop. Krita gives MS softwares a run for their money, but LibreOffice and GIMP are simply cringy embarrassments. You will get fired if you dare to make a presentation with LibreOffice Impress in a corporate environment. It is so bad. VLC Media Player is out right bad compared to Pot Player on Windows. Nothing on linux compared to MusicBee on Windows. I won't even embarrass you guys by talking about JRiver Media Center. Most linux desktop softwares simply lacks the features, polish and finesse compared to their Windows counterpart.

And no, it is not MS or Adobe's fault that those softwares are not available on Linux. You guys continuously rant about evil proprietary software. Upstream major distros like Debian and Fedora doesn't even include proprietary softwares in their main repo. Then why should proprietary software companies release their softwares on linux? What sort of a weird entitled demand is that? Why should proprietary software companies accept second-class treatment on linux and hear some caustic remarks from Gnome devs and Debian greybeards? It was up to you guys to provide a real 1:1 alternative to MS Office, Photoshop and various other proprietary softwares, and you guys failed.

And yes, hardware support and display quality is much better on Windows. The fault again lies with Linux. If you treat proprietary drivers and firmware as second-class citizens, don't expect hardware developers to go out of their way to support Linux. That's an unfair demand.

Bye. After experiencing Linux, my respect for Microsoft and Windows 10 has increased by a 1000 times.

IMPORTANT EDIT - REASON FOR WRITING THIS POST - This problems have bugged me since the beginning. But I came to linux at a tumultuous time, when Ubuntu has abandoned Unity (so Ubuntu Unity 16.04 is a dead horse), and Ubuntu 17.04 and 17.10 are only interim releases. So I cut linux desktop and Canonical some slack and waited for the next LTS. Today I tried Ubuntu 18.04 Beta and guess what? Lo and behold, the glorious memory leak is still present. And my head just exploded in rage. :/ So much effort, so much time spent tweaking, so much distro hopping, so much anticipation to permanently shift was all for naught. That's why I made this salty post.
From /u/UncleSneakyFingers on reddit 3/2018:

I have the same experience as you. This is my first comment on this sub, but a lot of users here are living in their own universe. I see so many posts on the various Linux subs describing issues that are simply unthinkable. Windows just works, Linux just breaks. I still try learning Linux though just to increase my skill set. But going from win10 to Linux is like going from a Mercedes to one of those old cars you have to hand-crank to start up. It's just ridiculous.

So many users here are willing to spend an entire weekend fixing an issues with their Linux setup, but give up on Windows the first time they f*ck up something basic and get an error message. This sub has really turned me off from Linux in general. When they talk about Windows, it's like one of those infomercials showing someone trying to crack an egg and having it explode all over the place. Just ridiculous exaggerations with no bearing of reality.
From /u/tonedeath on reddit 3/2018:

... The most important point that he made (in my opinion) is that if you install a distro like Ubuntu 16.04.x LTS (a distro that is supposedly designed for non-techies, non-geeks, non-developers, you know regular computer users), a lot of the software in the repos is not the latest versions of things. If you want to run the latest versions, you probably end up Google-ing and finding out how to add PPAs. This is not hard but, it takes more effort and learning than downloading installers on Windows or Mac and then getting update notifications. Why should a user of any current version of a desktop distro not at least be offered to be updated to the latest version of apps? It's a valid criticism and it should be listened to and addressed. ...
From /u/knvngy on reddit 3/2018:

The Gnome thing is an embarrassing. Looks like amateurish I don't understand what's going there in the Gnome HQ.

But truth be told: Linux has never been really polished, optimized and focused for desktop. The focus on Linux has been: servers, IT networks and now embedded/mobile, where the money is. In the desktop department Linux is OKish it can be used just fine, but I would agree that macOs and even Windows are better in that department.



From /u/ThePenultimateOne on reddit 3/2018:

Bluetooth audio is a pretty messy scene on Linux. For a long time I couldn't get any headset to work consistently on Kubuntu. You would have to go through this painful connect-disable-disconnect-connect-enable loop every single time.

Now I have things working on Fedora ... except for my laptop, which now consistently gets very out of sync. It didn't do this a month ago. It didn't do this on a previous version of Fedora. The whole thing sucks.



From /u/AlejandroAlameda on reddit 3/2018:

Once every few years, I try to give Desktop Linux another chance just for the kick. Here's my recent experience with Linux Mint 18.3. Enjoy :) Installing Mint on real hardware then went quite smoothly, but:
From /u/MaxPayneNoir on reddit 3/2018:

And this is exactly why Linux desktop share is still ~2-3% (and not because it doesn't come preinstalled on laptops, as Torvalds instead assessed: ChromeOS is an already popular Linux only because it "just works").

Not that Linux doesn't work, it works perfectly (significantly less troublesome than Windows and macOS, efficient, lightweight, secure, performing, versatile, free, portable, privacy-keeping and well documented), but you need to learn how to use it. And relying on GUI stuff only is not the right way of using it. Linux is CLI. You may use Graphical apps all the day long, and that's perfectly fine, but system administration, configuration, maintenance, and troubleshooting requires you to type commands in a terminal or on a virtual console. And most people don't like the idea (or are too afraid) of getting their hands dirty on terminals.

Here lays the explanation for the fact that all the people I know who attempted Linux (even ~ 10 engineering, physics, IT, computer science students forced to install it by University) but a single guy, dropped it after a while.

However if you bear it for the first 6 months you'll get accustomed to it, start appreciating it better and see reality for what it is, and probably never look back.



From /u/theth1rdchild on reddit 4/2018:

Hey everyone,

I've been using Windows since I was 4 in 1993. We had a Windows 3.1 box. I've worked in IT for a decade and I still do, but I have next to zero Linux experience.

How ... how does anyone do this? I tried to install Ubuntu server 16.04 raid 1 and every single step from partitioning on required googling and a restart of the entire process. I tried for eight hours just to get a bootable system on raid 1 and things just kept going wrong. Half the information I was looking up contradicted itself, documentation is incomplete and advice is anecdotal and missing important information. Screw it, I thought. I'll install desktop and get used to it before doing crazy stuff. Raid 1 was kind of a nice but not necessary thing. Surely a regular desktop install will allow me to learn and I can try again in a few months.

But holy sh*t, every single thing I want to do that would be as simple as "Google thing I want, Grab newest version from their website, Install or launch the exe" in Windows is a tedious stress-inducing headache in Linux.

As example: Google for a program to show sensor output like temperatures. Open hardware monitor looks cool. Oh it has dependencies. I don't know what mono is. Will it take up a lot of space or break anything else? Sh*t, I don't know. Oh, this forum post has another person trying to learn Linux and he wanted to use this program. Everyone is being rude to him. Oh, Linux can't interface with open hardware monitor very well. Why the f*ck was it the first answer on Google? There's no hardware sensor app like hwinfo for Linux? Okay, I'll search the Ubuntu apps for a temp sensor at least. There's only one. The only notes say that it needs something assigned in terminal to work. Why the f*ck doesn't the installer do that? Oh well, now I typed what it said to in terminal and it didn't take. I don't understand why. Oh, the official page on the app is misspelled for this command and I copied it directly. Okay, FINALLY I have a temperature sensor. And it doesn't display anything beyond the current core temp. Great.

As opposed to: "Google temp sensor. Find speedfan or hwinfo. Install. It runs."

Is the problem me? Is my windows brain just too stuck in a rut to understand why all this tedious BS is necessary?

I think at the least I need a decent explanation of why these are so different so I can maybe understand and work within my limitations better. Any guides I've followed are very straightforward "do ___ then do ___" so I haven't really learned anything about why Linux is the way it is, which seems necessary to functioning in it.

Thanks to anyone who read all that and can help.



From /u/zincpl on reddit 4/2018:

I just had to set up Linux on my new machine for work, took 4 different versions before it would actually install then started booting to a blank screen when I installed the software I needed, took me 2 days of non-stop frustration but now I can finally do something productive.

Basically IMO Linux shouldn't be compared with Windows or Mac, it's made by engineers for engineers, it's not designed to be user-friendly, rather it's designed to give power to the user and assumes the user knows what they're doing.

It really sucks that there isn't really anything between over-priced and underpowered macs with *nix power and free-but-held-together-with-duct-tape linux.



From someone on reddit 5/2018:
So I stopped using [pirated] Windows a year ago since it was problematic. Buying is not an option. So I switched to Linux since it was free, open source, and I am a Science student so I thought it would be pretty useful. A year have passed and I am still a noob (was very busy with my exams already, learning Linux would have been a burden). I have a Dell Inspiron Laptop with Intel HD Graphics 5500, 4 GB RAM and 1TB Hard Disk. I have been switching distros and these are the experiences so far:
  1. Ubuntu 16.04 - Was good but it was a little slow. Plus it wouldn't detect my headphone half of the time.

  2. Elementary OS - Was extremely slow. Took 30 minutes just to boot to login screen.

  3. Return to Ubuntu 16.04

  4. Switching to Ubuntu 16.04 Budgie Remix - Was good. Better than the default Unity both in looks and performance.

  5. Ubuntu 16.04 Xubuntu - Thought this would be lightweight, so installed it. The performance was OK and the look was really bad.

  6. Ubuntu 17.10 - tried to install. My laptop crashed. Couldn't even get past booting screen.

  7. Switch to Ubuntu 16.04 - Performance became slower day by day.

  8. Ubuntu 16.04 Lubuntu - thought that my laptop is low spec, so why not switch to the lightest distro? Well, surprise, Lubuntu encountered issues. The screen flickered often, especially when coming out of suspend.

  9. Finally, now I am in Linux Mint 18.3 Sylvia - The performance is OKish, lags sometimes, hangs out of nowhere.

I will not talk about gaming experience, but in short it is awful.

So, those of you who are new to Linux, this is my message: be cautious before installing Linux and understand Linux very carefully. Linux, as an interface for personal use, is terrible.
Responses:
Some advice: Slow down on switching distros, and find out where your performance bottleneck is by looking at your system usage. It could be the drivers you're using, or applications that aren't properly optimized to run on your OS. Dell offers some Linux driver support; look into that and see if you can replace some of the generic ones with Dell's suggestions.

...

Sounds like some poor configuration or hardware interaction (5400 rpm disk?)

...

The slowdowns and hangs are probably something to do with the disk. At a guess is it made by Seagate? They just love to stall for ages.

The other obvious hang is after doing a large disk write then flushing it to disk. There is a few turntables for this. I wish the distro's would fix these by default. Which is to limit the dirty cache relative to the performance of the disk.

...

Your problems are originating from "Intel HD Graphics 5500"



From people on reddit 6/2018:

Re: Windows vs Linux:

Over the course of the past ten years, I have tried Ubuntu on three separate occasions, on three separate laptops. Each time, I ended up going back to Windows because I couldn't get Wi-Fi to work.

...

Linux is great if you're a dev. I've found that it hits hiccups any time you are trying to do something a bit more consumer oriented, and have to interact with the world of Windows and OSX systems, as well as proprietary software.

Linux was also so customizable, and you could set up some pretty impressive desktop environments, however if something went sideways it would be quite a bit of work to get it sorted. ...

...

Windows just works.

Linux is buggy and unstable, regardless of what people say (I'd rather use OSX over Linux everyday).

I've tried Linux multiple times and distros and never takes more than a day to find a major bug on the system or a problem with software.

...

It depends heavily on the hardware, just like Windows. It's also heavily distro and version specific. I haven't been able to get Fedora to boot from USB without failing in 10 years, but Ubuntu runs every time.

Laptops are another issue ... if you want a Linux laptop, you're best off buying one from System76, Pine64, or Dell/HP with Ubuntu pre-installed. Wireless support has always been iffy if you try to install it on a laptop that was designed for Windows.

...

As someone who uses Ubuntu quite often - The Non-LTS releases are effectively betas. The newer, bleeding-edge ones are there for those who want them, but you're a lot more likely to find bugs outside of the LTS release.

...

[Currently with LTS] the Ubuntu Store doesn't even work, as a major bug but was reported on their channels.

...

I used to use Linux as my main OS. What happened is that I found I was needing to go into Windows more and more because of the lack of support of programs and hardware I needed for working which became a much more present issue in my life as I got older and spent less time casual computing. There were a lot of alternative software options for Linux but I found most of them to be unpolished and buggy. If you're okay and enjoy the whole troubleshooting aspect, then Linux might be right up your alley. I got to a point where I just wanted everything to work though and spend less time trying to make it work myself.



From someone on reddit 6/2018:

Linux can often break more frequently than Windows - no one likes to hear this and I'm sure people will say the problem is the user etc etc.

esp for rolling release distros, or point release when you do e.g. dist-upgrade, and other times with just regular updates, things can break.

With Linux it then becomes a cycle of 'hope you can find the answer on google, try it in terminal, see if its fixed, try something else' unless you are an expert. This is because of package dependencies in Linux, if you break one others break too. Often you can need to compile from src etc.

Windows has its own version of dll hell, but each program gets its own dependencies managed via WinSxs so you can't get global breakage due to a package. People will tell you that Windows Updates can cause problems but that's really rare - they can be slow though.

You get all the benefits of open source, choice, no ads etc but lets dispel a myth - Linux isn't any more performant or stabler than Windows 10. Windows is rock solid stable, supports every hw ever made and is very fast. It also has better battery life (I've tried both powertop and tlp).



From someone on reddit 11/2018:

I love Ubuntu, but have no more time to resolve the endless bugs it creates.

I adore Linux (Lubuntu is my current distro of choice) and have been using it for more than ten years. It has taught me a ton about how computers work and even created some professional opportunities for me writing about tech.

But as an increasingly busy small-business owner, I no longer have an hour a day to spare sifting through the endless amount of bugs that the OS throws up and am reluctantly about to switch to Windows. I love customization, but at this point in my life I also need something that just works and doesn't impair my productivity.

This week alone: I'm certain that there are a few more. And that if I knew more about Linux, or had more time to devote to resolving these issues, that I could fix some of the above. But I don't feel like I should have to.

Why do things have to be like this? It occurred to me yesterday that I would be more than happy to pay an annual subscription to a service that both guaranteed a level of customization that neither Windows nor MacOS offers, but also had some inherent stability so that bugs like this aren't par for the course. I'm not a poor student any more. But I still love Linux and the philosophy that underpins it.

Or perhaps asking for both stability and what we love about Ubuntu is chasing after the impossible.



From /u/deadbunny on reddit 11/2018:

... the Mint devs do many things badly.

Rather than type out a long reply here is a Debian dev explaining it:

"Linux Mint is generally very bad when it comes to security and quality.

First of all, they don't issue any Security Advisories, so their users cannot - unlike users of most other mainstream distributions - quickly lookup whether they are affected by a certain CVE.

Secondly, they are mixing their own binary packages with binary packages from Debian and Ubuntu without rebuilding the latter. This creates something that we in Debian call a "FrankenDebian" which results in system updates becoming unpredictable. With the result, that the Mint developers simply decided to blacklist certain packages from upgrades by default thus putting their users at risk because important security updates may not be installed.

Thirdly, while they import packages from Ubuntu or Debian, they hi-jack package and binary names by re-using existing names. For example, they called their fork of gdm2 "mdm" which supposedly means "Mint Display Manager". However, the problem is that there already is a package "mdm" in Debian which are "Utilities for single-host parallel shell scripting". Thus, on Mint, the original "mdm" package cannot be installed.

Another example of such a hi-jack are their new "X apps" which are supposed to deliver common apps for all desktops which are available on Linux Mint. Their first app of this collection is an editor which they forked off the Mate editor "pluma". And they called it "xedit", ignoring the fact that there already is an "xedit", making the old "xedit" unusable by hi-jacking its namespace.

Add to that, that they do not care about copyright and license issues and just ship their ISOs with pre-installed Oracle Java and Adobe Flash packages and several multimedia codec packages which infringe patents and may therefore not be distributed freely at all in countries like the US.

The Mint developers deliver professional work. Their distribution is more a crude hack of existing Debian-based distributions. They make fundamental mistakes and put their users at risk, both in the sense of data security as well as licensing issues.

I would therefore highly discourage anyone using Linux Mint until Mint developers have changed their fundamental philosophy and resolved these issues."

Source

Read the comments for more fun examples of how bad the Mint dev team are.

If you want to run a Debian-based system, run Debian or Ubuntu.

Edit: No they have not resolved any of these issues in the last few years since this was posted.

...

The main issue is that Mint doesn't care about security. To quote glaubitz again:

"On Debian, I open up Google and type "Debian CVE-2015-7547" and I am immediately presented with a website which shows me which versions of Debian are affected by the recent glibc vulnerability and which are not. You cannot do that on Linux Mint which therefore disqualifies itself for any professional use."

Due to the frankendebian issue mentioned in my previous post the fact that Mint uses Debian compiled packages (they don't compile themselves) they are reliant on Debian for any and all security fixes. If their frankendebian isn't compatible with the security patches made by debian (due to dependency issues) then you have to wait for Clem et al. to actually patch it themselves. Given their history of rejecting patches and their general security stance I don't have any faith in them to actually do things properly.

Mint also blacklist packages from updates, this means they won't get patched if there is a security update for them. While there is an option buried within Mint to allow these to update, this is not something a noob would be doing. This means your system could be vulnerable even when you think it's fully patched. That is unacceptable.

Mint's selling point is it's ease of use; unfortunately that ease of use comes from the devs having a willful disregard of licencing issues. They ship their ISO files with pre-installed Adobe Flash, Oracle Java packages as well as multimedia codecs (which people want) which violate intellectual copyrights and patents. Unless the maintainers of a distribution want to violate copyright laws intentionally and make themselves attractive targets for lawyers, there is nothing they can do to alleviate that. Debian and others aren't not shipping those packages because they want to make life hard for their users, it's because they cannot, legally speaking.

(This is the reason Debian forked Firefox and Thunderbird and distributed them as Iceweasel/Icedove.)

In this respect Ubuntu actually has licencing agreements which allow them to distribute third-party software through their official third party repos without violating the license terms of the software.

Dedoimedo's "Linux Mint 19.1 Tessa - Adrift"



From /u/gordonmessmer on reddit 12/2018:

There's a class of reasons that I dislike Ubuntu specifically. Ubuntu has at least three completely different installers, all of which use different sets of preseed commands. Documentation for Canonical's own installers is pretty bad. Automating Ubuntu installs for a large environment can be difficult, as a result. I think Canonical is a bad community member, with a history of competing with the community rather than contributing. They repeatedly offer applications which aren't as well supported as an application developed by the broader community, and then after a few years, shut it down. (Examples: Mir, Unity, bzr, probably snaps). If I build something new on top of a solution from Canonical, I'm probably going to have to rebuild it from scratch in a few years' time. Partially as a result, if you look at contributions to almost any major software project for GNU/Linux, Canonical is either very small, or absent completely. They're more of a consumer of Free Software than they are a contributor.



Lots of people say that closing the lid of a laptop to make it sleep, and opening to revive it, doesn't work well on Linux. Seems to be a common problem.
Linux laptop sleep
Apparently there is a long-standing problem with Linux reacting VERY badly to "RAM is nearly full": reddit post

Apparently there is a long-standing problem with Ubuntu and the ~/.Xauthority file that results in people unable to login.



From /u/TheChosenLAN on reddit 2/2020:

I was a full time Linux user for over a year (even bought a Dell Precision 5530 with Ubuntu preinstalled, to support the movement and have HW fully supported by Linux). But after all that time I had to go back to Windows. I just got really tired of tinkering around with my system. I just want something that is standardized and works fully out of the box.

On the effing Laptop, those were some gripes I had: Don't get me wrong. I still have a soft spot for Linux and think it is promising. And I fully understand and support you, if you are running it on your own systems. I love it as a software development platform. But I'm just tired of tinkering with my system and just want it to work. While having it not look like trash and have recent up-to-date software available and being stable. Windows has it's own slew of issues, but none of them are so nagging as the Linux ones. At least for me.

Disclaimer: Two months ago I had to swap my mainboard, because apparently the Intel GPU was defective (maybe an effect from the heat incident, who knows) and Windows (I was dual-booting at that stage) kept crashing because of it. So it may very well be possible that some of those issues appeared because I had a bad mainboard. But the thing is, I only discovered it because Windows clearly stated the crashing module in the BSOD - so I very quickly found the culprit. I have no idea if I would have found the source if I stayed purely with Linux.





Summarized from Neocities' "What Linux struggles with", and then more from the author:
Some flaws in Linux [I omitted some items which are outdated IMO]: [In email 10/2019:

Linux security is pretty much an illusion. An application can do what it wants in the folders it has permissions in - which usually is your whole home folder. Many distros run sshd by default on startup which allows any shmuck to try to crack your password. And some distros have really weak default passwords for root, which presents a real danger. I actually had it happen recently since I guess I didn't even think the root user is enabled on Slackel. Why the f*ck would you have sshd on by default though? It provides nothing but an entry point for hackers.

My article is old and since writing it I've had way more stuff to add in: Many, many more. Windows is still worse though, so whatever. Not that it justifies this stuff, just shows how much of a swamp we're in.

]





Snap Complaints



Good things / intended features:

From /u/lutusp on reddit 4/2020:

Flatpaks, Snaps and Appimages solve the "dependency hell" issue by packaging all required dependencies with the application itself in a separate environment. This solves an increasing serious problem (inability to install and run some applications) with another one -- an application's download and storage size and startup time goes up.

By contrast, an application installed from the normal repositories must find all its dependencies (right version and properties) in the installed libraries, which unfortunately is a declining prospect in modern times.

From someone on reddit:

> What is the potential of snaps? What does it do better than apt?

Snaps are a great way to isolate the program you are executing from the rest of the system. So the main idea behind Snaps is security and ease of install (distro-agnostic), as .deb based programs (and many others like it) are able to access the entire disk (with read-only permission), which can create a lot of security breaches in the system overall. With Snaps you are able to control what the software can read/write, what kind of hardware it can access (i.e. webcam or a microphone) and a lot of other options.

From someone on reddit:
"snaps are compressed, and are not uncompressed for installation -- certain snaps actually are smaller than their installed deb-packaged counterparts"

From /u/timrichardson on reddit 1/2020:
Once, people said the GUI applications were way too full of bloat. And before that, people despised compilers; hand-crafted assembly language is smaller and faster. The history of coding is to trade off memory and disk space for more efficient use of humans; it's the history of the algorithms we use and the tools we use, it's the reason for layer upon layer of abstraction that lets humans steer modern computers. Like the arrow of time, this is a one-way trend, but unlike time, it doesn't just happen, it happens because it saves the valuable time of the creators: the coders, the packagers. Snaps and flatpaks are another example of this. The less time wasted repackaging apps for a million different distributions, the more apps we all get. When you've got 2% market share of a stagnant technology (desktop computing), you should grasp at all the help you can get, if you want to see it survive and maybe even thrive.

And by the way, the binary debs you are used to are not targeted or optimised for your hardware, they target a lower common denominator. The difference can be significant, look how fast Clear Linux is. Maybe you should swap to Gentoo. My point is that you already accept bloat and performance hits in the name of convenience, you are used to it so you don't notice. But traditional packaging is an old technology, is it is so surprising that there are new ideas?




From /u/10cmToGlory on reddit 2/2019:

The snap experience is bad, and is increasingly required for Ubuntu

As the title says. The overall user experience with snaps is very, very poor. I have several apps that won't start when installed as snaps, others that run weird, and none run well or fast. I have yet to see a snap with a start up time that I would call "responsive". Furthermore the isolation is detrimental to the user experience.

A few examples: This is just the short list, using mostly anecdotes. I won't waste my time compiling a more extensive list, as I feel like the folks at Canonical should have done some basic testing long ago and realized that this isn't a product ready for prime time.

As for Ubuntu in general, I'm at a crossroads. I won't waste any more time with snaps, I just can't afford to and this machine isn't a toy or a hobby. It seems that removing snaps altogether from a Ubuntu system is becoming more and more difficult by the day, which is very distressing. I fear that I may have to abandon Ubuntu for a distro that makes decisions that are more in line with what a professional software developer who makes their living with these machines requires.

From /u/HonestIncompetence on reddit:
IMHO that's one of several good reasons to use Linux Mint rather than Ubuntu. No snaps at all, flatpaks supported but none installed out of the box.

From /u/MindlessLeadership on reddit 10/2019:

... issues with Snap as a Fedora user. Canonical don't seem very interested on addressing any of these, which questions whether it's to help the "Linux desktop world" or just push Canonical/Ubuntu.

From /u/schallflo on reddit 10/2019:

Snap:



From /u/ynotChanceNCounter on reddit 1/2020:
It's a bloated sandbox, tied to a proprietary app store, they've gone out of their way to make it as difficult as possible to disable automatic updates, so now trust in all developers is mandatory. Canonical's dismissive toward arguments against the update thing, they took the store proprietary and for their excuse they offered, "nobody was contributing so we closed the source." Excuse me?

And all the while, they're trying to push vendors to use this thing, which means I am stuck with it. And I'm stuck with the distro because they've got the market share, and that means this is the distro with official vendor support for d*mn near everything.

From people on reddit 3/2020:

Snap is pretty much hard-wired not only to Ubuntu, but also to Canonical. Snap can only use one repository at a time, and if it is not the Canonical's, users will miss most of the packages. ... Also, some snap packages simply assume that DE is Gnome 3.

...

... currently Snap (on the server side I think) is not yet open-source.



Snap automatic update issues

I think also you get updates on the developer's schedule. So suppose some horrible security hole is found in library X. Each snap (and flatpak and appimage) in your system may have its own copy of library X. You can't update one copy (package) of library X and know that the issue has been handled. [I'm told that flatpak allows sharing of libraries, if the developer sets that up explicitly, maybe in a case such as N flatpak apps from the same vendor.] [But see Drew DeVault's "Dynamic linking" (not about snaps).]



How is RAM consumption affected ? If I have 10 snaps that all have version N of a library, I'm told the kernel will see that and share the same RAM for that library. Suppose all 10 have SLIGHTLY different versions of that library, point-releases ?



Many people complain that Snaps are slow to launch. Explanation paraphrased from /u/zebediah49: "Has to create a mount-point and mount up a filesystem, load up everything relevant from it -- and since it's a new filesystem, we've effectively nuked our cache -- and then start the application. In contrast to normal, where you just open the application, and use any shared objects that already were cached or loaded."



From people on reddit 4/2020 - 6/2020:




4/2020 I installed Ubuntu 20.04 GNOME, and decided to let it use snaps as it wished:
Ended up with software store and 4 more snap apps in my user configuration (~/snap), and a dozen more for all users (/snap). They seem to work okay, with one big exception: when a snap app needs to launch or touch another app (Liferea launching any downloader, or VSCode opening a link in Firefox). This either fails (Liferea case), or works oddly (VSCode opens new FF process instead of opening a new tab in existing FF process). But: KeePassXC is a snap app, and has no problem opening a link in existing Firefox process.

Some people complain that Ubuntu's store app prioritizes snaps ahead of debs (re-ordering search results to do so), and even has some debs (Chromium) that start as a deb but then install a snap.

Heard: the Chromium snap is broken for 2-factor authorization keys (U2F). Reported since mid-2018, some fixes in pipeline, but still broken. Relevant: Ask Ubuntu's "How to remove snap completely without losing the Chromium browser?"

I'm told: Pop!_OS has adopted a no-snaps policy, Elementary OS has adopted a flatpaks-instead-of-snaps policy, Mint has a no-snaps-by-default policy.

Dev who packaged Liferea as snap said fixing it is complicated, just about as I was giving up on the snap version and changing to the deb version. Works.

VSCode as snap had a couple of issues: won't open a new tab in existing FF process, and seemed to be interpreting snap version of node incorrectly (said "v15.0.0-nightly20200523a416692e93" is less than minimum needed version 8). I gave up, uninstalled the snap version and changed to the deb version. Worked.

The node-based FF extension I was developing can't contact Tor Browser. Removed node.js snap, and did "sudo apt install nodejs" and "sudo apt install npm". But that didn't fix the problem.



One under-handed thing that Ubuntu 20 does: the deb package for Chromium browser actually installs Chromium as a snap. IMO that's deceptive. If it's available only as a snap, don't provide a deb package at all.



Changes Canonical could make to eliminate most of the objections:




How to prevent a snap from ever being updated:
Instead of running "snap install foo", do "snap download foo ; snap install foo.snap --dangerous". That sideloads the snap onto your system, so that it won't get updates from the store. (Doesn't work for "core" snap.)



dr knz's "Ubuntu without Snap"
Wikipedia's "Snappy_(package_manager) - Criticism"

Related info in "Snap" section of my "VMs, Containers, Controls" page





Reporting Bugs



I'm struggling with bug-reporting in Linux.



Different parts have different procedures:



A given part may have a huge "stack" and you may have to figure out exactly where to report:

Example: Pix app in Linux Mint 19 Cinnamon:

Not really in linear order, there are forks in here.

Example: GNOME desktop in Linux Ubuntu desktop 20.04:

Not really in linear order, there are forks in here.




From someone on reddit:
freedesktop.org is a project, which aims to reduce the fragmentation of Linux desktop. They work on interoperability and "host" software such as systemd and wayland. It used to be called X desktop group (XDG), but now they are killing off X11 ("death of Xorg" will be beneficial for the Linux desktop as a whole), so they "rebranded" themselves. GNOME and KDE work with them.

You don't send bug reports about anything to them. You can discuss "standards" stuff, e.g. new wayland protocols on their mailing lists.
...
GTK and clutter are GUI libraries developed by the GNOME team. Qt is a GUI library developed by the Qt company (KDE is using it). These libraries are used by various GUIs. Usually, the programmers using them are those who file bug reports about these.





My experience with Linux



I'm pretty happy with Linux, and Linux Mint. But there are issues

My issues with Linux in general:

My issues with Linux Mint 19 Cinnamon in particular:

My experience 4/2019 after using Linux Mint 19 and 19.1 Cinnamon for about 8 months:

My opinion: installing / updating / package managers is a mess:

I'm not happy about the variety of package managers and installers you have to use. I would like to deal with only Mint's Software Manager and Update Manager apps, but I also have to deal with FlatPak, Docker, Github, apt, pip (Python), bundler (Ruby), tar, npm (Node), yarn, more things I don't know the names of. Some of these are at a different level than others, I don't know.

Some apps (such as Atom) have different builds (of same release, I think) that work differently.

Updating is done in many different ways:
  • Through Update Manager.

  • Most apps that use plug-ins (e.g. Firefox, VSCode, Burp Suite, OWASP ZAP) update them inside the app, using some custom mechanism.

  • XnviewMP and Master PDF Editor check for updates internally and then you have to download and install them separately (not through Update Manager).

  • GNOME shell checks for extension updates and then you have to download and install them from the extensions site through the GNOME shell browser extension.

  • "Oh My Zsh" and npm check and update themselves at the CLI.

  • FoxIt Reader, Thunderbird seem to check and apply updates in a custom way.

  • Snap checks for Snap Store package updates four times each day and applies them automatically ? And Ubuntu updater doesn't tell you what snaps are being updated or any details about the updates.

  • The anti-virus packages all install cron jobs to update signatures, some (Sophos) also update the AV app that way.

  • Some apps (Atom, KeepassXC, OWASP-ZAP, more ?) notify you of the existence of updates, but then you have to download the update or go to the home web site and download the update or do apt-get to get them.

  • Some apps (Windscribe, more ?) notify you of the existence of an update and then stop working, until you update them through Update Manager or elsewhere.


I had hoped Linux would have a more rational install/update situation than Windows does, but it doesn't.

Causes of this:
  • Cross-platform apps find it easier to roll their own internal update mechanism rather than use the different mechanisms on Linux, Windows, OSX, wherever else they run.

  • Cross-distro apps that need to update their database (e.g. security apps) find it easier to roll their own internal update mechanism rather than build and submit update packages for each different distro family and repo.

  • Simple one-dev apps find it too burdensome to build and submit packages for each different package manager type and distro family and repo.

  • Apps with internal add-ons and add-on stores roll their own internal update mechanisms for add-ons.

  • Older apps and services built before there were updaters/stores just used cron, and continue to use it.



Flatpak - a security nightmare

I find the Nemo file explorer to be slow (19.1 is faster). Maybe my laptop has too little RAM (3 GB). I should try a different explorer, and I'm tempted to try a lighter distro such as Xubuntu next time I have to do a new install (I'm thinking of buying a new laptop).

On the other hand, I reported a series of Nemo crashes (on 19) and within days a dev had fixed it and put out the new version. Not going to see that on Windows.

Scrollbars too thin, and I had to try a series of hacks to get them wider.

Often it's unclear where to report a bug. Is it a Mint thing, or an Ubuntu thing, or a Debian thing ? A Cinnamon thing, a GNOME thing, a freedesktop.org thing ?

Often it's unclear where to tweak something. Is it a theme thing, or a Cinnamon thing, or a GNOME thing, or a Mint thing ? Some apps using gtk 2.0, others using gtk3.0, and the config files are separate and with different naming.

My MP3 players don't work well with Linux Mint 19; they worked fine on Windows. Connect via USB cable and delete a file, Linux says it's gone, MP3 player says it's still there. Might be related to Linux not supporting formatting in FAT16 ? But I think it happened even before I resorted to reformatting my MP3 players to get rid of "ghost" files.

The upgrade from Mint 19 to 19.1 was done through Update Manager, but the update didn't appear in the normal window, instead somehow you were supposed to notice that a new item had appeared in the Edit menu of Update Manager ! But the update went smoothly.

My issues with Ubuntu 20.04 desktop GNOME in particular:



My 1/2019 response to "will Linux ever reach 10% share of the installed desktop OS market ?":

To me, a big barrier to people moving to desktop Linux is the bewildering number of variations. Hundreds of distros, a dozen ways of packaging applications (package managers, then Docker, Flatpak, Snap, Appimage, etc).

I would love to see some consolidation inside each of the major distros. For example, some way that all the Ubuntu flavors (including Mint) could become one Ubuntu, and then at install time you pick DE and theme and list of installed apps. Same among the other major variants (Red Hat, Arch, Slack, Gentoo ?). That way someone moving from Windows or Mac really would be given 6 or 8 major choices, not 50 or 200.

And app developers and hardware developers and bug-fixers would have more focus, and less duplication of effort. Linux would get better and better.

...

Also, installation (partitioning and dual-booting) is a big barrier. Even with installers that try to make it easy, it's confusing. Certain options make things happen automatically, others require that user specify the partitioning. I installed Mint, wasn't clear how to get a swap file instead of a swap partition, if I chose encrypted /home then I had to do partitioning manually, etc. And user has to know if they have BIOS or UEFI.





Miscellaneous



Artem S. Tashkinov's "Major Linux Problems on the Desktop, 2020 edition"

John Paul's "What is the Difference Between the macOS and Linux Kernels"
Sohail's "Linux Kernel Vs. Mac Kernel"

StatCounter's "Desktop Operating System Market Share Worldwide"





Search my site