Much of the software you use is riddled with security vulnerabilities. Anyone who reads Matthew Garrett knows that most proprietary software is a lost cause. Some Linux advocates claim that free software is more secure than proprietary software, but it’s an open secret that tons of popular desktop Linux applications have many known, unfixed vulnerabilities. I rarely see anybody discuss this, as if it’s taboo, but it’s been obvious to me for a long time.
Usually vulnerabilities go unreported simply because nobody cares to look. Here’s an easy game: pick any application that makes HTTP connections — anything stuck on an old version of WebKit is a good place to start — and look for the following basic vulnerabilities:
- Failure to use TLS when required (GNOME Music, GNOME Weather; note these are the only apps I mention here that do not use WebKit). This means the application has no security.
- Failure to perform TLS certificate verification (Shotwell and Pantheon Photos). This means the application has no security against active attackers.
- Failure to perform TLS certificate verification on subresources (Midori and Xombrero, Liferea). As sites usually send JavaScript in subresources, this means active attackers can get total control of the page by changing the script, without being detected (update: provided JavaScript is enabled). (Regrettably, Epiphany prior to 3.14.0 was also affected by this issue.)
- Failure to perform TLS certificate verification before sending HTTP headers (private Midori bug, Banshee). This leaks secure cookies, usually allowing attackers full access to your user account on a website. It also leaks the page you’re visiting, which HTTPS is supposed to keep private. (Update: Regrettably, Epiphany prior to 3.14.0 was affected by this issue. Also, the WebKit 2 API in WebKitGTK+ prior to 2.6.6, CVE-2015-2330.)
Except where noted, the latest release of all of the applications listed above are still vulnerable at the time of this writing, even though almost all of these bugs were reported long ago. With the exception of Shotwell, nobody has fixed any of these issues. Perhaps nobody working on the project cares to fix it, or perhaps nobody working on the project has the time or expertise to fix it, or perhaps nobody is working on the project anymore at all. This is all common in free software.
In the case of Shotwell, the issue has been fixed in git, but it might never be released because nobody works on Shotwell anymore. I informed distributors of the Shotwell vulnerability three months ago via the GNOME distributor list, our official mechanism for communicating with distributions, and advised them to update to a git snapshot. Most distributions ignored it. This is completely typical; to my knowledge, the stable releases of all Linux distributions except Fedora are still vulnerable.
If you want to play the above game, it should be very easy for you to add to my list by checking only popular desktop software. A good place to start would be to check if Liferea or Xombrero (supposedly a security-focused browser) perform TLS certificate verification before sending HTTP headers, or if Banshee performs verification on subresources, on the principle that vulnerable applications probably have other related vulnerabilities. (I did not bother to check.)
On a related note, many applications use insecure dependencies. Tons of popular GTK+ applications are stuck on an old, deprecated version of WebKitGTK+, for example. Many popular KDE applications use QtWebKit, which is old and deprecated. These deprecated versions of WebKit suffer from well over 100 remote code execution vulnerabilities fixed upstream that will probably never be backported. (100 is a lowball estimate; I would be unsurprised if the real number for QtWebKit was much, much higher.)
I do not claim that proprietary software is generally more secure than free software, because that is absolutely not true. Proprietary software vendors, including big name corporations that you might think would know better, are still churning out consumer products based on QtWebKit, for example. (This is unethical, but most proprietary software vendors do not care about security.) Not that it matters too much, as proprietary software vendors rarely provide comprehensive security updates anyway. (If your Android phone still gets updates, guess what: they’re superficial.) A few prominent proprietary software vendors really do care about security and do good work to keep their users safe, but they are rare exceptions, not the rule.
It’s a shame we’re not able to do better with free software.
Leaving aside the recent break in to the Mint website, almost all major distros are downloaded via plain HTTP, including the popular desktop Mint and Ubuntu. Almost nothing is checked. The role of package “maintainer” usually reduces down to compile and put upstream. Everything gets into packages even programs that are clearly broken, have missing features, are obsolete or not updated for years. There’s no quality control of any kind.
In fact I’m scared to use anything but essential apps on my laptop. Tried to put most of them in ssh containers.
Same applies to the Ubuntu phone shop. Everything goes, even “apps” that are merely an icon leading to a website. There’s one with broken URL. It hasn’t been removed even though it’s been reported. And so on.
Debian may download stuff over HTTP, but that’s because APT has been checking the package signatures for some years now. I’m pretty sure Ubuntu does the same.
GPG used properly provides better security than HTTPS, but it’s very difficult to use, so choosing it over HTTPS is, in general, foolish. But in the special case of distribution package management, it makes a lot of sense because it can be automated; users never have to think about GPG keys in that context.
The problem is for *initial* downloads, the ISO files you install the OS with. Fedora downloads are all served over HTTPS, because it would be irresponsible to do otherwise. I was very surprised to learn other distros are not using HTTPS, but it’s true: the download images for both Debian and Ubuntu are served via HTTP.
In theory, Debian users are supposed to check the GPG signatures of the hashes to verify security, using the keys provided on https://www.debian.org/CD/verify (the only secure page involved). (I see no such instructions on the Ubuntu download page.) But in practice, who does that? I have no clue how to use GPG. I am definitely not going to follow complicated instructions to check the signature.
This is not really on the mark – you don’t need to know how to use GPG. In Debian (and subsequently Ubuntu and Mint) there is a system keyring that is always present, against which packages are automatically checked. You will only ever know this is in use if something is wrong, and you’ll only ever need to know about cryptography in even a superficial way if you want to install more certificates.
That said, it would be good to see some work done on extending file managers (Nautilus etc) so that checking the authenticity of downloaded files against hashes and GPG keys is easier.
The system keyring can be used to authenticate updates to the system, but doesn’t help with installing new systems since that keyring is part of the ISO you’ve downloaded.
If I unpack the install ISO, add a new trusted key, and repack it, then systems installed from that modified ISO will trust updates signed by my new key. To make sure your system isn’t compromised like this, you need to authenticate the ISO by checking that it matches a signed hash, or was downloaded from a trusted site.
> there is a system keyring that is always present, against which packages are automatically checked
Yeah but he was asking about the validity of that system keyring in a setting where your download is unverified. If you download a crooked ISO which you didn’t check with GPG (something people ever so rarely do) and that the embed keyring have been tampered everything depending on the integrity of your keyring is compromised.
It is completely on the mark. Especially now that the Mint case has effectively demonstrated that compromising ISOs is entirely feasible and that thousands installs will ensue with no one never checking the ISOs signatures against GPG keys.
Michael is not talking about packages downloaded by the package management tools. He is talking about the initial ISO download using something like wget or via point-and-click in the web browser.
Or did I misunderstand something? I apologize if I did.
I think the use case Michael is pointing at runs roughly as:
I’m on Mac OS X (or Windows, or ancient Red Hat 7.2, or something that the Free Software world can’t fix – either due to age, or due to not being Free). I would like to try out this “Debian” thing on a new machine I’m building, instead of pirating Microsoft Windows. I download a Debian USB image to my Mac OS X machine; how do you ensure that that initial download that bootstraps me into the Debian world is authentic?
HTTPS solves that problem – the Mac validates TLS certificates, I get a valid USB image which I can write from my Mac to a USB stick, then I’m in the Debian world where GPG checking Just Works from my point of view. Changes to Nautilus et al do not solve the problem – I don’t have Nautilus until I’ve bootstrapped into Debian in the first place.
No, choosing GPG over HTTPS is the correct, and perhaps the only correct, thing to do.
GPG and HTTPS address two fundamentally distinct issues. GPG is an authentication mechanism that is used to provide assurances that the contents of a file are the same as whomever signed it. HTTPS is a transport security machinism that is only concerned with getting data from A to B. While HTTPS does provide some form of authentication, the strength of the authentication assurances is not remotely as strong as those provided by GPG – mainly due to the nature of how HTTPS is deployed in the real world and the assumptions its correctness depends on.
As an example, GPG-based systems will continue to function correctly in a Kazakhstan-style state-wide MITM operation. HTTPS-based systems will not. Another example is the mere fact that HTTPS-based systems require continuous access to private keys, while GPG does not.
I’m all for providing downloads exclusively over HTTPS, and am very critical of those who don’t. However, I will – and currently do – continue to require GPG signature verification for anything remotely important.
Being Debian user for the past 16 years I’ve had no idea they do sign their disk images. In fact one has to go to the bottom of the page, choose “Debian on CDs” which then leads to the verification page. If you go straight to the netinst link there’s no mention of any GPG keys being available.
Secondly, as others said, you might be downloading to a computer that doesn’t have GPG at all, ie. you’re buying a laptop with game console preinstalled and you want to replace it with the operating system.
Besides the two most popular distros do not seem to bother with signing their ISOs or even give MD5 sums. Or at least I can’t find it.
In any case TLS is the first line of defense.
Getting back to the topic, almost a year ago I reported that setting Ubuntu phone security to “swipe” also clears PAM password allowing to do sudo su – with no authentication what so ever. Moreover there’s a system group nopasswdlogin that skips password prompt on GUI operations while keeping the phone protected, but somehow it cannot be configured from the settings panel. This hasn’t been fixed yet.
One commonly heard phrase, “I didn’t do anything, I just installed an app from Google Store. That’s it.” How come it had a virus. I’m afraid the recent purge in droid store only wiped away the most straightforward malwares leaving the sophisticated ones intact and still lurking.
The list goes on…
No, I do not trust that application. This is why i maintain DSSP [1]
DSSP is a SELinux Security policy with a strong focus on integrity of workstations. It targets a selection of Gnome applications including but not limited to Gnome Music, and Gnone Weather.
DSSP aims to be flexible and accessible, but DSSP is not user friendly.
Some people say that using strict SELinux policy in Workstations is not “practical”. In the same breathe they praise initiatives such as XDG-app which has yet to prove itself on the security front (any front for that matter). DSSP provides integrity on Workstations today.
Xserver is still an issue (but it would also be an issue with XDG-app) , so you are encourage to use Xwayland (which DSSP also targets)
Yes i don’t trust applications, and i know we can’t fix all the issue everywhere. So i just make sure that any damage a flawed process causes is contained. It is not perfect but it is something, today.
[1] https://github.com/DefenSec/dssp
Writing software takes time. The substantial part of a new feature can maybe be written quickly, but once it’s done, there are still unit tests to write (if TDD is not followed), bugs to fix, the documentation to write, improve the accessibility, in some cases have a better i18n support, taking care of the build system and portability, the long term maintenance, AND security. (hopefully not in that order).
I’m sometimes surprised myself about the amount of commits that I write which are part of that surrounding “noise”, i.e. not on the core algorithms.
How distributions work needs to change. Rather than offer a special distribution sanctioned packages they should focus on the verification of desktop containers. Maybe offer a service validating that independently produced containers are safe. They could still produce containers in house but there shouldn’t be as much focus on that if they want OSS platforms to scale sustainably. The KDE and Gnome teams should by right be defining the base desktop images. There are plenty of techniques for verifying software in the server sphere(MIG – Mozilla investigator comes to mind and the desktop sphere seems to be headed in a similar direction XDG-App, firejail etc although like someone has already said the security side of things hasn’t fully been realized with XDG-App so its hard to tell whether it will sway people over time.
So it seems like an inevitability to me for either established distributions to get it or for a newcomer to come in to stir up the waters.
Even though their approach has since drifted I think canonical might have been in a good position if it had done a few things differently from the start. Launchpad might have been a great service that they could have offered to developers to manage and audit their software deployments.
From a development point of view its much more difficult. I have seen a lot of gnome developers talk quite positively about languages like rust but that still leaves you with all of the c code that has been written over the years.
There was another language developed for writing safe c with sel4 called cogent which could be worth a look http://people.mpi-inf.mpg.de/~crizkall/Publications/fs_verif.pdf . Ideally educating developers as to how to write efficient safe code at compile time would help reduce the amount of vulnerabilities you need to isolate and identify in the first place.
I think groups like KDE and Gnome are better positioned to do this than distributors themselves.
My ideal future would be one which favored tooling and services which operated on open standards and focused much less on ‘doing things differently ™’ in order to differentiate from the crowd.
“the issue has been fixed in git, but it might never be released because nobody works on Shotwell anymore” Not to put blame on you directly but couldn’t you have made a release tarball when you patched Shotwell? Most packagers (speaking with my Gentoo hat on) use scripts to track upstream releases against packaged releases. Git snapshots are rarely used.
As for all the TLS issues you mentioned, the entire ecosystem just sucked. Until Heartbleed, no one cared. Since then, things have changed somewhat. In the Python world, getting the stdlib module “urllib2” to do any sort of certificate validation required close to a 100 lines of code. Now we have the ubiquitous “requests” module that does everything right *by default*. Though “requests” was there a long time, a lot of people assumed urllib2 to be safe until Heartbleed showed up.
We, as developers, need good lower level libs. We need good languages (as much as I love C, it’s a PITA to do anything in it). We need this so we can focus on whatever applications we’re building.