Your Linux PC Isn't As Secure As You Think It Is

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
Even if your Linux box isn't as secure as you think, it is still way better than the alternatives when it comes to security.

It’s been an apocalyptic year for Linux security, with a sophisticated Trojan and security holes over 20 years old. The Shellshock bug left Linux desktops and servers wide open for anyone to own. Security updates fixed these problems—but you may not even be getting those patches.
 
Bad software is bad software. Doesn't matter if it runs on Linux, OSX, or Windows.
 
Bad software is bad software. Doesn't matter if it runs on Linux, OSX, or Windows.

It's not just bad software, it's bad configuration.

All the major OSes out there are more than capable of being MUCH more secure than their default config.

Realistically, the safest way to deploy a box would to be to ship it nearly totally crippled and locked down to the point where short of physical access to the box it isn't going to do anything useful and every dangerous thing has to be turned on. Preferably you would have a smart, informed person turning those dangerous things on.

Anything that shipped like that to the general consumer market would fail instantly, so they don't. That includes almost every flavor of unix as well. Where the line is drawn varies with the OS, but like the article points out, for everything that line is pretty much past point of "user freely interacts with the internet." Which for all intents and purposes means that you the user are vulnerable.
 
Are you connected to the internet? Then you're going to be exposed to malicious things. Sorry to blow your world, but operating any computer or mobile device takes at least some intelligence, logical reasoning and mountains of common sense.
 
Anyone who seriously uses Linux in a production environment, personal or otherwise, would know all of this.
Not getting the updates?

Just add the repositories needed, add the patches yourself, or hell, compile them from source.
It's not a bad article, it was just written for the clueless.
 
The BASH thing was patched within like less than 12 hours of being announced which is a lot faster than waiting until update Tuesday for Microsoft to release yet another set of broken updates.
 
The BASH thing was patched within like less than 12 hours of being announced which is a lot faster than waiting until update Tuesday for Microsoft to release yet another set of broken updates.

Yeah, no computer system connected to a network is impervious to exploits.

The difference is that the open source community tend to address discovered holes very fast, and distributions tend to push out patches very quickly.

This can't be said for all systems. Microsoft has gotten better at this in the last 10 years. Apple is still pretty horrible, leaving known 0-days open without patching for long periods of time.

If you are running an even semi recent Linux distribution they tend to have the patches pushed out to them relatively quickly using the included package managers. If you don't run updates, well then that's your problem.

I run a few Linux and BSD based servers in my basement. Whenever a new high profile vulnerability rears its ugly head I immediately check progress of patches for it. I leave automatic security updates on most of them, thus in case the automated system gets to them before I do.

There is a slight risk in this approach, as rarely updates can break stuff. Might not want to do it on a production system, but it works for me.
 
Comment on his statement on the Bash bug leaving Linux/Unix completely open for 20 years.


If this is what you call open, then EVERYTHING is open ALL THE TIME.

Security bugs are inherent in all software at all times. They only become a problem once found and exploited. Thats why patching is so very important, and it is a bad idea to use OS/Software from companies and projects that are poor at implementing timely patches

(or using software or operating systems that are no longer receiving security updates, like XP.)
 
Zarathustra[H];1041300763 said:
Comment on his statement on the Bash bug leaving Linux/Unix completely open for 20 years.


If this is what you call open, then EVERYTHING is open ALL THE TIME.

Security bugs are inherent in all software at all times. They only become a problem once found and exploited. Thats why patching is so very important, and it is a bad idea to use OS/Software from companies and projects that are poor at implementing timely patches

(or using software or operating systems that are no longer receiving security updates, like XP.)

The problem with your statement is we the public tend to ignorantly think that a security flaw is known about when some university professor publishes on it, but for all we know hackers have been exploiting it for a decade prior. You and I do not realistically know when a flaw was first discovered because hackers and virus coders are not going to publicly disclose an ace up their sleeve. This is why discovery of these flaws is important, it is not the flaw itself it is the idea that such a flaw could exist that long undetected by the people who are suppose to be keeping these flaws in check and it is a canary in the coal mine for the likely hood of other such flaws. The long standing Linux exploits are just another piece of evidence that when hardly anyone uses an OS there are less eyes on its possible short comings.
 
meanwhile distro lines like redhat / Centos are almost a year or more behind on current releases...

It's a tradeoff.

Not being current on all software in the package repository isn't necessarily an issue, as security fixes are usually backported, either by the software developers, or by the distribution teams themselves.

You will find that distributions such as CentOS/Redhat and BSD derivatives such as FreeNAS/pfSense which are targeted at enterprise users will be further behind in their releases than consumer based distributions.

The simple reason is stability. Enterprise users value stability above the latest features and versions of everything. They don't want their mission critical systems to go down because of a software update. So they let the bugs get worked out over time, before implementing known very stable versions.

Consumer Linux distributions (such as Ubuntu and its derivatives) tend to have newer releases, but still be a bit behind, again for stability purposes. But here features are more important, so they are not as far behind.

Then you have the likes of Gentoo for the truly bleeding edge. I used to run it and pride myself in always having the newest bleeding edge, but it was just too much. Updates breaking shit all the time, and constant maintenance just made it too much.

(I should clarify here, that this isn't the case for Gentoo as a whole, portage - its package manager - has configuration options for just HOW bleeding edge you want to be, and in my naivete I went all in.)


It's not a big deal as mentioned. With older versions, security is almost always back-ported into those older versions while the distribution is still supported, and if you REALLY need newer versions most distributions allow you to add repositories with the latest versions of software packages, to override what is included by default.

You can also download the source code and compile it yourself (a lot easier than it sounds) and be really bleeding edge, but I generally recommend against it, as then you arent gettign the automatic updates the package manager allows for.
 
Not that hard to do updates. Nowadays everything is virtualized. I literally backup the VM and then update. If there is some dependency issue its just a quick restore while I work out the problem. There's no excuse not to update.
 
Zarathustra[H];1041300810 said:
It's a tradeoff.

Not being current on all software in the package repository isn't necessarily an issue, as security fixes are usually backported, either by the software developers, or by the distribution teams themselves.

You will find that distributions such as CentOS/Redhat and BSD derivatives such as FreeNAS/pfSense which are targeted at enterprise users will be further behind in their releases than consumer based distributions.

The simple reason is stability. Enterprise users value stability above the latest features and versions of everything. They don't want their mission critical systems to go down because of a software update. So they let the bugs get worked out over time, before implementing known very stable versions.

Consumer Linux distributions (such as Ubuntu and its derivatives) tend to have newer releases, but still be a bit behind, again for stability purposes. But here features are more important, so they are not as far behind.

Then you have the likes of Gentoo for the truly bleeding edge. I used to run it and pride myself in always having the newest bleeding edge, but it was just too much. Updates breaking shit all the time, and constant maintenance just made it too much.

(I should clarify here, that this isn't the case for Gentoo as a whole, portage - its package manager - has configuration options for just HOW bleeding edge you want to be, and in my naivete I went all in.)


It's not a big deal as mentioned. With older versions, security is almost always back-ported into those older versions while the distribution is still supported, and if you REALLY need newer versions most distributions allow you to add repositories with the latest versions of software packages, to override what is included by default.

You can also download the source code and compile it yourself (a lot easier than it sounds) and be really bleeding edge, but I generally recommend against it, as then you arent gettign the automatic updates the package manager allows for.

This was part of the reason why a lot of IT people I know stuck with Windows XP until the bitter end - you were getting a known quantity with years of stable operation (a relative term, I admit) that people had come to rely on. I know a few Linux admins that are actually worse in that regard (running really old software).

The thing with "corporate" Linux is that they pretty much have to get all of the bugs worked out. The Linux gurus I know tend to know enough about several distros to be capable of moving to them at any given time.

I wish their communities were a little more inviting sometimes - I always feel like an outsider, even with friends that eat/breathe/sleep/live Linux.
 
I agree with you Zarathustra[H], luckily if we do need the latest we add in another repo and off we go. I certainly prefer stability over new features for work, means less off hour phone calls!
 
Good thing that m$ troll isnt here, one less problem for linux i guess!!
 
How do they know how secure I think my linux PC is?

Mind reading.

There definitely is a group of people that thinks Linux is impervious to all hacking attempts and malware. I would say this is their wake-up call but their heads are buried so far up their own butts that nothing will get them out.

I think this is actually the first year I can recall where there were more critical bugs affecting my apache web servers than my IIS webservers. If you'd told me that in 1999 I wouldn't have believed you.
 
Well, my Xubuntu machine tells me about updates almost every day. They're usually a little under 100MB, and install quickly even on a AMD C-60.
 
Anyone who seriously uses Linux in a production environment, personal or otherwise, would know all of this.
Not getting the updates?

Just add the repositories needed, add the patches yourself, or hell, compile them from source.
It's not a bad article, it was just written for the clueless.

-This-

Who are these Linux users who claim Linux is bullet proof?
Granted, *nix OSes are not the hackers playground historically as Windows. But invulnerable? No way.
 
-This-

Who are these Linux users who claim Linux is bullet proof?
Granted, *nix OSes are not the hackers playground historically as Windows. But invulnerable? No way.

Nothing is invulnerable, not Linux, any version of Windows or OSX.

I would think most Linux users are savvy enough to know this. (OSX users on the other hand, tend to be VERY naive in this regard)

That being said, I'd take Linux over just about anything else, from a security perspective.
 
I think this is actually the first year I can recall where there were more critical bugs affecting my apache web servers than my IIS webservers. If you'd told me that in 1999 I wouldn't have believed you.

Yeah, from a straight up quantity perspective the open source community has had a rather bad year, but to their defense they were all patched very quickly after being reported.

I remember reading about the bash bug at work, and by the time I had gotten home, all of my servers had already automatically updated the patch.

I also don't necessarily think raw quantity is the best way to compare different platforms.

More eyeballs on open source code, is simply going to result in more issues being found, and then fixed. Though I can't prove it, I would strongly suspect that there are comparatively more undiscovered security flaws in Microsoft stuff just based on the open source model, and code that can be easily reviewed by anyone with the knowledge and desire to do so.
 
Zarathustra[H];1041302818 said:
More eyeballs on open source code, is simply going to result in more issues being found, and then fixed.

This claim has always been interesting. The assumption is that because Linux code is open source there's all these people that are constantly reviewing it. Maybe that's true and that there's more people reviewing server/desktop Linux code than server/desktop Windows code. However the way that you really ensure code quality is through testing. On the desktop side at least, it's hard to imagine that Linux undergoes more testing than Windows giving the market share disparity.
 
This claim has always been interesting. The assumption is that because Linux code is open source there's all these people that are constantly reviewing it. Maybe that's true and that there's more people reviewing server/desktop Linux code than server/desktop Windows code. However the way that you really ensure code quality is through testing. On the desktop side at least, it's hard to imagine that Linux undergoes more testing than Windows giving the market share disparity.

Yeah this. Because Microsoft does a lot of testing on everything, they would never release anything with bugs in it.
 
This claim has always been interesting. The assumption is that because Linux code is open source there's all these people that are constantly reviewing it. Maybe that's true and that there's more people reviewing server/desktop Linux code than server/desktop Windows code. However the way that you really ensure code quality is through testing. On the desktop side at least, it's hard to imagine that Linux undergoes more testing than Windows giving the market share disparity.

Yeah, that is true. I'd imagine Windows has a better overall controlled Agile software development life cycle process, including such best practices including well through out requirements, architecture documents and concurrent test code, as well as probably having coding and commenting standards to help deal with the code "obfuscation" problem you often have when you have many chefs rooting in the same pot.

That being said, if this were the case, or at least if this resulted in all the benefits the proponents of controlled software development processes claim, we would expect to see Microsoft products better attuned to user needs, more stable and more secure than they are...

I mean, granted, Microsoft has come A LONG way since the bad old days, but their products still aren't the shining example of stability and security something like this ought to generate.
 
This claim has always been interesting. The assumption is that because Linux code is open source there's all these people that are constantly reviewing it. Maybe that's true and that there's more people reviewing server/desktop Linux code than server/desktop Windows code. However the way that you really ensure code quality is through testing. On the desktop side at least, it's hard to imagine that Linux undergoes more testing than Windows giving the market share disparity.

This is what is interesting about open source vs closed source software development.
Who is at the helm of open source? In the case of Linux ultimately it is Linus Torvalds himself.
In some cases some organizations offer up crap, buggy code and Linus will raise hell.
Consider the recent case with systemd.

http://www.phoronix.com/scan.php?page=news_item&px=MTY1MzA
 
Yeah this. Because Microsoft does a lot of testing on everything, they would never release anything with bugs in it.

Well, keep in mind, software testing is very complicated.

Even with the best programmers in the world, following rigid software development processes, and automated code review you wind up with unexpected interactions of different parts of code, etc. etc. Software quality and Testing attempts to identify areas of concern and test those, but it is completely impossible to get full coverage, testing 100% of use conditions on 100% of hardware. Just can't be done. You'd be testing the release for decades and never releasing it.

Software is not like mechanical hardware that makes sense, and you can easily identify your critical parameters, assign them acceptance criteria and then test against them. As the FDA likes to say "software is different".
 
T
Who is at the helm of open source? In the case of Linux ultimately it is Linus Torvalds himself.

But what direct accountability does Torvalds really have? When Microsoft screws up there's a big company that people can point fingers towards. How points fingers at Torvalds when there are issues with Linux? Indeed since he doesn't have direct control over most of this that would really make a lot of sense. Sure he can blast people over stuff but that's not the same thing has accountability.
 
It's impossible for non-trivial software to have zero defects. That's just a law of computer science.

It's good to know we can set the bar so low. That way it's just the fabricated laws of computer science that are to blame instead of human laziness, incompetence, or anything else. I like pushing responsibility to something abstract. I should totally try using that next time someone points out a typo in one of my books. Like, "It's impossible for non-trivial novels to have no typos. It's a law of writer science." :p
 
It's good to know we can set the bar so low. That way it's just the fabricated laws of computer science that are to blame instead of human laziness, incompetence, or anything else. I like pushing responsibility to something abstract. I should totally try using that next time someone points out a typo in one of my books. Like, "It's impossible for non-trivial novels to have no typos. It's a law of writer science." :p

Even if it were possible, how do you prove that a non-trivial system contains zero defects? And that would the proof even be? Sure human deficiencies are part of the problem. But even there are other issues like ambiguity, incompleteness of requirements, misunderstanding of the problem space, etc. which aren't necessarily "not doing your" job types of issues. Even something as innocent miscommunications among people is a huge source of software defects.
 
Comparisons between Linux, Windows and OSX vulnerabilities are nonsensical. It is apples and oranges.

First off, yes, ALL software has vulnerabilities.

It's apples and oranges because the deployment demographics are vastly different. Linux is primarily used in servers. It is more difficult to hack, however the rewards of hacking are much greater (full DB dump). You need a lot of resources to make a successful hack on a server system, which is why people dabbling in this realm are big-time state level organizations (e.g. NSA)


Windows (and OSX) is on consumer, Joe-Schmoe desktops and laptops. It has a lot of potential attack vectors (browser, email, USBs etc.), but the rewards are smaller. There is only so much monetary gain to be had from hacking grandmas computer.

Also, android is Linux and it has its share of problems.

For some alternative security mechanism, read this: http://en.wikipedia.org/wiki/Capability-based_security
 
Zarathustra[H];1041302818 said:
Yeah, from a straight up quantity perspective the open source community has had a rather bad year, but to their defense they were all patched very quickly after being reported.
There is a downside though. Once a patch is released for an open source project, the bad guys can very quickly and easily compare the open source code, and craft an attack.

This happened with Drupal lately. Within six or seven hours of Drupal's critical security patch, the bad guys were already "owning" drupal-based sites via an automated attack. It was so bad that Drupal (two weeks later) released an alert saying in no uncertain terms that if you had not patched your Drupal install within seven hours of the patch being released, you should consider your site compromised, and revert to the point two weeks in the past before the patch was released and the automated exploitation campaign began.

Again, I'm not saying open source is a bad thing (far from it). But as with most things, there is a downside even to this "benefit" of having more eyes on the code.

Thankfully, I wasn't sleeping/traveling/vacationing or otherwise occupied when the drupal update was released. But a seven-hour window to patch production-level sites is getting very, very small.
 
Back
Top