Linux malware evades detection for upwards of 3 years, not sure how

Things with actual links (vs just forums/IRC/mailinglists):
https://ostechnix.com/how-to-disable-ads-in-terminal-welcome-message-in-ubuntu-server/
https://news.ycombinator.com/item?id=23052108

Every Ubuntu release is more bloated than the last - but none of that bloat offers any technical improvements!
Everything just gets fatter, more resource hungry, slower, and less responsive. So yes, plenty of bullshit in Ubuntu.

No, there is not plenty of BS, regardless and no, it is not slower, less responsive nor more resource hungry. In fact, it runs faster than Windows on the same hardware, on all 3 of my computers. You just keep being subjective and I will maintain my objective application of my machines and Linux. :)
 
Well, sample size of one, but I dual boot Ubuntu and Win10 (same machine) and it definitely feels faster on Ubuntu. Not a monster difference, but it's as good or better.

Granted, I have some issues, mostly with high refresh displays that I don't think I will ever get working completely, but otherwise it's pretty good.
 
Profit, from a corporations perspective. For Linux, the maintainers are hand picked and have decades of experience and come from a enthusiast background.
The maintainers at megacorp are also hand picked. And since it's a paid role they are motivated.
Also since you and many others refuse to use megacorp OS wouldn't profit drive them to exclude backdoors?

Meanwhile everyone should just trust Linux, cuz you know hackers and trust and stuff. Your just assuming that because megacorp needs to make money that they are going to go around abusing their users like a mob protection racket. I don't believe that.

I use Linux, and like everything the foundation stands for, but there is absolutely nothing that makes any open source software more secure.

Sure I could examine the source code myself, but that's not an ability most people have. Much less figuring out what millions of lines of shitty code written by amateurs does.

I agree UoM professor that conducted that experiment should be severely punished, but that's just an example of how easy it is to introduce malware. Would they have been caught if they didn't publish results?
 
isn't it more motivated of the people that do NOT get paid to perform well?
if you do something because you WANT to, you do it correctly.
if you do something because you get paid, you do it, barely.

also i have to answer 'yes' to would they have been caught. there are a surprising number of geeks that do sit around and browse linux code. it seems like a weird hobby to me, but quite regularly there are instances of 'G33kL0rd found a potential exploit in prerelease code, patch inserted' in the project logs. Heck a while back Debian core had an attempt malware injection, it was found literally minutes after upload and it was injected in a way that they TRIED to make it look as legit as possible.
 
Um, what? When I login to my Ubuntu 20.04.2 installations, I have no access to root by default. I have to you sudo or an equivalent in order to be able to gain any sort of root access, such as when I am install with apt-get.
sudo represents a security risk. Its been in the "news" a couple of times in the past year. But inherently it allows you to do convenient things that are more risky just like a default to super user is risky. All it does it give you a moment of pause and type 4 letters and a space vs. having su by default. Windows can even prompt you for elevation, now too. So they are more on par now. This might be fine for most people here, I'm talking about Linux in the hands of Grandma or Joe Average or even devices. There's an increasing population of poorly secured Linux 'Boxes'.
 
I am not satisfied by the info in the OP. Has someone more reputable commented on this malware?

As someone else wrote:
Anyone can write an executable to do nefarious things, but what is the entry path onto the system? What compromise is used to install it? Who uses that packages and has it misconfigured to allow this executable to be installed? Without this key information, this admittedly excellent analysis of the payload is useless. What is the CVE number?

My read of the netlab site seems to point to this malware can not escalate privileges. It is limited by whatever the user that installed it can do. The real question is how it gets installed. If it's individual cases of users downloading binary files from sketchy places and running them then I don't see how any OS can possibly prevent this sort of user-land thing without completely locking down the system ie. a game console. If the user has any ability to install and run their own software and that software can access the internet, then this type of thing is inevitable in my view.

However, by default people are not downloading and running executables on linux that are obtained from random websites. This is rare compared to windows. Most programs are installed through official channels and these channels use signature checks to ensure that the user downloaded a vetted program ie. the exact program compiled and packaged by the maintainer. In this way everything IS similar to a game console - much more than windows where people are sharing .exe files through dropbox. Unless the maintainer is malicious or has himself been compromised without his knowledge I don't see how this malware can spread. But I have not seen any evidence that this malware is being spread through official channels.

EDIT: I left out possible user-land exploits like if this is installing itself through a javascript exploit on firefox or emails attachments. Maybe a weather widget? But we have no info on this, or even if it's widespread. Link
 
Last edited:
sudo represents a security risk. Its been in the "news" a couple of times in the past year. But inherently it allows you to do convenient things that are more risky just like a default to super user is risky. All it does it give you a moment of pause and type 4 letters and a space vs. having su by default. Windows can even prompt you for elevation, now too. So they are more on par now. This might be fine for most people here, I'm talking about Linux in the hands of Grandma or Joe Average or even devices. There's an increasing population of poorly secured Linux 'Boxes'.

The only way that an Ubuntu box is poorly secured is if you set a bad password because otherwise, you are good to go. With Windows, you can set it up with no password, by default but as you know, Linux does not allow no passwords on any user accounts. I am not sure where you would consider a Linux box as poorly secured unless, once again, it has a poor password.
 
You seem to be under the impression that windows users give AF about the kernel.
Absolutely not. I know windows user don't give AF in general, they do after all represent the population at large. Nor do they have to, nor do most casual linux users.

Much less figuring out what millions of lines of shitty code written by amateurs does.

These 'shitty amateurs' work for Google, Facebook, Nvidia, AMD, Valve, IBM etc.. contributing to opensource. Shitty code is not an exclusive of where you work or don't work.

I agree UoM professor that conducted that experiment should be severely punished, but that's just an example of how easy it is to introduce malware.

It wasn't actually introduced. It was however a display of what happens when you try, which is that you get banned from contributing.
 
I am not satisfied by the info in the OP. Has someone more reputable commented on this malware?

As someone else wrote:


My read of the netlab site seems to point to this malware can not escalate privileges. It is limited by whatever the user that installed it can do. The real question is how it gets installed. If it's individual cases of users downloading binary files from sketchy places and running them then I don't see how any OS can possibly prevent this sort of user-land thing without completely locking down the system ie. a game console. If the user has any ability to install and run their own software and that software can access the internet, then this type of thing is inevitable in my view.

However, by default people are not downloading and running executables on linux that are obtained from random websites. This is rare compared to windows. Most programs are installed through official channels and these channels use signature checks to ensure that the user downloaded a vetted program ie. the exact program compiled and packaged by the maintainer. In this way everything IS similar to a game console - much more than windows where people are sharing .exe files through dropbox. Unless the maintainer is malicious or has himself been compromised without his knowledge I don't see how this malware can spread. But I have not seen any evidence that this malware is being spread through official channels.
Thanks for actually reading (y)

So after this entire discussion it appears that the attack vector for this malware is not as broad as first implied.
 
sudo represents a security risk. Its been in the "news" a couple of times in the past year. But inherently it allows you to do convenient things that are more risky just like a default to super user is risky. All it does it give you a moment of pause and type 4 letters and a space vs. having su by default. Windows can even prompt you for elevation, now too. So they are more on par now. This might be fine for most people here, I'm talking about Linux in the hands of Grandma or Joe Average or even devices. There's an increasing population of poorly secured Linux 'Boxes'.
The main security feature of linux is where you get your software. On Windows for the most part it's mainly done with downloading .exe or .msi files and running them. There is the Microsoft Store but it isn't exactly as friendly and easy to use as launchpad.net. I don't even need repositories when I can just "sudo apt install firefox" where as the Microsoft Store doesn't give me the option to do that. No Chrome either. The best thing I can do is install chocolatey which does give me this functionality. I can't say that installing apps from an unknown repository is 100% safe, but it's a lot better than downloading an installer from a website, which you can also do on Linux. Microsoft is too concerned about making a profit with their Store that nobody uses it to just upload free software.

If Grandma or Joe Average is using Linux then sudo isn't a feature they're going to be aware of. If they ever get software then it's through the Software Manager. Even if they knew how to use sudo then they'd be downloading apps through the included and safe repository. If they knew how to add repositories and sudo install then they aren't your Joe Average.

The only way that an Ubuntu box is poorly secured is if you set a bad password because otherwise, you are good to go. With Windows, you can set it up with no password, by default but as you know, Linux does not allow no passwords on any user accounts. I am not sure where you would consider a Linux box as poorly secured unless, once again, it has a poor password.
When have systems ever been breached because of a weak password? I'm not talking about using password as password or 12345. Most people are going to use passwords like joemaddon89 or something stupid like their favorite thing plus the year they were born. Even when the password is that stupid, most systems will lock you out after 3 tries, meaning you can't brute force find the password they use. If someone knows your password then that's because you used the same password on another website that didn't patch a vulnerability and stored users information on an unencrypted text file. No amount of unique characters and long passwords is going to stop that. Just don't use the same password on multiple websites as well as your linux computer.
 
No, there is not plenty of BS, regardless and no, it is not slower, less responsive nor more resource hungry. In fact, it runs faster than Windows on the same hardware, on all 3 of my computers. You just keep being subjective and I will maintain my objective application of my machines and Linux. :)
This may have already been addressed, but he didn't compare Linux to windows. He said each new Ubuntu release is more bloated and slower than the previous and he isn't wrong.
 
This may have already been addressed, but he didn't compare Linux to windows. He said each new Ubuntu release is more bloated and slower than the previous and he isn't wrong.

I find that his points are more subjective than objective. In fact, I find Ubuntu running faster than earlier versions from my own experience. However, of course, Ubuntu 20.04.2 and above is going to run slow on 10 year old hardware.
 
I find that his points are more subjective than objective. In fact, I find Ubuntu running faster than earlier versions from my own experience. However, of course, Ubuntu 20.04.2 and above is going to run slow on 10 year old hardware.
Then he is right. If you keep it on the same hardware, it will get progressively slower over time. But why? Is it because of the added bullshit that's unneeded? Of course, and that isn't subjective.
 
Then he is right. If you keep it on the same hardware, it will get progressively slower over time. But why? Is it because of the added bullshit that's unneeded? Of course, and that isn't subjective.

No, I do not think he is right but simply subjective. No new version of any Linux based OS is going to run fast on 10 year old hardware, nor any other OS, either.
 
No, I do not think he is right but simply subjective. No new version of any Linux based OS is going to run fast on 10 year old hardware, nor any other OS, either.
You need to stop trying to move the goalposts. The statement was that Ubuntu is getting progressively slower and has more bloat. This is in reference to Ubuntu and comparing it to itself.

Ubuntu is now up over 10gb for a base install. I just installed Mint last week and it was over 10. Ubuntu stock may very well be different, but I've used many flavours of it over the years and it is taking more and more space and including more and more applications.

This isn't debatable. The applications being provided are offering new features. The OS is being offered with more features. Both of these will affect disk space, ram usage and CPU cycles. Again, not debatable.
 
No, I do not think he is right but simply subjective. No new version of any Linux based OS is going to run fast on 10 year old hardware, nor any other OS, either.
Windows 7 ran faster than vista on my laptop, but that was an exception. Generally, new OSes run slower on the same hardware. Some of that is bloat, and some of that is old technology being phased out and the new tech replacing it not being available on the old hardware (larger RAM/Cache, faster GPUs with new architectures, new cpu instructions, etc).

You can't say none of it is from bloat, but I'll grant that a lot is from tech advances.
 
Windows 7 ran faster than vista on my laptop, but that was an exception. Generally, new OSes run slower on the same hardware. Some of that is bloat, and some of that is old technology being phased out and the new tech replacing it not being available on the old hardware (larger RAM/Cache, faster GPUs with new architectures, new cpu instructions, etc).

You can't say none of it is from bloat, but I'll grant that a lot is from tech advances.

For me, at least until I upgraded to an SSD, Windows Vista was faster than Windows 7. It had to do with the boot caching of the programs off of my hard drive where Vista did it immediately and Windows 7 did it a few minutes after booting up the computer. I hated that I could not make 7 act like Vista in that regard.

As for the term bloat, that is highly subjective, at least in my opinion. However, I have not noticed anything on my computers and they are objectively better with Ubuntu on them then without. (Even for gaming, a lot of the time.)

Edit: The only exception to the subjective nature of the term bloat is the WinSXS folder, that thing is straight up objective bloat, at least in my experience.
 
For me, at least until I upgraded to an SSD, Windows Vista was faster than Windows 7. It had to do with the boot caching of the programs off of my hard drive where Vista did it immediately and Windows 7 did it a few minutes after booting up the computer. I hated that I could not make 7 act like Vista in that regard.

As for the term bloat, that is highly subjective, at least in my opinion. However, I have not noticed anything on my computers and they are objectively better with Ubuntu on them then without. (Even for gaming, a lot of the time.)

Bloat is not subjective, at least not to those of us actually writing and contributing to (FL)OSS. You have no idea what you're talking about, you're enamored with your OS and that's great. That still doesn't change the fact that Ubuntu is getting more bloated with every release - instead of stripping down unnecessary features the base install itself keeps growing, the number of default "services" increases constantly, and the idle RAM consumption is through the roof compared to other distros.

This disturbing trend is not limited to Ubuntu btw. It's kind of sad - we get faster and faster hardware but software is less responsive and slower than 15 years ago. Some interesting reading about general bloating/slowing of software even with faster hardware:
https://tonsky.me/blog/disenchantment/
https://pxlnv.com/blog/bullshit-web/

Edit: The only exception to the subjective nature of the term bloat is the WinSXS folder, that thing is straight up objective bloat, at least in my experience.

Thanks for the laugh Mr Objective.
 
Bloat is not subjective, at least not to those of us actually writing and contributing to (FL)OSS. You have no idea what you're talking about, you're enamored with your OS and that's great. That still doesn't change the fact that Ubuntu is getting more bloated with every release - instead of stripping down unnecessary features the base install itself keeps growing, the number of default "services" increases constantly, and the idle RAM consumption is through the roof compared to other distros.

This disturbing trend is not limited to Ubuntu btw. It's kind of sad - we get faster and faster hardware but software is less responsive and slower than 15 years ago. Some interesting reading about general bloating/slowing of software even with faster hardware:
https://tonsky.me/blog/disenchantment/
https://pxlnv.com/blog/bullshit-web/



Thanks for the laugh Mr Objective.

Yes, bloat is subjective but hey, whatever dude, it is what it is. Oh, and the software I run is faster and more responsive that it was 15 years ago but hey, to each their own.

Edit: Or more specifically, you throw around the word bloat at everything so, it really has lost it's meaning.

Edit 2: Oh, and it is not being enamored, it is simply rejecting BS out of hand.

Edit 3: And it is objective bloat, that WinSXS folder but hey, in my experience, something growing significantly larger than even the OS is truly bloat. But hey, it is what it is.
 
Last edited:
Bloat is real, but it's not unique to Ubuntu or any OS really. It's just a matter of software getting more complex, adding more features or dependencies, etc.

Yes, a text editor was faster 25 years ago, but you also couldn't play AAA 3D games in your browser streamed over the cloud with a wireless Xbox controller (like you can on Stadia) just to give an example.
 
Having new people come to linux and explain GNU + Linux kernal's, init system, display managers, windows managers, then 1,000 differently named distros etc.. is a major put off for linux adoption. So it's easier to just say Linux is an operating system :)
Microsoft Disk Operating System (MS-DOS). Both command line based operating systems.
 
Bloat is real, but it's not unique to Ubuntu or any OS really. It's just a matter of software getting more complex, adding more features or dependencies, etc.

Yes, a text editor was faster 25 years ago, but you also couldn't play AAA 3D games in your browser streamed over the cloud with a wireless Xbox controller (like you can on Stadia) just to give an example.
Higher-level programming languages are also becoming more complex, taking further advantage of newer hardware functions and instructions, but that also adds significantly more code and layers to execute an operation - depending on the task being executed.
Bloat is definitely real, like seeing banner ads in programs, or programs loading a zillion dependencies which may not even be necessary, etc., agreed.

As for Linux, specifically the kernel, it has definitely become heavier in the last decade, and this malware might just be the tip of the iceberg.

Strips-IDE-visual-studio-650-finalenglish1.jpg
 
Paging everyone who whines about forced Windows updates.
I do all my updates under KDE. The difference is there's no perceptible increase in boot times, most of the time there's not even any need to reboot, I'm the one in charge and they've never hosed my whole install.

I checked for the files related to this apparent malware, there's nothing there.
 
This disturbing trend is not limited to Ubuntu btw. It's kind of sad - we get faster and faster hardware but software is less responsive and slower than 15 years ago. Some interesting reading about general bloating/slowing of software even with faster hardware:
https://tonsky.me/blog/disenchantment/
https://pxlnv.com/blog/bullshit-web/
This drives me nuts, I had a fantastic experience with Linux on my single core laptop in 2005. Why does software have to break that with time? Now I can't hardly run cli only on the same thing.
 
This drives me nuts, I had a fantastic experience with Linux on my single core laptop in 2005. Why does software have to break that with time? Now I can't hardly run cli only on the same thing.

So, everything should run well on single core laptop's in 2021?
 
Expecting a command line program to run reasonably well should be.. well, reasonable.

Then run the version of the OS that came out at that time, since only a CLI interface is needed. (If the program is newer, then get better hardware to run it.)
 
Then run the version of the OS that came out at that time, since only a CLI interface is needed. (If the program is newer, then get better hardware to run it.)
I think a much more reasonable suggestion would be to run an OS which was configured (or better, designed) to run on that kind of hardware. There are modern operating systems which are, believe it or not. The only issue would be whether the program runs, which might depend simply on compatibility with system libs, or how many dependencies it pulls in.
 
Bloat is not subjective, at least not to those of us actually writing and contributing to (FL)OSS. You have no idea what you're talking about, you're enamored with your OS and that's great. That still doesn't change the fact that Ubuntu is getting more bloated with every release - instead of stripping down unnecessary features the base install itself keeps growing, the number of default "services" increases constantly, and the idle RAM consumption is through the roof compared to other distros.

This disturbing trend is not limited to Ubuntu btw. It's kind of sad - we get faster and faster hardware but software is less responsive and slower than 15 years ago. Some interesting reading about general bloating/slowing of software even with faster hardware:
https://tonsky.me/blog/disenchantment/
https://pxlnv.com/blog/bullshit-web/

There was an interesting article one time comparing OS latency - as in the screen to screen, field to field timing of older OSes and responsiveness of the inputs. (As well as mouse and tool responsiveness for GUI)
It was surprising the extent to which modern OSes are slower in absolute terms. The character based inputs and original Mac OS led the responsiveness, unsurprisingly.
 
I find that his points are more subjective than objective. In fact, I find Ubuntu running faster than earlier versions from my own experience. However, of course, Ubuntu 20.04.2 and above is going to run slow on 10 year old hardware.
I've been dealing with this myself and found a number of solutions to this problem. I'm using Linux Mint 20 so it's basically Ubuntu but with a more demanding UI. First thing to always do is install bleeding edge drivers. I used to use Oibaf PPA but it tends to break too often for my tastes. It also doesn't support Ubunt 18.04, which makes it harder to uninstall and upgrade to 20.04. So now I use kisak-mesa which is a developer who works for Valve, so you know it's probably good. This probably doesn't matter for Nvidia users since they don't use Mesa drivers. The second thing I do is install Xanmod, which is a high performance kernel for Ubuntu. This will give a noticeable increase in performance. If I'm installing xanmod on a multicore CPU like my Ryzen 1700 then I install linux-xanmod-cacule, otherwise the default linux-xanmod should be fine. The third thing I do is disable CPU Mitigations. If you use an Intel CPU then this will give you a good deal of performance back, especially if you have an older Intel CPU. I haven't seen a malware or virus use Spectre and Meltdown, and even if there were they would still need me to download them first and execute them. The performance loss is too great for me to have to worry about what could happen. I will probably do this on my AMD machines as well since their mitigations are starting to hurt performance as well.
 
Last edited:
For what it's worth - Battlefield 4 runs way better, and actually faster, on my linux box than it ever did on Windows. If I can help it, windows will never be my main OS again.
How did you make this work? I'm terrified of being arbitrarily banned....
 
How did you make this work? I'm terrified of being arbitrarily banned....
I know exactly what you're referring to.

it's a security feature in Wine to disallow direct access to ICMP. I believe many anti-cheat systems (i.e. punkbuster) use direct ping response for things. Some people spoof their ping response to get into servers that disallow high ping so this is likely where the banning comes from. The below link will show you how to enable this access and fix the banning issues. This worked 100% correctly for me.

https://wiki.winehq.org/FAQ#Failed_to_use_ICMP_.28network_ping.29.2C_this_requires_special_permissions
 
I've been dealing with this myself and found a number of solutions to this problem. I'm using Linux Mint 20 so it's basically Ubuntu but with a more demanding UI. First thing to always do is install bleeding edge drivers. I used to use Oibaf PPA but it tends to break too often for my tastes. It also doesn't support Ubunt 18.04, which makes it harder to uninstall and upgrade to 20.04. So now I use kisak-mesa which is a developer who works for Valve, so you know it's probably good. This probably doesn't matter for Nvidia users since they don't use Mesa drivers. The second thing I do is install Xanmod, which is a high performance kernel for Ubuntu. This will give a noticeable increase in performance. If I'm installing xanmod on a multicore CPU like my Ryzen 1700 then I install linux-xanmod-cacule, otherwise the default linux-xanmod should be fine. The third thing I do is disable CPU Mitigations. If you an Intel CPU then this will give you a good deal of performance back, especially if you have an older Intel CPU. I haven't seen a malware or virus use Spectre and Meltdown, and even if there were they would still need me to download them first and execute them. The performance loss is too great for me to have to worry about what could happen. I will probably do this on my AMD machines as well since their mitigations are starting to hurt performance as well.

I have a 3600 and 2 3700x's. Is there a specific reason why you use the calcule one? Also, I just use Mesa 21, is the Kodak version better or needed?
 
The things that did should still.
Your telling me word processors and spreadsheet applications suddenly need 16 cores and gigabytes of ram?

There is always the use of the older version, that continues to work and is free. (Libre Office and Open Office.)
 
There was an interesting article one time comparing OS latency - as in the screen to screen, field to field timing of older OSes and responsiveness of the inputs. (As well as mouse and tool responsiveness for GUI)
It was surprising the extent to which modern OSes are slower in absolute terms. The character based inputs and original Mac OS led the responsiveness, unsurprisingly.

They must have used machines and devices to determine this then, because there is no way the human eye is going to notice the difference. In fact, I recall that back in the day, text based OSes where quite slow on older hardware but then again, that is what upgrading is all about. :)
 
Back
Top