Hardware acceleration, or lack thereof

No! You aren't getting it. Software rendering is what X11 uses. It works exactly the same as Wayland. Unless it's hacked with the ported version of Chrome.

You summarized your own ineptness best in your own statement.

And as can be seen in my screenshots, CPU usage is at best 5% lower using hardware rendering as opposed to software rendering - Meaning heat output should be, effectively, identical under software rendering vs hardware rendering. Now you were quoting considerably higher differences than that under Wayland regrading CPU usage considering software rendering vs hardware rendering, were you not?

Because if you weren't than your opening post isn't very clear. Furthermore, if you weren't than your claims of increased heat output make no sense as CPU usage is literally identical under both scenarios. Wayland is not the default for Gnome 3, like KDE Wayland is a tech preview - Wayland simply isn't finished and is a work in progress. In comparison to Wayland the fact is X11 is a polished masterpiece, Wayland has all sorts of quirky issues.
 
And as can be seen in my screenshots, CPU usage is at best 5% lower using hardware rendering as opposed to software rendering - Meaning heat output should be, effectively, identical under software rendering vs hardware rendering. Now you were quoting considerably higher differences than that under Wayland regrading CPU usage considering software rendering vs hardware rendering, were you not?

Because if you weren't than your opening post isn't very clear. Furthermore, if you weren't than your claims of increased heat output make no sense as CPU usage is literally identical under both scenarios. Wayland is not the default for Gnome 3, like KDE Wayland is a tech preview - Wayland simply isn't finished and is a work in progress. In comparison to Wayland the fact is X11 is a polished masterpiece, Wayland has all sorts of quirky issues.
I can't possibly keep wasting my time trying to explain this to you.
1. Software rendering runs like shit on my laptop on X11 and Wayland.
2. Hardware acceleration runs great on my laptop.
3. Hardware acceleration runs great whether X11 or Wayland.
4. No official browser support for hardware acceleration is disappointing.
5. Because your tests on your 4k capable machine resulted in something else doesn't mean that my laptop will have the same results.
6. Due to the above points, hardware acceleration is required in my situation for me to stream videos
 
I can't possibly keep wasting my time trying to explain this to you.
1. Software rendering runs like shit on my laptop.
2. Hardware acceleration runs great on my laptop.
3. Hardware acceleration runs great whether X11 or Wayland.
4. No official browser support for hardware acceleration is disappointing.
5. Because your tests on your 4k capable machine resulted in something else doesn't mean that my laptop will have the same results.

Lets simplify this. What were the differences in CPU usage observed considering hardware vs software rendering on your device under Wayland? That's all I want to know as your posts aren't clear and software rendering should not run like shit on your laptop, not by any stretch. There should be no real discernible difference in CPU usage considering software vs hardware rendering.

Screenshots providing evidence would be nice.

I have no idea just why you're getting so defensive? Have I quashed a point you were trying to make? My 4k capable machine is circa 2010.
 
Lets simplify this. What were the differences in CPU usage observed considering hardware vs software rendering on your device under Wayland? That's all I want to know as your posts aren't clear and software rendering should not run like shit on your laptop, not by any stretch. There should be no real discernible difference in CPU usage considering software vs hardware rendering.

Screenshots providing evidence would be nice.

I have no idea just why you're getting so defensive? Have I quashed a point you were trying to make? My 4k capable machine is circa 2010.
You're being accusatory. And, your guilty conscience thinks I'm getting defensive. You're dodging the truth until your last breathe. I don't have time to explain something for 2 days when I've already recorded the information previously.
Lets simplify this. What were the differences in CPU usage observed considering hardware vs software rendering on your device under Wayland? That's all I want to know as your posts aren't clear and software rendering should not run like shit on your laptop, not by any stretch. There should be no real discernible difference in CPU usage considering software vs hardware rendering.

Screenshots providing evidence would be nice.

I have no idea just why you're getting so defensive? Have I quashed a point you were trying to make? My 4k capable machine is circa 2010.
You have a unique ability of projecting.
In fact, after just running tests right now, X11 performs worse than Wayland with software rendering. As I said, both run like shit. In fact, your point is being quashed this whole time. Again, you're projecting. I'll leave these screenshots here...

KDE/Plasma. ~140%+ (not exact):
KDE-Plasma screen.png



X11. ~250% (not exact):
X11 screen.png


Both using the exact same stream with the exact same settings. Both using exact same browser, version, etc.on. Again, what works on your machine doesn't necessarily mean it's the right configuration for others. Might be time to own up and say you were wrong. You keep dodging and changing your argument and your topic to try and fit to you being right. Step up and own it. You're wrong. Your blanket statement obviously doesn't cover every case. Stop coming in here and stating that X11 is always better.
 
For comparison, I did the same thing using hardware acceleration. Again, the only difference here is I am able to leverage hardware acceleration as mpv (the player being used) supports this ability. Again, X11 performed worse using the exact same command.

KDE/Plasma. ~13-15% (not exact)

KDE screen mpv.png


X11. ~20%+ (not exact).

X11 screen mpv.png
 
You're being accusatory. And, your guilty conscience thinks I'm getting defensive. You're dodging the truth until your last breathe. I don't have time to explain something for 2 days when I've already recorded the information previously.

You have a unique ability of projecting.
In fact, after just running tests right now, X11 performs worse than Wayland with software rendering. As I said, both run like shit. In fact, your point is being quashed this whole time. Again, you're projecting. I'll leave these screenshots here...

KDE/Plasma. ~140%+ (not exact):
View attachment 180049


X11. ~250% (not exact):
View attachment 180050

Both using the exact same stream with the exact same settings. Both using exact same browser, version, etc.on. Again, what works on your machine doesn't necessarily mean it's the right configuration for others. Might be time to own up and say you were wrong. You keep dodging and changing your argument and your topic to try and fit to you being right. Step up and own it. You're wrong. Your blanket statement obviously doesn't cover every case. Stop coming in here and stating that X11 is always better.

Hang on!

Stop attacking me! I'm sick of the attacks! I'm here trying to work through the problem and you're purely focused on attacking me because of some stupid Wayland vs X11 rubbish or some belief that you have that hardware rendering is absolutely necessary because you're experiencing issues. Stop with the bullshit.

What you are seeing there is absolutely not right, not at all. I think your issue is related to your choice of distro. While I'm sure you won't do it, it would be interesting to see the results of you booted into KDE Neon 5.16 running the same software rendered video under X11. Live media should be fine.
 
Hang on!

Stop attacking me! I'm sick of the attacks! I'm here trying to work through the problem and you're purely focused on attacking me because of some stupid Wayland vs X11 rubbish or some belief that you have that hardware rendering is absolutely necessary because you're experiencing issues. Stop with the bullshit.

What you are seeing there is absolutely not right, not at all. I think your issue is related to your choice of distro. While I'm sure you won't do it, it would be interesting to see the results of you booted into KDE Neon 5.16 running the same software rendered video under X11. Live media should be fine.
I'm running Fedora. I've currently got KDE/Plasma, Gnome/Wayland, and Gnome/X11 installed. Downloading KDE Neon right now. Will update thread shortly. Did you see my screenshots with hardware acceleration?
 
I'm running Fedora. I've currently got KDE/Plasma, Gnome/Wayland, and Gnome/X11 installed. Downloading KDE Neon right now. Will update thread shortly. Did you see my screenshots with hardware acceleration?

Yes I saw all of your screenshots.

I just ran the same Top Gun 1080p video on my 2012 Mac Mini (Intel i5 3210M) with Intel iGPU using a VM of Ubuntu Gnome 16.04 totally outdated with only one core with HT being used and the difference between software and hardware decoding was about 5%, sitting at between 25 - 35% usage with no frame drops and no issues.

The fact you're running KDE isn't enough, you're still running Fedora. Spin up a live distro of KDE Neon 5.16 and see what happens playing back the same content.
 
Also, you don't need to add logical and virtual cores together when discussing CPU usage, it's confusing matters. The software used is jumping cores, it doesn't make use of multi threaded utilization. ;)
 
Last edited:
I just tested the same 1080p Top Gun video on my old Intel Core 2 T5500 laptop running KDE Plasma (Ubuntu 16.04 - So old and outdated) and that gets similar performance to what you're seeing. I get ~60-90% CPU utilization with software rendering, no frame drops, and ~25-30% CPU utilization via hardware rendering.

The issue here is your software rendering performance is abysmal, you're effectively getting the performance of a 2008 processor with 4GB of ram. [EDIT] I don't consider a Core 2 T5500 to be capable hardware..
 
Last edited:
I just tested the same 1080p Top Gun video on my old Intel Core 2 T5500 laptop running KDE Plasma (Ubuntu 16.04 - So old and outdated) and that gets similar performance to what you're seeing. I get ~60-90% CPU utilization with software rendering, no frame drops, and ~25-30% CPU utilization via hardware rendering.

The issue here is your software rendering performance is abysmal, you're effectively getting the performance of a 2008 processor with 4GB of ram. [EDIT] I don't consider a Core 2 T5500 to be capable hardware..
This is exactly my point. Software rendering is shit. Hardware acceleration should be standard.

Anyway, here's KDE Neon

Screenshot-20190811-022549.png
 
This is exactly my point. Software rendering is shit. Hardware acceleration should be standard.

Anyway, here's KDE Neon

View attachment 180065

Well it only appears to be shit on your system and a system that I consider to be incapable hardware.

Now, stop using Htop and give me a utilization graph over time. Your load averages under KDE Neon are far better and more in line with what I'd expect to see, compared to KDE Fedora you've essentially gained a core considering hyperthreading. However, they don't correlate to the CPU usage highlighted at that particular point in time according to Htop.
 
Last edited:
Also, can you playback the Top Gun 2 trailer and show me the stats. Your video is playing back at 960p/30, perhaps there's extra scaling resulting in a rendering overhead. What is your max download speed? Is your max download speed lower under Linux than it is under Windows? Because you're getting ~6Mbps there which is fairly low for this day and age - Are you using a Realtek network adapter that's resulting in higher than normal CPU usage?

vp09 is not that demanding at 1080p. There's something causing your poor software rendering performance, or I'm not seeing the whole picture here.
 
Last edited:
Now, stop using Htop and give me a performance graph over time. Your load averages under KDE Neon are far better and more in line with what I'd expect to see, they don't correlate to the CPU usage highlighted at that particular point in time according to Htop.

Didn't mean to get your hopes up. I only took that screenshot before the load had a chance to increase. It increased similar to what happened in Fedora. No noticeable difference.

Also, can you playback the Top Gun 2 trailer and show me the stats. Your video is playing back at 960p/30, perhaps there's extra scaling resulting in a rendering overhead. What is your max download speed? Is your max download speed lower under Linux than it is under Windows? Because you're getting ~6Mbps there which is fairly low for this day and age - Are you using a Realtek network adapter that's resulting in higher than normal CPU usage?

vp09 is not that demanding at 1080p, there's something causing your poor software rendering performance, or I'm not seeing the whole picture here.
If it were a network card/driver issue, I would expect the same limitation whether I'm using the browser or mpv. I can stream 4 or 5 streams in mpv simultaneously and it doesn't break a sweat.

Edit: I think the 960p thing might be a result of the ratio that Marques Brownlee recorded/uploaded the video at. When I play the Top Gun 2 video, I get 1080p in the stats for the YT video.
 
Last edited:
Didn't mean to get your hopes up. I only took that screenshot before the load had a chance to increase. It increased similar to what happened in Fedora. No noticeable difference.

It's obvious what you did, you put little effort into that screenshot and took a screengrab 'immediately' after starting the video playing.

I tell you what. Lets agree that laptops suck and for that reason hardware rendering could have advantages in situations where core count/cooling system capacity is limited and you want to minimize noise - It's for this reason I avoid laptops like the plague and use desktops where possible. It's difficult to compare a 12C/24T workstation to a 2C/4T laptop with a tiny heatsink and fan. But understand that as far as playback is concerned software rendering will do fine in all situations considering capable hardware (and your laptop is capable hardware) and therefore it stands to reason that devs will prioritize such optimizations considering the time needed to get them working reliably under Linux compared to more pressing issues - Linux is not Windows, and considering API's and the open nature of Linux I wouldn't be at all surprised if such features were far harder to implement under Linux than they are under Windows.

Here is the results of me testing the exact video you're using. On the left of the utilization graph in the red bubble is hardware rendering, on the right of the utilization graph highlighted by the black bubble is software rendering - The differences are minimal, ~5%. As you can see, highlighted using the blue bubble under the core temperature graph, temps actually appear to have 'dropped' using software rendering.

There are advantages to hardware rendering, but it's definitely on a case by case basis. At least you can use the media player of your choosing to open such video streams and render them in hardware.

 
Last edited:
It's obvious what you did, you put little effort into that screenshot and took a screengrab 'immediately' after starting the video playing.

I tell you what. Lets agree that laptops suck and for that reason hardware rendering could have advantages in situations where core count/cooling system capacity is limited and you want to minimize noise - It's for this reason I avoid laptops like the plague and use desktops where possible. It's difficult to compare a 12C/24T workstation to a 2C/4T laptop with a tiny heatsink and fan. But understand that as far as playback is concerned software rendering will do fine in all situations considering capable hardware (and your laptop is capable hardware) and therefore it stands to reason that devs will prioritize such optimizations considering the time needed to get them working reliably under Linux compared to more pressing issues - Linux is not Windows, and considering API's and the open nature of Linux I wouldn't be at all surprised if such features were far harder to implement under Linux than they are under Windows.

Here is the results of me testing the exact video you're using. On the left of the utilization graph in the red bubble is hardware rendering, on the right of the utilization graph is software rendering - The differences are minimal, ~5%. As you can see, highlighted using the blue bubble under the core temperature graph, temps actually appear to have 'dropped' using software rendering.

There are advantages to hardware rendering, but it's definitely on a case by case basis. At least you can use the media player of your choosing to open such video streams and render them in hardware.


Hence the whole thread... Now you get it... As adament as you were that it didn't matter, now hopefully you see the gripe behind hardware acceleration being just as important on Linux as in Windows. And, my initial statements that when I'm talking to anyone and trying to make a solid recommendation for someone, it's difficult to go "all-in" on Linux when there are so many caveats when it comes to this hardware acceleration. It's nonsense that it isn't officially supported in any browser available on Linux. That's regardless of compositor.

I'm not even going to address your little dig at me in the beginning of your last reply.
 
Hence the whole thread... Now you get it... As adament as you were that it didn't matter, now hopefully you see the gripe behind hardware acceleration being just as important on Linux as in Windows. And, my initial statements that when I'm talking to anyone and trying to make a solid recommendation for someone, it's difficult to go "all-in" on Linux when there are so many caveats when it comes to this hardware acceleration. It's nonsense that it isn't officially supported in any browser available on Linux. That's regardless of compositor.

I'm not even going to address your little dig at me in the beginning of your last reply.

You can run X11 with no downsides and gain hardware acceleration under Chrome, most run X11 and have that option - You choose not to. As for my dig, it's obvious what you did at 99 frames - I struggle to even get a screenshot at 99 frames! No point denying it. At least you haven't reported me yet.

The problem with this thread is you were struggling to provide much needed information, namely CPU utilization under both X11 and Wayland using software rendering and were purely focused on hardware rendering and the unsurprising issues encountered under Wayland forcing it via Chrome. Based on what you were describing I was of the opinion that software rendering via your browser was a jittery, stuttery mess with dropped frames and playback issues at 1080p.
 
Last edited:
You were also far too focused on attacking me as opposed to discussing like a mature adult. The attacks weren't warranted or needed. I was simply interested in working through the problem. ;)
 
Last edited:
You can run X11 with no downsides and gain hardware acceleration under Chrome, most run X11 and have that option - You choose not to. As for my dig, it's obvious what you did at 99 frames - I struggle to even get a screenshot at 99 frames! No point denying it. At least you haven't reported me yet.

The problem with this thread is you were struggling to provide much needed information, namely CPU utilization under both X11 and Wayland using software rendering and were purely focused on hardware rendering and the unsurprising issues encountered under Wayland forcing it via Chrome. Based on what you were describing I was of the opinion that software rendering via your browser was a jittery, stuttery mess with dropped frames and playback issues at 1080p.

LOL. it doesn't matter if I took it right away or two minutes later. The result is the same. I was trying to help you understand why the load hadn't spiked yet. I've provided more than enough evidence that supports all my previous statements. There's no question there.

You were also far too focused on attacking me as opposed to discussing like a mature adult. The attacks weren't warranted or needed. I was simply interested in working through the problem. ;)

You still gotta take digs at me and insult me. Seems you're the one that can't be a mature adult. And, you still don't understand the comment I made many posts ago that "I am not experiencing an issue!". You just frustrated the situation by being obtuse and not wanting to accept anything other than your own opinion, even when the evidence is presented.
 
LOL. it doesn't matter if I took it right away or two minutes later. The result is the same. I was trying to help you understand why the load hadn't spiked yet. I've provided more than enough evidence that supports all my previous statements. There's no question there.

Well it does. You took that screenshot so fast I'd be surprised if the movement of the mouse pointer wasn't affecting the results. You're failing to understand that your blanket comment "I am not experiencing an issue" provided absolutely no context in relation to your opening post which made it sound like 1080p playback via your browser using software rendering was totally borked due to your odd need to report CPU usage by adding individual/virtual cores as opposed to reporting a single core as a result of the obvious fact that application(s) were core jumping.

Perhaps if you had opened with a screenshot the situation wouldn't have been 'frustrated'. It wasn't until I saw a screenshot that I worked out exactly what you were doing.

[EDIT]: Having said that, I'm really not interested in this back and forth bullshit. You know you did wrong as when I pulled you up on it you completely stopped with the attacks.
 
Last edited:
Discovered a couple of additional things last night. MDS and SMT should both be disabled to take advantage of the recent patches for vulnerabilities. Turned those off and I lost 2 threads. Browser still runs like shit when streaming video. And, streaming in mpv still runs great.

While testing, I ran across another article stating that the Intel BIOS has a feature called SpeedStep that should be turned off for performance gains. I turned it off and ran some preliminary tests. Browser still creates exceptionally high load. What I noticed in mpv is that my load decreased by about half. There are others that noted there's potential for additional battery drain. I'm going to do some additional testing to see if there are any other differences with this feature turned off.
 
It's obvious what you did, you put little effort into that screenshot and took a screengrab 'immediately' after starting the video playing.

I tell you what. Lets agree that laptops suck

Ahem, I happend to love my Macbook pro's. Reliable, quiet and just do what I want to do with them. Windows laptops? Horrible unless you pay Macbook prices for them. And even that is no garantee of anything.
 
Discovered a couple of additional things last night. MDS and SMT should both be disabled to take advantage of the recent patches for vulnerabilities. Turned those off and I lost 2 threads. Browser still runs like shit when streaming video. And, streaming in mpv still runs great.

While testing, I ran across another article stating that the Intel BIOS has a feature called SpeedStep that should be turned off for performance gains. I turned it off and ran some preliminary tests. Browser still creates exceptionally high load. What I noticed in mpv is that my load decreased by about half. There are others that noted there's potential for additional battery drain. I'm going to do some additional testing to see if there are any other differences with this feature turned off.

I would be paying specific attention to your clock speeds while software rendering as I believe your cooling solution in inadequate and your clocks are throttling. Perhaps if you could bring fan speeds up a touch at lower temps you could gain better control over heat soak and therefore maintain higher clocks and lower overall CPU usage with a minimal increase in fan noise.

I refer again back to my Mac Mini running Ubuntu Gnome 16.04 as a VM using 1C/1T and the difference between software and hardware rendering was about 5% with no increase in fan noise using a 3rd generation i5.

Ahem, I happend to love my Macbook pro's. Reliable, quiet and just do what I want to do with them. Windows laptops? Horrible unless you pay Macbook prices for them. And even that is no garantee of anything.

Agreed. I've got a Macbook here, its old (2012), but still perfectly capable. I've got an even older Compaq running a Core 2 T5500 that I used for comparisons in this thread that interestingly performed software rendering within the browser fine, with high CPU usage, but the fans didn't ramp up noticeably at all. That's long since retired now, but it served me well.

The nicest laptops I've used are the Dell XPS series, the XPS15 is a beautiful device, but we're back into Mac territory pricing again...
 
Last edited:
I thought the reason we are pushing away from X11 is the security threat it presents. Basically every application you are running has full io access of every other program running (keystrokes, mouse clicks and positions, etc.)

That era of intel integrated chips was plagued with hardware acceleration problems and they went through a lot of driver versions trying to fix it, even on windows. OP, what intel driver version are you running? I don't even see a linux driver from intel on that chip.


Ahem, I happend to love my Macbook pro's. Reliable, quiet and just do what I want to do with them. Windows laptops? Horrible unless you pay Macbook prices for them. And even that is no garantee of anything.

I have always felt that with Macbook pro's were basically rolling the dice. The one I had was fantastic, but I know a LOT of people that have horror stories from them. If you have a real problem you are at the mercy of the applestore a+ kid who could just tell you "buy another one, we are not honoring the warranty" for no rime or reason. I think they are fantastic as long as you never really stress them. I have personally witnessed one of them release a small bit of the magic smoke running autocad in a class. I could probably fold on my P51, but it cost as much as a mac pro, if not more... :/

The nicest laptops I've used are the Dell XPS series, the XPS15 is a beautiful device, but we're back into Mac territory pricing again...

I agree with this.

Ultimately I think the moral of the story is not that laptops suck so much as cheap laptops suck.
 
I thought the reason we are pushing away from X11 is the security threat it presents. Basically every application you are running has full io access of every other program running (keystrokes, mouse clicks and positions, etc.)

I'm of the same opinion, unfortunately at this point in time Wayland is resulting in more issues than it's resolving for me to use it 100% daily. I'd be only too happy to use it on a machine that I don't count on however. Furthermore, by today's standards X11 is no more insecure than most other applications running on your PC as nothing is really sandboxed like it is under Android or iOS with the exception of your web browser. Therefore the potential security implications of X11 really only apply to a future where it's assumed everything will be sandboxed - However that's a reality that's still a distant future as I'm fairly certain that even Snaps aren't technically sandboxed. Ssh -X is the realistic threat, however that's disabled out of the box on every distro I know of.

I have always felt that with Macbook pro's were basically rolling the dice. The one I had was fantastic, but I know a LOT of people that have horror stories from them. If you have a real problem you are at the mercy of the applestore a+ kid who could just tell you "buy another one, we are not honoring the warranty" for no rime or reason. I think they are fantastic as long as you never really stress them. I have personally witnessed one of them release a small bit of the magic smoke running autocad in a class. I could probably fold on my P51, but it cost as much as a mac pro, if not more... :/

As someone that repairs laptops, all devices have their share of quirky issues. From BGA solder joints cracking to flex cable SATA issues to HDD issues, cooling issues and more. I wouldn't buy the latest MacBook as I don't agree with the implications of the T2 chip and soldered HDD's, but their devices from ~2012 are still very capable devices that run well under Linux.

I agree with this.

Ultimately I think the moral of the story is not that laptops suck so much as cheap laptops suck.

Valid point, agreed. ;)
 
Last edited:
I'm lucky to have an indy Mac repair shop nearby, they fix things on the cheap. Not that I ever needed one for the MBP, just replacing battery and a broken display on iPhone.

I'm considering to take my older MBP for touch pad adjust, I accidentally dropped the laptop on cement floor from 1 meter high table (bass was shaking it enough to bounce it down) and it became stiff.
 
I'm lucky to have an indy Mac repair shop nearby, they fix things on the cheap. Not that I ever needed one for the MBP, just replacing battery and a broken display on iPhone.

I'm considering to take my older MBP for touch pad adjust, I accidentally dropped the laptop on cement floor from 1 meter high table (bass was shaking it enough to bounce it down) and it became stiff.

If your MacBook track pad is stiff be sure to check the battery. On certain models the battery swells with age and presses on the underside of the track pad. The batteries are so bad that I've actually watched them swell while sitting idle on the bench.
 
If your MacBook track pad is stiff be sure to check the battery. On certain models the battery swells with age and presses on the underside of the track pad. The batteries are so bad that I've actually watched them swell while sitting idle on the bench.
The pad became stiff after it dropped on the concrete, hard enough to make 1-2mm deep dents to the aluminum frame... It's most likely just out of alignment but I'm too busy to fix it myself.

The battery is fine, I opened the frame recently to upgrade the PCI-E SSD to a 1Tb model (OWC Aura Pro).
 
I'm fairly certain that even Snaps aren't technically sandboxed.

In a properly configured distro Snaps are 100% confined to their folder (provided the Snap is properly built as well). Ubuntu, Solus, any Ubuntu fork etc has 100% fully functional confinement. Arch and Arch based distros can enable AppArmor in the kernel but I don't think they are 100% confined by default. You have to deal with all of that. That's why I've never enabled it on my systems. Too lazy. ;)

You can even have real-time notifications if a Snap has something do something (external or internal) which is against the policy. Canonical has actually done a very good job with Snaps.
 
In a properly configured distro Snaps are 100% confined to their folder (provided the Snap is properly built as well). Ubuntu, Solus, any Ubuntu fork etc has 100% fully functional confinement. Arch and Arch based distros can enable AppArmor in the kernel but I don't think they are 100% confined by default. You have to deal with all of that. That's why I've never enabled it on my systems. Too lazy. ;)

You can even have real-time notifications if a Snap has something do something (external or internal) which is against the policy. Canonical has actually done a very good job with Snaps.

Are they actually each in their own sandboxed area of memory though? I don't think they are?
 
Are they actually each in their own sandboxed area of memory though? I don't think they are?

If they are in strict mode (the mode used by most snaps) they are.

https://snapcraft.io/docs/snap-confinement

If the Snap is installed in classic mode then it isn't confined. so as long as the Snap is built properly and used on a properly configured system it is 100% confined from everything.
 
Back
Top