Steam On Linux Usage Climbs Higher Thanks To The Steam Deck

DukenukemX

Supreme [H]ardness
Joined
Jan 30, 2005
Messages
6,565
Welp in the video its failing don't know what else to tell you. I've experienced it too. It's one of the most common issues with the drivers being proprietary. There's boards upon boards talking about it. It happens. Typically when upgrading the distro (major point releases) is when the chance of failure is at its highest. The other issue is Wayland development, which is why Nvidia is finally open sourcing it's drivers. It's not doing it for zero benefit.
Nvidia is open sourcing because everyone has been asking for it. AMD does it, Intel does it, so it should make sense for Nvidia to do it. The issue is that Nvidia just threw the code online and that code only works for relatively new GPU's. It's not like AMD where they have AMDGPU, AMDVLK, and RadeonSi. It's not like Intel's ANV driver or their new HasVK driver for older GPU's. You're just building the same driver that Nvidia gives you on their website. For this to work Nvidia would have to contribute to Nouveau or start their own driver like AMD and Intel does. The code does seem to help Nouveua guys but it isn't exactly a big help.
 

kac77

2[H]4U
Joined
Dec 13, 2008
Messages
2,893
Nvidia is open sourcing because everyone has been asking for it.
It's doing it because the Linux community is tired of Nvidia's shit. Specifically I'm talking about Wayland and nVidia's EGL Stream approach which was a wrapper that created this black box when trying to get the display compositor to work correctly with the graphics driver. I've been following this nonsense for years here's quick recap:

NVIDIA Proposes Mesa Patches To Support Alternative GBM Back-Ends

"For years NVIDIA was against using GBM and instead proposed using EGLStreams. Some compositors like GNOME's Mutter implemented EGLStreams support but that has only been a mild success with Wayland compositors not liking that rather NVIDIA-specific solution while the open-source GPU drivers all support GBM."

Imagine Intel, AMD, Imagination, Samsung, and ARM all creating at least a decent open source driver, but there's this one a**hole who doesn't want to play nice? That would be nVidia.

I literally switched from nVidia only to AMD (for my main workstation) specifically because the bundled AMD driver worked FAR better in Linux than even the propriety nVidia driver did. If you want to use that (until recently), you had to pull x11 back in with all of its compatibility layers which causes havoc when you want to do shit x11 was always bad at like dual monitor support. Don't get me wrong the proprietary nVidia driver is great if you're in the game. But you start changing resolutions and moving monitor positions for the desktop and the driver starts to squeal like a piggy. So much so that you have to ditch the GUI go into the x11 config files and manually input what you want. Meanwhile back at the ranch with everyone else the crap just works with the GUI which is exactly how it should be.
 
Last edited:

Lakados

Supreme [H]ardness
Joined
Feb 3, 2014
Messages
6,956
It's doing it because the Linux community is tired of Nvidia's shit. Specifically I'm talking about Wayland and nVidia's EGL Stream approach which was a wrapper that created this black box when trying to get the display compositor to work correctly with the graphics driver. I've been following this nonsense for years here's quick recap:

NVIDIA Proposes Mesa Patches To Support Alternative GBM Back-Ends

"For years NVIDIA was against using GBM and instead proposed using EGLStreams. Some compositors like GNOME's Mutter implemented EGLStreams support but that has only been a mild success with Wayland compositors not liking that rather NVIDIA-specific solution while the open-source GPU drivers all support GBM."

Imagine Intel, AMD, Imagination, Samsung, and ARM all creating at least a decent open source driver, but there's this one a**hole who doesn't want to play nice? That would be nVidia.

I literally switched from nVidia only to AMD (for my main workstation) specifically because the bundled AMD driver worked FAR better in Linux than even the propriety nVidia driver did. If you want to use that (until recently), you had to pull x11 back in with all of its compatibility layers which causes havoc when you want to do shit x11 was always bad at like dual monitor support. Don't get me wrong the proprietary nVidia driver is great if you're in the game. But you start changing resolutions and moving monitor positions for the desktop and the driver starts to squeal like a piggy. So much so that you have to ditch the GUI go into the x11 config files and manually input what you want. Meanwhile back at the ranch with everyone else the crap just works with the GUI which is exactly how it should be.
No, they aren't, they are doing it because Enterprise and Datacenter need the ability to change things faster than Nvidia's current driver model allows. The ability to modify and tweak drivers and code to optimize and customize for specific use cases is a huge deal right now and Opensource is the only way to achieve that.
 

kac77

2[H]4U
Joined
Dec 13, 2008
Messages
2,893
No, they aren't, they are doing it because Enterprise and Datacenter need the ability to change things faster than Nvidia's current driver model allows. The ability to modify and tweak drivers and code to optimize and customize for specific use cases is a huge deal right now and Opensource is the only way to achieve that.
This would be the same thing. No one wants to figure out a black box in order to release an update.
 

Lakados

Supreme [H]ardness
Joined
Feb 3, 2014
Messages
6,956
This would be the same thing. No one wants to figure out a black box in order to release an update.
It's more a case of if we modify this driver function here we can sacrifice performance on functions we don't use to accelerate functions we do use and drop power requirements by 50w per card for this very specific task. But it will finish the job a month earlier and cost us $3m less in electrical and allow us to pick up an extra contract this year. But the overall sentiment is the same.
I was at a virtual conference recently about how specialized hardware is the future of Datacenters, ARM and Intel are getting on board with the demands they are getting from the providers too. They want CPUs with custom cores for encode/decode, PCIe IO optimized for certain data types, etc... The cost savings in specializing for what you are doing specifically are too great and if Intel, Nvidia, or AMD doesn't do it somebody else will, Altera is looking mighty good for a crapload of tasks and it is causing Intel and AMD to sweat a little.
 

kac77

2[H]4U
Joined
Dec 13, 2008
Messages
2,893
It's more a case of if we modify this driver function here we can sacrifice performance on functions we don't use to accelerate functions we do use and drop power requirements by 50w per card for this very specific task. But it will finish the job a month earlier and cost us $3m less in electrical and allow us to pick up an extra contract this year. But the overall sentiment is the same.
I was at a virtual conference recently about how specialized hardware is the future of Datacenters, ARM and Intel are getting on board with the demands they are getting from the providers too. They want CPUs with custom cores for encode/decode, PCIe IO optimized for certain data types, etc... The cost savings in specializing for what you are doing specifically are too great and if Intel, Nvidia, or AMD doesn't do it somebody else will, Altera is looking mighty good for a crapload of tasks and it is causing Intel and AMD to sweat a little.
Going open source doesnt necessarily mean you sacrifice performance. Hell in AMDs case it's been all upside considering the Steam Deck.

Going open source, especially when you're talking about Enterprise workloads matters a lot. No one wants to get stuck on the platform because it's closed source. The amount of money spent in those situations is incalculable.

Pre-2010's has been nothing but companies trying to get off of closed source, or forced upgrade pathways.
 

Mazzspeed

2[H]4U
Joined
Dec 27, 2017
Messages
3,206
Welp in the video its failing don't know what else to tell you. I've experienced it too. It's one of the most common issues with the drivers being proprietary. There's boards upon boards talking about it. It happens. Typically when upgrading the distro (major point releases) is when the chance of failure is at its highest. The other issue is Wayland development, which is why Nvidia is finally open sourcing it's drivers. It's not doing it for zero benefit.
Once again.

The only time you have issues regarding the Nvidia proprietary drivers is if you:

  1. Run bleeding edge kernels.
  2. Install the drivers using the shell script as opposed to installing the drivers via your package manager
  3. Run switchable graphics solutions, which aren't trouble free even under Windows.
I have never had a problem with Nvidia drivers running a kernel release that's recent, but not bleeding edge. You will not have a problem with Nvidia propriatery drivers running Ubuntu LTS point releases.
 

Mazzspeed

2[H]4U
Joined
Dec 27, 2017
Messages
3,206
Hell in AMDs case it's been all upside considering the Steam Deck.
AMD open source drivers aren't all roses, driver releases lag notably (as in months) after the release of the latest hardware. As opposed to Nvidia, where drivers are released along with the latest hardware, with the drivers hitting most repo's a day or two later.

You can't complain about proprietary drivers, while running proprietary games - That's called hypocrisy.
 

kac77

2[H]4U
Joined
Dec 13, 2008
Messages
2,893
Once again.

The only time you have issues regarding the Nvidia proprietary drivers is if you:

  1. Run bleeding edge kernels.
  2. Install the drivers using the shell script as opposed to installing the drivers via your package manager
  3. Run switchable graphics solutions, which aren't trouble free even under Windows.
I have never had a problem with Nvidia drivers running a kernel release that's recent, but not bleeding edge. You will not have a problem with Nvidia propriatery drivers running Ubuntu LTS point releases.
All I can say is that video says otherwise. I've experienced it too. It's rare but it does happen. You don't have to have bleeding edge kernels to run into it.

One of the biggest problems with Linux users has been this idea that it's perfect when it's not.
 

kac77

2[H]4U
Joined
Dec 13, 2008
Messages
2,893
AMD open source drivers aren't all roses, driver releases lag notably (as in months) after the release of the latest hardware. As opposed to Nvidia, where drivers are released along with the latest hardware, with the drivers hitting most repo's a day or two later.

You can't complain about proprietary drivers, while running proprietary games - That's called hypocrisy.
Never said they were. But I think it hypocrisy to identify the benefits of controlling the whole Linux experience (graphic drivers being apart of that) when it comes to gaming / the Steam Deck but pretend this doesn't apply to nVidia when it obviously does. Who argues for things being proprietary?
 

Mazzspeed

2[H]4U
Joined
Dec 27, 2017
Messages
3,206
All I can say is that video says otherwise. I've experienced it too. It's rare but it does happen. You don't have to have bleeding edge kernels to run into it.

One of the biggest problems with Linux users has been this idea that it's perfect when it's not.
Rubbish. The current LTS kernel is 5.15.0.5, bleeding edge is 6.1-rc4, the 15.15 branch was released on the 31 of October 2021 - There's no way a point release is going to break your Nvidia drivers running an LTS distro.

Who argues for things being proprietary?
Well who simply expects everything to be FOSS because Linux? As a Linux user I understand the reality that such a perspective is ludicrous.

Another negative to FOSS only is the fact that software constantly gets abandoned, and not forked or picked up by new developers. On a number of occasions I've been forced to either stop using certain software packages altogether or simply use outdated software packages for this very reason. Sure, it's not limited to FOSS only, but it seems more prominent regarding FOSS and it's annoying. Two examples off the top of my head are the Clementine music player and Green With Envy. I can use Strawberry in place of Clementine, but it's not as fully featured as Clementine.

NOTE: I'm by no means anti FOSS, but expecting Linux to be FOSS only is simply unrealistic. Nvidia drivers work as well as their Windows counterparts, the open source AMD drivers are not faultless.
 

kac77

2[H]4U
Joined
Dec 13, 2008
Messages
2,893
Rubbish. The current LTS kernel is 5.15.0.5, bleeding edge is 6.1-rc4, the 15.15 branch was released on the 31 of October 2021 - There's no way a point release is going to break your Nvidia drivers running an LTS distro.


Well who simply expects everything to be FOSS because Linux? As a Linux user I understand the reality that such a perspective is ludicrous.

Another negative to FOSS only is the fact that software constantly gets abandoned, and not forked or picked up by new developers. On a number of occasions I've been forced to either stop using certain software packages altogether or simply use outdated software packages for this very reason. Sure, it's not limited to FOSS only, but it seems more prominent regarding FOSS and it's annoying. Two examples off the top of my head are the Clementine music player and Green With Envy. I can use Strawberry in place of Clementine, but it's not as fully featured as Clementine.

NOTE: I'm by no means anti FOSS, but expecting Linux to be FOSS only is simply unrealistic, and there's nothing wrong with Nvidia drivers.
Never said the drivers themselves were bad. Also never said everything needed to be FOSS either from a general usage POV.

But you can't speak out of one side of your mouth and talk about controlling the entire stack and leave proprietary drivers out of the discussion. If it makes you feel better I like Nvidia too. Feel better?
 
Last edited:

Mazzspeed

2[H]4U
Joined
Dec 27, 2017
Messages
3,206
But you can't speak out of one side of your mouth and talk about controlling the entire stack and leave proprietary drivers out of the discussion. If it makes you feel better I like Nvidia too. Feel better?
The point you're missing is the fact that there are advantages to Nvidia controlling the entire stack, like 0 day driver support when new hardware is released, and no need to run bleeding edge possibly unstable kernels due to the fact that the driver isn't bundled with the kernel.

The way FOSS evangelists rant, you'd think they want the same situation MacOS users face: No Nvidia support, period.
 

DukenukemX

Supreme [H]ardness
Joined
Jan 30, 2005
Messages
6,565
AMD open source drivers aren't all roses, driver releases lag notably (as in months) after the release of the latest hardware. As opposed to Nvidia, where drivers are released along with the latest hardware, with the drivers hitting most repo's a day or two later.
Most news leaks of future AMD products are coming from open source Linux drivers. The only thing that lags is features like Ray-Tracing and OpenCL.
You can't complain about proprietary drivers, while running proprietary games - That's called hypocrisy.
You can't, but I can.
I was at a virtual conference recently about how specialized hardware is the future of Datacenters, ARM and Intel are getting on board with the demands they are getting from the providers too. They want CPUs with custom cores for encode/decode, PCIe IO optimized for certain data types, etc... The cost savings in specializing for what you are doing specifically are too great and if Intel, Nvidia, or AMD doesn't do it somebody else will, Altera is looking mighty good for a crapload of tasks and it is causing Intel and AMD to sweat a little.
ARM's not the first to do this, and IBM's PowerPC has been with no success. Though as a consumer we can't make our own specialized hardware, so really this benefits corporations who make devices we buy. Also I find it funny that ARM doesn't make FPU standard.

 

Lakados

Supreme [H]ardness
Joined
Feb 3, 2014
Messages
6,956
Most news leaks of future AMD products are coming from open source Linux drivers. The only thing that lags is features like Ray-Tracing and OpenCL.

You can't, but I can.

ARM's not the first to do this, and IBM's PowerPC has been with no success. Though as a consumer we can't make our own specialized hardware, so really this benefits corporations who make devices we buy. Also I find it funny that ARM doesn't make FPU standard.


Binary arithmetic is only capable of adding, subtracting is just adding a negative number using the 1 or 2's complement method, multiplication is just adding the same thing in repetition, division is just subtraction which is just addition with negative numbers, logarithms use a lookup table such as the Taylor series then work backward to addition from there.

PowerPC was the first to do it but the architecture was flawed and computing at a Datacenter level was nowhere near what it is today nor was the software doing it. We're starting to see customized consumer stuff like the Apple M series of desktop chips, and it's only going to branch out from there. AMD and Intel are working on their versions of just such SoC systems which is where their investment into the UCIe standards comes into play, so they can pick and choose just which components they need to customize their silicon for specific jobs and get far more specific in their integrations. The likes of you and I are a LONG ways off from being able to get our own customized silicon (at a reasonable rate) but OEMs and big data are already chomping for it, for better or worse Apple has stirred up the shit with the M1 and we're all gonna get splashed.
 

ManofGod

[H]F Junkie
Joined
Oct 4, 2007
Messages
12,717
The point you're missing is the fact that there are advantages to Nvidia controlling the entire stack, like 0 day driver support when new hardware is released, and no need to run bleeding edge possibly unstable kernels due to the fact that the driver isn't bundled with the kernel.

The way FOSS evangelists rant, you'd think they want the same situation MacOS users face: No Nvidia support, period.

For me, I do not care if it is Nvidia, AMD or Intel on the graphics front. As of now, essentially everything I have works in Windows and in Linux, outside of Steam, most games I have do not operate correctly in any version of Linux. I would say that outside of steam, in would 50 50 whether a game works or not. And unlike others, I am not willing to throw games away just because.
 

Mazzspeed

2[H]4U
Joined
Dec 27, 2017
Messages
3,206
Most news leaks of future AMD products are coming from open source Linux drivers. The only thing that lags is features like Ray-Tracing and OpenCL.
Dispite a handful of exceptions, the open source drivers are renowned for lagging behind hardware releases, sometimes by about six months. This isn't a point that can be argued, this is simple fact.

For me, I do not care if it is Nvidia, AMD or Intel on the graphics front. As of now, essentially everything I have works in Windows and in Linux, outside of Steam, most games I have do not operate correctly in any version of Linux. I would say that outside of steam, in would 50 50 whether a game works or not. And unlike others, I am not willing to throw games away just because.
I run a number of titles under Lutris and really don't experience any issues with the exception of Origin, which has issues under Windows and is something I can work around. Essentially, the main issue is anti cheat and DRM - It's a problem that was supposed to be resolved re: EAC, but good 'ol Tim Sweeney pulled the old Microsoft stunt regarding OOXML and made two versions of EAC just to be difficult (probably under direct instruction from Microsoft, hence the similarities).

Having said that, I'm tiring of gaming.
 

Red Falcon

[H]F Junkie
Joined
May 7, 2007
Messages
11,807
Most news leaks of future AMD products are coming from open source Linux drivers. The only thing that lags is features like Ray-Tracing and OpenCL.

You can't, but I can.

ARM's not the first to do this, and IBM's PowerPC has been with no success. Though as a consumer we can't make our own specialized hardware, so really this benefits corporations who make devices we buy.
Agreed with all of this.

Also I find it funny that ARM doesn't make FPU standard.
Not so much with this.
While the video is technically correct, as stated in the video ARM CPUs didn't include an FPU as a standard in its CPUs until 2004, but this was the same for x86 back in the 1970s to 1990s until the 80486 DX (but not the SX) and Pentium 1 (all models) made it a standard inclusion with the CPU.

Even the then-old (from 2003) ARM11 CPU on the original Raspberry Pi from 2012 had an FPU included by default.
So the argument made in the video is kind of silly when the person complains about Cortex-M CPUs from well over a decade ago not all having the FPU, which many of them did not need since they were meant for 32-bit embedded systems, which just needed to run integer functions and had no need for the additional cost and/or power draw of the FPU that would have sat idle in such systems.

That is hardly accurate for modern ARM 64-bit CPUs, from low-end SBC and mobile to bleeding edge server, which all have FPUs in them, and far more CPU extensions than Intel and AMD should be comfortable with.
Also, this is why software libraries exist so that integer functions can perform floating point arithmetic, albeit very inefficiently, on CPUs without an FPU regardless of the CPU ISA.

So the person in the video figured out what coders had to do 30-40 years ago while complaining about an "issue" that hasn't been an issue in nearly 20 years... progress? o_O
Other than that it was a great video, thanks for sharing!
 

serpretetsky

2[H]4U
Joined
Dec 24, 2008
Messages
2,057

"and that most of the smaller cortex m series processors still dont support signed or unsigned division. How is this possible?"
Hehe... welcome to the world of low cost microcontrollers. If you're trying to synthesize a soft CPU/MCU for a logic constrained FPGA sometimes you dont even have a multiply instruction. In low cost/ low power chips this is usually the first thing to go. It's extra logic, which means space wasted, and it usually takes atleast 10 clock cycles, usually a lot more.

If you need to divide by a constant the tricks he mentions are usually very fast anyways. I wouldn't be surprised if C compilers dont bother using divide instructions when dividing by a constant for integers.
 

Lakados

Supreme [H]ardness
Joined
Feb 3, 2014
Messages
6,956
"and that most of the smaller cortex m series processors still dont support signed or unsigned division. How is this possible?"
Hehe... welcome to the world of low cost microcontrollers. If you're trying to synthesize a soft CPU/MCU for a logic constrained FPGA sometimes you dont even have a multiply instruction. In low cost/ low power chips this is usually the first thing to go. It's extra logic, which means space wasted, and it usually takes atleast 10 clock cycles, usually a lot more.

If you need to divide by a constant the tricks he mentions are usually very fast anyways. I wouldn't be surprised if C compilers dont bother using divide instructions when dividing by a constant for integers.
ARM developers do it with 64-bit multiply and shift right.
you can google "shift right division" for examples and explanations on how and why it works for specifics if you want but the TLDR is:
It's an old way of doing binary division from before the "math.c" library was a thing and how they did it for assembly programming.
 

serpretetsky

2[H]4U
Joined
Dec 24, 2008
Messages
2,057
I wouldn't be surprised if C compilers dont bother using divide instructions when dividing by a constant for integers.
Heh, just answered my own question. Didn't realize there are online c to assembly compilers:



https://godbolt.org/

1670536196117.png


Modern x64 compilers. Don't see any divide instructions when im dividing by 7.

So even on your blazing fast modern AMD or intel x86 cores the compiler still avoids the divide instruction when possible.

Well now im curious, what if we do floating point operations instead?


1670536612099.png


Ah... now i spot a divide instruction.
 

jbltecnicspro

[H]F Junkie
Joined
Aug 18, 2006
Messages
8,256
I really hope Valve has plans to pack up the Deck's gaming experience and plop it on the general Linux ecosystem. Especially with the way Windows is headed. It would make it viable for a whole swathe of the current market.
I'm late to the party but my sentiments exactly. Windows has gone so far down the "big brother" drain that I would really love to be able to jump ship to Linux and still play my catalog.
 

Lakados

Supreme [H]ardness
Joined
Feb 3, 2014
Messages
6,956
Heh, just answered my own question. Didn't realize there are online c to assembly compilers:



https://godbolt.org/

View attachment 532802

Modern x64 compilers. Don't see any divide instructions when im dividing by 7.

So even on your blazing fast modern AMD or intel x86 cores the compiler still avoids the divide instruction when possible.

Well now im curious, what if we do floating point operations instead?


View attachment 532804

Ah... now i spot a divide instruction.
And that div probably breaks down to something like this.
mov edx, 0 ; clear dividend
mov eax, 0x8003 ; dividend
mov ecx, 0x100 ; divisor
div ecx ; EAX = 0x80, EDX = 0x3
 

DukenukemX

Supreme [H]ardness
Joined
Jan 30, 2005
Messages
6,565
I'm late to the party but my sentiments exactly. Windows has gone so far down the "big brother" drain that I would really love to be able to jump ship to Linux and still play my catalog.
This is why people like me who actually made the switch to Linux are trying to accomplish. We know Linux isn't there yet, and we'd like it to be. That isn't going to happen using Windows and complaining how Linux sucks. Valve made a big brain move by making their own console and adopting Wine to play Windows games on Linux. Especially when you consider nobody else made a similar form factor to play PC games at such a low price. The next step would be to make Wine like Proton and be able to play games outside of Steam. There are Wine-Proton builds that do this, like Glorious EggRoll builds and my personal favorite Kron4ek Wine Proton builds. It's gonna take a while and a lot of community support.
 

jbltecnicspro

[H]F Junkie
Joined
Aug 18, 2006
Messages
8,256
This is why people like me who actually made the switch to Linux are trying to accomplish. We know Linux isn't there yet, and we'd like it to be. That isn't going to happen using Windows and complaining how Linux sucks. Valve made a big brain move by making their own console and adopting Wine to play Windows games on Linux. Especially when you consider nobody else made a similar form factor to play PC games at such a low price. The next step would be to make Wine like Proton and be able to play games outside of Steam. There are Wine-Proton builds that do this, like Glorious EggRoll builds and my personal favorite Kron4ek Wine Proton builds. It's gonna take a while and a lot of community support.
We salute you. If I had more time for games I would be there with you. That said I’m really wanting a steam deck now.
 

Mazzspeed

2[H]4U
Joined
Dec 27, 2017
Messages
3,206
Did some updates under Lutris yesterday. Updated the runner for the Epic games store and I'm glad to say that Detroit Become Human (the only game I ever bought under the Epic Store) runs even better now than it did before. Also updated the runner for Origin and now experience no crashing whatsoever and BF4 also performs even better now (and it wasn't bad before). While I was at it I changed the user agent string under my browser and downloaded the EA App, I then installed the runner under Lutris for the EA app and pointed the runner to the downloaded file during installation - In under 5 mins I had the EA app running perfectly. I also enabled MangoHUD for my FPS games under Lutris, a procedure that's now as simple as toggling a radio switch.

I also upgraded my video card drivers to the latest Nvidia 525.60.11, which enabled Vulkan 2.0 support and allowed me to enable the -vulkan flag for the Source 2 games that support it. Tested using MangoHUD that Vulkan was the correct renderer, Vulkan running fine under the Source 2 games that support it now and I get a nice performance boost running DXVK.

Very impressed!

UHgPDYH.jpg
 
Last edited:

UnknownSouljer

Supreme [H]ardness
Joined
Sep 24, 2001
Messages
7,686
This is why people like me who actually made the switch to Linux are trying to accomplish. We know Linux isn't there yet, and we'd like it to be. That isn't going to happen using Windows and complaining how Linux sucks. Valve made a big brain move by making their own console and adopting Wine to play Windows games on Linux. Especially when you consider nobody else made a similar form factor to play PC games at such a low price.
We disagree on a lot of things DNX, that much is apparent, but on this we mostly agree. However I'll say that in order for this to happen Linux will likely have to get a distro that has a steady hand guiding it (perhaps Valve, perhaps someone else) that actually puts in all of the coding time/money and guides it and then at the end of the day will likely charge money for it. Which btw, I'm personally fine with even though that upsets the "principles" of most Linux users. I would happily pay for a Linux version that took me zero effort to install on any hardware, didn't take any configuration time what-so-ever, worked perfectly out of the box, updates itself without breaking anything/everything, and requires zero console commands to use. And that's basically the level it will need to be to have wide-scale adoption.

If such a thing never comes about, then the best we'll ever get is Valve basically turning Linux into the PC equivalent of macOS, which is to say having very integrated hardware/software and the rest of the Linux ecosystem more or less staying exactly the same. The only way this gets anywhere is with people that have a unified direction putting in dev time. And the only that happens is if people get paid. There is no other way; 20+ years of Linux being fragmented and forked is the obvious evidence of that.
 

Lakados

Supreme [H]ardness
Joined
Feb 3, 2014
Messages
6,956
We disagree on a lot of things DNX, that much is apparent, but on this we mostly agree. However I'll say that in order for this to happen Linux will likely have to get a distro that has a steady hand guiding it (perhaps Valve, perhaps someone else) that actually puts in all of the coding time/money and guides it and then at the end of the day will likely charge money for it. Which btw, I'm personally fine with even though that upsets the "principles" of most Linux users. I would happily pay for a Linux version that took me zero effort to install on any hardware, didn't take any configuration time what-so-ever, worked perfectly out of the box, updates itself without breaking anything/everything, and requires zero console commands to use. And that's basically the level it will need to be to have wide-scale adoption.

If such a thing never comes about, then the best we'll ever get is Valve basically turning Linux into the PC equivalent of macOS, which is to say having very integrated hardware/software and the rest of the Linux ecosystem more or less staying exactly the same. The only way this gets anywhere is with people that have a unified direction putting in dev time. And the only that happens is if people get paid. There is no other way; 20+ years of Linux being fragmented and forked is the obvious evidence of that.
I agree but I am old and cynical, should a single Linux distro get close to the point where it becomes the ultimate gaming experience I am sure the infighting will cause it to get forked all to hell the resource split will cause the initial project to shut down and the whole thing will become a shadow of what it once was and nothing close to what it could have been.
 

Mazzspeed

2[H]4U
Joined
Dec 27, 2017
Messages
3,206
We disagree on a lot of things DNX, that much is apparent, but on this we mostly agree. However I'll say that in order for this to happen Linux will likely have to get a distro that has a steady hand guiding it (perhaps Valve, perhaps someone else) that actually puts in all of the coding time/money and guides it and then at the end of the day will likely charge money for it. Which btw, I'm personally fine with even though that upsets the "principles" of most Linux users. I would happily pay for a Linux version that took me zero effort to install on any hardware, didn't take any configuration time what-so-ever, worked perfectly out of the box, updates itself without breaking anything/everything, and requires zero console commands to use. And that's basically the level it will need to be to have wide-scale adoption.
Which is essentially what the Steam Deck is all about, the advantage is that progress made on the Steam Deck filters down to desktop users. Having said that, an open platform will never be as locked down to one way of doing things as a closed one by virtue of the fact that it's open and free (not as in beer). Personally, I wouldn't want things any other way considering what MS are doing to Windows as an advertising platform.
 

UnknownSouljer

Supreme [H]ardness
Joined
Sep 24, 2001
Messages
7,686
I agree but I am old and cynical, should a single Linux distro get close to the point where it becomes the ultimate gaming experience I am sure the infighting will cause it to get forked all to hell the resource split will cause the initial project to shut down and the whole thing will become a shadow of what it once was and nothing close to what it could have been.
macOS is just BSD (that works in the way that I described that Linux needs to in order to be successful). We have one example, there could easily be another. If people want to fork it they can. But it's broadly the software/driver support that makes this work.

All of the forks would be unsuccessful without that for all the same reasons why Linux isn't successful now and there isn't motivation to spend the time hours to get that level of support without money.
Which is essentially what the Steam Deck is all about, the advantage is that progress made on the Steam Deck filters down to desktop users. Having said that, an open platform will never be as locked down to one way of doing things as a closed one by virtue of the fact that it's open and free (not as in beer). Personally, I wouldn't want things any other way considering what MS are doing to Windows as an advertising platform.
steamOS is open for now in the same way that Android was at the beginning of its inception. Android is now what it is is precisely because it is a closed system.

Unchecked capitalism is bad. No disagreement. But money as a motivating factor to make a product good is the reason why we have the devices we're using to type on this forum in the first place. If not the software, then certainly ALL of the hardware. It's fine for everyone to disagree with me, but the proof is in the pudding. Linux is and has always been an incomprehensible mess. And all of the versions of Linux that are good (lets say AWS servers) had huge amounts of money and dev time to ensure all issues were ironed out and to maximize uptime. If you're not paying people to make the OS good, you're paying people to be specialists to keep it going and eventually paying the same specialists to make it good (custom kernals, drivers, everything).

There is no way around this. Even SteamPC failed as a result of the fact that without dev time to make broad hardware support easy and possible, it just made $3000 PC's unusable for the laymen. Hence why even major PC builders like Falcon Northwest dropped out and why SteamPC's don't exist today (even in a Valve only version). So, even within Valve own ranks themselves is a direct case study showing that "open source Linux" doesn't work for 99.99% of the gaming community.

Again, in 20+ years it hasn't become viable until Valve created a specific hardware/software variation and then put in money for dev time while having a very specific directive. Losing sight of the obvious money fact after this success would be entirely missing the point.
 
Last edited:

Mazzspeed

2[H]4U
Joined
Dec 27, 2017
Messages
3,206
Unchecked capitalism is bad. No disagreement. But money as a motivating factor to make a product good is the reason why we have the devices we're using to type on this forum in the first place. If not the software, then certainly ALL of the hardware. It's fine for everyone to disagree with me, but the proof is in the pudding. Linux is and has always been an incomprehensible mess. And all of the versions of Linux that are good (lets say AWS servers) had huge amounts of money and dev time to ensure all issues were ironed out and to maximize uptime. If you're not paying people to make the OS good, you're paying people to be specialists to keep it going and eventually paying the same specialists to make it good (custom kernals, drivers, everything).
GNU licensing makes it difficult to charge for an OS based on Linux; paying for support is possible, but selling licences to use the OS isn't as simple as may be assumed. Even Android is FOSS, it's Gapps that make Google money. I have no problem paying for anything, provided such a fee is fair and reasonable - Which is unlikely given our over capitalized society and ever increasing inflation price gouging, not to mention the ever increasing subscription based model. I'm not an unlimited resource and can only afford so many subscription based products.

Having said that, I see nothing regarding my KDE Neon install that isn't good. You have to view things from the perspective of learned habit; if you had used Linux all your life, Windows would be unfamiliar, and there would be a learning curve involved - But people still have the ability to learn new things as evidenced by the way people quickly adopted Android, iOS, their Xbox, their Playstation or even smart TV. This concept of being stuck in one familar way of doing things really holds very little merit in our modern society.
 

UnknownSouljer

Supreme [H]ardness
Joined
Sep 24, 2001
Messages
7,686
GNU licensing makes it difficult to charge for an OS based on Linux; paying for support is possible, but selling licences to use the OS isn't as simple as may be assumed. Even Android is FOSS, it's Gapps that make Google money.
I'll admit to having less of an understanding here, other than I assume that there is a tie to the licensing required to use the Linux kernal that therefore means it has to stay open source. But all that tells me is that Steam will push a hardware combined with software model to make Linux viable. Technically SteamOS is free to download and use, but if you're not using SteamDeck hardware, it is NOT a friendly user experience for anyone not familiar with Linux.
I have no problem paying for anything, provided such a fee is fair and reasonable - Which is unlikely given our over capitalized society and ever increasing inflation price gouging, not to mention the ever increasing subscription based model. I'm not an unlimited resource and can only afford so many subscription based products.
No disagreement here. I don't think it's unreasonable at all to have to spend $200-$300 for an OS. Especially if said OS is doing what I want, which we agree on: not spying on me, not serving me ads, and otherwise staying out of my way. However for general users it also needs to have the requirement of being dead simple to use.
Having said that, I see nothing regarding my KDE Neon install that isn't good. You have to view things from the perspective of learned habit; if you had used Linux all your life, Windows would be unfamiliar, and there would be a learning curve involved - But people still have the ability to learn new things as evidenced by the way people quickly adopted Android, iOS, their Xbox, their Playstation or even smart TV. This concept of being stuck in one familar way of doing things really holds very little merit in our modern society.
I think people are more or less willing to learn another OS if there is value being placed there AND there aren't the barriers that I mentioned before. As soon as a SINGLE command line is necessary to do anything, for the layman and 99.99% of users, they're out. Ordinary people don't know how to install OS updates even critical ones. Ordinary people know nothing about drives or driver updates. Without systems in place that can automate these functions that don't break things when they do updates, Linux will continue to be niche. Love it or hate it, this is why iOS can get 70% of its user install base to upgrade and have the most updated version of its OS (including critical patching) precisely because these functions are dead simple to use and automate themselves seamlessly while NOT breaking anything.

Here in the Hardforums, we're the .01%. In some cases on here the .001 or even .0001%. All of this stuff is relatively trivial to us, but to Amy in accounting she will NEVER learn an OS to the degree necessary that the current state of Linux finds itself in. She just wants to open Excel, or her books or whatever, and do her job. These are the kinds of users that will have massive amounts of friction points and ultimately Linux needs to accommodate that user if it wants a meaningful install base. That's more or less the conversation that I was trying to have and the points that I was trying to make. And frankly the average gamer or PC user in general isn't that different from Amy in accounting.
 
Last edited:

Mazzspeed

2[H]4U
Joined
Dec 27, 2017
Messages
3,206
I'll admit to having less of an understanding here, other than I assume that there is a tie to the licensing required to use the Linux kernal that therefore means it has to stay open source. But all that tells me is that Steam will push a hardware combined with software model to make Linux viable. Technically SteamOS is free to download and use, but if you're not using SteamDeck hardware, it is NOT a friendly user experience for anyone not familiar with Linux.

No disagreement here. I don't think it's unreasonable at all to have to spend $200-$300 for an OS. Especially if said OS is doing what I want, which we agree on: not spying on me, not serving me ads, and otherwise staying out of my way. However for general users it also needs to have the requirement of being dead simple to use.

I think people are more or less willing to learn another OS if there is value being placed there AND there aren't the barriers that I mentioned before. As soon as a SINGLE command line is necessary to do anything, for the layman and 99.99% of users, they're out. Ordinary people don't know how to install OS updates even critical ones. Ordinary people know nothing about drives or driver updates. Without systems in place that can automate these functions that don't break things when they do updates, Linux will continue to be niche. Love it or hate it, this is why iOS can get 70% of its user install base to upgrade and have the most updated version of its OS (including critical patching) precisely because these functions are dead simple to use and automate themselves seamlessly while NOT breaking anything.

Here in the Hardforums, we're the .01%. In some cases on here the .001 or even .0001%. All of this stuff is relatively trivial to us, but to Amy in accounting she will NEVER learn an OS to the degree necessary that the current state of Linux finds itself in. She just wants to open Excel, or her books or whatever, and do her job. These are the kinds of users that will have massive amounts of friction points and ultimately Linux needs to accommodate that user if it wants a meaningful install base. That's more or less the conversation that I was trying to have and the points that I was trying to make. And frankly the average gamer or PC user in general isn't that different from Amy in accounting.
As stated, if all you had used all your life was Linux, Windows would involve a learning curve. Unfamiliar does not automatically = Difficult, unless you're Linus from Linus tech tips and fail to read a whole paragraph making up a warning telling you exactly what is about to happen if you proceed to install Steam from the PopOS repos, asking for you to enter a full sentance in reply before the procedure would progress; what he did was no different to the cartoon where the Coyote on the ACME rocket smashes through every warning bollard before launching off a cliff - People can claim what they like about Linux, but terminal is very verbose and tells you exactly what is about to happen assuming you can read and have some form of attention span (and aren't creating clickbait).

I've used it as an example before, but assuming you don't install some hackish and usually a paid driver under MacOS, you need to use terminal using commands almost identical to Linux in order to write to an NTFS file system. Sure, you can use ExFat, but most USB drives come formatted as NTFS, so that's good enough for most Windows users until they try to use the same drive on a Mac. I haven't had a Linux update break anything since package managers became good, so I haven't experienced an update hosing an OS install in about seven years, but I fix plenty of Windows systems with failed updates.

We aren't beyond the ability to learn new things, as evidenced by the way people work out new GUI's every day, right down to the touch screen at the local McDonald's. Most people can use Linux without even touching terminal; as an example, Steam is available as a .deb and under most Debian based distro's can be installed via a GUI installer directly from Valve no different to Windows. In fact, this is how I personally install Steam.

My Wife uses my Linux PC just fine.
 

UnknownSouljer

Supreme [H]ardness
Joined
Sep 24, 2001
Messages
7,686
As stated, if all you had used all your life was Linux, Windows would involve a learning curve. Unfamiliar does not automatically = Difficult, unless you're Linus from Linus tech tips and fail to read a whole paragraph making up a warning telling you exactly what is about to happen if you proceed to install Steam from the PopOS repos, asking for you to enter a full sentance in reply before the procedure would progress; what he did was no different to the cartoon where the Coyote on the ACME rocket smashes through every warning bollard before launching off a cliff - People can claim what they like about Linux, but terminal is very verbose and tells you exactly what is about to happen assuming you can read and have some form of attention span (and aren't creating clickbait).

I've used it as an example before, but assuming you don't install some hackish and usually a paid driver under MacOS, you need to use terminal using commands almost identical to Linux in order to write to an NTFS file system. Sure, you can use ExFat, but most drives come formatted as NTFS, so that's good enough for most Windows users until they try to use the same drive on a Mac.

We aren't beyond the ability to learn new things, as evidenced by the way people work out new GUI's every day, right down to the touch screen at the local McDonald's. Most people can use Linux without even touching terminal; as an example, Steam is available as a .deb and under most Debian based distro's can be installed via a GUI installer directly from Valve no different to Windows. In fact, this is how I personally install Steam.

My Wife uses my Linux PC just fine.
If this is where we are, agree to disagree. Before Windows the adoption rate of PC's was incredibly low. Even in the present there is a user base that is intentionally forgoing full PC devices and moving towards even easier interfaces that do everything they want to do such as iPads or tablet computers or their phones. There is a decently large portion of the computing base that does all of their computing tasks from their phones. In countries that aren't the USA, those percentage are non-trivial amounts of people. If you're in the third world, the numbers could be flipped to mobile devices only with only a small percentage using a "full computing device".

I personally think you're grossly over-estimating the average person and their relationship to technology precisely because you are so skewed the direction you are. Most people will NEVER change the oil on their cars. Most people will NEVER do the repairs on their own homes. They will NEVER learn command line or use terminal. You might think I'm being ridiculous for being this emphatic, I'm just telling you the average joe treats computing like an appliance and they want it to work in the same vein as a car. They care nothing about the technology under the hood, how to fix it, how it works, everything. It could be a magical box for all they care (and they effectively treat it like it is one). They just want it to do what they want and stay out of their way.

EDIT: As a humorous aside, it might be worth reading IT stories from hell to see just how "dumb" the average person in an office actually is. I don't think this is actually stupidity, it's just ignorance and it has to do with the points I'm making that normal people will never learn these things precisely because they don't care about them. In other words, computing is a part of their job, it's not their hobby.
 
Last edited:

Mazzspeed

2[H]4U
Joined
Dec 27, 2017
Messages
3,206
If this is where we are, agree to disagree. Before Windows the adoption rate of PC's was incredibly low. Even in the present there is a user base that is intentionally forgoing full PC devices and moving towards even easier interfaces that do everything they want to do such as iPads or tablet computers or their phones. There is a decently large portion of the computing base that does all of their computing tasks from their phones. In countries that aren't the USA, those percentage are non-trivial amounts of people. If you're in the third world, the numbers could be flipped to mobile devices only with only a small percentage using a "full computing device".

I personally think you're grossly over-estimating the average person and their relationship to technology precisely because you are so skewed the direction you are. Most people will NEVER change the oil on their cars. Most people will NEVER do the repairs on their own homes. They will NEVER learn command line or use terminal. You might think I'm being ridiculous for being this emphatic, I'm just telling you the average joe treats computing like an appliance and they want it to work in the same vein as a car. They care nothing about the technology under the hood, how to fix it, how it works, everything. It could be a magical box for all they care (and they effectively treat it like it is one). They just want it to do what they want and stay out of their way.
I think you're being a little harsh regarding modern society by believing society is useless and can't learn new things or do anything for themselves. Personally I'll believe that certain generations aren't useless and can actually learn new things and apply learned skills to their everyday lives.

As stated earlier, every point you're making is essentially exactly what the Steam Deck is all about, and people seem to use it just fine - With the advances made on the Steam Deck filtering down to desktop Linux operating systems.

An operating system that doesn't stay out of your way is the one constantly nagging you with messages regarding OneDrive and Microsoft accounts, my KDE Neon install just chugs away nicely while I forget I'm running Linux.
 
Top