Universal Windows/Linux Benchmarks?

I'm trying not to sound facetious, but Windows will almost undoubtedly have better graphics performance hands down. If for no other reason than the driver support is significantly better.

Granted I'm thinking in terms of games and real time graphics. GPU computational work I'd suspect might be comparable between the two.
 
I'm trying not to sound facetious, but Windows will almost undoubtedly have better graphics performance hands down. If for no other reason than the driver support is significantly better.

Granted I'm thinking in terms of games and real time graphics. GPU computational work I'd suspect might be comparable between the two.

I wouldn't say that you're wrong, just not as right as you may think.

The benchmarks at Phoronix (there's a whole bunch these days) indicate that the performance differences are negligible across OpenGL game benchmarks for AMD and NVIDIA. Sometimes Windows has the edge, sometimes Ubuntu, etc.

This isn't particularly surprising since the drivers are actually pretty much identical across platforms.

The Intel IGP stuff is not as optimistic however. They were doing great with their Linux drivers for awhile there and then I don't know what happened but they've really fallen behind (even vs. OS X).

There are other things that have to be taken into consideration such as the effects of composited desktop environments, etc.

But all in all, if using the AMD and NVIDIA drivers the performance should be comparable. At least for NVIDIA I know that's true.
 
Here are some OpenGL benchmarks for Windows 7 vs. Ubuntu 12.04. You can see that other than some freak cases, the performance is pretty much the same... give or take a few points either way. Except for the Intel drivers... which have fallen way behind.

NVIDIA GeForce GTX 680: Windows 7 vs. Ubuntu 12.04 Linux
AMD Radeon Catalyst Performance: Windows 7 vs. Ubuntu 12.04 LTS
Ubuntu 12.04 vs. Windows 7: Intel Sandy/Ivy Bridge Loses On Linux

And for fun, some Intel benchmarks for OS X and Ubuntu:

Apple Mac OS X 10.7.4 Lion vs. Ubuntu Linux
 
Here are some OpenGL benchmarks for Windows 7 vs. Ubuntu 12.04. You can see that other than some freak cases, the performance is pretty much the same... give or take a few points either way. Except for the Intel drivers... which have fallen way behind.

NVIDIA GeForce GTX 680: Windows 7 vs. Ubuntu 12.04 Linux
AMD Radeon Catalyst Performance: Windows 7 vs. Ubuntu 12.04 LTS
Ubuntu 12.04 vs. Windows 7: Intel Sandy/Ivy Bridge Loses On Linux

And for fun, some Intel benchmarks for OS X and Ubuntu:

Apple Mac OS X 10.7.4 Lion vs. Ubuntu Linux

Those are some interesting results. I didn't consider that much of the driver would be the same between OSs. Do you have any realworld performance measures though? All the ones shown are bench marking programs.
 
Those are some interesting results. I didn't consider that much of the driver would be the same between OSs. Do you have any realworld performance measures though? All the ones shown are bench marking programs.

I guess not since there are no games for Linux to even benchmark. lol.
 
What would be interesting is comparing the DirectX performance on Windows to the OpenGL performance on both, as I think that is the better optimized path.
 
This isn't particularly surprising since the drivers are actually pretty much identical across platforms.

From my experience that is entirely untrue. Drivers for the same piece of hardware are often radically dfifferent in the implementation because of platform differences.
 
We do have this somewhat interesting anecdote now:

http://blogs.valvesoftware.com/linux/faster-zombies/

Those bare numbers kill the statistician in me. Methodology, number of runs, variance from baseline, etc. A higher average is unimportant if the variance is such that the lower average platform stays closer to that average, while the higher average has some dips into the unacceptably slow territory (think Batman:AC under DX11).
 
Those bare numbers kill the statistician in me. Methodology, number of runs, variance from baseline, etc. A higher average is unimportant if the variance is such that the lower average platform stays closer to that average, while the higher average has some dips into the unacceptably slow territory (think Batman:AC under DX11).

Agreed. And there's no screen shots, does the rendering look identical between the different versions and OSes?
 
Those bare numbers kill the statistician in me. Methodology, number of runs, variance from baseline, etc. A higher average is unimportant if the variance is such that the lower average platform stays closer to that average, while the higher average has some dips into the unacceptably slow territory (think Batman:AC under DX11).

Well I think the important take-away from that entry was that OpenGL vs. Direct3D performance is going to depend on the game engine and how well the developers have done performance tuning. I.e., Valve intends to improve Direct3D in their Source engine now that they've uncovered some bottlenecks: "We have been doing some fairly close analysis and it comes down to a few additional microseconds overhead per batch in Direct3D which does not affect OpenGL on Windows. Now that we know the hardware is capable of more performance, we will go back and figure out how to mitigate this effect under Direct3D."
 
Back
Top