Workstation GPUs vs. Desktop GPUs

Nihilanth99

2[H]4U
Joined
Aug 28, 2002
Messages
3,564
In trying to help someone choose parts for a new system, I realized that I have no clue what physically differentiates a workstation-class GPU (e.g. Nvidia Quadro and ATI FireGL) from a desktop-class GPU, considering they share many components. Basically, what makes these cards better-suited for CAD and 3D modeling? Just how advantageous is it for a professional to invest in a solid workstation-class GPU?
 
Thanks for the replies so far. Is there anywhere I can go to find out exactly what applications benefit from these cards?
 
Well basically, the gaming GPU goes for raw speed. The Workstation GPU is tailored more for image quality before speed.

What is your friend going to be doing with this computer?
 
Well basically, the gaming GPU goes for raw speed. The Workstation GPU is tailored more for image quality before speed.

Not really. The image quality between a Quadro and a Geforce card will be identical. The biggest difference between the 2 models are the drivers. Quadro drivers are very solid. They are designed with a very limited pool of programs whereas a gaming card's drivers have to deal with new games being released every week, patches and trying to make them ALL work in one driver set. Gaming cards are meant for 3d rendering. Thats about all they do. Quadro cards are more of an extension of the computing system as a whole and will help offload work from the cpu in certain programs.
 
Not really. The image quality between a Quadro and a Geforce card will be identical. The biggest difference between the 2 models are the drivers. Quadro drivers are very solid. They are designed with a very limited pool of programs whereas a gaming card's drivers have to deal with new games being released every week, patches and trying to make them ALL work in one driver set. Gaming cards are meant for 3d rendering. Thats about all they do. Quadro cards are more of an extension of the computing system as a whole and will help offload work from the cpu in certain programs.

You forget GPGPU programming.. CUDA, etc.

The difference is in the driver for Nvidia cards. I would suspect the same thing for ATI cards as well. Nvidia cards are extremely easy to soft-mod into Quadro cards. Not sure about ATI cards though.
 
As you can clearly see, the ATI driver programmers have done an amazing job. The two models' hardware is 99% identical, and yet the FirePro adapter completely outclasses the cheaper Radeon gaming card. The most extreme case in point is Maya, where the FirePro V8700 is six times faster than the Radeon HD 4870.

We also decided to investigate if there were visible differences in picture quality between the two models. On a basic Windows desktop we discovered no discrepancies, but as soon as you load a professional graphics application such as Maya or 3ds Max and import a complex 3D model, things change completely. When using the Radeon, you simply have to accept that wire frames will peek out of shaded surfaces all over the place, and that significant clipping occurs as numerous objects are viewed or animated. These phenomena simply don't occur when using the FirePro. Bottom line: those who seek to be frugal with expensive workstation applications should not fall prey to false economies.

I think this part of the article sums up what needs to be said in this thread.
 
I think this part of the article sums up what needs to be said in this thread.

Arrggghhhh. I wish they would just not disable the professional features on the "gaming" card. Sure, the per-card profit would be lower, but I bet they would sell more than enough to make up for it.

From what I could find, the 4800 series of cards is a lot harder to soft-mod than earlier cards.. and it also depends on the driver version as the newer drivers seem to have more "security" to prevent soft-modding.

Hrmmmm.. there has to be a way around it.
 
try to manually install using the inf?

Nope. Doesn't work. It seems the drivers have checks for specific card IDs in them, not just the .inf either.

A modded BIOS may do the trick. Will have to back up the BIOS on my card, mod it, and then see if that does the trick.
 
workstation cards may also be more thermally robust than their gaming counterparts as they're expected to run constantly.
 
But the difference for the most part is still just in the drivers. At least for ATI and Nvidia it is that way.

I bet both of them would sell even more cards if they just made it so you could switch between driver modes.

Sure, still sell super high-end workstation cards with EXTRA features, not just workstation cards with existing features enabled that are blocked on the "gaming" cards.
 
What i took from the article is that OpenGL programs are much more time consuming and expensive to program for. This is why more and more programs are moving to Direct3D.
 
Workstation cards focus on OpenGL and "Gamer/Desktop" cards focus on Direct3D.

This should help explain some things:

http://www.mcadforums.com/forums/files/autodesk_inventor_opengl_to_directx_evolution.pdf

What i took from the article is that OpenGL programs are much more time consuming and expensive to program for. This is why more and more programs are moving to Direct3D.

Not at all true. Workstation cards focus on workstation applications. The API is irrelevant, it just so happens that most workstation programs use OpenGL. OpenGL is no harder to program for than DirectX, its just that MS has more money to push DX therefore DX tends to get the fancy new features first. Infact, OpenGL tends to be *easier* to learn (for the basic stuff, anyway). One of my friends who spent several months digging into DirectX looked at OpenGL code and said "OpenGL is too easy, where is the challenge?"
 
Not at all true. Workstation cards focus on workstation applications. The API is irrelevant, it just so happens that most workstation programs use OpenGL. OpenGL is no harder to program for than DirectX, its just that MS has more money to push DX therefore DX tends to get the fancy new features first. Infact, OpenGL tends to be *easier* to learn (for the basic stuff, anyway). One of my friends who spent several months digging into DirectX looked at OpenGL code and said "OpenGL is too easy, where is the challenge?"

Alright, well then why are programs moving to DirectX instead of OpenGL? Why would the guy in the article (Did you read it?) say what he said? Do you have first hand experience of OpenGL workstation cards vs "gamer" directX cards?

Quote from the article:
When we use OpenGL, we have found over the past many years (and still today) that we need to invest
in a large, significant amount of QA that simply verifies that the OpenGL graphics driver supports the
OpenGL API on the level that we use (which is actually rather dated, to be consistent with OpenGL GDI
Generic, from circa 1997). In spite of the fact that we do not use any new fancy OpenGL extensions and
use OpenGL almost on the level of 1997 graphics HW technology, we routinely encounter OpenGL
graphics drivers that do not work correctly and as a result, we have our extensive OpenGL graphics HW
certification process which involves a serious amount of testing effort on our part that does not actually
test Inventor, it merely tests the OpenGL graphics driver. In fact, we currently have 44 (and counting)
OpenGL "Workarounds" that we can invoke in our OpenGL graphics layer to "workaround" various
OpenGL graphics driver problems.

In addition, when we use OpenGL, we _never_ use the graphics HW for the rendering of any offscreen
images, so any shaded views in Drawings, images of Parts and Assemblies, thumbnails associated with
files, all printing of shaded 3D views, etc. are _all_ done using the OpenGL SW GDI Generic renderer.
This means that we have a limit on the prospect of actually using newer graphics HW capabilites,
because we cannot do anything on graphics HW that is not supported in the (circa 1997) GDI Generic
renderer; our onscreen and offscreen generated images would not match.
 
But the difference for the most part is still just in the drivers. At least for ATI and Nvidia it is that way.

I bet both of them would sell even more cards if they just made it so you could switch between driver modes.

Sure, still sell super high-end workstation cards with EXTRA features, not just workstation cards with existing features enabled that are blocked on the "gaming" cards.

Ya... I dunno... I have had soft-modded 3870's and now have a v7700.

On one hand the speed is about the same, give or take. The real difference is in stability; my softmoded cards would crash and burn frequently - drivers just crap-out and even some BSODs My v7700 is rock solid. I would not run out and pay $1000 for it, but if you can find one in the $400 neighborhood then I would go for it IF your primary focus is on 3d modeling. (I got mine for less than $300 in the 35% cashback days)
 
main reason companies are shifting to directX is because microsoft is backing it, and developing it all the time. OpenGL is barely "maintained" even thou 3.0 came out its not really in any better condition then it was at 2.1. After all the companies shift to directx microsoft can slowly start charging for using the SDK etc, just like what they are now doing with drivers for windows (cant use unsigned drivers for vista 64 etc)
 
Alright, well then why are programs moving to DirectX instead of OpenGL? Why would the guy in the article (Did you read it?) say what he said? Do you have first hand experience of OpenGL workstation cards vs "gamer" directX cards?

THEY ARE THE SAME CARD. So *PLEASE* stop saying "OpenGL workstation cards" and "gamer DirectX cards". And yes, I have experience with both workstation and gamer cards.

Here, let me explain what happens. The basic overview looks something like this:

Application <-> DirectX/OpenGL <-> Driver <-> GPU.

The card doesn't care if its DirectX or OpenGL - that doesn't matter. The drivers aren't "optimized" for one or the other, they are optimized for specific programs (or games, depending). Gamer cards don't focus on DirectX, just like workstation cards don't focus on OpenGL. ATI and Nvidia do devote more resources to optimizing their DirectX paths, but that is simply because most games are DirectX (due to reasons already stated, it isn't a technical one)

Ever hear of a little company called id Software? Their games are OpenGL (including Id Tech 5 with its 20gb of textures per level). Unreal Engine? It has an OpenGL version as well. Hell, how about the PS3? All PS3 games are running on an nvidia 7800 GPU using OpenGL (modified, but still OpenGL)

And no, I didn't read the article you linked because I don't feel like installing a PDF reader. I also don't care what Autodesk has to say.

OpenGL is barely "maintained" even thou 3.0 came out its not really in any better condition then it was at 2.1.

OpenGL 3.0 was a let down, yes, but it is very far from just "being maintained". After all, DX 10 was a let down but nobody is claiming that DX is barely "maintained" either. OpenGL can do all the same stuff DX can
 
I agree that they are the pretty much the same hardware. I wasn't arguing that. That's why I put the quotations around "gamer". Guess I didn't make that clear.

What I am talking about is the actual application of the cards. I use Microstation V8i and it runs on DirectX. I have dual 30" HP monitors and doing 3D CAD can really put a hurting on my current videocard (HD4850). I am currently trying to find the best Video card for running Microstation V8i so that I can manipulate my work at a decent fps. From what I have seen though is that the Workstation cards are not going to do anything for me other than raise the price because the workstation cards just use optimized drivers, mainly for OpenGL from my understanding.

I have not found a single person who has used a top of the line workstation card (driver) vs a top of the line "gamer" card (driver) in AutoCad or any other DirectX based CAD program and given results. It seems the review sites don't touch this arena- They usually test cards in OpenGL programs and stop at that.

I would plop the money down for a workstation card if I knew for sure it would benefit me, but it seems that both versions will perform about the same in DirectX.
 
seems like the most recent review for the top of thel ine ati workstation gpu is way faster than nvidia's and cheaper to boot (anandtech review)

similar specs to a 4870 but it blows it away because it is optimized for things like cad, etc :)
 
seems like the most recent review for the top of thel ine ati workstation gpu is way faster than nvidia's and cheaper to boot (anandtech review)

similar specs to a 4870 but it blows it away because it is optimized for things like cad, etc :)

Yes, blows it away in OpenGL apps, but no DirectX apps were tested. That's what I want to see tested-
 
Yes, blows it away in OpenGL apps, but no DirectX apps were tested. That's what I want to see tested-

You are getting way to hung up on what API the programs use. Let me say this again, the API IS IRRELEVANT. Workstation cards have drivers optimized for workstation programs (CAD, 3D modeling, etc...). The API is 100% irrelevant, just like the API is 100% irrelevant when dealing with games. *IF* ATI has optimizations for the workstation program you use, *THEN* it will be much faster with a FireGL card. Thats all there is to it. So stop worrying about whether it is DX or OGL, as it *DOES NOT MATTER*.
 
I have not found a single person who has used a top of the line workstation card (driver) vs a top of the line "gamer" card (driver) in AutoCad or any other DirectX based CAD program and given results. It seems the review sites don't touch this arena- They usually test cards in OpenGL programs and stop at that.

.

. I use AutoCAD 2009 & 3dsMAX daily. I do quite a bit of 3d architectural modeling - I currently use the ATI v7700 (not uber high-end, but it does bench rignt up there with the best of them) I have used 4870's, gtx280's, 4850's, 3870's.. just about every gaming card that I could get my hands on.. None of them come close to the v7700 for stability at a decent frame rate on my dual 24's

With your dual 30's you are going to need a huge frame buffer. I have been trying to get my hands on a FX4800 or V8700 to see if I can keep more textures on. You might even want to go for that 2GB v8650.

Bottom line, for speed the gamer cards are OK, but they get buggy and produce a lot of artifacts. The workstation cards are stable and produce a nice, clean, image.
 
. I use AutoCAD 2009 & 3dsMAX daily. I do quite a bit of 3d architectural modeling - I currently use the ATI v7700 (not uber high-end, but it does bench rignt up there with the best of them) I have used 4870's, gtx280's, 4850's, 3870's.. just about every gaming card that I could get my hands on.. None of them come close to the v7700 for stability at a decent frame rate on my dual 24's

With your dual 30's you are going to need a huge frame buffer. I have been trying to get my hands on a FX4800 or V8700 to see if I can keep more textures on. You might even want to go for that 2GB v8650.

Bottom line, for speed the gamer cards are OK, but they get buggy and produce a lot of artifacts. The workstation cards are stable and produce a nice, clean, image.

Thanks for your input- I really appreciate it. I'm not super concerned about clean images right now as I deal with Mechanical systems, so it's just a bunch of duct and pipe. I just want it to be as smooth as possible when I am manipulating the stuff in 3D.
 
if a program is direct-x optimized, how would it not matter? (honest question) :)

Because what happens is Nvidia and ATI optimize for specific programs. They both implement the APIs fully and quite well to begin with, but then they go through and do certain tricks to make specific programs faster. The API is simply how the program talks to the driver. All the optimizations occur at the driver level, AFTER the API. As such, the API is irrelevant, since Nvidia and ATI aren't optimizing the API itself.
 
Thanks for your input- I really appreciate it. I'm not super concerned about clean images right now as I deal with Mechanical systems, so it's just a bunch of duct and pipe. I just want it to be as smooth as possible when I am manipulating the stuff in 3D.

It can be pretty annoying, stuff just disappears and there are selection bugs. If you are a casual user and/or this is your home system I would just get a 3870 gamer (ddr4) gamer card and do the soft mod, If you are serious about this stuff and it is on a dedicated workstation then get the workstation card.
 
My laptop has a workstation graphics card and it plays all the games just fine.
 
My laptop has a workstation graphics card and it plays all the games just fine.

That depends on your definition of 'just fine' some people want to have a 90 FPS at 1900x1200 with everything set to max...
 
Back
Top