installing graphics drivers on a server

Do you load 3rd party graphics drivers on servers?


  • Total voters
    27

goodcooper

[H]F Junkie
Joined
Nov 4, 2005
Messages
9,771
so, do you or don't you?

i've come across quite a few boxes that have Standard VGA Adapter.... i can always tell because when i'm shutting down and the background grays out it stutters like crazy...

some people argue the less drivers/memory footprint you have on a windows server desktop the better.... stability could possibly be taken from the OS because of 3rd party driver programmers....


what are your thoughts? i always thought of it as a "no big deal"... load it up anyway... unless i was forced to get video drivers from like driverguide or something, and i didn't need any hardware acceleration anyway.

i guess it kind of depends on how much or what you're using that server desktop/gui for... administration? or cod4
 
Yes. Remote desktop applications work better, as does the higher resolutions on your monitor that is desired on todays servers. Esp with LCD monitors being more common for servers.....including the 1U KVM panels for server cabinets.

Installing the proper driver really has little to do with a "memory footprint". Servers have (or should have) "Server approved hardware/drivers". Commonly onboard ATI with most servers I've dealt with.
 
Yes. Remote desktop applications work better, as does the higher resolutions on your monitor that is desired on todays servers. Esp with LCD monitors being more common for servers.....including the 1U KVM panels for server cabinets.

Installing the proper driver really has little to do with a "memory footprint". Servers have (or should have) "Server approved hardware/drivers". Commonly onboard ATI with most servers I've dealt with.

Yea I always install the drivers as long as I'm running a card designed to run on a server.

Most server hardware I've delt with as onboard ati drivers although I've seen XGI used on a bunch as well.
 
I always install all hardware drivers on a server. However, I do often disable sound cards in the BIOS.
 
I normally do but I don't think my DL320 / Ubuntu has any drivers for the graphics.
 
If it needs drivers it gets drivers. :)

I typically only deal with HP servers, so when it's time to gen one I just use the latest SmartStart CD and it installs everything along with the OS automatically. Except for 2008 server, for some reason the v8 smart start CD doesn't install server 2008 correctly. Aftter the OS is installed it runs fine, however....
 
While I never go to nvidia/ati.com and get the latest for server chipsets, I always install the drivers provided by the server manufacturer.
 
Never seen a sound card on a server.

Leaving out what he was talking about some of the lower end stuff does have sound onboard. Supermicro has some that are entry level server/workstation boards for example.

Generaly speaking I try to avoid using anything but server grade components for anything that will pull server roles.

In DeaconFrost's setup if he is using it in a lab setting then maybe. I've still seen issues pop up where windows server will not like something and it is a driver issue. Same with drivers that have installers that will not run because they say 2k3 is unsupported.
 
Only on windows servers, i don't bother if they don't have a gui obviously.
 
Supermicro has some that are entry level server/workstation boards for example. /QUOTE]

:eek: Yeah I supposed the cloner world....which I generally try to run from a get a good server in there soon.

I generaly try to get a nice dell or hp server installed as well but a supermicro or intel server is good too. The mid to higher end stuff from both is very good. I just like my 4 hour responce warranty.
 
I tend to only use Dell or HP as well. Dell do some fantastic deals and I had a 1600SC server running 24/7 for over 5 years!
 
Back
Top