Need Direct X 10 and 30" Monitor Advice

HRPC

Weaksauce
Joined
Sep 26, 2005
Messages
81
Want to upgrade my system to play BioShock on a 30" at full Direct X 10 resolution.

I assume as far as 30" monitors goes DELL is the one to get

As for video cards I currently have a 7800GT and wanted something single PCI-E that would power a 30" monitor and run BioShock at 2560x1600 with Direct X 10

What is the least expensive Nvidia card that will deliver that? Also is my X23800+ going to cut it? Ah I remember when my system was really powerful, it was almost as if it was a 18 months ago ;-)
 
2560x1600 = 4,096,000 pixels
1920x1080 = 2,073,600 pixels

Thusly, a 30" is roughly 2x as difficult to drive as a slightly smaller widescreen.

You will /need/ SLI'd 8800 GTXs or Ultras in order to maintain playable framerates. It is not a question of "it would be a good idea" but rather "it will be completely and utterly necessary, and may still not work so well at times."

You're welcome to try with a single GTX or Ultra, but be sure to buy a high quality SLI capable board (the EVGA 122-CK-NF68-A1/T1 680i SLI board is about as good as it gets for Intel CPUs right now) and be ready to drop another $600 on a second card when you're disappointed by single card performance... because you will be. Your current CPU is probably going to be insufficient. It's going to take around $2500 to build a machine that can drive that display reliably, so unless you're willing and able to spend the cash, I'd recommend getting something smaller and considerably less demanding. Half of the $2500 is going to be in GPUs alone -- 2 8800 GTXs/Ultras at around $600 each -- plus $300 for a Q6600, $200 for the EVGA motherboard I mentioned, $150-200 in RAM (4GB), $50-60 for CPU cooling, $200 for ~700-800 watt high quality PSU... Case, I imagine you already have, drives too. $1200 + $300 + $200 + $200 + $200 + $50 = $2150 + S/H. Roughly. Another $200 if you need new drives, $150-200 if you need a new case. That's what it'll take. You could probably sell your current machine to help offset the cost, but I don't know what you'd get for it.
 
30" monitors are great and all, and I don't mean to flame, but your system sig looks a little under powered for a 30" monitor and high end gaming.

The only single card that will play Bioshock well at 2560x1600 is an 8800 Ultra. However, I don't think that Bioshock is extreamly CPU intensive, so you'd probably get ok performance with your sig rig at 2560x1600 in Bioshock, though you might have to tweak the settings a bit.
 
I play all games on my Apple 2560x1600rez with just one 8800GTX and it works smooth butter :):) WoW at that rez with 4xMultiSample all options maxed

Fuk SLI it sux and doesn't even wotk right in Vista64, and then ya get an SLI setup and a few months later the next generation cards come out and your SLI is slow old school.

I would take one 8800GTX over 2-7900GTX any day, and then soon a single 9800GTX will be out and beat 2-8800GTX
 
I play all games on my Apple 2560x1600rez with just one 8800GTX and it works smooth butter :):) WoW at that rez with 4xMultiSample all options maxed

Fuk SLI it sux and doesn't even wotk right in Vista64, and then ya get an SLI setup and a few months later the next generation cards come out and your SLI is slow old school.

I would take one 8800GTX over 2-7900GTX any day, and then soon a single 9800GTX will be out and beat 2-8800GTX

....WoW is /anything/ but demanding. Of course you can push it at that res. Bioshock, Crysis, UT3, Gears of War, these are all an entirely different story. Please, make sure to at least read the whole discussion before making these sorts of claims. Also SLI has worked in Vista for well over a month now... more like 2, 3 months if I'm remembering correctly. Granted, it shouldn't have taken /nearly/ that long to get sorted out, and I am not apologizing for Nvidia in the slightest (though apparently a big part of the issue was due to something Microsoft did) but it does at least work now.

30" monitors are great and all, and I don't mean to flame, but your system sig looks a little under powered for a 30" monitor and high end gaming.

The only single card that will play Bioshock well at 2560x1600 is an 8800 Ultra. However, I don't think that Bioshock is extreamly CPU intensive, so you'd probably get ok performance with your sig rig at 2560x1600 in Bioshock, though you might have to tweak the settings a bit.

I've yet to see a single benchmark at that res -- pretty much all top out at 1080p (1920x1080) and as I pointed out before, 2560x1600 is actually twice as many pixels to push.
 
Granted my computer is top of the line right now but my single EVGA 8800 Ultra handles World in Conflict, Bioshock, and other games in DX10 mode on Vista Ultimate x64 with no problem. Yes, FPS ends up being around 30fps average but I can max everything (except AA, but its not as necessary given the resolution) and it is never unplayable - even in the biggest situation, Fraps never reports below 26fps in Bioshock and WIC.
 
Granted my computer is top of the line right now but my single EVGA 8800 Ultra handles World in Conflict, Bioshock, and other games in DX10 mode on Vista Ultimate x64 with no problem. Yes, FPS ends up being around 30fps average but I can max everything (except AA, but its not as necessary given the resolution) and it is never unplayable - even in the biggest situation, Fraps never reports below 26fps in Bioshock and WIC.

Many people don't consider 30 FPS playable, particularly in first person shooters.
 
jeez... it hurts to read stuff like this. I know it's hard struggling through life with your 24" monitor but this would be a purchase you will regret later on when you understand the value of a dollar and the power of compounding interest... unless you are just absolutely filthy rich and already have everything you need...
 
jeez... it hurts to read stuff like this. I know it's hard struggling through life with your 24" monitor but this would be a purchase you will regret later on when you understand the value of a dollar and the power of compounding interest... unless you are just absolutely filthy rich and already have everything you need...

Sure, its better to save than consume for the most part. At the same time, people stop spending, people loose jobs and then they stop spending and then recession.

I'm not saying people should spend beyond their means, but at the same time, we're all going to get old and die someday. Live a little!
 
Many people don't consider 30 FPS playable, particularly in first person shooters.

Really depends on the game, and for a single player game like Bioshock, 30FPS is fine especially at 2560x1600. I'm only playing it at 1920x1200 and its so beatiful that I just like to look at stuff.
 
jeez... it hurts to read stuff like this. I know it's hard struggling through life with your 24" monitor but this would be a purchase you will regret later on when you understand the value of a dollar and the power of compounding interest... unless you are just absolutely filthy rich and already have everything you need...


Maybe they understand the value of a dollar and worked there ass off so they could buy this particular item(s) ? this has nothing to do with the value of a dollar because you know nothing about this person.

No one can judge another on purchasing something since you know nothing about their finances, just because you maybe a penny pusher, doesnt mean others are, you live life to enjoy it, you work to make money, why not spend some now and then? since you cant spend it once your dead.

i know and learned the value of a dollar working my butt off as a kid so the fact i own a pair of $400 head phones, a $2000 computer system and $3500 in camera equipment, your saying i dont know what the value is because"you" think it is something i will regret?


and what defines being rich? for what i make and the country i live in, i am rich, in Canada, i am not rich... rich is what you want it to be, i feel rich because i have things i want, nice things, other may define being rich as having a crap load of cash in the bank but they live in some crap house with nothing nice....
 
I've never found it necessary above 30fps even online (of course, keeping it higher is always better) simply because I've never noticed TV or movies running faster, and they're locked to below 30fps for example

And I shouldn't have said average 30fps, its closer to 40, just that worst case scenarios it will hit 26-30
 
Want to upgrade my system to play BioShock on a 30" at full Direct X 10 resolution.

I assume as far as 30" monitors goes DELL is the one to get

As for video cards I currently have a 7800GT and wanted something single PCI-E that would power a 30" monitor and run BioShock at 2560x1600 with Direct X 10

What is the least expensive Nvidia card that will deliver that? Also is my X23800+ going to cut it? Ah I remember when my system was really powerful, it was almost as if it was a 18 months ago ;-)

In Order to game at that rez and get a good going frame rate.. your gonna need a pc in this configuration :

Q6600
4 gigs of ram
2 x 8800 GTX or Ultra's SLi
Xifi creative sound card

3800+ X2 isnt gonna cut it on the CPU side of things. Otherwise I would aim for a 24 inch and get a single 8800 GTX.
 
I did a quick run with Fraps and I can say that the FS number are accurate. I ran from 29 to the mid 50s FPS @2560x1600 (max settings no AA) depending on the area. I would not want to play at that res though because overall it is not nearly as smooth as if I scale it down to 1920x1200. You could get away with it though if you were tolerant of a bit of choppiness. BTW q6600 is @ 3.33ghz and GTX is @650/1ghz

http://www.firingsquad.com/hardware/bioshock_directx10_performance/page5.asp

Looks pretty playable on a higer-end rig at 2560x1600. On my stock Q6600 4GB, 8800 Ultra Vista x86 machine at max settings in DX10, its smooth as 12 year old Scotch.
 
I've never found it necessary above 30fps even online (of course, keeping it higher is always better) simply because I've never noticed TV or movies running faster, and they're locked to below 30fps for example

And I shouldn't have said average 30fps, its closer to 40, just that worst case scenarios it will hit 26-30

TV usses a "blurring" effect between frames, video games do not, you see the frame transition, you can not compare 30FPS on a tv / movie vs a video game, they use different methods of showing the frames and why many people can see well above 60FPS on a video game, and see lagging and such, but you wont see it on a tv.

watch a movie, a fast scene, things arent clear, they get blurry, things arent in focus.

play a fast video game and if you watch you can still see all details if you could stop the game mid move.

http://en.wikipedia.org/wiki/Frame_rate

The reason computer rendered video has a noticeable afterimage separation problem and camera captured video does not is that a camera shutter interrupts the light two or three times for every film frame, thus exposing the film to 2 or 3 samples at different points in time. The light can also enter for the entire time the shutter is open, thus exposing the film to a continuous sample over this time. These multiple samples are naturally interpolated together on the same frame. This leads to a small amount of motion blur between one frame and the next which allows them to smoothly transition.
 
What are you all smoking, a 30" display does NOT need SLI, maybe Crossfire because that actually works in Vista64 and double performance in every game, but SLI sux

I don't even like the idea though of both Crossfire and SLI systems, it is just a way to steal more money from geeks. A single 8800Ultra is more than enough for todays games, not tomorrows games, but any game available today will run smooth at 2560x1600rez on a 30" LCD with a 8800Ultra or GTX. I play these games at my default rez with my system in sig;

- WoW = 2560x1600rez 4x MultiSample all options maxed

- QuakeWars beta = 2560x1600rez 2xAA all options highest

- HalfLife2 = 2560x1600rez 4xAA all options maxed

- Supreme Commander = 2560x1600rez no AA all options maxed


These are the main games I play right now, and with a single 8800GTX play smooth as heck, I just don't see the need for another GTX, because the shitty thing with SLI you pay 100% more but only get 50-75% increase at most in games, it does NOT double the benchmarks in games and work perfectly all the time, if it did then I would be all for SLI, and with Vista64 right now don't even think about it.

SLI is always one up'd by the next generation VideoCard, so soon a 9800GTX will be out that is the same speed as 2-8800GTX's
 
SLI isn't broken in Vista 64bit anymore.

Crossfire does not magically give you a 2x framerate boost. It works more or less exactly like SLI.

SLI is not some kind of conspiracy. You probably won't get a 2x framerate boost, but you will get a smoother over all experience -- higher minimum framerates, less fluctuation. 80 FPS consistent is a lot better experience than 120 dropping down in to the mid 20's occasionally.

HL2 is aging. It's a great game, but it's not new. WoW is not at all demanding, as I've said before. You don't need an 8800 by any stretch of the imagination to run it well, even at max settings. Supreme Commander is newer, but much more CPU limited than it is GPU limited. Quake Wars, based on an aging engine with some new stuff stuck on it.

A 9800 GTX may very well be just that fast... or it might not. Radeon HD 2900 XTs look a hell of a lot better on paper than they do in practice -- we may run in to a similar situation with the 9800 GTX. A 9800 GTX also isn't available now. 8800's are.

Can you play Bioshock at that resolution? Call of Juarez in DX10 mode? Come September 25th will you be able to run the Crysis demo? I seriously doubt it.
 
SLI isn't broken in Vista 64bit anymore.

Crossfire does not magically give you a 2x framerate boost. It works more or less exactly like SLI.

SLI is not some kind of conspiracy. You probably won't get a 2x framerate boost, but you will get a smoother over all experience -- higher minimum framerates, less fluctuation. 80 FPS consistent is a lot better experience than 120 dropping down in to the mid 20's occasionally.

HL2 is aging. It's a great game, but it's not new. WoW is not at all demanding, as I've said before. You don't need an 8800 by any stretch of the imagination to run it well, even at max settings. Supreme Commander is newer, but much more CPU limited than it is GPU limited. Quake Wars, based on an aging engine with some new stuff stuck on it.

A 9800 GTX may very well be just that fast... or it might not. Radeon HD 2900 XTs look a hell of a lot better on paper than they do in practice -- we may run in to a similar situation with the 9800 GTX. A 9800 GTX also isn't available now. 8800's are.

Can you play Bioshock at that resolution? Call of Juarez in DX10 mode? Come September 25th will you be able to run the Crysis demo? I seriously doubt it.

Exactly, you will want at least a single Ultra, but will most likely need SLI'd GTX/Ultras to have very nice fps in these newer games / future games.
 
I agree with Silent-Circuit.

The 4MP 30" LCD is what SLI was made for... pushing tons of pixels at a fast rate.
 
First off certain games are WORSE in SLI, take WoW again for example, a single card gets like 1-2% faster frames than an SLI or CrossFire setup that engine doesn't take well to multiple VideoCards, like it just doen't even enable them ??

I just don't see the point in spending another 100% on a VideoCard to get less than that back in performance, I could live with getting 80-90% back in performance with SLI, but that never seems to be the case, it is usually only 50-75% increase in the real world, and for that much cash I just don't see the point then, especially if some games do not even take advanatge period with SLI.

I would rather wait a few months and get the next new generation card, where a single card beats two previous generation cards in SLI.

But back to the main thread, I think a OC 8800Ultra running 650mhz and there are a few brands selling them like that stock would be the best choice right now to run a 30" Display. That would make it 15-20% faster than my card, and I can play all games fine at 2560x1600rez with newer games 2xAA.

My opinions if your getting a single card only buy an 8800Ultra super clocked, if ya want two VideoCards then buy 2-2900XT's 1gb versions, that would be allot cheaper than two Ultras but still give ya really nice performance

Check this review out in BioShock, 2560x1600rez, a single 8800GTX gets 41fps, 2-8800GTX in SLI gets 50fps, that is only 20% faster performance for 100% increase in price, umm no thank's;
http://www.firingsquad.com/hardware/bioshock_directx10_performance/page6.asp
 
Scaling up core clocks does not equate to scaling up speed in real world applications. My 675Mhz 8800 GTX is not 17% faster (at least in framerate) than it was at 575Mhz, looking at framerates.
 
Since Nvidia and ATi seem to hint the future is multi chip GPUs, I hope they fix it.
Here's a thread where others aren't happy with their costly returns for running SLI/Crossfire.
Crossfire and Crysis?
 
It is not a 100% price increase. You should factor the total price of the computer and understand that the video card is the main performance determinant.

And since you brought this performance/price topic, your Q6600/4GB/Raptor don't bring a noticeable performance increase over my sig in games, so why do you still buy them???
 
I just don't see the point in spending another 100% on a VideoCard to get less than that back in performance, I could live with getting 80-90% back in performance with SLI, but that never seems to be the case, it is usually only 50-75% increase in the real world, and for that much cash I just don't see the point then, especially if some games do not even take advanatge period with SLI.

[/url]



okay great YOU do see the point in spending%100 more, because the games YOU play dont need the power, but for people who play other games and NEED the power, maybe they DO see the point in SLI, the 4 games you listed are high end in ANy respect graphics wise what so ever!

not everything is for everyone. and some people dont want to wait, but since your going to wait a few months to buy a next gen card, then why wait at all, why not wait another 6 months to get the card after that! oh wait, no, might as well wait another 6 months for the card that will beat out that card your waiting for.. see the process?


Buy what you need / want when you want it, the wait game is pointless because you will always be waiting and waiting and never use your computer!

some people ike a smooth 80FPS with little dropping and want to know that they can play that game coming out in 3 months most likely, some people have a timeline and a budget to get parts with in X time, and SLI / Xfire is part of the plan.

so get over the fact you dont want SLI / Xfire and others do / will.


last note, how long until that Ultra card came out? before that there was no single card to compare to SLI 8800 GTX's either, now they have an ultra card out, great, it performs close or better the SLI, now find more benchmarks and i am sure in places SLI will come out on top of a single card...
 
Fine go out and spend another $500-$600 on a second card that will give ya an extra 10fps in games at 2560x1600rez :rolleyes:

I have the money to spend on it if I want, but I am being logical and to spend that much money on a second VideoCard that only gives 20% increase in games is just crazy, I mean why bother, would be better off upgrading another part of your system instead. And WoW doesn't even take advantage of SLI period so the second card is useless in that game.

Show me and prove to me some solid benchmarks where SLI will even give close to double the performance improvement over a single card but at 2560x1600rex with AA enabaled.
 
Try running some Oblivion, GRAW2, Call of Juarez, UT3, or some Crysis on a single card and your tune shall change on the quick. World of Warcraft is hardly what I (or anyone else for that matter that actually knows anything) would call a graphically strenuous title. ROFL.

For some that want simply the best or have really high resolution monitors with maxed out details SLI is the only option. And for running at 2560x1600 with AA and not compromising on detail is such a case, unless you have an 8800 Ultra with LN2 on tap, and even that's stretching it.

Granted, nVidia screwed the pooch big time on SLi support for Vista, but sooner or later (hopefully sooner) it will get fixed. And yes, it isn't without its issues, but it does have a purpose. And for most games it does work well. Things will get better. The real question is, what are you SLI haters smoking? Or are you just jealous that you can't afford such an extravagent configuration so you just knock it, tar and feather it, and throw it to the dogs?

Grow up and do some research before you start kicking around your two cents, eh?
 
Granted, nVidia screwed the pooch big time on SLi support for Vista, but sooner or later (hopefully sooner) it will get fixed.

Yeah but when in the world will that be? I wouldn't have minded going SLI but gambling that both Nvidia and Microsoft fix their problems on their ends is probably worse odds than hitting the jackpot at Vegas right now it seems. I mean, good old display drivers not working still happen months later, and now I have to disable Vista Aero and lower clocks on cards just to make sure it doesn't happen.
 
Yeah but when in the world will that be? I wouldn't have minded going SLI but gambling that both Nvidia and Microsoft fix their problems on their ends is probably worse odds than hitting the jackpot at Vegas right now it seems. I mean, good old display drivers not working still happen months later, and now I have to disable Vista Aero and lower clocks on cards just to make sure it doesn't happen.

...it's fixed and has been for over a month, as I've said several times now. I'm sure SLI setup owners will be glad to confirm this.
 
Really? On certain games I recently read it wasn't working as well upon but that might be the game itself
 
Really? On certain games I recently read it wasn't working as well upon but that might be the game itself

Key words there being "On certain games". That's been the case with dual card setups pretty much since their inception. This is true of Crossfire and SLI both. The "default profiles" don't always work properly.
 
...it's fixed and has been for over a month, as I've said several times now. I'm sure SLI setup owners will be glad to confirm this.

So there ya go. My opinion stands. Some games greatly benefit from SLI. It just depends on how you write your game code. Like any technology, it isn't perfect, but SLI does have it's place, and running in very, very high resolutions with maximum details and AA is perfect for dual card applications.
 
Back
Top