Thinking of buying 2 8800Ultras. Is it worth it?

Here's somebody with a bit of sense in his head! How dare you put things in their proper context?

Nothing is obsolete. If the 3.5" floppy isn't obsolete, AMD isn't either. Here's to hoping AMD stays alive and saves us a lot of money.
 
Nothing is obsolete. If the 3.5" floppy isn't obsolete, AMD isn't either. Here's to hoping AMD stays alive and saves us a lot of money.

yes yes ...but you're still missing the point

Which is that, if this guy wants to spend 1.8 grand on an upgrade, he should at least be running the most current platform

it's hilarious that you would lecture this guy (and me) about the "continued viability" of an old platform when he is trying to purchase 2 8800 ultra cards :cool:

"continued viability" is obviously not in his vocabulary and thats why it shouldnt be in yours if you want to give him advice, and its in this context that my advice was given

I dislike people like you who either can't or won't view things in their context
 
yes yes...but you're still missing the point

Which is that, if this guy wants to spend 1.8 grand on an upgrade, he should at least be running the most current platform

it's hilarious that you are trying to lecture this guy (and me) about the "continued viability" of an old platform when he is trying to purchase 2 8800 ultra cards :cool:

"continued viability" is obviously not in his vocabulary and thats why it shouldnt be in yours if you want to give him advice, and hence my advice was appropriate

It's hilarious that you assume authority on the matter. If this were the Al Shades forum, you'd be uncontested. However, it's a discussion forum, and it's merely your opinion that "he should at least be running the most current platform."

He can spend his money however he wants. Right now, he isn't exactly sure, so he asked here. If he for some reason bows to your all-knowingness, more power to the both of you. If he's smart, he'll only consider it with the other ideas.
 
its hilarious anyone is even thinking of buying an ultra, let alone TWO. rofl
 
He can spend his money however he wants.
That's very true, but you also have to admit that Al makes a damn good point. It doesn't make much sense at all from a dollar-per-frame perspective to be running an Opteron or what have you when Intel's current line of procs is clearly superior. He states a pretty obvious alternative: Core 2 Duo and overclocked 8800 GTXs, which will give him better performance 9 times out of 10 (or better).

That being said, if the OP is partial to AMD for whatever reason, it seems almost insulting to call his procs "obsolete". They're obsolete in a particular sense, but not in the way Al made it seem. It's weird to get attached to a PC part in that way, but Al's way of stating the obvious was pretty over-zealous, and I'm pretty sure he knew he was striking a match with that post.

And there it is.
 
C2Ds may beat their AMD counterparts in most areas, but it isn't always by that much. The X2 6000+ was barely beaten by the e6700, which costs bout $100 more. If money is no object, sure, get the very best along with an 8800 Ultra.

If you'd rather spend less money and lose maybe 10FPS, why not go AMD? You'll save quite a bit of money for only a small loss of performance.

I'd wait for AMD's new stuff. There's AM2+, R600, etc. Right now's a good time to buy, but if you want the latest and "greatest," you should wait a little longer.

http://techreport.com/reviews/2007q1/athlon64-x2-6000/index.x?pg=4

http://techreport.com/reviews/2007q1/athlon64-x2-6000/index.x?pg=5

http://techreport.com/reviews/2007q1/athlon64-x2-6000/index.x?pg=6
 
QFT (fucking hate qft) Well he speaks the truth!

Well the 8800 ultra's aren't good enough to put in a sig rig or any system for that matter. It won't matter during games and it's not even cool enough to get props.

Just game!

I played the command and conquer 3: Tiberium Wars the other day on the rig in my sig on all medium with great FPS. It's not worth the dough.

Have a nice day!


Are you kidding? I have a rig just as old, if not worse and I can play with every setting maxed out and it still plays great.
 
To the guy who is looking to upgrade from two 7900 GTX cards.

Wait until nV fixes Vista SLi, which should be right around the same time they break out an 8950 GX2.

Ultra cards have always been high dollar but short lived products. This just means that the 8800 is getting closer to end of life and 8900 is getting closer. Just look at the time lines between 7800 GTX, 7800 GTX 512 (Ultra), 7900 GTX, 7950 GX2 (Ultra), 8800-Series.
 
If you'd rather spend less money and lose maybe 10FPS, why not go AMD? You'll save quite a bit of money for only a small loss of performance.
That would be a good point were it not for the fact that we're talking about the OP investing in $1650+ graphics card setup, something that the majority of people around here spend roughly $200-450 for. That changes the way we think about the situation slightly.
 
Right now I have two Evga Superclocked 7900GTXs running in SLI mode. I am thinking these would probably still beat only one 8800Ultra, so I was going to get 2. Is this true?

Btw, will I even notice a difference in SLI with the 8800Ultras with my somewhat outdated cpu?

Here are my specs:
AMD Opteron 170 @ 2.63ghz
2gbs of Corsair PC4000 DDR ram
750W PSU (certified for 7900GTX sli, dunno about 8800ultra)

8800 Ultra is for stupid people with too much money
 
Or people who have the money and feel like spending it on whatever they wish.

I'm thinking that perhaps the 8800 Ultra exists so that extremely envious poor people feel like they have something else to complain about.

Interesting play, NVIDIA. I hope it amuses you as much as it does me.
 
That would be a good point were it not for the fact that we're talking about the OP investing in $1650+ graphics card setup, something that the majority of people around here spend roughly $200-450 for. That changes the way we think about the situation slightly.

Good point regarding the one sentence of mine that you quoted. As for the rest?
 
Two ultras is idiotic. The premium on those cards is way too high to SLI them, unless you are freakin' rich and don't care. And I mean rich, because you don't need two Ultras. Two GTX's is ok because those cards at least have reasonable price/performance. The ultra does not. I doubt there is anything you could do with two ultras that you couldn't do with two GTXs including 1080p resolution. I wouldn't even buy one ultra at those prices. But it isn't really a question of having the money for the ultra, because a year or so from now you'll need new cards anyway and you'll hate yourself for spending that much.

You would be better putting the money into a new CPU. Much better off, no doubt about it.

The only exception is if you want to run some kind of extreme resolution and settings. You'd have to look at benchmarks and see if there is a material difference between the GTX and the Ultra for your intended purposes.
 
I still have 7900GTX SLI. I'm not going anywhere. Of course, I also use 1280x1024...

I don't see any indication of what res the OP is using. If he's under 1600x1200, then I would venture to say that the 7900s should hold a while longer. As for the Intel-AMD crap, it really depends on what resolution you're talking about. If he has the same res as I do, then yes the second Ultra will be wasted. But then again a single Ultra wouldn't get a workout at all.

To the OP: what the heck are you doing that you could even think you're going to need dual 8800 Ultras to get good frames?
 
No its not worth it. Its not close to worth it. I mean come on. Get a 8800GTX and upgrade your pc. Get a new cpu, some ram, and a new mobo. What res are you playing at anyways that requires 2 ultras? 3000x4500? Every DX9 game is being maxed out at 1600x1200 by one 8800GTX alone. DX10 games aren't even out for a while. Go with one GTX then get another when DX10 games are out.
 
Which system will be faster in games?

1) C2D + 2x8800GTS / 1x8800GTX/Ultra
2) Opteron + 2x8800GTX/Ultra

IMO 2x Ultra is not worth it, just go with 2x GTX. However I don't that it is a good idea if you trade the second GTX just to go with a C2D. Changing the whole platform to support C2D from your current system is also not worth it imo.
 
Which system will be faster in games?

1) C2D + 2x8800GTS / 1x8800GTX/Ultra
2) Opteron + 2x8800GTX/Ultra

IMO 2x Ultra is not worth it, just go with 2x GTX. However I don't that it is a good idea if you trade the second GTX with a C2D.

It depends on facts we don't have. We need to know what size screen he is using and what resolution and settings he wants.

Depending on resolution and settings, he may be GPU bound, or he may be CPU bound. It really depends. It also depends on what games. If the game is a CPU hog then Core2Duo will help a lot, especially one of the 4MB cache models.

I think people tend to underestimate how much a faster CPU can improve performance. I notice significant overall system and gaming performance increases when I overclock my Core2Duo from 2.4 to 3.4 GHz. I use an X1800XT @ 1680 x 1050 4x 8x in all games. So I imagine that to get the full benefit from a GTX+ SLI setup you will want the latest CPU platform. You could try such a SLI setup with your existing CPU and see if it gives you the performance you are looking for. If not you'll want to upgrade your platform.

There is always a point where you really would benefit from a faster CPU but sometimes it is hard to tell when that is. Faster CPU can also speed up things like load times which makes the whole experience that much better.
 
If he is using a small resolution screen, 2x 8800GTX will push FPS high enough that I think you can't notice the difference between an Opteron and a C2D. Normally a monitor only has a 60Hz refresh rate at the native resolution so what is the difference between 120 FPS and 160 FPS?
 
If he is using a small resolution screen, 2x 8800GTX will push FPS high enough that I think you can't notice the difference between an Opteron and a C2D. Normally a monitor only has a 60Hz refresh rate at the native resolution so what is the difference between 120 FPS and 160 FPS?

It isn't a question of 120 vs 160 FPS, it is a question of an additional level of antialiasing being available, or veritical sync, or other such features. Like I always say, first you have to figure out what your minimum standard is and then you figure out what your hardware needs are. Nobody should have 120 FPS because if there is that much power to spare they should kick the aa up another notch or whatever.

It could very well be that the opteron is good enough for a while, at least until there are cheap quad core CPUs and games that take advantage of them. I'm not all that framiliar with that particular AMD chip. I do know that if the CPU is lacking there will be bottlenecking. It doesn't always manifest itself as low frame rate all the time. In the past where I had CPUs that didn't quite cut it and graphics cards that more than fit the bill, often frame rates were good overall but certain activity in game would slow things down, like a barrel exploding suddenly or some kind of physics related event. That is the kind of thing that really spoils a game.
 
I just don't think that there is any situation in games where a single graphic card on a slightly faster CPU can outperform two of the best graphic cards on a slightly slower CPU(except for supcom, I think) . Sure the Opteron is not the best CPU but I don't think that it will hurt the performance of two of the best cards until they can be slower than a single card. I also don't think that it is worth it to change all the mobo, RAM and CPU to upgrade to C2D and run 2x 8800GTX just like I don't think that it is worth it to buy 2x 8800Ultra over the 2x 8800GTX.
 
QFT (fucking hate qft) Well he speaks the truth!

I leave the internet for a few years come back and find QFT and FTW being used often. I figured them out on my own, but I orginally thought QFT = Quite Fucking True. Actually still prefer that version.
 
if it makes you happy to have the fastest setup then yes pull the trigger and order those 2 ultras.

now back to reality.

the 8800ultra is a big ripoff imo, its barely faster than a 8800gtx. shit you can overclock the 8800gtx and im sure you wouldnt notice the difference between the 2 in games.

it depends on what games your playing right now, do they currently run like crap? if the answer is yes then upgrade, if they run just fine then keep your current setup.

I would wait for the next generation of cards toward the end of the year once DX10 games actually start to hit the shelves.

i really cant believe how much that card costs, its just retarded.
 
Or people who have the money and feel like spending it on whatever they wish.

I'm thinking that perhaps the 8800 Ultra exists so that extremely envious poor people feel like they have something else to complain about.

Interesting play, NVIDIA. I hope it amuses you as much as it does me.

and they will regret having bought the 8800 ultra one day when they realize it wasn't even worth the money, just like some of the people who got the 7800gtx 512MB

$830 for just a OCed 8800 GTX give me a fucking break nVidia
 
I think 8800 Ultra SLI would barely cut it with that supar powerful CPU you're using. I mean it would work, but you'd definitely be GPU limited. You should probably wait until the next generation of video cards and then upgrade to "9800 Ultra" SLI. Then your system would be much more balanced.

^^This guy is wrong, depending on the resolution you're playing at.

Basically, yes, you can't get better GPU performance than 8800 Ultra SLI, period. If you have an extra $600 to drop on shit like this, go ahead. You can't match it with OCing 2 8800GTX's, why? Cause you can just OC the ultra exactly the same way.

But I would only do that if your monitor cost at least that much. ;) Essentially, unless you have a 30" monitor, a single one of those cards can run at max settings. So get a 30" monitor first, then upgrade your video cards (if you don't have that, your current video cards can run games just as well as the Ultra's will because of the resolution). At that point you're gonna want a new Conroe CPU too, shouldn't be too expensive (compared to the monitor and video card).

If money is a concern, just get SLI 8800GTX's and a 24" widescreen (and a Conroe CPU upgrade, you'll need that too). If you run at less than that, don't upgrade your GPU's.
 
Why does he need to upgrade the CPU too, surely the Opteron is not the best CPU but I don't think the performance gain is worth it to change the whole mobo, RAM and CPU. If he just want the best, get a QX6700 with 2x 8800Ultra, no need to ask whether they are worth it or not.
 
Ummm, he was being very sarcastic? Earth humor, ar! ar! ar! [/Mork]

Hmm, you may be right, now that I look at it. Although I wouldn't really say a dual core s939 opteron is obselete really. They're still pretty good, just not for >1600x1200. Hell, I'm using an OCed 3000+.
 
Every DX9 game is being maxed out at 1600x1200 by one 8800GTX alone.

Except the PC's current killer app, but FSX is cpu-limited anyways...;)

Dual 8800 Ultra's will NOT run Flight Simulator X maxed out. Nothing will. So it is technically incorrect to say that a high end 8800 can run "any game maxed out".

I think people tend to underestimate how much a faster CPU can improve performance.

Do you? I think people overestimate it.

As I have written before, I see the CPU increasingly becoming a component of secondary importance in today's systems. The performance improvements you describe would seem to be better attributed to high performing physical storage devices, rather than the processor. In fact, I see graphics and storage at the top of the performance hierarchy.

I didn't notice any difference in desktop-level performance when upgrading from a P4 Northwood to a C2D. In this, I know I'm not alone, for many other people have stated as much. In fact, I have heard it mentioned as one of the downsides of the current generation of Intel CPU's that, "while they readily overclock far beyond their stock frequencies, the resulting performance gains are questionable".

I just don't think that there is any situation in games where a single graphic card on a slightly faster CPU can outperform two of the best graphic cards on a slightly slower CPU(except for supcom, I think) .

There are plenty of situations where that's the case. SLI does not work with all games. SLI never doubles the performance of one card. Even when it does provide a performance advantage, it's only substantial about half the time. One fast card with an OC'd CPU is way better than SLI with a slow CPU.

I still have my FX60 on OH MY GOD s939 with ddr 1 :D I"m happy as can be.

If you're running ancient shit (software) then there's no need to stay current on hardware. However, you are missing out what's probably the best generation of consumer hardware ever (C2D + 600i platform, 8x GPU).

At the end of last summer, the reports were all over the web. Entry level Conroe's were beating top-class AMD FX cpu's in real world benchmarks. Intel has been on top since that time, and legions of people upgraded to an LGA 775 platform last fall, myself among them (I was coming from s-478). That is why DDR2 prices were so expensive from Sept-Jan. Now they have started coming down. AMD is simply out of the picture for enthusiasts right now. If you ask me, they are out for good, because processors have ceased to be the ultimate rubrick of computer performance. Intel will simply coast on the success of Conroe, AMD will try to match that success with it's own product and if/when it does, it will then coast itself. The trend is to keep adding more cores, which don't actually improve performance whatsoever on 90% of applications. This is a sign that the chip makers are running out of ideas. They've already tried the clock frequency race, efficient architecture, and multi-cores. Naturally, fabrication methods will continue to improve, but I think we have seen the best of the big chip makers. The future is in graphics and high speed storage.
 
There are plenty of situations where that's the case. SLI does not work with all games. SLI never doubles the performance of one card. Even when it does provide a performance advantage, it's only substantial about half the time. One fast card with an OC'd CPU is way better than SLI with a slow CPU.

Care to elaborate? Have you ever tried SLi before? If you have then you know that it can be disabled in the control panel, one fast card with an OC'd CPU is not way better than SLI with a slightly slow CPU in some games but two fast cards with a slightly slow CPU is way better than one fast card with a slightly better CPU in other games.
 
Care to elaborate? Have you ever tried SLi before? If you have then you know that it can be disabled in the control panel, one fast card with an OC'd CPU is not way better than SLI with a slightly slow CPU in some games but two fast cards with a slightly slow CPU is way better than one fast card with a slightly better CPU in other games.

SLI is only worth it for people running at 1600x1200 and higher. And even then, not in all cases.

GTX SLI:
http://www23.tomshardware.com/graphics_sli2007.html?modelx=33&model1=804&model2=805&chart=353
vs.

Single GTX:
http://www23.tomshardware.com/graphics_2007.html?modelx=33&model1=706&model2=707&chart=293

In FSX, you will get a better performance gain going from e6300->e6700 than Single 7900->8800GTX SLI. This app is likely the most extreme example of SLI's mediocrity. Even in graphics intensive apps, SLI's performance gain cannot justify the additional costs (cooling and power, apart from the second display adapter).
 
If you're running ancient shit (software) then there's no need to stay current on hardware. However, you are missing out what's probably the best generation of consumer hardware ever (C2D + 600i platform, 8x GPU).

At the end of last summer, the reports were all over the web. Entry level Conroe's were beating top-class AMD FX cpu's in real world benchmarks. Intel has been on top since that time, and legions of people upgraded to an LGA 775 platform last fall, myself among them (I was coming from s-478). That is why DDR2 prices were so expensive from Sept-Jan. Now they have started coming down. AMD is simply out of the picture for enthusiasts right now. If you ask me, they are out for good, because processors have ceased to be the ultimate rubrick of computer performance. Intel will simply coast on the success of Conroe, AMD will try to match that success with it's own product and if/when it does, it will then coast itself. The trend is to keep adding more cores, which don't actually improve performance whatsoever on 90% of applications. This is a sign that the chip makers are running out of ideas. They've already tried the clock frequency race, efficient architecture, and multi-cores. Naturally, fabrication methods will continue to improve, but I think we have seen the best of the big chip makers. The future is in graphics and high speed storage.

I agree with your point about the future being in high-speed graphics and storage, but I don't see how chip manufacturers are "running out of ideas" for processor designs. The processor will always be the heart of the consumer-level computer, and it doesn't make sense to assume that we will give up on CPU design once we hit the limits of what silicon will do. We can find other materials.

Furthermore, I don't see how you can claim that all AMD CPU's are "out of the picture" for computer enthusiasts right now. The last time I checked, the term "computer enthusiast" meant someone who was enthusiastic about computer technology, not someone with an unlimited supply of money for purchasing a Core 2, 680i motherboard, and G80 GPU. Enthusiasts on a budget shoud be glad that AMD is still competitive in the low-mid range CPU market, that's how we get deals like the 5600+ for $180 that competes with an e6600 at stock speeds. The choice is clear cut for the high-end at present time, but for the vast majority of people (myself included), money is the limiting factor.
 
LOL but nono lol you musn't have a gf or something if you got that much money t justo toss at a couple graphics cards wow.
 
A single GTX barely makes any effort, in today's games, with all the eye candy on @ extremely high-resolutions.

Not true. I know of three games at least where this isn't true. Also, this not only with one 8800 GTX but two of them.

High resolution for me is 2560 * 1600 and "all eye candy on" is minimum 4 * AA / 16 * AF + all other settings maxed.


SLI never doubles the performance of one card.

Never say never. From personal experience F.E.A.R. is one game where this is possible. I have benchmarked 99% increase in framerates using specific settings. Regardless, SLI using the most powerful GPUs are for those where one simply isn't enough with the settings/resolutions they want. It sure isn't for anyone, but that doesn't mean it is useless. We all have different needs / standards and money to spend for hardware.


Sorry I didn't actually respond to this topic, but I am running 2560X1600 resolution.

Get the two Ultras then and OC them, but if you can then wait for the refreshed G80 / R600 or better yet wait for the next generation.
 
No. I don't think it is worth it.

I would buy two 8800GTX's and take the left over money and afford food for a week.

In fact, I am going to buy two 8800GTX's.
But, I am waiting until November for two reasons.
1. Price drop of Q6600
2. I am hoping a new graphics card comes out.
 
Back
Top