No R600 for X-mas :(

The GX2 is a single card, but it's not a single GPU solution.
So if you compare dual GPU from nvidia, to dual GPU from ATi, then that's the proper way of looking at it. You should not compare a GX2 to a board that just has 1 GPU. The only reason it's done in reviews is to show that ATi's single GPU can compare with nvidia's dual GPU in some instances/games.
 
tornadotsunamilife said:
Correct me if I'm wrong (I very well may be) but doesn't the use of a GX2 require an SLi compatible mobo?
No. You just need a motherboard with an upgraded bios to support it. Even an ATI chipset board is supported.

psychot|K said:
The GX2 is a single card, but it's not a single GPU solution.
So if you compare dual GPU from nvidia, to dual GPU from ATi, then that's the proper way of looking at it. You should not compare a GX2 to a board that just has 1 GPU. The only reason it's done in reviews is to show that ATi's single GPU can compare with nvidia's dual GPU in some instances/games.

Then we should only compare cards that have the same number of pipes, with the same clock speed and the same memory. :rolleyes:

Spin it any way you want, it is still one card and it's valid to bench it against other single cards.
 
PRIME1 said:
No. You just need a motherboard with an upgraded bios to support it. Even an ATI chipset board is supported.

AND you must enable both GPU's in the drivers to get performance, just as required for dual card solutions. You also lose dual monitor outputs just as dual GPU/SLI solutions do so why both with the "single card" talk? It is technically two cards in one slot.

PRIME1 said:
Then we should only compare cards that have the same number of pipes, with the same clock speed and the same memory. :rolleyes:

NV doesn't have anything to compare to the X1900/X1950 (single GPU solution) so yes it is required that review sites use a dual GPU solution from NV. Maybe next generation we will see NV and ATI going at it on the single GPU front but not this generation. Who knows, maybe it will take ATI sandwiching two PCB/GPU's together to contend with the G80.

PRIME1 said:
Spin it any way you want, it is still one card and it's valid to bench it against other single cards.

How is a TWO PCB, TWO GPU solution considered a SINGLE CARD solution? To be honest it is a DUAL CARD SOLUTION IN A SINGLE PCI-e SLOT. It take interaction with the drivers to enable both GPU's so it can't be considered a single card. Single PCI-e slot YES, single card NO.

/on topic
 
I call it a single slot SLI solution. Until ATI develops their own single slot Crossfire solution, the GX2's competitors are any ATI cards in Crossfire configuration or ATI cards within the GX2's price range.
 
well... ATI only has cheaper solutions right now, their highest end card is roughly around $450, where you can find the GX2's for $500, majority for $530+

if you want to compare it to a crossfire set up no ones complaining, in all honesty all you guys are really argueing about is semantics, its a video card, the only gripes that can be said about it is its a single slot solution that takes away your multi monitor support, and to get it you got to cut back on your gaming power by quite a bit.
 
Its easy; 7950GX2 is a single graphics adaptor. Now lets stay on topic guys, this isn't the NV forum.
 
That's why you don't play the waiting game on graphics cards. You'll be waiting a long time.
 
HighTest said:
OMFG! ATI is pwnd due to late release, G80 gonna Roxorz!! :rolleyes:

Seriously, regardless of who is released first, it will have less of an impact this time arround than in the past. The most significant reason is that the non-hardcore less informed (which is not us) will most likely purchase their card arround the same time they purchase Vista. Major product forcasts for computer sales this Christmas is significantly down do to the delay of Vista, hence the industries angst against MS at this point in time.

When Vista ships, hardware purchasers that have held off for it's release will then be pruchasing their GPU. This will provide plenty of time for both ATI and nVIDIA.

Secretly I suspect the following. ATI and nVIDIA, wishing to maximize sales of their latest current gen products and thus profit from that product lines R&D, will find ways to state delays on the R600 and G80 artificially encouraging purchase of older gen. Then walla, release them at about the same time and garner additional revenue as card purchasees from the last couple months decide to pony up more $ just to get DX10 support.

Why release it earlier when you can squeeze all of the [H] enthusiasts all over again closer to the Vista release?


Rather intelligent post.

I also think its sad that people actually think ATI would fall from the planet, if anything i see them pushing further and further in to the enthusiast market.
 
no joke, christmas seems so far away at this point. I hope both companies release before then.
 
I personally will be holding on to my X1900 at least until the first DX10 refresh (equals new Intel comp for me :cool: :cool: :cool: ).

I'd like to see this next gen refresh to 65nm. Power requirements at 80/90nm seem ridiculous, and here's hoping that any DX10 teething problems will be sorted (a bit anyway) by then.


It will also give me a chance to see just how these cards handle Crysis, probably the first insight into just how well they can throw DX10 around.
 
I for one wish that DX10 was never released on vista as it is a bloated POS that i despise. I cant believe that an operating system consumes 43% of my ram idling.

Dont tell me that i have to get 2GB to run vista and a game like BF2, because with my 1gig i run BF2 medium maps with medium textures like butter on XP, not so with with vista.

Another thing, OSX seems to somehow manage with 512mb of ram, i know this because when i first got my macbook 512 was just fine, but for things that i do 512 wasnt enough. But i could multitask just fine and tested my 512 with all the apps open and everything worked and was very responsive.

Vista is a terrible, bloated, useless operating system. And in fact i believe that windows XP is better, because it utilizes less resources, is snappier to use, and is much more logical in using it.

Hopefully microsoft realizes that not everyone needs a compositing interface for gaming and release a service pack 3 for XP to include the DX10 library.

I would be the happiest individual to run winxp with a DX10 card and DX10 library avalible for me...damn Vista to hell!
 
R1ckCa1n said:
Who knows, maybe it will take ATI sandwiching two PCB/GPU's together to contend with the G80.
I highly doubt it, but I suppose it's feasible. I'd expect that ATi will, in some generation, have a GX2-like card, but I don't think it's going to happen this generation. I firmly believe that R600 and G80 are going to be tremendously well matched. I think that, for this coming generation, it's going to be more about the little things, rather than performance. Power consumption, driver support, control panels, IQ features - these are going to be the big selling points next year, even moreso than this last year. We'll see how the rumoured external PSU for G80 pans out for nVidia if it happens, but that may end up being a big turn-off for some enthusiasts.

ManicOne said:
Its easy; 7950GX2 is a single graphics adaptor
I'll go with that. I think we're getting really tripped up on semantics here, and I'm not sure if semantics really plays a big role in consumer opinion. Perhaps to some degree.

If nVidia could have slapped everything on one giant PCB, they probably would have done so (this has been done before a number of times, even way back in the days of 3Dfx). If nothing fundamentally changes going from one PCB to two, however, I still see it as a single card solution. If components were arranged across five stunted PCBs, I don't think I'd call it a five card video...card.

Endurancevm said:
OSX seems to somehow manage with 512mb of ram, i know this because when i first got my macbook 512 was just fine, but for things that i do 512 wasnt enough.
The core of OS X is so radically different from that of Windows that's it's not quite comparable. OS X is a much more "process oriented" operating system, while WIndows tends to be more "globally oriented". For most consumers, that latter probably tends to make more sense. I fully agree with you on Vista bloat and the accelerated compositing engine, though.
 
phide said:
A make-believe car has two engines (I recall one Ford concept supercar having two V10s, for instance, so this is a reasonable analogy). A competitor's car has one very large V12.

Again the only problem is that the dual GPU solution does not work every where all the time with out some user intaction. We know that there are some games were SLI just does not work, and most new games will take a profile update to make use of the 2nd GPU (this is easy to over come but takes user interaction). Both of which are not issues for the single GPU board. Gratned these cases are small. But they are differences none the less...
 
I'm actually amazed that we have little to no reliable info on either the R600 or the G80, at this point.
Both NVIDIA and ATI, are keeping a big secret under their covers. Actually I don't remember such secrecy / lack of reliable info, in any other generation of graphics cards.
We might be in for a big treat or a big dissapointement.
Let's hope it's a big treat...
 
PRIME1 said:
Then we should only compare cards that have the same number of pipes, with the same clock speed and the same memory. :rolleyes:

Spin it any way you want, it is still one card and it's valid to bench it against other single cards.

Really? Would you bench a dual CPU system vs. a single CPU system and call it valid? :rolleyes: :rolleyes: :rolleyes: You could do it for comparison sake, but that's it.
 
psychot|K said:
Really? Would you bench a dual CPU system vs. a single CPU system and call it valid? :rolleyes: :rolleyes: :rolleyes: You could do it for comparison sake, but that's it.
I think that comparing a single core cpu vs a dual core cpu is a valid comparison. Same motherboard, same chipset, same socket.
 
roflcopter said:
I think that comparing a single core cpu vs a dual core cpu is a valid comparison. Same motherboard, same chipset, same socket.

Yep, but not dual CPU. That's my point.
The 7950GX2 is not dual core.
 
psychot|K said:
Really? Would you bench a dual CPU system vs. a single CPU system and call it valid? :rolleyes: :rolleyes: :rolleyes: You could do it for comparison sake, but that's it.
You can compare anything you'd like to any other thing. That's the beauty of comparisons.

Check out a good Tech Report evaluation. You're going to see dozens of cards being benched. Is it not valid because a 7600 is compared to an X1900? Not at all. A review comparing a Diamond Monster Voodoo 2 to an X1950 XTX is still valid, because consumers can figure out what's going on. They can look at two cards, look at the respective prices and say "Okay, this is what's going on."

It's the consumer's job to determine what they're going to be buying. It is not the reviewer's responsibility to make the choice for them. People know the GX2 is more expensive. They know it's a different card. That doesn't mean they can't be compared.
 
psychot|K said:
Really? Would you bench a dual CPU system vs. a single CPU system and call it valid? :rolleyes: :rolleyes: :rolleyes: You could do it for comparison sake, but that's it.
I would bench a dual core system vs a single core system. The AMD X2 was benched against single core Intel systems and no one complained.

To me the GX2 is the same as a dual core CPU. They both fit in one spot.

I don't know why "certain" people call it anything but ONE CARD. It works in one PCIe slot. You don't need a SLI board and you can not seperate it into 2 cards. In fact if you do have a SLI board you can run 2 of these.

It must be some sort of denial or mental block I guess.
 
PRIME1 said:
It must be some sort of denial or mental block I guess.

It's not black and white. It has the performance advantages of two cards, but some of the disadvantages of SLI as well. With vsync for example, it cannot force triple buffering in D3D.
(Neither can Crossfire)
 
psychot|K said:
Yep, but not dual CPU. That's my point.
The 7950GX2 is not dual core.
But it behaves like a single card. Same motherboard, same chipset, same slot.
 
No GX2's behave exactly like SLI, you need to have a SLI profile for it to work, and you loose multi monitor support

again everyone is agrueing semantics, its a VIDEO CARD, who cares, its a beast and it works well!
 
I'm not arguing, I'm just seeing how worked up I can make this PRIME guy, he's hillarious :D

ATI PWNZORZ!
 
The GX2 is a terrible piece of hodge podge engineering.

"Well OMG, ATi is releasing a single GPU that beats our 7900GTX, what do we do? Oh i know, lets take 2 cards and glue them together! That is a marvelous idea...now where did i put that Elmers.."
 
Endurancevm said:
The GX2 is a terrible piece of hodge podge engineering.

"Well OMG, ATi is releasing a single GPU that beats our 7900GTX, what do we do? Oh i know, lets take 2 cards and glue them together! That is a marvelous idea...now where did i put that Elmers.."

QFT..... I doubt the bugs will be worked out before it is discontinued.
 
psychot|K said:
I'm not arguing, I'm just seeing how worked up I can make this PRIME guy, he's hillarious :D

ATI PWNZORZ!
Ah, I was right it was a mental block :D

It does not bother me if you can't handle the truth.

Anyways BACK ON TOPIC. I wonder when we will see some ACTUAL information regarding the R600 and or G80 instead of the flood of random rumors.
 
Endurancevm said:
The GX2 is a terrible piece of hodge podge engineering.

"Well OMG, ATi is releasing a single GPU that beats our 7900GTX, what do we do? Oh i know, lets take 2 cards and glue them together! That is a marvelous idea...now where did i put that Elmers.."

Trust me, if ATI could do it they would. I just don't think the dongle will fit. :p
 
PRIME1 said:
Trust me, if ATI could do it they would. I just don't think the dongle will fit. :p

"Wow, nVidia had no choice but to glue two cards together...*ATi bursts out laughing*"
 
psychot|K said:
Really? Would you bench a dual CPU system vs. a single CPU system and call it valid? :rolleyes: :rolleyes: :rolleyes: You could do it for comparison sake, but that's it.

if ur testing CS Source or CS 1.6 go head a 3800+ venice will pwn a 3800 x2
 
Mister E said:
Just put PRIME1 on your ignore list and be done with the biased pos.

I'm trying to figure out why he is so green, all he has is a 7800gs? Usually you own the products that you rant and rave about :rolleyes:
 
Way to continually derail a thread........

PRIME1 this is an explanation you might understand - The term card comes from the fact that these things come on their own pcb. Having a 2 pcb unit makes it hard for people to accept the term "single card".

That's why I suggested calling the GX2s "single graphics adaptors". Calling something single card when its clearly 2 together goes against most peoples sense of logic.


Anyway who cares, this is an R600 thread.


EDIT: Oh, and another thing, if "certain" persons derail this again, trust me its faster to report posts then to respond.
 
Well, but is it two boards or two cards? I've never heard of individual PCBs referred to as "cards".

Regardless, it's just semantics. No need to get overly concerned about it. The product remains the same no matter how you choose to label it.
 
ATI has been winning for a while. The green heads are just too !!!!!! to notice.
9800p vs 5900xt.
x850xt vs 6800u
x1900xt vs 7900gt

The price vs performance has been continuously going to ATI in the top-end sections.. however, they do need to work on their midrange solutions :p
 
Oh4Sh0 said:
ATI has been winning for a while. The green heads are just too !!!!!! to notice.
9800p vs 5900xt.
x850xt vs 6800u
x1900xt vs 7900gt

The price vs performance has been continuously going to ATI in the top-end sections.. however, they do need to work on their midrange solutions :p

Completely agreed, however now that you can get an X1900GT for the price of a 7900GS, the ATi has the upper hand in mid range. Its also the fact that for all those generations the ATi cards were synonymous with great IQ.
 
Endurancevm said:
Completely agreed, however now that you can get an X1900GT for the price of a 7900GS, the ATi has the upper hand in mid range. Its also the fact that for all those generations the ATi cards were synonymous with great IQ.
being an ati fan myself
i do have to mention that for every kickass ATI card was a refresh card that was made after the nvidia card
ie x1900xt vs 7900gt
the 1800xt blew chunks

i am an ati fan but nvidia has been winning overall in the race ppl
 
X1800XT blew chunks ?
It was late but performance wise, it was at least the equal, if not better than a 7800GTX.
 
Back
Top