So nVidia is completely unchallenged this generation?

Obi_Kwiet said:
It used SLI. A more compact form a SLI doesn't count as a single card.

It is a single card because it takes ONE PCI-Express slot from my motherboard. Whatever else it is, or does, or has on it, it only takes one motherboard slot.
 
Commander Suzdal said:
Make you all a deal, then. When r600 debuts, if it's on 80 or 65nm process (and significantly outperforms g80), I'll run around in my frilly undies crying, "That's not fair--r600 has a smaller process! Don't compare the g80 to it until the g80 has a die shrink! Then and only then will it be fair!" ;)
That's fine :)

I myself am an ATi boy but I do and can and will concede to the Winrar of the video card right now, 8800GTX.

I do hope to god that R600 will rape 8800GTX and as a matter of fact 1900XTX still rocks cause it can fold :)
 
Okay, let me try to rephrase what I had deleted above.

I've been looking at some reviewers sites before 8800 were even released and nearly all of them claimed that x1900XTX is the fastest single card out there even when the 7950GX2 existed.

Keep in mind, 1950XTX weren't even out.

Does anybody knows why 1900XTX were considered to the fastest single card even when 7950GX2 existed, is it because 7950GX2 isn't considered as a single card?

I know for a fact 7950GX2 pwns 1900XTX in nearly every section, if not all sections.

Does anybody knows why it has been like that?
 
Its fair to compare these two cards as long as you dont let fan boy get to your head and start spouting off how much ATi Sucks.
Keep in your mind that even though its fair the card is still an older card and common sense should inform you that this card will be faster.
Nvidia did a stand up job and the 8800GTX is a beast with the GTS not trailing far behind.
Also on the subject of 7950GX2 I didnt really like that card. Fact is some games will have issues with it and its AF still sucked a bunch. Allthough it took one PCIe slot it still has SLi issues. SO while for most things it did work I would rather have the ease of use of the X1950XTX.
 
The only difference between two cards and the GX2 is that it takes up one PCI-E slot. ATI could have easily bolted two of its cards together too, but fortunately did not perpetuate that stupid little contest.

The GX2 is two cards on a PCI-E splitter.
 
sam0t said:
Now here is alot of misunderstanded stuff in your post. The only one spot on is the need for competition. My post was mostly directed to the utmost fan people who write with blood to these forums "down ATI or down Nvidia". What good comes if the competion dies, we have another geforce 2,3,4 in our hands. Slow progress and nothing exciting.

There is no wonder where the ATIs response is, its most likely off the drawing board and in test production. I dont think Iam alone when I say the manufacturers should take their time with products, finish em and not rush them. We have many examples from rushed product, take a look at the game industry or some previous 3d card products that have flopped big time. Better to take their time, take that couple extra months to bring good product than rush it to meet competition.

Your right i did take your post out of context, After re-reading this thread i see your point now.

Sorry i just got caught up in the !!!!!! attacks i should have read your post more thoroughly.

LOL it replaces f.anboy with !!!!!!
 
fuelvolts said:
please tell me that your not 14 and your parents aren't buying it for you...

LOL i wish when i was 14 my parents would buy me some cool shit like an 8800 screw toys and clothes.
 
phide said:
Does it make a difference? In the end, it's just a product. We're all free to buy the products we want to buy, regardless of how they go about achieving the things we want them to achieve.


Yes it makes a diff when a person claims X product is faster - when it isnt and that is why they bought it, clearly they didnt do their research.
 
ati should just glue two x1950xtx's together, call it a x1950xtx2 and call it a day until r600 is released.
 
Soymilk said:
ati should just glue two x1950xtx's together, call it a x1950xtx2 and call it a day until r600 is released.
Quad CrossFire them too :)
 
Soymilk said:
ati should just glue two x1950xtx's together, call it a x1950xtx2 and call it a day until r600 is released.
They would need some sort of "dongle glue" :p
 
fuelvolts said:
please tell me that your not 14 and your parents aren't buying it for you...

Hell! I'm 32 but if my parents will by it, I'll damn sure let them. :D
 
MrGuvernment said:
Yes it makes a diff when a person claims X product is faster - when it isnt and that is why they bought it, clearly they didnt do their research.
My comment was in reference to the GX2 and its supposedly being "two cards".

But, if product X is the GX2, then, yes, it is faster.

Obi_Kwiet said:
The GX2 is two cards on a PCI-E splitter.
I don't believe you're considering the totality of the product. This sentence boils the entirety down to one very technical distinction without taking into account the sum of all the other aspects, don't you think?

Perhaps you should define what you believe constitutes a "card" (taking into account that the two PCB distinction is irrelevant).
 
Obi_Kwiet said:
The only difference between two cards and the GX2 is that it takes up one PCI-E slot. ATI could have easily bolted two of its cards together too, but fortunately did not perpetuate that stupid little contest.

The GX2 is two cards on a PCI-E splitter.


The GX2 is considered a single card, as it only has one connector and only takes up one slot. Saying that ATI didn't want to compete in a stupid little contest is chilidish and makes no sense in this situation. NVidia used a different approach in making a new card and it was a good approach. If ATI put two PCB's on what looked to be a single card you'd pat them on a back, but simply because Nvidia stacked what looks to be a full card on top of the other, it's a stupid contest.
 
If it exists in the same market it competes, is anyone going to say hmmmm the 7950 costs only a little more than the X1950 and is faster BUT its two cards so its not really the same thing ill buy the X1950... no they won't.

As for this generation, ill say what i said when G70 was out but r520 wasn't. If you have a decent DX9 card WAIT out R600, there is NO dx10 games out so unless your really hurting in performance the wait for the R600 wont be bad, plus either the R600 will be better than the g80 or even if it isnt nv will lower the price on the 8800 to be competitive, last generation I'm glad I waited for ATI's offering instead of getting a 7800GTX and i'm very sure itll be worth the wait again.
 
KevinJ said:
The GX2 is considered a single card, as it only has one connector and only takes up one slot. Saying that ATI didn't want to compete in a stupid little contest is chilidish and makes no sense in this situation. NVidia used a different approach in making a new card and it was a good approach. If ATI put two PCB's on what looked to be a single card you'd pat them on a back, but simply because Nvidia stacked what looks to be a full card on top of the other, it's a stupid contest.

The whole idea of putting lots of resources to get little one ups on the competition irritates me. The 8,000 card refreshes irritate me too. The X1950XTX irritates me as much as the GX2. Still, the GX2 had a point- quad SLI. The X1950XTX was just stupid. The thing about the GX2, is Nvidia didn't make a faster chip. It just stuck 2 PCBs in one slot. The R580 was the fastest GPU of the last gen. The GX2 is just another multiple GPU configuration. What I'm saying is that weather you want to call the GX2 one card or two, ATI did make the fastest chip last generation. It just didn't let you use as many of them in tandem.
 
pfunkman said:
LOL i wish when i was 14 my parents would buy me some cool shit like an 8800 screw toys and clothes.

Yeah childhood isnt even cool. or wait...
 
The X1950XTX irritates me as much as the GX2. Still, the GX2 had a point- quad SLI. The X1950XTX was just stupid.

X1950XTX did have a point... to test new memory and see how well it worked with the Ring Bus, which would be implimented on R600.
 
The thread creator makes me laugh. ATI has no match for nvidia for ~3 months. So after that 3 months, when ATI releases a better card, 'ATI' will be unmatched.. for what, the next year? It's a habit for ati to release a better product at a later time.. I don't see why people would suddenly expect something different.
 
chinesepiratefood said:
ehh...you on drugs?

it was more like:

7800 < X1800
7900 < X1900
7950 > X1950
8800 > X1950
8900 ? R600

No, looks like your more on drugs then him

7800 > X1800
7900 > X1900
7950 > X1950
8800 > X1950
8900 ? R600
period. No arguing allowed, I wont tolerate it!
 
nvidiapwnsati said:
No, looks like your more on drugs then him

7800 > X1800
7900 > X1900
7950 > X1950
8800 > X1950
8900 ? R600
period. No arguing allowed, I wont tolerate it!
sorry to say but after anyone sees your user name they're immediately going to disregard your post.

X1800>7800
X1900>7900
7950>X1950
8800?R600
 
Talk about a ridiculous thread. Nvidia is unchallenged until the R600 comes out, end of story, all of you Nvidia and ATI !!!!!!s (apparently I cant say that word) can go back to your caves now.
 
Killa_2327 said:
Talk about a ridiculous thread. Nvidia is unchallenged until the R600 comes out, end of story, all of you Nvidia and ATI !!!!!!s (apparently I cant say that word) can go back to your caves now.

Amen

The End
 
Oh4Sh0 said:
The thread creator makes me laugh. ATI has no match for nvidia for ~3 months. So after that 3 months, when ATI releases a better card, 'ATI' will be unmatched.. for what, the next year? It's a habit for ati to release a better product at a later time.. I don't see why people would suddenly expect something different.

I think that puts an undeserved positive spin on the situation for ATi. Yes, G80 is unchallenged until R600 comes out. And with 3 months to tweak the 80nm process and a very clearly defined target to beat, I have no doubt that R600 will likely end up a little faster.

However, by that time, most people that were planning on spending $600+ on a video card, already spent it. And while nVidia gets to enjoy 3-4 months of astronomical profit margins, ATi will be fighting an uphill battle against an entrenched enemy. In 3 months, 8800 gtx prices will have dropped substantially, and the drivers will be matured and well tuned in all major games. On the flip side, ATi will have to launch at a lower price with higher initial costs and will be lagging in driver quality.

So yeah, you can say that ATi typically releases a better card a little later. But you can also say that nVidia is executing their marketing strategy almost perfectly and their stock price shows it.

ATi got spanked by the 7800GTX for nearly 6 months, then R520 showed up (a little faster on a smaller process) and got whipped by the 7800 GTX 512 and later the 7900 GTX. Then ATi caught up with the X1900 series, but nVidia was still making major profits because their chips were so much smaller. Then the X1950XTX came out and took a clear lead, only to be stomped by the 8800GTX a month or so later.

And let's not even talk about the mid-range where ATi has spent the last 3 years trying to just match nVidia.

ATi makes great hardware. But their timing sucks, and so does their marketing and mid-range execution.
 
FaRKle0079 said:
X1950XTX did have a point... to test new memory and see how well it worked with the Ring Bus, which would be implimented on R600.
That could easily have been done "in house". But I guess the scope creep made them sell it, since the difference of building a couple of dev boards and a complete set of retail boards really isn't that large.

Unless the 8800GT... blows the x1900xtx out of the water in PPD on F@H, I am likely going to wait for the r600 and make my choice at that point, since I do not have the immediate need for a new gfx card right now.
 
Arcygenical said:
Christmas is the only reason I'm getting mine...

My birthday is the 3rd of December, and of course Christmas is the 25th. I'm thinking/hoping that's a good enough reason to get an 8800 GTS.

Well, that and the fact that it's fast as hell and has great image quality. :D

I wouldn't buy a GTX because it's just too much to spend, and if I ever even considered SLI I couldn't afford the PSU and light bill to go along with 2 of them, even if I would drop 1300 bucks for video cards.

BUT somehow, I think I'll justify a GTS and under the tree is looking realllllyyyy reallllllyyy good... :eek: :D
 
skeeder said:
the average user probably will never notice a difference. Think about this.

I went from an 6800NU to the XTX and I just swapped that for a GTX I've seen no difference in frames between CSS, UT, Quake, Doom, etc. The only small differences is Dark Messiah, and FEAR. Most people won't even notice the detail in a picture because they are focused on the game. I find this race...rediculous. Its for the hardcore. The people with bigger wallets and smaller brains.

Says the guy who went from a 6800 to an X1900 XT to the GTX...

So do you now have a smaller wallet AND a smaller brain :D ?

(I couldn't help it, you left yourself wide open)
 
^eMpTy^ said:
I think that puts an undeserved positive spin on the situation for ATi. Yes, G80 is unchallenged until R600 comes out. And with 3 months to tweak the 80nm process and a very clearly defined target to beat, I have no doubt that R600 will likely end up a little faster.

However, by that time, most people that were planning on spending $600+ on a video card, already spent it. And while nVidia gets to enjoy 3-4 months of astronomical profit margins, ATi will be fighting an uphill battle against an entrenched enemy. In 3 months, 8800 gtx prices will have dropped substantially, and the drivers will be matured and well tuned in all major games. On the flip side, ATi will have to launch at a lower price with higher initial costs and will be lagging in driver quality.

So yeah, you can say that ATi typically releases a better card a little later. But you can also say that nVidia is executing their marketing strategy almost perfectly and their stock price shows it.

ATi got spanked by the 7800GTX for nearly 6 months, then R520 showed up (a little faster on a smaller process) and got whipped by the 7800 GTX 512 and later the 7900 GTX. Then ATi caught up with the X1900 series, but nVidia was still making major profits because their chips were so much smaller. Then the X1950XTX came out and took a clear lead, only to be stomped by the 8800GTX a month or so later.

And let's not even talk about the mid-range where ATi has spent the last 3 years trying to just match nVidia.

ATi makes great hardware. But their timing sucks, and so does their marketing and mid-range execution.

I agree 100%.

ATI makes excellent hardware, but they suck at marketting and anything other than high-end cards.
Their mid-range is laughable at best and only with the X1950 Pro, they finally have something very good with an appealing price. What we call "best bang for the buck" card.
I have no doubt that the R600 will be faster than the current G80, but the problem for ATI is that they won't be competing with the G80 much longer, but rather its refreshed version, which will certainly be out, if R600 really stomps G80.
But this article from vr-zone:

http://www.vr-zone.com/?i=4293

Basically says, they are re-designing the R600 card, which also leads me to believe they are not just "shortening" it and improving its cooling solution. They probably weren't expecting G80 to be such a beast in terms of performance and IQ, so they are trying to improve R600.
 
Silus said:
Basically says, they are re-designing the R600 card, which also leads me to believe they are not just "shortening" it and improving its cooling solution. They probably weren't expecting G80 to be such a beast in terms of performance and IQ, so they are trying to improve R600.

Keep in mind that with the R600 supposed launch in about 3 months or less, there is very little they can do now. About the only thing they can play with is clock/memory speed. It takes a very long time to make any changes to the actual R600 design (remember changes to the silcon itself can take up to months before its back from the FAB).
 
Silus said:
http://www.vr-zone.com/?i=4293

Basically says, they are re-designing the R600 card, which also leads me to believe they are not just "shortening" it and improving its cooling solution. They probably weren't expecting G80 to be such a beast in terms of performance and IQ, so they are trying to improve R600.
Oh gosh. The card will require a 6-pin PCI-E power connector, but it will also require one of those 8-pin server adapters that are on high-end Xeon and Opteron boards?! :eek: :eek: R600 is looking to be the new Prescot...you know, it really bugs me when these companies think they can just increase the MHz, glue on some extra RAM, and release it as a "next-gen" product. nVidia, ATI, AMD, and Intel are all guilty of this. The industry seriously needs to reevaluate their strategy...so far, the only company who has made an effort to fix this is Intel (which, I should probably add they've done a stellar job of and proven that it can be done).
 
InorganicMatter said:
Oh gosh. The card will require a 6-pin PCI-E power connector, but it will also require one of those 8-pin server adapters that are on high-end Xeon and Opteron boards?! :eek: :eek: R600 is looking to be the new Prescot...you know, it really bugs me when these companies think they can just increase the MHz, glue on some extra RAM, and release it as a "next-gen" product. nVidia, ATI, AMD, and Intel are all guilty of this. The industry seriously needs to reevaluate their strategy...so far, the only company who has made an effort to fix this is Intel (which, I should probably add they've done a stellar job of and proven that it can be done).

I agree completely.

It's almost a relief that the GTX's aren't less expensive than they are. I can just see it now.


{"newscast"}
"Instead of a White Christmas, 2006 will forever be known for having the Brown Christmas, as thousands of nVidia 8800 GTX owners fired up their computers and caused large portions of the country to go dark. Representatives for ATI, which recently merged with AMD and is nVidia's closest competitor in the graphics market, were quick to point out that their solution will draw less power while still delivering high frame rates. A spokesman for the company stated "We're confident that our graphics solution will be competitive, while requiring approximately a whole watt less power."

{end "newscast"}
 
Dont forget, those people who spend $600 on a new video card, are such a SMALLLLLl % of the market, and are also the same people who will usually dish out another $600 when the next fastest vid card comes out.

Check threads, most people getting 8800's already own 7900's........
 
MrGuvernment said:
Dont forget, those people who spend $600 on a new video card, are such a SMALLLLLl % of the market, and are also the same people who will usually dish out another $600 when the next fastest vid card comes out.

Check threads, most people getting 8800's already own 7900's........

I don't know how accurate that is. When the 6800 Ultra was launched at QuakeCon a couple years ago, I overheard the BFG rep saying they had sold $20,000 in video cards in a couple of hours since they had opened their booth. That was only 40 video cards.

No doubt, there are more people buying mainstream cards, but they make truckloads of money in the high end. I don't know what the percentages are, but the people that buy the high-end cards represent a disproportionate amount of money for their numbers.

J Macker said:
Oranges are BETTER !! :D

I agree. ;)
 
Mark_Warner said:
I don't know how accurate that is. When the 6800 Ultra was launched at QuakeCon a couple years ago, I overheard the BFG rep saying they had sold $20,000 in video cards in a couple of hours since they had opened their booth. That was only 40 video cards.

No doubt, there are more people buying mainstream cards, but they make truckloads of money in the high end. I don't know what the percentages are, but the people that buy the high-end cards represent a disproportionate amount of money for their numbers.



I agree. ;)

And let's not forget that the enthusiasts (and their money) are the reason we have such innovation right now. If there were no quest for a fleeting performance "crown" we'd be gaming on the equivalent of an MX420 today. The flagship cards are by far the most expensive to produce, from a technological and practical standpoint. If no one was willing to buy them, the pace of innovation would slow and stop. I'm giving everyone who upgrades every 3-6 months a well deserved round of applause. clap*clap*clap*clap*clap*clap*clap*
I work hard so I can join the GPU innovation support group, and soon I'll be a member!
 
That could easily have been done "in house". But I guess the scope creep made them sell it, since the difference of building a couple of dev boards and a complete set of retail boards really isn't that large.

Oh yeah... companies should only test in house just like EA does with their games, and then when they release them to the masses they're still basically betas that need 8 patches to fix...

It's a whole lot more accurate study if you give it to the masses and then see if any problems arise than to screw yourself over later with your flagship product.
 
FaRKle0079 said:
Oh yeah... companies should only test in house just like EA does with their games, and then when they release them to the masses they're still basically betas that need 8 patches to fix...

It's a whole lot more accurate study if you give it to the masses and then see if any problems arise than to screw yourself over later with your flagship product.

You fail to realize that when a game is given to the masses to beta test that very few actually participate in the beta. Most people are just out to play the game before its out and dont do their duty and report bugs/issues. Many are just out to get a copy out to their favorite torrent site. And the rest just want bragging rights that their playing the beta.

EA has the right idea with in house testing and to be honest imo their releases are far less buggy/problematic than most of the companys that do have open betas.
 
Back
Top