8800 GTX released too early??

5150Joker said:
They could've spent more time figuring out how to make it shorter for a start. Second they could've waited a bit longer so they could build it on a smaller process as well. It really is pointless to get an 8800 GTX right now if you already have a 7900/X1900 class card because there really isn't any game the latter cards can't handle. Having an uber high end DX10 card right now is just for the sake of a bigger e-penis and not much else. Who knows how well it will perform when actual games that demand its power like Crysis, Quake Wars, Bioshock etc surface?

Please do try running Oblivion or FEAR (especially Extraction Point) @ something >= 1600x1200 and everything maxed, on any card that is not a 8800 GTX/GTS.
Quake Wars ? Didn't you see Quake Wars running on a G80 card, in the NVIDIA Lan ? It was flawless and extremely smooth.
Crysis ? Didn't you see the first DX10 footage of Crysis, running on a G80 ? Framerates weren't amazing, but it ran pretty well and the game isn't even finished or optimized.
Another upcoming title: UT 2007. I assume you've seen the trailers. Those must've been done in some SLI / Crossfire configuration. Now if a 8800 GTX is usually faster than any SLI / Crossfire configuration, don't you think it will run UT 2007 even better ?
G80 was not rushed and there are already plenty games out there that needed this extra boost in performance and image quality.
 
And just for the record, for those that wanted a "shorter" card.

http://techreport.com/onearticle.x/11236

This clearly says ATI's R600 was longer than a 8800 GTX, which already says something...
IMHO, this means more than that. I think ATI wasn't expecting this amazing performance and IQ delivered by G80, so they are trying to improve their card, to be at least competitive with G80.
Also that article suggests that the 300W for JUST the card rumor, is true, which is insane. Instead of complaining about 8800 GTX's length, I would be more worried about the fact that the R600 sucks up 300W.
 
Hulk said:
In my opinion, and this is only what I think, is that yes the G80 was rushed. IF G80 did not support DX10 no one would make such a big deal out of it. What is the point of having a video card that supports DX10 when there are no games that support it? Sure you could play oblivion at INSANE resolutions but why did they have to stick dx10 in it and charge extra for that?

Just think about it for a second. People who bought the G80 for $650 are also the people who are going to be buying the next nvidia dx10 card because chances are when the first dx10 game comes out the g80 will be weak to play it.

Who says they "charged extra" for DX10 ? How do you know what the price would of been if they didnt add DX10? i am pretty sure it still would of been a $650 card.

As i recall ATI first DX9 cards played DX9 games pretty dam good, you dont think ATI and NVIDIA already have DX10 games to test with....
 
MrGuvernment said:
Who says they "charged extra" for DX10 ? How do you know what the price would of been if they didnt add DX10? i am pretty sure it still would of been a $650 card.

As i recall ATI first DX9 cards played DX9 games pretty dam good, you dont think ATI and NVIDIA already have DX10 games to test with....


exactly, it's not like Nvidia or ATI have to wait as long as we do to get apps to test their hardware......software devs work very closely with IHV's to make sure their software works well on their hardware, it works out well for both parties and us

DX10 support is more of a bonus right now, for $650 you get a card that makes all others look stupid in several different areas + the ability to run DX10 code in the near future
 
3 ridiculous claims that really bug me about the ongoing debate:

1. "There are no games demanding enough to warrant the 8800 series cards."
Eh? Try running Oblivion or FEAR full tilt at medium (1600x1200/1680x1050) to high (1920x1200 and up) resolutions and tell me with a straight face that it's running smoothly with anything less than an 8800 card so I can laugh at you.

2. "There are no DX10 games, so what's the point of getting a DX10 card?"
People are acting like the 8800s can ONLY run DX10. Can't you just be satisfied that they are the fastest DX9 cards on the market by a large margin? I personally won't be in any rush to Vista-fy my PC, and I'm guessing neither will many of us here. A blazing fast DX9 card to last another year, maybe a year and a half, until Vista/DX10 gets more established sounds like a great idea to me. The fact that it will also run DX10 games is a bonus.

3. "The 8800s are way overpriced."
I believe the MSRP for the 7800GTX was $650 when it debuted. The 8800GTX is at a comparable price. If the 8800s are overpriced then so were all other recent flagship videocards at launch, which admittedly there's something to be said for that. But to claim that the 8800s are overpriced compared to previous offerings is ludicrous. The 8800GTX is overpriced for my budget, but that doesn't make it overpriced for what it offers or for current market conditions.
 
MentatYP said:
3. "The 8800s are way overpriced."
I believe the MSRP for the 7800GTX was $650 when it debuted. The 8800GTX is at a comparable price. If the 8800s are overpriced then so were all other recent flagship videocards at launch, which admittedly there's something to be said for that. But to claim that the 8800s are overpriced compared to previous offerings is ludicrous. The 8800GTX is overpriced for my budget, but that doesn't make it overpriced for what it offers or for current market conditions.

agreed, matter of fact, i remember previous generation cards being a fair amount more expensive at launch, and in short supply....
 
Its actually really good for everyone that the 8800 was released. By the time we really need it, the price will have gone down.
 
MentatYP said:
3 ridiculous claims that really bug me about the ongoing debate:

1. "There are no games demanding enough to warrant the 8800 series cards."
Eh? Try running Oblivion or FEAR full tilt at medium (1600x1200/1680x1050) to high (1920x1200 and up) resolutions and tell me with a straight face that it's running smoothly with anything less than an 8800 card so I can laugh at you.

2. "There are no DX10 games, so what's the point of getting a DX10 card?"
People are acting like the 8800s can ONLY run DX10. Can't you just be satisfied that they are the fastest DX9 cards on the market by a large margin? I personally won't be in any rush to Vista-fy my PC, and I'm guessing neither will many of us here. A blazing fast DX9 card to last another year, maybe a year and a half, until Vista/DX10 gets more established sounds like a great idea to me. The fact that it will also run DX10 games is a bonus.

3. "The 8800s are way overpriced."
I believe the MSRP for the 7800GTX was $650 when it debuted. The 8800GTX is at a comparable price. If the 8800s are overpriced then so were all other recent flagship videocards at launch, which admittedly there's something to be said for that. But to claim that the 8800s are overpriced compared to previous offerings is ludicrous. The 8800GTX is overpriced for my budget, but that doesn't make it overpriced for what it offers or for current market conditions.

lazyness took me all this time from posting exactly what u have said here.. ive been thinking in the same things in fact.. those were the reasons that made me buy a g80
 
I don't think the G80 was released too early. The image quality enhancements and raw power under DX9 are reasons enough to upgrade.

What I do think is that the 8800GTX will be short lived, at .9 microns its quite expensive to manufacture even though the yields are supposed to be good. I'm not sure nvidia will move to .80 micron for its next G80, I think it'll move to .65 micron so we can have mainstream and budget cards based on G80 and maybe a GX2 version.
 
phide said:
Why give up? I find this whole nonsensical debate very amusing.

EDIT: Hell, might as well join the fray tonight.


This means you've confirmed that Crysis uses ~one gigabyte of surface data?

Do tell, sir!

more is better, they gave the 7950gx2 1gb y not the 8800
 
nobody_here said:
you know, it's really hard to take you seriously man, you start a thread that is as worthless as every other post you have made, simple one-liner posts, avoiding the issues, and not providing any significant info so i tell you what, i'm gonna put on my tinfoil hat just for you since we can't seem to talk reason with you and you obvisouly dont understand things as well as you wish you did...k? ;)


to answer the GDDR vs. GDDR3 question um, no, not now, GDDR3 is much more mature, cheaper, easier to get in high speeds with stability and low latencies

GDDR4 is new, harder to get in quantities to satisfy a hard card launch i would think, has higher latency than GDDR3 (which negates the higher clockspeed), more expensive, etc.....

just like the move from DDR to DDR2 system memory, DDR2 sucked bad and was more expensive at first

eventually GDDR4 will replace GDDR3 in video cards, but by then Nvidia will be using it too i am sure, they just decided to go with "what we know" instead of banking on GDDR4 being available and cheap enough to not affect a hard launch

here we go with the black helicopter theories....

if you believe that R600 is going to come out with GDDR4 and a 512bit bus and is not going to be too expensive or too hard to get or both, then you have to be open to the idea that Nvidia will also have a card ready to launch at the same time with GDDR4 memory and a 512bit bus.......did you ever stop to think that Nvidia might have released this card at the perfect time to soak up the profits that ATI wont be getting, which means more money to pour into the next big thing, and already has a non-neutered 8900GTX with GDDR4 and a 512bit memory bus waiting....just waiting for ATI to launch, then bam, steal their thunder....and be able to do it for cheaper because they reaped the benefits of the holiday shopping craze.....so they dont have to charge an arm and a leg for the newer tech......

i hope ATI does just that....slams the G80 with a $500 stunner of a card with massive availability, because it would drive G80 prices down and force Nvidia to show their cards, all of which is good for us, the consumers, but it just inst going to happen unless AMD wants to bankrupt that part of their busniess, and trust me, with the state of affairs AMD is in right now in the desktop market, they dont want to lose any more mainstream market shares

my point is, all of this is speculative until it is released, right now the R600 can't beat anything because it cannot be bought by anyone.....do you see?

also, we dont know if a 512bit memory bus will be of any use at any time in the near future.....remember SM3.0? the NV 6 series featured SM3.0, but was it powerful enough to use it fully and really take advantage of it? NO, that didnt come until the 7 series,

to be honest, a 512bit bus and GDDR4 could be a really expensive waste in first gen DX10 cards, we may not see a 512bit bus or GDDR4 provide any advantage at all until say the end of 2007, early 2008, so in the mean time, ATI will be passing off all of the costs associated with going with an expensive PCB to accomodate a 512bit bus and all of the increased costs involved in going with GDDR4 memory, and all of the associated availability issues.......and i am going out on a limb here.....

i bet if an ATI X2900XTX with a 512bit bus and 1Gb of GDDR4 memory comes out and can barely be found and is selling for $800+ but is no faster than G80, you will be one of the ones in line waiting to get it just because "it has a 512bit bus and GDDR4 memory, so it has to be better right", i mean.......512 is more than 384, and GDDR4 is newer than GDDR3 right???.....so it has to be better right...??

why dont you go start a thread over at some other forum board where a bunch of "bigger is better" types hang out so you can find some people who believe the crap you believe with nothing to base it on other than, it's newer and bigger so it has to be faster.... :rolleyes:

now why would ATi sell a x2900xtx for 800 dollars? That's just silly, btw I didn't read most of the crap you type cause, for a person that types so much just to protect 8800, i suspect that you are nothing more than a nVidia salesman or something
 
phide said:
Why would you assume that performance in these games will be sub-par? I assume that performance in these titles you've listed will be quite excellent going from what I've seen. G80 is more than two times faster than the already capable G71 in a number of scenarios, especially those nasty pixel shader intensive scenarios we're going to become very familiar with.


It couldn't possibly be for unparalleled performance and image quality in current games, of course. Why, buying a $650 graphics card for anything else than increasing the size of your so-called "e-penis" would be quite pointless.


Had they built it on a smaller process, clock speeds would be higher, and we could shrug off this totally pointless 2x+ performance boost over the previous architecture. Only two times faster? What a sham indeed!

There will never be a "perfect" graphics card, but having been a graphics card buyer for over eight years, this is as close as anyone has ever come. I don't think anyone would really disagree with that, and if they do, certainly they'd have a better reason than "it's too long" (or so I would hope).

And you're guaranteeing that 8800 will perform well on DX10 games at high resolution with high settings based on what? Your chrismas urge to buy?
 
MentatYP said:
3 ridiculous claims that really bug me about the ongoing debate:

1. "There are no games demanding enough to warrant the 8800 series cards."
Eh? Try running Oblivion or FEAR full tilt at medium (1600x1200/1680x1050) to high (1920x1200 and up) resolutions and tell me with a straight face that it's running smoothly with anything less than an 8800 card so I can laugh at you.

2. "There are no DX10 games, so what's the point of getting a DX10 card?"
People are acting like the 8800s can ONLY run DX10. Can't you just be satisfied that they are the fastest DX9 cards on the market by a large margin? I personally won't be in any rush to Vista-fy my PC, and I'm guessing neither will many of us here. A blazing fast DX9 card to last another year, maybe a year and a half, until Vista/DX10 gets more established sounds like a great idea to me. The fact that it will also run DX10 games is a bonus.

3. "The 8800s are way overpriced."
I believe the MSRP for the 7800GTX was $650 when it debuted. The 8800GTX is at a comparable price. If the 8800s are overpriced then so were all other recent flagship videocards at launch, which admittedly there's something to be said for that. But to claim that the 8800s are overpriced compared to previous offerings is ludicrous. The 8800GTX is overpriced for my budget, but that doesn't make it overpriced for what it offers or for current market conditions.

8800 isn't overpriced, you can get a 464shipped 8800 GTS at tigerdirect.com, but the question how will it perform in DX10 when compared to x2900
 
SX2233 said:
8800 isn't overpriced, you can get a 464shipped 8800 GTS at tigerdirect.com, but the question how will it perform in DX10 when compared to x2900
Sigh--the point being, it outperforms all current cards now in DX9, and has better image quality, and will outperform them all in DX10, since they can't run DX10 at all.

It works well, performs well, displays well, represents good value for money in every way. There is no reason at this time to buy anything else unless you can't afford it, or unless you can afford and want an 8800GTX. If all that is true, then how could it possibly matter how well it will compare to a card that won't be available for at least 3 months?

It may be "too early" if you're an ATi partisan and you're pissed that your favorite brand can't compete, but it's not too early if you want to buy one of the two best video cards ever made at a reasonable price, right now.
 
SX2233 said:
now why would ATi sell a x2900xtx for 800 dollars? That's just silly, btw I didn't read most of the crap you type cause, for a person that types so much just to protect 8800, i suspect that you are nothing more than a nVidia salesman or something

the answer to your question is because they might need to in order to implement such features such as 1Gb of GDDR4 on a 512bit memory bus.......in case you didnt already know, ATI doesn't exactly have the best track record of delivering the highest !/$ (just for you, that would mean biggest "bang for the buck")

i mean, come on, using your reasoning, if a 384bit 768Mb card sells for $650, surely a card that beats the above mentioned $650 card and sporting such "better" features such as a higher memory capacity, more advanced memory modules, and a wider memory bus (hence much more complex PCB if you didnt know) would demand a price of at least $800 right?

no, the problem is you dont know half as much as you think you do, it's really simple, read and understand, why dont you try countering my points or the points of many others with something of substance instead of one-liner simplistic narrow minded responses

you do read english right?

i wish we could get a poll going to vote this guy for the asshat of the month award without getting the thread shut down..........

seriously, SX2233, why dont you take the time to give a rebuttal of any substance whatsoever?
 
MentatYP said:
medium (1600x1200/1680x1050) to high (1920x1200 and up) resolutions

Since when is 1600x1200 only a medium resolution ? I know more people are running those crazy resolutions on big LCDs than ever before, but I think 95% of PC gamers are still at 1600x1200 or lower.

I know it's up to interpretation but am I wrong ? :p

But hey I only use 1440x900 and Oblivion runs like crap. (I blame the game not my computer)
 
If you want to know how will the G80 will perform on dx10 you can have a teaser just by installing the latest DX SDK wich contains some demos using the latest dx features,
 
Stoly said:
What I do think is that the 8800GTX will be short lived, at .9 microns its quite expensive to manufacture even though the yields are supposed to be good. I'm not sure nvidia will move to .80 micron for its next G80, I think it'll move to .65 micron so we can have mainstream and budget cards based on G80 and maybe a GX2 version.

Yet another reason why NVIDIA will have G81 ready @ at least 80 nm, as soon as R600 is out.
 
Back
Top