what happened to the 8900's???

Yep doubt that the higher core clocks for R600 would have doned much good, seeing how people OC these thing very high, without too much improvement in performance. :(

But all this R600 talk is pulling the thread to offtopic, this was about 8900 right. I really doubt we will see these cards, most likely Nvidia will go straight for the G90, which is ofcourse based on G80 architechture with some improvements. No matter what happens, the next gen cards better provide some solid DX10 performance, as currently the cards are DX9 with just the DX10 capability sticker slapped on them.
 
Thats a pretty ignorant statement to make. All cards suck at DX10 games, and will surely get hammered by Crysis.

Crytek already said that Crysis can be maxed out with a single 8800 GTX, with smooth framerates.
 
I agree. While the HD 2900XT isn't all we had hoped for it's not a terrible and nearly useless piece of equipment. It's just not the best choice on the market.

Hotter, louder, more expensive and dubious performance with AA enabled (basically lower for the most part) and this is comparison to Nvidias lower ranged cards.

I agree that we should not say this card sucked unless it can't stand up to cards in the same price range, but it cannot even do that. Why can't we say a card sucks if it's beaten in basically every aspect? Just because it only loses slightly in every aspect doesn't mean that there's anymore reason to buy it than if it lost a lot in every aspect. It's hardly a massive failure, it's still an impressive piece of kit, there's just no reason why anyone would get it.
 
Hotter, louder, more expensive and dubious performance with AA enabled (basically lower for the most part) and this is comparison to Nvidias lower ranged cards.

I agree that we should not say this card sucked unless it can't stand up to cards in the same price range, but it cannot even do that. Why can't we say a card sucks if it's beaten in basically every aspect? Just because it only loses slightly in every aspect doesn't mean that there's anymore reason to buy it than if it lost a lot in every aspect. It's hardly a massive failure, it's still an impressive piece of kit, there's just no reason why anyone would get it.

I think what Dan is saying is that while it can't compete competitively with the current generation, it doesn't mean it's a crap piece of equipment. In truth, it still took a lot of incredible design and so on to pull of what it can do, and it can still easily outperform previous generation cards. Also, it has a lot of features on it yet to be used. Problem is, it lost this round because the 8800's were just that much better. But in the big scheme of things, it still improved over the previous generation, something the FX series couldn't really claim..:mad:
 
Interesting since Im in the market for a new video card. Do they say how well it runs on a GTS?

So far actually I think the only thing that's been said is an "8800 Class card" to be honest. The latest playable demo/stands at some gaming conference that starts soon (forget the name right now), was reported by someone (at the incrysis.com forums) to be running on:

Q6700
4GB RAM (2 should be fine)
8800GTS 640MB

All the monitors are either 17 or 19 inchers so your talking about 1280x1024 max res. As was said before, this thing will be running smooth most likely. And I contunously love how so many people (not directed at samduhman) think these hackjob DX10 patches = DX10 games or Dx10 gaming :rolleyes: . Makes me want to stay away from threads in general on here sometimes...I mean wow, look at the site they are visiting, HardOCP. There's a reason why the H hasn't done benchies on the wannabe DX10 titles :eek: . This is just like "wait for E3", but wait for Crysis; it will give a more accurate measurement.

EDIT: Personally I think a 3Ghz C2D Dual Core will net performance of a 2.6Ghz C2D Quad Core and 2GB of RAM with an 8800GTS will be needed at least to play the game at max. 640MB is probably better for AA and eye-candy as I guess the 320MB will be capped at 1280x1024 and the 640 at --maybe-- 1600x1200. But that's just a guess, especially between dual and quad core, I wonder what the performance differences will be.
 
Crytek already said that Crysis can be maxed out with a single 8800 GTX, with smooth framerates.

And you believe them, right? I doubt it will play to my liking at 1920x1200, with everything "maxed out".
 
Crytek already said that Crysis can be maxed out with a single 8800 GTX, with smooth framerates.

You are incorrect, sir.

Crytek's claim was that Crysis was playable at 'ultra' settings with an 8800gts and e6600, with still higher settings available to future hardware.

http://www.gamesradar.com/us/xbox36...5081&sectionId=1001&pageId=200706121061280027

It's worth noting that no resolution is stated here. It would be overoptimistic to assume Yerli is suggesting a 1920x1200 resolution.
 
If they release an 8950GX2 without releasing an 8900GX2 that's available to the public, I'll go 100% ATI, there naming scheeme is always a little screwed up but not that bad. O.O
I am sorry, but...
You are going to go with ATI, because of the name of a card?

What if the card performs like a beast?
And it is the bomb?
What if it is the Chuck Norris's of graphics cards?

Guess you wont be getting in. The naming isn't good enough?

The 8900's were basically proof that the INQ isn't straight facts.
The 8900s seem to be a forgotten rumor.

G92 is where it's at.
 
Hotter, louder, more expensive and dubious performance with AA enabled (basically lower for the most part) and this is comparison to Nvidias lower ranged cards.

Now this is starting to annoy me, in no test or user based comment have I read that 2900xt runs hotter than the competition. Please correct me if Iam wrong about the heat issue. On my personal experience and from forum posts I have gathered that 2900xt runs just as hot as 8800 cards. Notice Iam just talking about the heat, nothing else.

Crytek already said that Crysis can be maxed out with a single 8800 GTX, with smooth framerates.

I bet Crysis will have something like 7600/1650pro as minimum requirements but we all know those cards won´t run it, atleast not the way gamers want. Same goes for the current high end, I do not believe current high end can max out Crysis, especially when using DX10. I have gotten shafted by the PR so many times that I refuse to believe anything they say before I see it for myself, but this ofcourse is only my personal belief only :)
 
OMG you are so funny! You invented a card name by mixing higher numbers following the current numbering scheme together with previous generation of cards acronyms and then added a no sense technology name preceded by the word hyper.

I'm not sure if that sentence was poorly assembled, but it took me a while to digest. :D
 
Going by nVidia's launch model naming in the last couple generations (G70 = high-end, G7x = midrange; G80 = high-end, G8x = midrange), wouldn't the G92 indicate a sort of next-generation midrange card?

Given the current HUGE gape between the 8800/2900's and... well, everything below them, getting something decent in the middle makes a ton of sense from a business standpoint, as opposed to a 'what I want' standpoint. :(
 
Going by nVidia's launch model naming in the last couple generations (G70 = high-end, G7x = midrange; G80 = high-end, G8x = midrange), wouldn't the G92 indicate a sort of next-generation midrange card?

Given the current HUGE gape between the 8800/2900's and... well, everything below them, getting something decent in the middle makes a ton of sense from a business standpoint, as opposed to a 'what I want' standpoint. :(

I thought it was confirmed a few days ago that the G92 was actually the codename for the mid-ranged card that would bridge the gap that is missing between the 8800's and 8600. The original G92 rumor was just a rumor attributed to the wrong card, but the G90 high end is happening regardless
 
I am guessing maybe the more power it use = more heat.

Which leads to the conclusion that 2900 heatsink is just more efficient than the one on 8800 because it can handle the heat produced by bigger power consumption and still run roughly at the same celsius degrees as 8800 according to the heat measurement tests.

I do aknowledge that my reasoning might not be totally accurate because 2900 produces more noise than 8800 at stock levels. But be it as it may, on stock levels the cards do produce roughtly the same amount of heat.

So what Iam getting at is that I know you can say that the 2900 eats more power and is louder, but you really cant say its hotter. Gah I now realize that you can twist this anyway you want and be just as correct about the matter :) :p
 
Okay anyone else notice that the 8800GTX on newegg have now dropped well below $500? In fact, before rebates, some are already retailing below $500. :eek: With Ultra's dropping in price as w ell, is this a sign that a fall refresh may still occur? :D
 
Okay anyone else notice that the 8800GTX on newegg have now dropped well below $500? In fact, before rebates, some are already retailing below $500. :eek: With Ultra's dropping in price as w ell, is this a sign that a fall refresh may still occur? :D

Not necessarily, it could be a sign of slumping sales.
 
Not necessarily, it could be a sign of slumping sales.


he's right, you know. prices will stop falling/go back up at the end of august (back to skool sale!!! time)

as for the 89xx, i checked a couple months ago and the only place (besides forum posts) that this was ever mentioned was in a very short article in the inq, quoted from "un-named chinese sources".
 
Which leads to the conclusion that 2900 heatsink is just more efficient than the one on 8800 because it can handle the heat produced by bigger power consumption and still run roughly at the same celsius degrees as 8800 according to the heat measurement tests.

I do aknowledge that my reasoning might not be totally accurate because 2900 produces more noise than 8800 at stock levels. But be it as it may, on stock levels the cards do produce roughtly the same amount of heat.

So what Iam getting at is that I know you can say that the 2900 eats more power and is louder, but you really cant say its hotter. Gah I now realize that you can twist this anyway you want and be just as correct about the matter :) :p

Something to remember - while the card itself may not get any hotter because of a good cooling design, that doesn't mean it isn't still putting out heat. It may not necessarily be putting it out inside of the case but it has to put it out somewhere. It's the law of conservation of energy at work. Energy has to go somewhere.

This isn't a concern in the dead of a northern winter, but in the south, this is an issue (where I am, on the Gulf Coast it is an issue at least 6 or 7 months out of the year!) because you have to crank the A/C just that much more to keep room temperatures where the box is at a reasonable level. I know this because I have an 8800 GTS, and while the GPU is at ~54C normally, the room where the computer stays gets warmer than other rooms in the house. It has nothing to do with the room either, because I moved it twice thinking that it was a ventilation problem.

I think we can all accept that the 8800 GTS uses less power than a 2900 XT without screaming about nvidiot bias, right :p? Along with that, I'm sure a GTX gets hotter than a GTS. Personally, I would NOT want to share office space with a GTX, much less either of them in SLI (or a 2900XT in CF). That is one issue with the heat. The other is that some *is* going to leak into the case, there aren't many ways around that with air cooling. That leads to higher ambient system temps, etc.

There is a higher cost with power consumption than just "heat" - it's higher electric bills for powering the device and for the A/C, not to mention general things like more noise and discomfort if the device manages to overwhelm the A/C temporarily. Personally, I'd like for both manufacturers to start taking power consumption/heat into consideration because it is totally getting out of hand.
 
Back
Top