90NM GTX to have 30 pipes

Lord_Exodia said:
They might be trying to charge everybody 2x the normal price to compensate, and this could have damaged relations with nv and possibly even ati. This is only speculation in my part but it seems to be the normal for these corporate giants to charge others for their wrongdoings. If this is true and the 90Nm GTX Cards are 32 pipelines this could be a help since nvidia will rely less on the memory and more on raw GPU Performance. Hopefully ATI and nvidia have other GDDR Ram manufacturers to go to, it's never safe to put all your eggs in one basket.

Doesn't work that way. GPU performance can never compensate for bandwidth since they are responsible for different aspects of processing. Games are going to be more shader heavy in the future but that doesnt mean bandwidth requirements are going to stop growing. AFAIK only Hynix and Samsung have announced GDDR4 parts, 512Mbit from Hynix and 256Mbit from Samsung.
 
trinibwoy said:
Doesn't work that way. GPU performance can never compensate for bandwidth since they are responsible for different aspects of processing. Games are going to be more shader heavy in the future but that doesnt mean bandwidth requirements are going to stop growing. AFAIK only Hynix and Samsung have announced GDDR4 parts, 512Mbit from Hynix and 256Mbit from Samsung.

This is true that it doesn't work that way. However, nv and ati may have to take a different approach if things get too costly for them other than raise the price to the consumer. Eventually people will stop buying, I know I will if the cards are $1000 a piece. On the consoles I believe they have a technology called EDRam or something like that. I believe this will help with memory issues as it improves the way memory calculations are offset to the gpu. Maybe ingenuity will win over raw horsepower. Even the fastest car can lose a world rally if it can't take the turns as well. Not to bring the car fannatics in here but the ford focus wins the rally over the far superior WRX from time to time. A little creativity can go a long way. The driver makes a difference too but what I mean is if Samsung has to pay over $300 Million dollars for this lawsuit (largest fine in history of the semiconductor industry) They will try and make it back some way.
 
meatfestival said:
32 pipe 7800 gtx... oh my... a little bit of cum just came out.

Did anyone else hear C3P0 in their head when they read this or was it just me?

Yeah, 16X3 in the 580 vs. 32X1 in the 7900gtx- It's going to be an interesting spring for sure.
 
yevaud said:
Did anyone else hear C3P0 in their head when they read this or was it just me?

Yeah, 16X3 in the 580 vs. 32X1 in the 7900gtx- It's going to be an interesting spring for sure.

which part, the 'oh my' or the 'a little bit of cum just came out'
 
Marcdaddy said:
LOL who said these are gonna sell for less then half there value is mistaken, I paid $549 for my XFX 7800GTX OCs when they came out i sold them for $480 a piece, when i sell these ill still get at least 65% to maybe 70% of the original price i paid which was $699. So if people think your gonna get a 7800GTX 512 card for $350 flay is smoking some crack.

ur wrong... the GTX 512mb is at iz high price because of lack of availability now, but when the GTX 256mb was released the card was available everywhere in retail. Getting a 512mb now isn't worth it because of all the damn price gouge
 
gtx4u said:
ur wrong... the GTX 512mb is at iz high price because of lack of availability now, but when the GTX 256mb was released the card was available everywhere in retail. Getting a 512mb now isn't worth it because of all the damn price gouge

Oh it's worth it considering my two 512mb cards will eat anyone else gaming rig overclocked or not compared to mine stock :).

There is noone in this forum that can come close to me unless they have two of the same cards and can match me :D .

I got mine for 699 a piece first week they came out well like the 3rd day they were shipped but regardless it's the hottest card out in the market the demand is huge for the amount of money it costs I am surprised so many want it but then again I know why they do!

If Nvidia's next gen is as great a performer as these or ATI's they will be in high demand as well. ATI just needs to get crossfire going if they don't then I guess we all stay with Nvidia if we want the best. No biggie!
 
Let's just see if the 90nm help a lot in the enery consumption department. Hope so, I hope the supposed 32 pipelines don't blow away the 90nm watt advantage
 
Raudulfr said:
Let's just see if the 90nm help a lot in the enery consumption department. Hope so, I hope the supposed 32 pipelines don't blow away the 90nm watt advantage

QFT I hate to see a 600w power supply become too small to run these high end card geez.
 
hell no, im not buying another card for this dx9 generation...its already fleshed out.. only reason you would want a better card is for unreal tournament 2007 and such. ill wait till dx10 cards start shipping
 
we don't need a graphics card to play real life games like... Hide and Seek. Besides, the graphics are uncanny!

Now lets go touch base before they find us... Tag you're it.
 
nVidia didnt want to call the new video card "Ultra", cuz they were reserving it for the new core with 32 pipelines, so they instead named it GTX. nuff said
 
to keep up prices. I don't know about you but its tiring to get hosed alot of money for a video card knowing its price fixxing. I have two 7800 gtxs from Evga and I am gonna wait awhile besides CPU is going to bottleneck those new 90nm cards. Games are still haven't used full potential of the current cards on the market.
 
you dont think FEAR, Call of Duty 2, and other similar recent launched titles are pushing the GPU's? There wont be a CPU bottleneck for awhile. The upcoming cards will be far from creating any bottleneck situations.
 
Would nVidia's new drivers permit SLI with the upcoming G71 architecture and the current G70-based 7800-series cards, or are we limited with same generation SLI only?
 
Lord_Exodia said:
On the consoles I believe they have a technology called EDRam or something like that. I believe this will help with memory issues as it improves the way memory calculations are offset to the gpu.

eDRAM is an expensive solution for PC's. Consoles are targeted at specific resolutions so it's an easy call there but you'd have to drop enough eDRAM to support the highest resolutions on a PC and that would kill yields and send costs through the roof. Best we can hope for is that developers put enough strain (with useful effects not just bad code) on the shaders to diminish the reliance on fast memory as much as possible.
 
Back
Top