g80 details in full @ the inq

Nothing we ddin't expect but I'm concerned about availability. If the Inq is right and theres massive shortages upon release, that means the fucking prices are gonna be retarded. I really hope not since I was hoping to pay msrp for a gtx or 2, I refuse to pay 800 bucks for 1 gtx. :mad:
 
If both GPUs are severly CPU bound then I think I'll go with the GTS, especially if it's at the $399 mark (heres hoping that it stays there). At that price I might as well go with it rather then a 360 and hope/pray GOW comes out for PC soon.
 
not even worth upgrading until there is a CPU available that can max it out...
 
Inq said:
Anisotropic filtering has been raised in quality to match for ATI's X1K marchitecture, so now Nvidia offers angle-independent Aniso Filtering as well, thus killing the shimmering effect which was so annoying in numerous battles in Alterac Valley (World of WarCraft), Spywarefied (pardon, BattleField), Enemy Territory and many more. When compared to GeForce 7, it looks like GeForce 7 was in the stone age compared to the smoothness of the GeForce 8 series.

Hooray!
 
Haste266 said:
not even worth upgrading until there is a CPU available that can max it out...

Thats not true at all dude. You can still take advantage of it by running at high resolutions and keeping all the eye candy on maxed.
 
I imagine CPU limiting will be almost non-existent when running DX10 with plenty eye candy
 
Haste266 said:
not even worth upgrading until there is a CPU available that can max it out...

Silly statement. My 3.6GHz E6600 should do just fine.
 
Anyone know what 16xQ is? Is this 16 tap Quincunx AA? The only time I used Quincunx is back on my GF3 card and I recall it being very blurry looking.
 
I would like to see some oblivion benchies nfs too =p maybe black and white 2 seeing as those games stress vcards alot
 
quincux AA was introduced with the GF2 GTS, essentialy giving 4xAA with almost no performance hit, it only worked on certain resolutions and didn't work in every game, dunno where your blur came from quincux worked and looked fine to me

the 16xQ is the Quality version, its going to apply both AA and Alpha AA algorythms, full details are still unknown and this is coming from the Inq so no idea how much of its true, but I'm glad fuad didn't write it :eek:
 
I don't care if it is cpu bound. I don't believe there isn't a single graphics card setup SLI or crossfire that can run oblivion at full max setting at 1600 x 1200 or higher outdoors in a fight. I'm hoping that this card can or at least 1280 x 1024.
 
Brahmzy said:
Silly statement. My 3.6GHz E6600 should do just fine.

Yeah i should run fine too i have the same cpu at the same speed like yours.

quick question when are they comming out with the DDR4?
 
Haste266 said:
not even worth upgrading until there is a CPU available that can max it out...



baloney. for those of us that play games at high res with max IQ, any fast modern dual core will be plenty to enjoy the increased speeds the G80 will offer.....yeah those poor guys playing CS 1.6 at 800x600 don't need anything more....but I digress.... :D
 
Agreed.

I'll eat my hat if the 8800gtx is CPU limited when rendering to a 24" LCD at 1900x1200.
Unless you're pairing it up with a celeron D. :p

The unqualified assertion that the 8800gtx is CPU limited is patently ridiculous. :rolleyes:
 
Inq said:
With programmable units, the traditional pipeline died out, but many hacks out there are still using this inaccurate description.

he called me a hack :(

well my plan still is to jump on the suckers who rush to e-bay to sell there X1950s/X1900s/7900s/7950s.

and did this dude tell us what the odd chip w/ the odd mem bus is for?
 
MrWizard6600 said:
he called me a hack :(

well my plan still is to jump on the suckers who rush to e-bay to sell there X1950s/X1900s/7900s/7950s.

and did this dude tell us what the odd chip w/ the odd mem bus is for?

RAMDAC/HDCP
 
LOCO LAPTOP said:
Yeah i should run fine too i have the same cpu at the same speed like yours.

quick question when are they comming out with the DDR4?

Probably next year, when R600 is out.
 
Back
Top