9800GTX scores 14,000 in 3DMark 06

Shame on nVidia for using the GTX name on something that scores 14k with a quad.
 
Did they ? They still own 200-400 USD segment and 9600 GT beats HD3870 already.. They have pretty much it covered at the moment

What I saw in the benchmark is the opposite, the 9600GT is slighly faster overall than the HD 3850, but slighly slower than the HD 3870, it's a great feat for nVidia, but ATi already lowered the HD 3870 price which will make more presure against nVidia's offering, also bear in mind that the chip still bigger than the HD 3870, so the profit with that GPU still be less than ATi.
 
After going over some data, it seems the 9800GTX was not meant to be the new Geforce 8800GTX, the 9800X2 is.
nVidia expects the 9800X2 to be 30% faster to the 8800Ultra.
I think the key GTX part of 9800GTX puts a wrinkle in that train of thought. ;)
 
I'm getting 15k with my GTS512 and a Quad.

If the 9800GTX ends up being better, hopefully they get it out before my Step-Up period ends in a couple months.
 
Get them to install rivtatuner and OC the core right up to 750. If my 8800GTS 512 can do it, this "new" 9800GTX can do it.
 
Well one major difference between the GTS512 and the "9800GTX" is it will have that extra SLI finger. Thats worth another 100 bucks right?....right?
 
Really sad.

Stock E8400 and Stock 8800 GTS (G92) = 12700 in 3DMark06...
 
You know, Nvidia wouldn't be getting nearly as much flack if they had just called these the 8900 series (starting with the release of the G92 based 8800GT and 8800GTS).

That's all this new card is really, an 8900GTX.


Actually I think it should be called 8700GTX, Seeing as how it will be slower than an 8800GTX at the resolutions everyone buying high end cards plays at :D
 
isnt the 9800 series coming out with the supposably unlocked a3 version of the g92 gpu..... that gpu-z says their g92 is the supposably locked a2 version i am waiting for official numbers these dont seem right to me
 
Yeh, my Crossfire'd 3870s score more than 14014, and that's without the quad core. Not too impressive for something that is supposedly the successor to the 8800 GTX.

CROSSFIRED 3870's is TWO CARDS. this is ONE CARD. not to mention 3dmark gives artificial numerical scores to things that dont' even help framerates, like a few hundred for quad core, a few thousand for SLI or Crossfire

all that matters in 3dmark are the framerate numbers it spits out, yet nobody ever compares them.
 
12660 3dmarks with rig in sig, video card @ stock settings.

13132 3dmarks with rig in sig.

A quad and a stock 9800GTX = 14000. That video card has a much higher stock memory clock, and the core is 5MHz faster stock, and a quad at 3GHz.

I'm stepping up to either a 9800GX2 or a 9800GTX... can't make a decision with out any real specs/reviews! My stepup expires in the beggining of may, maybe something new in april? If not I'll stepup.
 
Wait a minute... why is everyone freaking out over a rumor? Also, I think Nvidia is realizing that more efficient architecture overcomes raw speed. The G94 in the 9600GT has something the G92 doesn't -- incredibly efficient data compression. At least double the shader performance for the 9800GTX and increase performance by a good... 40%? If a 9600GT performs within 5% to 10% of the 3870, and the 3870 performs within 5% to 10% of the 8800GT, then the 9800GTX should, at minimum, be 30% faster than the 3870 across the board. This isn't incorporating all the advantages Nvidia has in their designs. This puts the 9800GTX in the $349 price bracket. Even if that is the MSRP for this card, I have a feeling the HD3870 X2 will push it down into the sub-$320 area.
 
3dmark isn't a game, but it is a pretty good indicator, especially when we're dealing with architecture that is pretty much identical to the previous generation.

You know when you buy the full version of 3D Mark it actually does have a game built in, and it's a hardware killer. Absolutely hands down the best game play I have ever experienced in my life too. There's nothing like that feeling I get piloting that little robot thru the canyon maze shooting everything that moves. And the graphics make Crysis look like a bitch. It's everything you'd ever need to check out your rig's performance.

:D
 
those are pretty craptastical leaked results... I have seen many single card scores higher than that with overclocking.. wayyyy higher than that in some cases.. which begs the question, I wonder how it will OC? Maybe it will OC like a beast..? :) regardless, it would seem (if the source are accurate) that this will be no marvel above the 8800 series.. which is a bummer! but I guess we'll have to wait how valid this info is... if it is valid, I may not be upgrading to the 9800GTX after all.. 2x GX2 SLI might still be an interesting upgrade.. I guess. or CF'd 3870X2's.... and a dedicated 20a line for my gaming rig. lol. Plus I can ditch this shitty nv chipset also. lol. j/k.. but I would like to go with an intel chipset again..
 
Wait a minute... why is everyone freaking out over a rumor? Also, I think Nvidia is realizing that more efficient architecture overcomes raw speed. The G94 in the 9600GT has something the G92 doesn't -- incredibly efficient data compression. At least double the shader performance for the 9800GTX and increase performance by a good... 40%? If a 9600GT performs within 5% to 10% of the 3870, and the 3870 performs within 5% to 10% of the 8800GT, then the 9800GTX should, at minimum, be 30% faster than the 3870 across the board. This isn't incorporating all the advantages Nvidia has in their designs. This puts the 9800GTX in the $349 price bracket. Even if that is the MSRP for this card, I have a feeling the HD3870 X2 will push it down into the sub-$320 area.

All G92 cards have the new data compression technology that's used to eliminate the bottleneck that would normally be created by the 256bit memory bus. http://www.hardforum.com/showpost.php?p=1031972013&postcount=1

The most probable explanation for the 9600GT competing so well with the 8800GT is that the games we have now are not shader limited to a large degree.
 
My only complaint is that G92 wasn't labeled as the "9000" series from the beginning. It is a bit disappointing that the 384-bit bus is gone for good, but you have to understand that making 384-bit bus cards is just too damn expensive.

As for "new compression technologies," don't get excited. I can tell you with certainty that everything that can be compressed inside the video card has already been compressed (textures, Z-buffer, environment maps, etc), and that this improvement is incremental at-best. You typically see large performance gains when you introduce compression (the easy %90), but incremental changes usually only bring small performance improvements (the hard %10).

In the meantime, I don't expect card makers to release a new high-end GPU until GDDR5 is available in-quantity. They really can't justify it with current memory bandwidths.
 
All G92 cards have the new data compression technology that's used to eliminate the bottleneck that would normally be created by the 256bit memory bus. http://www.hardforum.com/showpost.php?p=1031972013&postcount=1

The most probable explanation for the 9600GT competing so well with the 8800GT is that the games we have now are not shader limited to a large degree.

Which is true to a great extent, and sadly means 9600GT owners are going to be screwed if and when that changes and they find themselves in need of more stream processors.
 
Its like nVidia is leaving themselves open for ATI to kick them in the balls...

Then again maybe they're doing this as part of a larger plan. I'll just hold my opinion until [H] gets ahold of the new cards and shows us what they're actually capable of.
 
Its like nVidia is leaving themselves open for ATI to kick them in the balls...

Then again maybe they're doing this as part of a larger plan. I'll just hold my opinion until [H] gets ahold of the new cards and shows us what they're actually capable of.

QFT
 
I'll wait till Hard reviews it and then I'll worry if it actually worth.
Rumor are just that!
.2
Nokia
 
I've been waiting for the 98xx's and this is a little discouraging... as everyone else says, lets wait and see what happens with some other proven results...
 
IIRC my 8800GTX gets 16000+ on 3Dmark06 (650/1050) - don't quote me on that.... i'm prolly wrong..

Are we ever going to be able to play crysis in dx10/very high/1680x1050 with a fps > 18-26fps????? :(:(:(:(:(:(:(
 
How credible are pre-release results like this? The performance on that 9800GTX seems more like something that should be a 9800GT.
 
It is a bit disappointing that the 384-bit bus is gone for good, but you have to understand that making 384-bit bus cards is just too damn expensive.
and I suppose the successor to the Lamborghin Reventon will only have a V8 engine cas V12 is just too expensive... ???

they'll save the 384 for the 9900GTX
 
Its like nVidia is leaving themselves open for ATI to kick them in the balls...

Then again maybe they're doing this as part of a larger plan. I'll just hold my opinion until [H] gets ahold of the new cards and shows us what they're actually capable of.


No company would have it in their plans to get "kicked in the balls" by their direct competitor. ;) This is the result of NV pulling on the R&D reins in mid 07, and because of that ATI gets back in the game.
 
No company would have it in their plans to get "kicked in the balls" by their direct competitor. ;) QUOTE]

I meant that more as nVidia purposefully laying back to see if ATI will try and 1-up this release, which is actually to bait ATI into showing their hand first.

or something...
 
nVidia doesn't need to release more powerful parts right now. Save for Crysis, Call of Juarez, and maybe a couple others, my 9 month old 8800 Ultra and stock Q6600 are tearing up all other games at 1920x1200. Hardware is out stripping software right now. By this time it might be really out of whack. 8-core Nehalems and G100's with three times the power of an 8800Ultra?

So things are a little slow right now with hardware. That's okay, this generation of hardware is still plenty fast and it gives us all a little more time to save up for the killer stuff coming in the next year!
 
nVidia doesn't need to release more powerful parts right now. Save for Crysis, Call of Juarez, and maybe a couple others, my 9 month old 8800 Ultra and stock Q6600 are tearing up all other games at 1920x1200. Hardware is out stripping software right now. By this time it might be really out of whack. 8-core Nehalems and G100's with three times the power of an 8800Ultra?

So things are a little slow right now with hardware. That's okay, this generation of hardware is still plenty fast and it gives us all a little more time to save up for the killer stuff coming in the next year!

Reminds me of the time when the 9800Pro was king, I bought a 9700pro a while before that, and that card was able to play almost everything at high details until some time after the X800/6800 series' came out. That card lasted quite a long time.
 
Which is true to a great extent, and sadly means 9600GT owners are going to be screwed if and when that changes and they find themselves in need of more stream processors.

The other shader processors on the 8 series cards can probably be used to run the PhysX API now that Nvidia owns that. Remember all those reports about them brining Ageia's tech to 8 series cards.

Having the option to give up 10-15% performance (the difference between an 8800GT and 9600GT) for accelerated physics would be interesting.
 
Are we ever going to be able to play crysis in dx10/very high/1680x1050 with a fps > 18-26fps????? :(:(:(:(:(:(:(

Already can with a 3870x2 and 08123a drivers, 18fps not very playable though...

Crysis is screwed...I don't think any card in the next couple of years will be able to play it smoothly on Very High. It reminds me of Oblivion another great looking game for it's tmie but due to poor programming/optimization it ran horribly even with todays harware it still struggles at times. these games run bad due to bad game engines, not lack of power from graphics cards.
 
Reminds me of the time when the 9800Pro was king, I bought a 9700pro a while before that, and that card was able to play almost everything at high details until some time after the X800/6800 series' came out. That card lasted quite a long time.

Yes, the 9700 might very well be the best card ever in terms of its longevity. ATI's best work ever in my opinion.
 
Back
Top