R600 Benchmarks

anybody notice this?
If you just bought a GeForce 8800, especially the GTX version, you might want to return your card on January 22nd when ATI R600 cards become available throughout the US at 630 USD (MSRP Price) which roughly equals the current price for a GeForce 8800GTX.
January 22nd... I've only heard of Jan 20th, or Feb 14th

They also updated this artical:
This article is authentic. Our R600 sample is an RTM sample used in the MS CERT process for driver validation (especially for Vista). We are not publishing pictures of the card itself right now, as the card contains major ID tags that we can not remove yet for source protection. We will add pictures of the card itself once we can remove these tags.

:rolleyes:
 
plus this one

Update - Oblivion Benchmark is TSSAA/AAA quality


Due to upcoming questions regarding the frame rates on our Oblivion test: We ran the Oblivion Benchmark with activated TSSAA (Transparency Super Sampling Anti-Aliasing) on nVidia and AAA (Adaptive Anti-Aliasing) on the ATI cards because Oblivion profits substantially from these advanced AA modes.
 
snowysnowcones said:
anybody notice this?

January 22nd... I've only heard of Jan 20th, or Feb 14th

They also updated this artical:


:rolleyes:

Jan 20 is a Saturday...so I doubt Jan 20 is true.
 
Update - Authenticity Of This Article

This article is authentic. Our R600 sample is an RTM sample used in the MS CERT process for driver validation (especially for Vista). We are not publishing pictures of the card itself right now, as the card contains major ID tags that we can not remove yet for source protection. We will add pictures of the card itself once we can remove these tags.

1) Taking pics and using photoshop or any other image manipulation tool to blur out or draw over any ID tags, what a BS excuse which also leads me to think this is just a hoax/BS

2) It beat's Nvidias cards by only a tiny fraction on most tests I saw while browsing through the first few pages, this card will come with a hefty price tag upon release, the 8800 will drop in price to remain competative so even if there was a grain of truth to this, who would pay a bucket load for a card which will release with the normal high price tag when they can get one almost as good for a much better price.

3) I think Nvidia will be very close to a refresher range of cards by the time the AMD counterpart ships, which is likely to beat its ass into the ground. The refresh stage of the product life cycle has been getting more and more important with every generation, with the 7xxx generation there was several massive upgrades to the product range which vastly increased the performance and gave good performance/price ratios (the 7950GX2 anyone?)

But yeah, probably BS :)
 
Frosteh said:
3) I think Nvidia will be very close to a refresher range of cards by the time the AMD counterpart ships, which is likely to beat its ass into the ground. The refresh stage of the product life cycle has been getting more and more important with every generation, with the 7xxx generation there was several massive upgrades to the product range which vastly increased the performance and gave good performance/price ratios (the 7950GX2 anyone?)

But yeah, probably BS :)

Yep, a faster-core G80 version is expected in Feb. I was a little hesitant towards that thought when I found out R600 was delayed until March / April (since nV wouldn't launch a faster card before the competition even releases their parts), but hearing that an R600 version *might* be out in Jan., it gives more credability towards an 8850GTX (not sure if that'll be the name) in Feb., which is right when Im buying my new system.

Also, let's not forget...
If Level505's comments seem a little too pro-ATI, don't be too surprised. When asked if the site was affiliated in any way to ATI or AMD, the owner replied to DailyTech with the statement that "two staff members of ours are directly affiliated with AMD's business dev division."

So I'd still be cautious with those numbers provided. You know how PR likes to embellish things.
 
oblivionwr2.jpg


for those wondering about oblivion (below is TR).
 
I think it could be real not as fake as some other things that I have seen. Real or not I cant wait to see ATi/AMD's new card :p .
 
Well, I can tell you that both the 06 and 05 numbers are too low. At 1600x1200, 4xAA, 16xAF, with HQ settings and no OC on the cards with a QX6700 at 3.2 GHz on a 680i, my system does 7740 on 06 and 12300 on 05. The system specs they used are identical to mine, so I do not think they really tested anything. The story/excuses they mention seem to be possible, but, like prior posts indicate, tags and ID's can be blurred out quite easily.

Also, if why are they using the old versions of 3DMark05 and 06? If they ran it at 1600x1200, you would think they had the Pro version and would consequently have the latest versions. The whole thing feels like a scam.
 
phez said:
for those wondering about oblivion (below is TR).

The performance difference between TRMSAA and TRSSAA is HUGE. Below they use TRMSAA, above TRSSAA. Besides this, Oblivion has so many different areas where you can bench with so many different outcomes. You cannot take a graph from one site and lay it next to a graph from another site and say "look this one must be bullshit".
 
Someone on B3D Forums just pointed out how they're running Crossfire on an nVidia chipset.
The following computer setup was used for testing:

* Processor: Intel Kentsfield Core 2 Extreme QX6700 (Quad Core) 2.66 GHz, overclocked to 3.2GHz
* RAM: OCZ Titanium 2048 MB (2×1GB) DDR2-800 PC2-6400 Dual Channel
* Motherboard: EVGA 122-CK-NF68-AR LGA 775 (Socket T) nVidia nForce 680
* Harddrive: 2x Seagate Barracuda 7200.9 ST3500641AS 500 GB @ 7200 RPM, SATA 3.0 Gb/s on RAID 0
* Dedicated Sound: Creative SoundBlaster X-Fi Platinum

* ATI R600: (Our Sample)
* nVidia GeForce 7950GX2: XFX PVT71UZDF9 GeForce 7950GX2 1 GB GDDR3 PCIe x16 Xtreme
* nVidia GeForce 8800GTX: XFX PVT80FSHF9 GeForce 8800GTX 768 MB GDDR3 PCIe x16
* ATI Radeon X1950 XTX CF: 2x Connect3D X1950 XTX 3060 512 MB GDDR3 PCIe x16 C

Just more evidence of BS....
 
Kliter said:
Someone on B3D Forums just pointed out how they're running Crossfire on an nVidia chipset.


Just more evidence of BS....

VERY good find.. I'm surprised that I didn't catch that one lol.
 
phez said:
oblivionwr2.jpg


for those wondering about oblivion (below is TR).

Apple740 said:
The performance difference between TRMSAA and TRSSAA is HUGE. Below they use TRMSAA, above TRSSAA. Besides this, Oblivion has so many different areas where you can bench with so many different outcomes. You cannot take a graph from one site and lay it next to a graph from another site and say "look this one must be bullshit".


Exactly, and especially when they aren't even benching at the same settings...such as resolution (something that is actually really really important). Nice try at a good post there bud, your efforts are appreciated.
 
Kliter said:
Someone on B3D Forums just pointed out how they're running Crossfire on an nVidia chipset.


Just more evidence of BS....

Can't use that. The article does state they used a Foxconn 975 board for Crossfire tests.

For the test of our X1950 XTX CF-pair we used the same configuration as mentioned above, but exchanged the motherboard to a Foxconn 975X7AA (CF Edition).
 
Apple740 said:
The performance difference between TRMSAA and TRSSAA is HUGE.

Also worth mentioning is that they use AAA for the R600, but which one? Performance ADAA or Quality ADAA? Just like the difference with TRMSAA/TRSSAA the difference is HUGE.
 
Also worth mentioning (again) is that site and review appear to be totally bogus. :p
 
the Level505.com domain is registered to someone who used domainsbyproxy.com to purposely avoid having their name and address associated with the domain registration entry.

it's too shady. I as well, call shenanigans.
 
I always use private registration with my domains, I don't get why anyone would want their information out there if they can help it?
 
Apple740 said:
The performance difference between TRMSAA and TRSSAA is HUGE. Below they use TRMSAA, above TRSSAA. Besides this, Oblivion has so many different areas where you can bench with so many different outcomes. You cannot take a graph from one site and lay it next to a graph from another site and say "look this one must be bullshit".

you're right, but you also can't call bullshit without reading the articles too.

Tech Report said:

edit: misread the CSAA part, so i guess this is weird. anyone want to explain this one to me?
 
Thats the biggest Pile of S*it if ive ever seen one, i mean the ATI (amd) R600 better outperform the 8800gtx considering they had an extra 3-4 months to polish it up, but B/S sites like this need to be ignored and nothing more, intill some real evidence of these cards surface.

it usually goes like this, nVidia beats ati to the market with the "fastest card" 2 months later ATI releases a card wich outperforms it by 5% then nVidia releases another revision of their flagship card wich is overclocked and minor tweaks have been made in order to have the "fastest card"...........its all business and marketing.........
 
Apple740 said:
The performance difference between TRMSAA and TRSSAA is HUGE. Below they use TRMSAA, above TRSSAA. Besides this, Oblivion has so many different areas where you can bench with so many different outcomes. You cannot take a graph from one site and lay it next to a graph from another site and say "look this one must be bullshit".

Except that you're wrong, my friend. Damage used TR-SSAA in that bench, as you can see here:

"I wanted to see what I could do to push the G80 with Oblivion, so I left the game at its "Ultra quality" settings and cranked up the resolution to 2048x1536. I then turned up the quality options in the graphics driver control panel: 4X AA, transparency supersampling, and high-quality texture filtering."

Now, while you're right in that there are many areas to bench in Oblivion, the fact stands that Damage pulled down 54.7 on average at 2048x1536 and these so-called R600 benches show G80 pulling down a meager 19.5 at 1600x1200. No matter where you're testing, that's a MAJOR difference. R600 benchmarks = bullshit? I think the answer to that question is an emphatic yes!
 
Ardrid said:
Now, while you're right in that there are many areas to bench in Oblivion, the fact stands that Damage pulled down 54.7 on average at 2048x1536

I own a 8800GTX myself. @2048x1536/4xTRSSAA/16xAF(HQ) i average ~25 on a horseback riding in a forest, with lots of grass etcetera.
 
Again, that just supports earlier statements that where you benchmark in Oblivion matters. But even your ~25 at 2048x1536 is more than those ridiculous R600 benches at 1600x1200. And as I'm sure you can see, you're not going to pull down less at a resolution 2 steps down with comparable settings, no matter where you bench. But yes, stick in a fork in this because it's done. Everyone on B3D has pretty much shot this mess down.
 
Speed is not the only criteria. It is unfair to compare two video cards when the IQ is different (X1900XTX vs 7900GTX is the perfect example).
Image quality, price, availability, power consumption, noise level and driver stability are also very important.
Anyway, my eVGA 8800 GTX will arrive Tuesday. The only thing that I fear is the driver stability...
I will upgrade again in 6 months and DX10 performance should matter much more by than.
 
RadXge said:
Speed is not the only criteria. It is unfair to compare two video cards when the IQ is different (X1900XTX vs 7900GTX is the perfect example).
Image quality, price, availability, power consumption, noise level and driver stability are also very important.
You are absolutely right, and by the looks of things, nVidia will hold the IQ and power draw advantage this generation.

RadXge said:
Anyway, my eVGA 8800 GTX will arrive Tuesday. The only thing that I fear is the driver stability...
There are no driver stability problems unless you are using the unreleased and therefore unsupported Windows Vista. The 8800GTS I played with ran FLAWLESSLY on Windows XP; it ran FEAR at 1680*1050, maximum everything, 4xAA, 16xHQAF, a solid 54FPS, and rock-solid stability.
 
InorganicMatter said:
You are absolutely right, and by the looks of things, nVidia will hold the IQ and power draw advantage this generation.


There are no driver stability problems unless you are using the unreleased and therefore unsupported Windows Vista. The 8800GTS I played with ran FLAWLESSLY on Windows XP; it ran FEAR at 1680*1050, maximum everything, 4xAA, 16xHQAF, a solid 54FPS, and rock-solid stability.

I'm not the only person seeing sometimes random BSODs using the 97.44s. Before that up-and-down performance using the 97.02s. Rock-solid? Hardly.
 
I have two systems with the 8800 cards and I have seen no issues aside from the normal game bugs that can crop up. If you are getting BSODs, you should look elsewhere like your PSU, RAM, etc.
 
Ardrid said:
ridiculous benches at 1600x1200.

If you really search for it, you can get a 8800GTX totally on its knees @ 1600x1200.

Look and shiver;



Saved game here

So yes, the Level505 benchmarks could be valid.
 
InorganicMatter said:
You are absolutely right, and by the looks of things, nVidia will hold the IQ and power draw advantage this generation.


There are no driver stability problems unless you are using the unreleased and therefore unsupported Windows Vista. The 8800GTS I played with ran FLAWLESSLY on Windows XP; it ran FEAR at 1680*1050, maximum everything, 4xAA, 16xHQAF, a solid 54FPS, and rock-solid stability.
As an 8800GTS user, I can tell you this is untrue. My card is stable most of the time, but not all, and there are still some graphical issues in apps that need to be resolved.

I backed down from 97.44 to 97.02 drivers. It's a lot better, but I've still had issues with 3DMark crashing, and occasional graphics corruption in the games I've been running that I did not have with previous cards. 97.44 was far worse. In Oblivion, the tips of trees/plants would corrupt and stretch out into the sky, blocking out the sun. I had small lines in several games all the way across the screen, as if frames weren't being rendered properly, and texture corruption.

I still think the card is great; it's incredibly fast, and when it works right (the vast majority of the time) it looks great. But, I'd say nVidia's drivers are about 85% of the way there, and still have a ways to go. nVNews has some great screenshots in their forums of the issues users are experiencing that need to be resolved; worth checking out. To call things perfect, and/or perfectly stable would be premature.
 
HeavyH20 said:
I have two systems with the 8800 cards and I have seen no issues aside from the normal game bugs that can crop up. If you are getting BSODs, you should look elsewhere like your PSU, RAM, etc.


Right... the blame rests on 8800 GTX plain and simple.
 
LoneWolf said:
I backed down from 97.44 to 97.02 drivers.

So true. The 97.44s are definitely a step back. I don't see blue screens using the 97.02s but performance seems to vary sometimes. Blue screens from the 97.44s sometimes happen within minutes after exiting a game. It's always the nvidia display driver that causes the problem.
 
So the drivers still aren't completely wrinkled out? Damn.

Anyways, when I saw those "benchmarks", let's just say I haven't laughed that hard in a while.
 
Apple740 said:
If you really search for it, you can get a 8800GTX totally on its knees @ 1600x1200.

Look and shiver;



Saved game here

So yes, the Level505 benchmarks could be valid.

Is this the save point that is always talked about as bringing a GTX to its knees?
I'm getting mid 20s to low 30s. At 1920 x 1080
AAx8Q
16xAF
TR:MSAA

Using TR:SSAA performance drops to mid 10s but i also see graphical anomalies so for now i can't depend on that setting.

The article doesn't mention what AA settings, besides TR:SSAA, were used on the GTX. If 4xAA was used my performance jumps back up to mid 20s to low 30s. Sitting 6 feet from my HDTV and i can't the difference between AA modes.

I do have some of the higher IQ packs installed though. I also changed the grass settings (slightly thinner) just slightly to increase visibility. I'd like to test the more intense areas so if anyone has game saves, fork them over please :D
 
I don't believe that review until I see pics of the actual card with computer in the background.
 
Based on the horrible placement of their google ads, the site's sole purpose is to generate revenue for the website owners by falsifying information.

Please remove the link from the front page of the [H] so we can stop sending traffic to these assholes.
 
chiablo said:
Based on the horrible placement of their google ads, the site's sole purpose is to generate revenue for the website owners by falsifying information.

Please remove the link from the front page of the [H] so we can stop sending traffic to these assholes.

Agreed. Just copy and paste everything. Yea, that means we'll be forced to look at it, but atleast they won't get any hits.
 
Defiantely fake, we all know that the X1900 cards in crossfire are destroyed by the 8800GTX yet those benchmarks say otherwise.. BTW I'm no Faynboi. but I do prefer ATI.
 
Hey for all those say it's fake - theres been an update to the 1st page


Update - Authenticity Of This Article

This article is authentic. Our R600 sample is an RTM sample used in the MS CERT process for driver validation (especially for Vista). We are not publishing pictures of the card itself right now, as the card contains major ID tags that we can not remove yet for source protection. We will add pictures of the card itself once we can remove these tags.


So there you have it, its not fake. ;)
 
Back
Top