R520 does 10k+ in 3Dmark'05...

Status
Not open for further replies.
I should fucking well hope so, it'd be a pretty poor display on ATIs part if their brand spanking new GPU couldn't trounce one that was a year old.
 
Gotta love this competition...what a great time to be building a second computer :D
 
Gotta love the fact even more you want to buy all your components in a month, and these new crazy parts won't be out quite yet I bet. However they will be released soon after and wreck the stuff I just bought :rolleyes:
 
well if two ultras is bottlenecked by my fx-55, i cant imagine two r520s not being seriously bottlenecked by a cpu that is out in the next 6 months. Id still take it though:)
 
I dont trust INQ, but if the numbers are true that is very impressive. Wouldn't it be nice if that was with only 24 of the rumored 32 pipelines on the card :)

I am eagerly awaiting the release, and availability (biggest thing) of both of the next gen cards and am looking forward to seeing real numbers and real reviews. Until then I will take everything with a large boulder of salt.
 
Honestly sometimes I think the Inquirer's articles are written by chimps. The grammer is so bad.

Cool if true though :)

Shadow27
 
Hmmm it's Inq fodder. Fact? Fiction? Who knows but 'wowzers' if true........ :eek:
 
Honestly sometimes I think the Inquirer's articles are written by chimps. The grammer is so bad.

Thats what I love about the Inquirer. I can relate to them so much sometimes.
 
They were referring to the rumor flying around that G70 is named the 7800 because it scores 7800 in 3dmark05. they were saying that the g70 will score higher than the 7800 it was rumored to score.
 
When I am able to run 3dMark05 on a friends computer with an R520 and see the score, it'll be QED, but until then it's BS.
 
Erasmus354 said:
They were referring to the rumor flying around that G70 is named the 7800 because it scores 7800 in 3dmark05. they were saying that the g70 will score higher than the 7800 it was rumored to score.

Haha. You mean the rumour they started? :rolleyes:
 
fiction....all fiction.

Actually I'd love to see a new card that can blaze scores like that....and also show comparabl e performance in games.
 
well, they cant be sure that nvidia's offering will not beat that..damn inquirer..but anyways...WOW hehe I might have to grab one... :)
 
. We still don't know its clock speeds but we know the score.

Yeah well SLI'd 6800U's can do 14,000.... when ran at subzero temps. Who knows under what conditions this FUDO business is being run at. This is BS until Anandtech/xbit/Madshrimps has one in hand and benches it.
 
I dont know but it just does not make any sense to me for Nvidia and ATI to release new cards that effectively double the speed of their current generation cards. They have spent millions on R&D and production costs on the 6800 and X800 cards and they have not even been around for a year yet.

Effectively are they not canabalizing sales of those products by releasing cards that are twice as fast? So for example, when Nvidia and ATI released the GF4 and 9700, all subsequent cards were pretty much incremental speed bumps up until the X800 and 6800 which effectively doubled the performance of the previous generation of cards.

Now you have people saying that they are going to double performance again? Whilst this will make the X800 and 6800 cheaper, sales would be substantially less than if an incremental speed bump was released. Why you may ask? Price/performance ratios.
 
I really want to know the price of these cards. I got a feeling they might be on the high side though. :(
 
defiant said:
I dont know but it just does not make any sense to me for Nvidia and ATI to release new cards that effectively double the speed of their current generation cards. They have spent millions on R&D and production costs on the 6800 and X800 cards and they have not even been around for a year yet.

Effectively are they not canabalizing sales of those products by releasing cards that are twice as fast? So for example, when Nvidia and ATI released the GF4 and 9700, all subsequent cards were pretty much incremental speed bumps up until the X800 and 6800 which effectively doubled the performance of the previous generation of cards.

Now you have people saying that they are going to double performance again? Whilst this will make the X800 and 6800 cheaper, sales would be substantially less than if an incremental speed bump was released. Why you may ask? Price/performance ratios.

The X8xx series cards are still based on R300. While mildly modified over time, there have been no major alterations and still the same core at heart. ATi has already recovered their R&D for this core.

I'm not sure about NV. ATi released the R300 core when NV was releasing the FX, then 6xxxx some time after that, so the 6xxx series have not been out nearly as long as the R300 core. For that reason I couldn't say whether NV has fully recouped their R&D costs for the 6xxx series.
 
Un4given said:
The X8xx series cards are still based on R300. While mildly modified over time, there have been no major alterations and still the same core at heart. ATi has already recovered their R&D for this core.

I'm not sure about NV. ATi released the R300 core when NV was releasing the FX, then 6xxxx some time after that, so the 6xxx series have not been out nearly as long as the R300 core. For that reason I couldn't say whether NV has fully recouped their R&D costs for the 6xxx series.
It is possible that NVIDIA recouped their R&D cause SLI really did take off more than alot of people thought it would. Their are alot of people out there that bought 2 instead of 1. That helped alot!
 
trudude said:
It is possible that NVIDIA recouped their R&D cause SLI really did take off more than alot of people thought it would. Their are alot of people out there that bought 2 instead of 1. That helped alot!


Define "alot"

You are simply not making any sence. High end cards are ALWAYS the minority in total production. This is true for just about anything. You make it sound like everyone in the neighborhood is running SLI. SLI's cards cost ranges from $400-$1000 if the prices are retail. Not only is that high end, that is not appealing. Infact i havent seen anything saying SLI was causing Nvidia's cards to sell especially well, what i have been seeing is that the Nforce 4 SLI boards are selling well. That says nothing since its basically the best and was for a time the only chipset combining AMD and PCI Express. Hell, I own one because it was my only choice! I wont be going SLI.
 
While these large speed bumps may cannabalize their sales, if they only incrementally improved, and the other company blasted them out of the water, game over. Even with the FX fiasco, neither company has ever had a head and shoulders lead over the other (not counting sli).
 
I'd be jumping for joy right now.....if I gave a damn about a benchmark :p What can it do in farcry @ 1600x1200 6xaa 16xaf...that's the info I'm looking for. =)
 
|0b0 said:
I'd be jumping for joy right now.....if I gave a damn about a benchmark :p What can it do in farcry @ 1600x1200 6xaa 16xaf...that's the info I'm looking for. =)
Well we can be pretty sure it's gonna kick HL2 in the arse, and probably Far Cry as well....my major question is if ATi will improve their openGL performance.
 
jebo_4jc said:
Well we can be pretty sure it's gonna kick HL2 in the arse, and probably Far Cry as well....my major question is if ATi will improve their openGL performance.

If you're referring to the doom 3 engine specifically, then yes.
 
FanATIc said:
Define "alot"

You are simply not making any sence. High end cards are ALWAYS the minority in total production. This is true for just about anything. You make it sound like everyone in the neighborhood is running SLI. SLI's cards cost ranges from $400-$1000 if the prices are retail. Not only is that high end, that is not appealing. Infact i havent seen anything saying SLI was causing Nvidia's cards to sell especially well, what i have been seeing is that the Nforce 4 SLI boards are selling well. That says nothing since its basically the best and was for a time the only chipset combining AMD and PCI Express. Hell, I own one because it was my only choice! I wont be going SLI.
That is true that not everyone went out and bought 2 6800 Ultras, but NVIDIA prolly makes just as much profit when people buy 2 6600GT's. I know at least a dozen of my close friends who bought dual 6600GT's. I also know 3 others besides myself that bought 6800GT's. There are alot more people out there than you think that went SLI with NVIDIA. Most of the friends of mine are not even hardcore gamers.
 
jebo_4jc said:
Well we can be pretty sure it's gonna kick HL2 in the arse, and probably Far Cry as well....my major question is if ATi will improve their openGL performance.
Performance has been improving....ATi is pretty much tied with nV in high-end D3.
 
I would not say tied but only a bit below now a days.
Even still both cards (6800U and X850XTPE) are good enough to run DOOM3 at insane Settings.
 
{NG}Fidel said:
I would not say tied but only a bit below now a days.
Even still both cards (6800U and X850XTPE) are good enough to run DOOM3 at insane Settings.
They run at equal settings....is that not equal? Yes, the nV will run the higher FPS, but any higher settings and it becomes unplayable.
 
They run at equal settings....is that not equal? Yes, the nV will run the higher FPS, but any higher settings and it becomes unplayable.

Faster is Faster Period.
their is no exception.
 
jebo_4jc said:
Dude....those DNA drivers are 3rd party modded drivers. They are like 10 fps faster than the latest nvidia beta. 3rd party drivers don't count.

Huh - did you not see the 76.44 WHQL score? The DNA drivers are based off of that driver.
 
trinibwoy said:
Huh - did you not see the 76.44 WHQL score? The DNA drivers are based off of that driver.
Doh....I just looked for the latest nvidia driver and saw that it was slow.

Edit: Wait, why are all the subsequent drivers much slower? Were the 76.44 a bug?
 
jebo_4jc said:
Doh....I just looked for the latest nvidia driver and saw that it was slow.

Edit: Wait, why are all the subsequent drivers much slower? Were the 76.44 a bug?

Yeah it's strange that the 76.44 is so much faster than the 76.45 in HL2. We'll see if the next official drivers (whenever they come out) have similar improvements.
 
So the R520 is par on par with 6800 SLI?
If all this is true, and please note, that I don't find 3Dmark 05 usefull, then I have a this conclusion for you.
Taken form this thread: Post your 3dMark 2005 scores here. (Link to list in 1st post)

1. Gundammit: 12.020
2. HeavyH2O: 11.715
3. Zxcs: 11.693
4. Eva2000: 11.574
5. spaceman: 11.453

And there are 17 more people on that list that are +10K, so I would say this invalidates any preformance arguments against SLI, as those 23 people(just on this board) have been having this kind of preformance for a while...

Terra - The preformance would-be buyers of the R520 are dreaming of...
 
Terra, those are all overclocked cards, and systems. Who knows if the one the Inq is claiming broke 10k is. You cant compare them.

Shadow27 said:
Honestly sometimes I think the Inquirer's articles are written by chimps. The grammer is so bad.

Cool if true though :)

Shadow27

Oh the irony..
 
Status
Not open for further replies.
Back
Top