ATI 6xxx gpu-z and 3dmark real or fake?

Status
Not open for further replies.

bear jesus

Limp Gawd
Joined
Aug 7, 2010
Messages
166
Just noticed this on hexus.net sourced from a chinese forum, so what do you guys think real or fake?
Going by the gpu name on gpu-z it would be the card listed as "233,CAYMAN XT (6718),NI CAYMAN" in the catalyst 10.8 drivers

http://img.hexus.net/v2/pmason/ati/6000/gpuz-2.jpg

It would be nice if it was real as 6.2ghz gddr5 on a 256 bit bus giving 204gb/s :eek: sounds pretty nice... but it needs to be overclocked just for fun :p

source http://www.hexus.net/content/item.php?item=26184 original chinese forum source http://we.pcinlife.com/thread-1498296-1-1.html
 
I love how ATI never really bragged about how good this architecture was, and blew everyone away and stole 10% market share in <1 year.

NVIDIA brag about Fermi, and it fails miserably (when compared to all aspects of the 5 series, power consumption, heat, and no delays for 7 months).
 
Not really sure. The whole foreign language thing is making it hard to read the thread :p

I love how ATI never really bragged about how good this architecture was, and blew everyone away and stole 10% market share in <1 year.

NVIDIA brag about Fermi, and it fails miserably (when compared to all aspects of the 5 series, power consumption, heat, and no delays for 7 months).

Last I checked, Fermi is faster. Also, your post is 100% irrelevant to the thread.
 
its real and I am 99% sure.

Seems they re-design of shaders are paying off.
 
According to w1zzard over at tech power up (creator of gpu-z) he cant see anything on the gpu-z part of the image to suggest it been faked, so if the 3dmark vantage part is also not faked then things look very promising.

It makes me wonder if the core could possibly overclock like the 58xx cards so upto/around 1ghz depending on the chip, as i think it would help ati gain some more market share and put pressure on nvidia to drop pices causing the consumer to win :D
 
Not really sure. The whole foreign language thing is making it hard to read the thread :p



Last I checked, Fermi is faster. Also, your post is 100% irrelevant to the thread.


actually its quite relevant.. really how much did ATI talked about the HD5k series before it was release? how much has ATI talked about the HD6k series?
now how long did nvidia sit there bragging about the fermi cards before they were released? let me give you a hit.. way the hell before the HD5k series was ever released..

while to an extent the fermi card is faster.. the performance to power ratio is easily controlled by ATI in single card form..


numbers look good though if they are legit.. now the question is if the retail card is going to have the 6.4ghz GDDR5 memory clocks, and is this a sign that they might be using the GDDR5 7ghz chips? if thats the case then these things would be overclocking beasts. but i guess we wont know til ATI releases some numbers on the cards..(this is where i wish ATI would hype their cards a little sooner then they do)
 
His stuff is legit as it seems the less a company says, the more they deliver.

Remember the 8800? Rumors around the mill were that the 8000 series wouldn't be unified shaders, it was going to be the old pixel/vertex setup of before. DX10 allows for that, it just requires a unified interface. Everything floating around was not from nVidia and pointed to a somewhat mediocre showing. Then the 8800 came out of the sky like a lightning bolt and completely redefined high end graphics.

It just seems like hype and delivery are somewhat inversely scaled. Fermi was kinda a let down. It performs fine but runs hot and was late to the game. They sure talked it up a lot though.

I just find that when the companies are more tight lipped, they are working on something much bigger.

At any rate I'll be interested to the how the 6000s do. I have no need at all for a new card... But that doesn't mean I might not get one anyhow :D.
 
Hopefully this scales better that the 58xx series
 
actually its quite relevant.. really how much did ATI talked about the HD5k series before it was release? how much has ATI talked about the HD6k series?
now how long did nvidia sit there bragging about the fermi cards before they were released? let me give you a hit.. way the hell before the HD5k series was ever released..
That is because when ATI released their 58xx, the 48xx parts were still competitive with Nvidia's. Their goal was to sell as many 48xx parts before the new line was released. When Nvidia released Femi, they were already behind, had pretty much killed the 2xx parts and all that was left in the high end was ATI. They were playing damage control to get people to hold off until Femi was released to upgrade. Something ATI didn't have to bother with during the 58xx launch. If Nvidia had been able to launch Femi on time, you wouldn't have heard much from official channels until the release.
 
actually its quite relevant.. really how much did ATI talked about the HD5k series before it was release? how much has ATI talked about the HD6k series?
now how long did nvidia sit there bragging about the fermi cards before they were released? let me give you a hit.. way the hell before the HD5k series was ever released..

while to an extent the fermi card is faster.. the performance to power ratio is easily controlled by ATI in single card form..


numbers look good though if they are legit.. now the question is if the retail card is going to have the 6.4ghz GDDR5 memory clocks, and is this a sign that they might be using the GDDR5 7ghz chips? if thats the case then these things would be overclocking beasts. but i guess we wont know til ATI releases some numbers on the cards..(this is where i wish ATI would hype their cards a little sooner then they do)

The topic is an alleged picture of an HD6xxx series card. The discussion is "is this picture legit?". ATI being quiet about the launch of the HD5xxx cards has NOTHING to do with that.
 
Not really sure. The whole foreign language thing is making it hard to read the thread :p

Last I checked, Fermi is faster. Also, your post is 100% irrelevant to the thread.

What the shit?

Fermi had 7 months on AMD... seriously, it should've kicked AMD in the ribs with that much time.


actually its quite relevant.. really how much did ATI talked about the HD5k series before it was release? how much has ATI talked about the HD6k series?
now how long did nvidia sit there bragging about the fermi cards before they were released? let me give you a hit.. way the hell before the HD5k series was ever released..

while to an extent the fermi card is faster.. the performance to power ratio is easily controlled by ATI in single card form..

Thankyou. Not a big member of [H] so thanks.

That's my point, ATI never really pushed the 5k series - same with the 6k series. Yet, NVIDIA were pushing Fermi before it was a real GPU for months - failing A1 silicon and following with the failure of A2 silicon.

Yes, as you said - Fermi is faster in some things... but, when I think of :

188 W versus ~275W
~50C versus ~90C

It's not really a comparison. NVIDIA released a GPU that uses nearly 80% more power and runs heaps hotter - that's not competition - it's a release to combat a product they knew they couldn't beat.


Hopefully this scales better that the 58xx series

Do you mean in Crossfire? With 200GB/sec, I think it will - hopefully Crossfire is bettered by then - I wouldn't mind an upgrade in Crossfire - maybe an upgrade which can COMBINE memory through Crossfire.

Imagine 2 x 6870's 1GB that could combine memory through Crossfire instead of just using 1GB total - this would be a HUGE win for AMD.


That is because when ATI released their 58xx, the 48xx parts were still competitive with Nvidia's. Their goal was to sell as many 48xx parts before the new line was released. When Nvidia released Femi, they were already behind, had pretty much killed the 2xx parts and all that was left in the high end was ATI. They were playing damage control to get people to hold off until Femi was released to upgrade. Something ATI didn't have to bother with during the 58xx launch. If Nvidia had been able to launch Femi on time, you wouldn't have heard much from official channels until the release.

When ATI released the 48xx parts - NVIDIA had the huge golden egg of the 2xx series - but constant renaming of this isn't how you do business - this is how NVIDIA fell SO far behind of AMD.

This is why they discontinued high end 2xx products, obviously because they thought Fermi would kick ass, but when it failed A1 and then A2 silicon, surprisingly the 2xx was put back into manufacturing for a while until the low-yieleded Fermi was released.

NVIDIA seem to be in hard times - but it's only when you take yourself out of the equation and look at the situation from the outside and look at all the puzzle pieces together.......
 
I have to laugh, i was only curious of some [H] oppinions on weather people think the gpu-z and vantage shot was a fake or real :p

Although i admit i would hope that crossfire scaling is better with the 6xxx cards but the fact that the 5770's scale better than the 58xx cards has me curious if maybe the memory bandwith could be holding the 58xx cards back, i geuss if they have to use the 7gb/s chips then it woudl be easy to do some testing if they can be taken all the way to 7gb/s

I admit im just kinda excited about the 6xxx release due to the cards themselfs but also the effect on the 5xxx and g4xx cards prices, with each generation i never know what side im going with untill im ready to buy and then just buy from whoever gives me the most power within my budget and so many cards and generations out at once should mean i get more power for my money :D
 
I have to laugh, i was only curious of some [H] oppinions on weather people think the gpu-z and vantage shot was a fake or real :p

Although i admit i would hope that crossfire scaling is better with the 6xxx cards but the fact that the 5770's scale better than the 58xx cards has me curious if maybe the memory bandwith could be holding the 58xx cards back, i geuss if they have to use the 7gb/s chips then it woudl be easy to do some testing if they can be taken all the way to 7gb/s

I admit im just kinda excited about the 6xxx release due to the cards themselfs but also the effect on the 5xxx and g4xx cards prices, with each generation i never know what side im going with untill im ready to buy and then just buy from whoever gives me the most power within my budget and so many cards and generations out at once should mean i get more power for my money :D


id say the gpu-z is most likely legit but the vantage tests a little hard to validate..

but yeah i agree.. the 6k series looks to be shaping up pretty well.. now if only they would fix the crossfire bottleneck in the drivers..
 
Do you mean in Crossfire? With 200GB/sec, I think it will - hopefully Crossfire is bettered by then - I wouldn't mind an upgrade in Crossfire - maybe an upgrade which can COMBINE memory through Crossfire.

Yes I mean crossfire. Why the hell do the 5770s scale better than the 58xx's? F'ing ATI drivers are F'ing over users who bought higher end ATI cards, they need to fix the software so the hardware doesn't suck because ultimatley the drivers will make or break the card: either allowing it's full potential or hindering it.

I don't think bandwith has anything to do with CF scaling it has all to do with the drivers.
 
real or not I don't know anyone that plays 3dmark :D. for myself I would want some other benches (pref real world one) at this point its all speculation and guesses. I do agree if the next gen is really good ATI will probably keep there mouths shut until the last. right now they are still selling their cards at MSRP.
 
This always seems to be how it starts. Some Chinese website leaks a GPUz and 3dMark screenshot. Then come graphs of games benchmarked against a random smattering of other cards and resolutions (these graphs will be highly debated and ultimately found to be very close to the actual performance). Core specs will then be rumored and debated.

An internal slide will then be "leaked" showing some real info about the core...

All I know is, we're very close some new GPUs to play with, debate & troll over! :)
 
Yes I mean crossfire. Why the hell do the 5770s scale better than the 58xx's? F'ing ATI drivers are F'ing over users who bought higher end ATI cards, they need to fix the software so the hardware doesn't suck because ultimatley the drivers will make or break the card: either allowing it's full potential or hindering it.

I don't think bandwith has anything to do with CF scaling it has all to do with the drivers.

After checking out the spec of the 5770 again i think you are right, i forgot the 5770 has a 128bit bus and almost exactly half he memory bandwith along with everything else.

The importaint question really is how can the drivers work so well with the 5770 and not the 5870? i can't see any logical reason. i admit i'm just glad i don't intend to use crossfire or sli any time in the near future (at least 8 months) unless i get an nv surround or eyefinity setup.

Anyway back to the point, i will be eagerly watching out for further leaks untill there is some solid info from ati as i'm really interested in the 58xx, 68xx and g460 (460's as if i do go with one i know they scale well and deal ok with nv surround) i just want everything to be out and priced so i can hopefully get a great gpu at a good price :D
 
Weird, people take any chance to blow out of proportion extra watts of power and heat......... I really hope those people who keep acting like its the end of the world are using fluorescent light-bulbs......... actually I hope they are using netbooks for the power savings.

I'm really looking forwards to the next generation of cards, but seriously heat/power has little effect on me. Unless it causes problems, which none of the Fermi cards have run into.
 
Last edited:
So for those of us that don't use 3DMark, how would that score compare to a 5870 or 480?
 
After checking out the spec of the 5770 again i think you are right, i forgot the 5770 has a 128bit bus and almost exactly half he memory bandwith along with everything else.

The importaint question really is how can the drivers work so well with the 5770 and not the 5870? i can't see any logical reason. i admit i'm just glad i don't intend to use crossfire or sli any time in the near future (at least 8 months) unless i get an nv surround or eyefinity setup.

Anyway back to the point, i will be eagerly watching out for further leaks untill there is some solid info from ati as i'm really interested in the 58xx, 68xx and g460 (460's as if i do go with one i know they scale well and deal ok with nv surround) i just want everything to be out and priced so i can hopefully get a great gpu at a good price :D

with anything under say a i7 930 at 2560x1600 two 5870's will be somewhat cpu limited
at 25x16 i've had 3870cf,4870cf,4890cf and a 5970 all scale well
 
After checking out the spec of the 5770 again i think you are right, i forgot the 5770 has a 128bit bus and almost exactly half he memory bandwith along with everything else.

The importaint question really is how can the drivers work so well with the 5770 and not the 5870? i can't see any logical reason. i admit i'm just glad i don't intend to use crossfire or sli any time in the near future (at least 8 months) unless i get an nv surround or eyefinity setup.

Anyway back to the point, i will be eagerly watching out for further leaks untill there is some solid info from ati as i'm really interested in the 58xx, 68xx and g460 (460's as if i do go with one i know they scale well and deal ok with nv surround) i just want everything to be out and priced so i can hopefully get a great gpu at a good price :D

There's a user here with dual 5970s that swears the plx chip onboard doesn't suffer the scaling issues dual 5870s or 5850s have.

I don't know why they don't scale just the perf in % improvement going from one 5770 to two is not equal to going from a single 5850 to dual 5850s for example...
 
I love how ATI never really bragged about how good this architecture was, and blew everyone away and stole 10% market share in <1 year.

NVIDIA brag about Fermi, and it fails miserably (when compared to all aspects of the 5 series, power consumption, heat, and no delays for 7 months).

just think , they could have put nvidia down for the count if they could ever fix their driver issues.......
 
Wonder if they will launch something that beats the GTX 470 for under $300 ??? I am itching to go SLI 470s but if ATI has a counter I might wait.

just think , they could have put nvidia down for the count if they could ever fix their driver issues.......

No, they couldn't have.
 
just think , they could have put nvidia down for the count if they could ever fix their driver issues.......

The same could be said for NVIDIA - even with their magical "perfect" drivers (that don't kill GPU's or overheat them ya know........).

If NVIDIA had great hardware, they'd put ATI down for the count.

The mighty has fallen, people will see it when the 6xxx comes out and NVIDIA have nothing to compete.
 
AMD is on a roll. I'm excited for the 6xxx release and as well as Bulldozer in 2011. Good times ahead.
 
We will see when it actually launches. I don't believe shit til at least two of the more reputable sites have benches thye ran themselves that match up pretty close.

Not that 3Dmark is useful beyond stability testing and running the bench one time to make sure your score is in line with other people with the same setup.
 
Anandtech forums say 20% faster than GTX 480 in Unigen. ATI worked hard to improve it's Tesselator.
IIRC there wasn't much "wrong" with the tessellation unit in Cypress, rather the unit was starved by a lack of setup rate.

From Beyond3D's analysis,
Moving onwards from the domain shader, we find that, on average, for 15% of the render time the pipeline is stalled by rasterisation (setup included here), meaning that the domain shader can output processed vertices and the primitive assembler can assemble them faster than they can be setup and rasterised.

We could also talk about how tessellation will never scale well in current graphics architectures, but that's another story.
 
This always seems to be how it starts. Some Chinese website leaks a GPUz and 3dMark screenshot. Then come graphs of games benchmarked against a random smattering of other cards and resolutions (these graphs will be highly debated and ultimately found to be very close to the actual performance). Core specs will then be rumored and debated.

An internal slide will then be "leaked" showing some real info about the core...

All I know is, we're very close some new GPUs to play with, debate & troll over! :)

QTF, actually looks sooner than I thought.
 
With the 6000, ATI is going to be copying Nvidia architecture again, as Fermi, no matter how you slice it, is simply better at tessellation. It'll be refined however, and should run cooler.

It will be on 40nm as TSMC/GF gave up on anything smaller. I expect it to also have the same amount of Stream Processors as ATI won't want to loose their 'cool running GPU' title. :p
 
With the 6000, ATI is going to be copying Nvidia architecture again, as Fermi, no matter how you slice it, is simply better at tessellation. It'll be refined however, and should run cooler.

It will be on 40nm as TSMC/GF gave up on anything smaller. I expect it to also have the same amount of Stream Processors as ATI won't want to loose their 'cool running GPU' title. :p

Yeah man, improving tessellation performance sure is "copying Nvidia architecture".

Where do these horrible posters come from, jesus.
 
Yeah man, improving tessellation performance sure is "copying Nvidia architecture".

Where do these horrible posters come from, jesus.


They lack the ability to comprehend the size of the lawsuits were their twisted fanboy fantasies actually true.

If he had said "Ati was just following Nv's lead and improving tessellation", instead of " copying Nvidia's architecture again", he would have a point that could at least be argued.
 
They will, just watch. They copied it with unified shaders too. Just like Intel copied x64 from AMD.

Companies do this stuff. Mark my words, the next ATI overhaul is going to have shader-based tessellation. And don't you dare call me a fanboy. I have MORE than my fair share of ATI/AMD systems. My big boy is just with those with the big chips.
 
They will, just watch. They copied it with unified shaders too. Just like Intel copied x64 from AMD.

Companies do this stuff. Mark my words, the next ATI overhaul is going to have shader-based tessellation. And don't you dare call me a fanboy. I have MORE than my fair share of ATI/AMD systems. My big boy is just with those with the big chips.

You claimed they were "copying Nv's architecture". Even if Ati comes out with improved shader based tessellation on their next batch of cards, it will not be because they copied Nv's architecture. Architecture they had no access to. In fact, the cards that Ati is supposed to be shipping in the next few months were being designed B4 Fermi shipped, B4 Ati knew what Fermi was going to be all about. All that b4 we even talk about parallel development.

As for x64, Intel licensed the tech from AMD. If AMD/Ati was to license any tech from Nv, we would know about. Public companies report where the money goes, and typically have a press release any time a licensing, or cross license deal is struck..
 
In fact, the cards that Ati is supposed to be shipping in the next few months were being designed B4 Fermi shipped, B4 Ati knew what Fermi was going to be all about. All that b4 we even talk about parallel development.
Got any way to back that up?

The same way, I could say since AMD was building Buldozer back in 03, they started multi-core and threading first.

Things can change quite significantly. Fermi is also an example of that, as for the first year of development, it was simply a doubled-up version of GT200b.

I suppose I could have made my initial post more clear, but what I was trying to say is that ATI is going to be going the same route as Nvidia. Chalk Fermi up as a fail all you want, but it obviously works.
 
They will, just watch. They copied it with unified shaders too. Just like Intel copied x64 from AMD.

Companies do this stuff. Mark my words, the next ATI overhaul is going to have shader-based tessellation. And don't you dare call me a fanboy. I have MORE than my fair share of ATI/AMD systems. My big boy is just with those with the big chips.

1.) AMD did not copy Nvidia with regards to a unified shader architecture. Both companies made that move independently, with AMD(ATI) actually releasing theirs first with the GPU inside the Xbox360.


2.)Tessellation is not done in shaders for either company, and will not be done there. It's done in dedicated hardware. AMD does it in the set-up engine, while Nvidia created what they call a Polymorph Engine. This misunderstanding came about because Nvidia put a Polymorph Engine connected to a SM(Shader Module, which is a group of CUDA cores) and lowering the amount of active SM lower the tessellation(and basic geometry) performance.
 
I think these are true. The Time frame seems about right too. Hopefully we can get something solid here soon.

I am in the Market for a new computer. And I was about to pull the trigger last night on a pair of 5870's for FF14. Looks like I can wait a little longer :)
 
If it performs similarly to a 5970 then I think I might swap mine for one of these. Judging from the leaked benches that looks to be the case. Would be nice to ditch CrossFire, which is nice when it works.
 
Status
Not open for further replies.
Back
Top