R600 = WET DREAM

k whos saying the R580 is 2X bigger then a G71? i dont think it is.

and remember the die shrinkage is gonna take of some space.
 
5150Joker said:
razor:

Where's your game man? I've been waiting since E3 for it. You said it would be out by now.

We are updating our physics at the moment for next gen cards, so lets say around end of this year or by Feb it will be ready, the Demo will be released publicly, so don't worry ;)
 
Sharky974 said:
I'm not sure if your theory is even right, nor am I saying your arguments why R600 wont have 512 bit bus are wrong, but I'd just like to point out R580 is ALREADY almost twice the die size of G71.

So that doesn't seem much obstacle, since R600 will surely be larger than R580.


going to .80 nm the die size will surely shrink, but also there will be an increase in transistor counts with the r600, (17% decrease in size) which is a small savings but the r600 with 500 million transistors will end up being similiar size or a bit larger then the die of the r580, which should be enough to hold a 512 bit bus. Of course this is if ATi doesn't increase transistor density. One of the major reasons that nV's die size is so much smaller then the r580 is because thier denisty is higher.

I just see ATi taking a technically leap forward and not a "brute" force approach. Brute force will only go so far, why would they do that with a new architecture, they haven't done Brute force yet with any new generation chip. Granted the x8xx were kinda brute froce but there were substantial changes that made a great improvement in efficiency.
 
Moofasa~ said:
Actually I thought that card just had two 256-bit for the bus, not really equaling 512-bit?

IIRC, it's a 512-bit core and 256-bit memory interface.
 
merlin704 said:
ATi can't be the first ever 512-bit card. Matrox beat them with the Parhelia-512.:) It wasnt that great of a performer but feature rich nonetheless.

http://www.tomshardware.com/2002/05/14/matrox_parhelia/index.html

True, the Matrox Parhelia 512 did have a internal 512 bit bus, good find :) . But going to a 512 external causes alot of problems with the PCB complexity and of course pin counts on the GPU itself will be rather high, speculations on the r600 pin count is close to 2000 pins so it might be possible.
 
Edson Arantes do Nascimento, Pelé eterno!

:D

With this nickname it has to be the best!

:D
 
MrWizard6600 said:
k whos saying the R580 is 2X bigger then a G71? i dont think it is.

and remember the die shrinkage is gonna take of some space.

Not 2x, but it is bigger. G71 has 279 million transistors and IIRC R580 has ~350 million.
 
Silus said:
Not 2x, but it is bigger. G71 has 279 million transistors and IIRC R580 has ~350 million.


Surface area the r580 is 2 times bigger well the g71 is 40% smaller, because of increased transistor density.
 
nobody_here said:
to be honest, i think the 384bit memory interface of the G80 proposed specs is going to be more than enough to handle things, 512bit will be wasted IMO

has anyone really looked into the GDDR4 thing? i read somewhere that it is based off of the older GDDR2 spec, that it is actually slower than the mature GDDR3 memory


Lol...GDDR2 is slower then GDRR per clk, GDDR3 is slower then GDDR2 per clk, GDDR4 is slower then GDDR3 per clk... why? Because with each new generation there is added latency. But they make up for the added latency by increasing clock speeds.

ATi engineered (with Samsung) both GDDR3 and GDDR4.
 
razor1 said:
Surface area the r580 is 2 times bigger well the g71 is 40% smaller, because of increased transistor density.


There also seems to be a discrepancy between how NV and ATI count transistors in their cores. What it is though is beyond me.
 
ManicOne said:
There also seems to be a discrepancy between how NV and ATI count transistors in their cores. What it is though is beyond me.


True transistors counts vary between the IHV's, guess the only that really matters are the end results :)
 
Also just found this juicy bit over at B3D......

http://www.techreport.com/etc/2006q4/stream-computing/index.x?pg=1


"Orton pegged the floating-point power of today's top Radeon GPUs with 48 pixel shader processors at about 375 gigaflops, with 64 GB/s of memory bandwidth. The next generation, he said, could potentially have 96 shader processors and will exceed half a teraflop of computing power."

Mmmmm, seems like a fairly reliable hint that everyone so far has missed........

Given, 48 ps units and 64GB/s bandwidth, wouldn't 96 suggest ~120GB/s? That seems to suggest either 384bit + fast DDR4, or 512bit and more moderately specced memory.


Credit to Uttar, http://www.beyond3d.com/forum/showthread.php?t=34676
 
Yep thats why I didn't rule out the 512 bus, and also I expect the r600 to have quite a bit of computing power much more then the r580 ;)
 
razor1 said:
Yep thats why I didn't rule out the 512 bus, and also I expect the r600 to have quite a bit of computing power much more then the r580 ;)

And far more then G80. But will it be faster at DX9 games over G80? That remains to be seen.
 
ElMoIsEviL said:
And far more then G80. But will it be faster at DX9 games over G80? That remains to be seen.

I don't think we can assume R600 will destroy G80. Given R600's release schedule for ATI's sake you would hope so. And even if they do, NV is a master of the refresh.....
 
ManicOne said:
I don't think we can assume R600 will destroy G80. Given R600's release schedule for ATI's sake you would hope so. And even if they do, NV is a master of the refresh.....

It's not an assumption..lol. It's an arrogant prediction.
 
ManicOne said:
I don't think we can assume R600 will destroy G80. Given R600's release schedule for ATI's sake you would hope so. And even if they do, NV is a master of the refresh.....

nVidia really is the master of the refresh aren't they?
 
ElMoIsEviL said:
And far more then G80. But will it be faster at DX9 games over G80? That remains to be seen.


I don't think it will have far more then the g80, they will probably end up being very close the only difference will be due to clock speeds.
 
seriously if its jan 20, i'll hold off on the stupid 8800

but i thought someone else mentioned that ati delayed the board a bit, think the announcement came after the g80 was released?

question is what mobo would i use for this?
 
Arashi said:
seriously if its jan 20, i'll hold off on the stupid 8800

but i thought someone else mentioned that ati delayed the board a bit, think the announcement came after the g80 was released?

question is what mobo would i use for this?

A mobo with a PCI-E slot.
 
I dont care if it had 100000 bit memory or 10 gigs of it, if it isn't out soon I am not interested.
 
Who cares about Crossfire (unless you are using a 30" screen and want the best possible IQ)? One of this card should be enough for some times. (6 months!)
 
Do Intel chipsets support Xfire? If you can't run Xfire with a Conroe, it might put a dent in the R600's appeal.
 
Obi_Kwiet said:
Do Intel chipsets support Xfire? If you can't run Xfire with a Conroe, it might put a dent in the R600's appeal.

The 975x supports 8x 8x Crossfire

The 965x supports 8x 4x Crossfire I think, I know it supports Crossfire now, just dont know the speed its able to operate at.

Well supposdly this ATI chipset is suppose to be great.... If it supports 16x 16x Crossfire and does have asyncrous clock speeds and is stable, then it might have a great future ahead of it.
 
R600 sure sounds droolworthy :D But.... How much? That's my question.... I'm hoping it will not be any more expensive than the nvidia product, otherwise I will have to remortgage my house. ;)
 
yeah the R600 will be drool worthy

and the RD 600 will be drool worthy

but both are late! im done waiting... ordered a 680i motherboard and an e6300 today. if the RD600 comes out, shatters the 680i in overclocking performance, then well, ill be pissed off. if the R600 comes out and owns my 8800GTS, then well, ill be pissed off. if both happen, ill be really pissed off. if both happen, and the Canucks dont do better then 6-0 loss against anahaim, well then I just might have to stab myself.
 
Back
Top