ATI R600 Confirmations on Size and Power

Let's not bring personal attacks into this. I am pissed at both ATI and NVIDIA right now, so no need to get personal. NVIDIA sucks because they've had 3+ years to make a decent Vista driver and failed miserably, and ATI sucks because they've had 18+ months to make a decent DX10 part, and also failed miserably. Now, I want you to read my specs in my sig very carefully and look at what brand of video card I am using in my gaming machine, then I want you to read this thread before you begin more childish insulting.

I'm sorry but didn't you call someone a "fan=boy" in your last post? I'd say you brought in the personal attacks and childish insults before I did. ;)

All I asked is that you stop trying to make everyone feel miserable because you're disenchanted with what the big shots are doing lately.
 
Actually that is what I said. The 6/8pin connectors can physically handle more power than the specs supply them with, but they aren't being fed more power. If the R600 was to pull 300W at stock then a whole new OC limiting factor comes into play, independant of the actual adaptor hardware limitations. Thus my use of the term "un-overclockable". Sorry if my terminology causes you grief.

Okay, in one breath you say that power connectors can supply more power than the spec calls for, then turn around in another say that it can't because the spec say it can't. That's nonsense. The whole idea of overclocking is taking something out of spec and pushing it to it's absolute physical lmits, not it's specification limits. Someone said earlier that the 6pin is physically capable of 288w each. So for two 6pin connectors that's 576w available to one card (we'll assume for the sake of discussion that your PSU is capable of supplying that much power). That's a hell of a lot of headroom and I bet that heat dissipation, absolute switching speed or voltage tolerance will affect the R600's overclockability long before it runs out of power, specs be damned.
 
AAARGHH!!! The connectors are physically capable of handling more power than they are actually supplied with. Argue for arguments sake m8 but it seems that [H] is nowadays more about the arguments than the facts.

What happens when you try and run a card without plugging in the required power feeds? Does it just drag more power through the PCI-e slot? NO. It can't because that extra power isn't there. I've made my point abundantly clear; keep arguing for whatever reason you want but this is going abso-bloody-lutely nowhere.


God my head hurts, I think I need an aspirin....


Edit: And if I am wrong about something I like to know it. Increases my knowledge. People butting heads because of the need to be right is, well........

2nd Edit. I'm basing my logic on the early claims that R600 could run on 2 6 pin plugs, but for oc'ing it would require an 8 pin as well. If it was as simple as just dragging more power through the 6 pins why would this be the case? And yes it does now look like an 8 pin might be mandatory.
 
Ryan does bring up an interesting point. One faced as higher power requirements grow on PCIe cards.

What actually HAPPENS if you forget to plug in the power cables? Im going to "assume" that the card will not power up without the voltage regulators fed by the cable connectors being active. I can't imagine the card would happily go on to TRY to draw power from the PCIe bus.... the connector pins would MELT.
 
Ryan does bring up an interesting point. One faced as higher power requirements grow on PCIe cards.

What actually HAPPENS if you forget to plug in the power cables? Im going to "assume" that the card will not power up without the voltage regulators fed by the cable connectors being active. I can't imagine the card would happily go on to TRY to draw power from the PCIe bus.... the connector pins would MELT.

Usually what happens is a message comes up on your monitor, similar to a "No Signal" message, saying that you need to connect the power cable.
 
Some will beep, also, but usually the system will not POST. Nothing appearing on the screen and an angry noise emanating from the box usually gets my attention! :)
 
I didn't want to come right out and say that, but since you did :) Many of them are just really biased, and several are also filled a little too much to the brim of themselves for my tastes.


I just don't believe in 'beating around the bush' as the saying goes. :D To hell with being politically correct,and all that goes along with it.The egos and hot air is near suffocating
in those places.
 
Yes I am saying some of them are power hungry.I am ware of Intels misadventures in this area,as I own a few Intel Northwood/Pressy systems.It had very little to do with user requests.



As have I,and have never been banned,its called being tactful and polite.



I have seen some very ignorant/fan boi type posts over on Beyond recently,as well as Kyle and [H] bashing,and none of it has been poo poo'd :) "Kyle isnt the sharpest tool in the Shed"
..... Anyone who knows of Google can find it,and the site its on.... 3+ Mods have seen it,and done nada.Abject rudeness does seem ok over there. :)


Prescott's heat issues were pretty much limited to the S478 versions (there have been few, if any, complaints regarding Prescott thermal performance in LGA775 trim; this leads me to suspect major differences between S478 and LGA775 from a heat-related POV); in the case of *Northwood*, the issues only occurred with the original version (amusingly, there have been few complaints regarding heat and 800 MHz FSB (200 MHz quad-pumped) Northwoods of either B or C series).

Also, Prescott was primarily aimed at (and designed for) LGA775, *not* S478. (Question: Did the backfit to S478 contribute to thermal issues?) Starting with Northwood-B, even though clock frequencies (and overall clocking) both increased over the original Northwood, absolute power draw dropped even when the clock frequency didn't (compare the power draw in watts for the original P4 2.4 Northwood, 2.4 B, and 2.4 C; with each revision, even though the FSB increased over the original, power consumption dropped in the B revision, and dropped again in the C revision, which also explicitly supported HT).

We have no idea what the true thermal performance of R600 is; however, we are well aware that all ATI GPUs back to R300 have not exactly been easy to cool when heavily taxed (case in point: the legendary thermal issues regarding AIWs, especially 9700 Pro and above in AGP trim).
 
I have no doubts that it will draw 300W. Look:


Source.

That is extremely black-and-white. Leaves no room for this +/- 30W people are talking about.

Is it possible that the words "high-end" and "configuration" in there mean that the power draw refers to the system, not just the card, including a fast Core2?

The power draw charts on this page show a total system draw of 187 watts at idle and 277 under load for an 8800GTX with a Core 2 Extreme X6800, so I guess a total system draw of as "low" as 300 watts for R600 would be possible. Now if that's at idle, mother help!
 
ATI is suggesting to SIs that they provide a solid 300 watts of power per R600 card inside of any preconfigured systems.
 
I think we'll have to wait until someone uses their Kill-A-Watt on the card before we make any judgements. They always have ridiculous power "recommendations."

Yeah, they like to scare as many SIs as possible so they won't use their product. :rolleyes: ;) Spec on the "High End" R600 GPU power useage will be around 245w to 250w. Factor in 80% efficiency on a good PSU...well, you do the math.
 
300watts just for the GPU so the SI would have to take into consideration the rest of the system, right? Or do you mean a dedicated 300watts?


Well, if you look at what was orginally posted.....

300w for ONE R600

600w for TWO R600.

Now logic tells me that is JUST for the video card. If it were not, the requirement for the second card would be much lower. From the sheet I am looking at, it means that each "high end" R600 will require 300w of dedicated power to operate safely.

I have a feeling that this product is about to be ATI's "GeForce 5800," but with just a bit less PR smoke and mirrors to go along with it.
 
Pardon the intrusion in this epic thread...

But fuck you AMD. I'm canadian but I now study in Europe. Ive always supported ATI, but the fact they keep pushing their cards while buying all the GDR4 memory to suposedly enhance their market strategy is BALONEY !!!!!!!

I've been holding off finishing my new system for 2 months, because I figured the R600 would be maybe better by a significant margin, if not there would be a price battle. Where I live the X1950XT is about 100€ cheaper than the 8800GTX, which I could OC pretty well up to 660/2100 probably. So why would I get a DX9 card that wont be even close as good as the GTX?

Then I hear ATI !!!!!!s (!!!!!!ism is as ridiculous as Paris Hilton) saying ''mwahahahaha the R600 will kick the 8800GTX's ass , you should wait!''

Guess what. By the time the R600 actually is released COMPUTERS WILL BE OBSOLETE AS THE SUN WILL HAVE EXPLODED FROM ITS LONG LIFE, THE R600 BLUEPRINTS ALONG WITH IT !.

Seriously, GDR4 memory? This card will be expansive...especially with a 1GB of it. Ive built a rig with a CM Stacker 830 evo and a XStream 700W in preparation for long cards. EVEN if the retail R600 is as long as the 8800GTX, the performance boost in fps (nobody gives a tiny rat's ass , I dont anyway, about scoring 3000 more points than the 8800GTX. ) better be SIGNIFICANT. If it gives me 2 extra FPS on a 1680 resolution, who gives a crap? So you're epeen will be hurt because you didn't score as high on 3dmark06? Bohoo.

Sorry for the venting, as a proud canadian I am hurt to see AMD has messed up the good image of ATI. Pushing products once is ok, but twice ? Only Blizzard has the right to push back products for years because we actually know their games are all premium, ATI isn't on that level just yet.

I feel much better now.
 
I have been reading this thread for some time now...And all there rumors and hear says is starting to get annoying.

Well i have somthing that is FACT.. nothing fake about it

ATI does not have a DX10 for the masses yet.

nVidia has a DX10 being sold to the masses for some time now.

I know the truth hurts.. But somtimes thats the facts of life :D
 
[RIP]Zeus;1030764940 said:
I have been reading this thread for some time now...And all there rumors and hear says is starting to get annoying.

Well i have somthing that is FACT.. nothing fake about it

ATI does not have a DX10 for the masses yet.

nVidia has a DX10 being sold to the masses for some time now.

I know the truth hurts.. But somtimes thats the facts of life :D

Nvidia might have a DX10 card but there are no DX10 games and their drivers are damn near deplorable. Moreover it's looking more and more that the R600 will dominate over Nvidia in the DX10 department when DX10 games actually show up.

But seriously don't let the "facts" get in the way :rolleyes:
 
[RIP]Zeus;1030764940 said:
I have been reading this thread for some time now...And all there rumors and hear says is starting to get annoying.

Well i have somthing that is FACT.. nothing fake about it

ATI does not have a DX10 for the masses yet.

nVidia has a DX10 being sold to the masses for some time now.

I know the truth hurts.. But somtimes thats the facts of life :D

Nvidia has had DX10 card sold to the masses for some time now! 2 months before a DX10 capable OS was released, and now that it is released, and, the drivers don't work under that DX10 OS!

in all fairness, all Nvidia did so far was release a faster DX9 card using a Unified pipeline design =p
 
Nvidia might have a DX10 card but there are no DX10 games and their drivers are damn near deplorable. Moreover it's looking more and more that the R600 will dominate over Nvidia in the DX10 department when DX10 games actually show up.

But seriously don't let the "facts" get in the way :rolleyes:

What i find funny is you claim FACT on that ATI will dominate in the DX10 world.. when you have no idea when ati will release there FIRST "mind you" There FIRST DX10 card.

How can you even say Don't let the facts get in the way. When it's fact that nVidia has a DX10 card out, or rather a NEWER card than the 7900 or x1900.

Who cares about what is supported about it right now. I donno if you forget.. but if the hardware isn't there. The software will not be built around it. So. As of right now. ATI has nothing to sell to you. nVidia does. That is fact. Also IMO! "mind you this is my opinion" By the time ati releases a new card. nvidia will have a refresh out or hell even a newer card.. but i don't know that and either do you or anyone here. So let time take it's toll. stop with the rumor's and BS. yeesh. somtimes i wonder if you people that create these rumors have even a life;)
 
Nvidia has had DX10 card sold to the masses for some time now! 2 months before a DX10 capable OS was released, and now that it is released, and, the drivers don't work under that DX10 OS!

in all fairness, all Nvidia did so far was release a faster DX9 card using a Unified pipeline design =p

Either way. It was released. Once again. Newer things have bugs. Bugs are apart of trouble shooting.

Gotta understand there are millions of differnt configs in the world. Just cause nVidia and MS tested a few thousand.. doesn't mean they got it to work on eveyone's machine.
 
[RIP]Zeus;1030765008 said:
Either way. It was released. Once again. Newer things have bugs. Bugs are apart of trouble shooting.

Gotta understand there are millions of differnt configs in the world. Just cause nVidia and MS tested a few thousand.. doesn't mean they got it to work on eveyone's machine.

I'm sorry I didn't pay $450 to be a beta tester for Nvidia ;) thx

and EVERYONES machine? IT DOESNT WORK ON ANYONES MACHINE UNDER VISTA! I have heard complaints only, no oh yea it works great for me!!!

and everyones different config doesn't even apply to videocards manufactureres, because they follow a standard, the standards set by mobo makers and OS makers, and psu makers, they tell u what is required for their cards to work, 1 PCIe 16x slot, 450w psu, MS windows xp / vista / 2k etc, linux, the rest doesn't matter, and if something doesn't work beyond those standards it does't fall on the shoulders of the card maker, it falls on the shoulders of one of the other components, like the motherboard, which has to worry alot more about hardware configuration diversity

Long story short, Nvidia and ATI don't care what sound card, network card, ram, hard drive, floppy drive and dvd+-rw drive you're running because it simply doesn't have enough influence on the videocard


and by the way, the hardware that is there for the software is BS, Nvidia and ATI both send SDKs to game / software devs long before we ever see the hardware the SDK is for
 
I'm sorry I didn't pay $450 to be a beta tester for Nvidia ;) thx

and by the way, the hardware that is there for the software is BS, Nvidia and ATI both send SDKs to game / software devs long before we ever see the hardware the SDK is for

ok great.. they send SDK's. Big deal. How they going to test those SDK's if there is no hardware to test them on?
 
I'm sorry I didn't pay $450 to be a beta tester for Nvidia ;) thx

So your saying i paid 450 dollars to be a beta tester for nVidia?

If thats what you are truly saying. Then you my friend need to get the facts and stop asuming somthing that you have no idea about.
 
Nvidia might have a DX10 card but there are no DX10 games and their drivers are damn near deplorable. Moreover it's looking more and more that the R600 will dominate over Nvidia in the DX10 department when DX10 games actually show up.

But seriously don't let the "facts" get in the way :rolleyes:

Oh really ? It's looking more and more that R600 will dominate over NVIDIA in the DX10 departement...

May I ask how did you reach that conclusion ?
 
[RIP]Zeus;1030765041 said:
So your saying i paid 450 dollars to be a beta tester for nVidia?

If thats what you are truly saying. Then you my friend need to get the facts and stop asuming somthing that you have no idea about.

so maybe you should to buy an 8800 card and Install Vista :) We'll see how your tune changes!

I'm saying is that I don't care waht bugs Nvidia has, if they have so many bugs they should a) not release that product, b) not advertise it as compatable with that software

no one buys an 8800 just to run their AERO desktop

you need to wake up and stop defending Nvidia for their shortcomings on the vista driver front
 
Oh really ? It's looking more and more that R600 will dominate over NVIDIA in the DX10 departement...

May I ask how did you reach that conclusion ?

not sure if you guys read the other thread or not, but an MS dev stated that R600s GS are a fair bit faster then the G80s(working on MSX I guess?) since GS are one of the new features of DX10 that gives some sort of hint at the differences in DX10 performance
 
[RIP]Zeus;1030765037 said:
ok great.. they send SDK's. Big deal. How they going to test those SDK's if there is no hardware to test them on?

ofcourse they get hardware samples to test on, long before we or reviewers do
 
Nvidia has had DX10 card sold to the masses for some time now! 2 months before a DX10 capable OS was released, and now that it is released, and, the drivers don't work under that DX10 OS!

in all fairness, all Nvidia did so far was release a faster DX9 card using a Unified pipeline design =p

Hey Digital Viper-X-

Just a simple question to satisfy my curiosity.
Wasn't you that bought a 8800 GTX when it was out, then switched to a X1950 Pro, because the GTX was too much for your current setup ?
Now I'm spotting a 8800 GTS in your sig...Do you change sigs just because, or is that sig actually your current system ?
 
Hey Digital Viper-X-

Just a simple question to satisfy my curiosity.
Wasn't you that bought a 8800 GTX when it was out, then switched to a X1950 Pro, because the GTX was too much for your current setup ?
Now I'm spotting a 8800 GTS in your sig...Do you change sigs just because, or is that sig actually your current system ?

nope, it's actually in my system, I found a new game I wanted to play =) so I got a videocard for it, and I got it at a great price, about $270 cheaper then the GTX I had and sold

actually I found 2 games I wanted to play =p Supreme commander and CNC3, sure CNC3 runs on a 1950pro just fine, but its alot more fluid on the 8800

btw I sold that GTX because I had no use for it, the games I played then ran pretty much maxed out on my 1950pro
 
so maybe you should to buy an 8800 card and Install Vista :) We'll see how your tune changes!

I'm saying is that I don't care waht bugs Nvidia has, if they have so many bugs they should a) not release that product, b) not advertise it as compatable with that software

no one buys an 8800 just to run their AERO desktop

you need to wake up and stop defending Nvidia for their shortcomings on the vista driver front

Sorry i haven't updated my sig..

But i do have a vista box running an FX60 with a 8800GTS

I have no problems. So maybe you just one of those unlucky peeps.. And i'm not defending nVidia alone. I am also defending MS. I think your the one that needs to wake up. If your a PC guru or whatever. and you build your machines. Do you not expect to run into problems with newer hardware/software?

Then again you might just have a narrow site on building/troubleshooting machines.
 
not sure if you guys read the other thread or not, but an MS dev stated that R600s GS are a fair bit faster then the G80s(working on MSX I guess?) since GS are one of the new features of DX10 that gives some sort of hint at the differences in DX10 performance

Yes, I've read it. But again, you are missing the point. The point here is that, R600 high-end model MUST and I'm sure it WILL be faster than the 8800 GTX, but when it's out, it will no longer be competing with the GTX, but rather a refreshed and improved version of it. Who knows what these improvements will represent.
It's hardly relevant at this point, to speculate that R600 is faster than G80, when R600 will no longer be competing with it. AMD/ATI just didn't compete with G80. It's that simple...
 
so maybe you should to buy an 8800 card and Install Vista :) We'll see how your tune changes!

I'm saying is that I don't care waht bugs Nvidia has, if they have so many bugs they should a) not release that product, b) not advertise it as compatable with that software

no one buys an 8800 just to run their AERO desktop

you need to wake up and stop defending Nvidia for their shortcomings on the vista driver front

No one is defending NVIDIA on the driver front, but NVIDIA is in an unique situation right now. New hardware, new OS, new drivers. It's normal to have the problems they are/were having. Just like it will be normal to see AMD/ATI having the same problems.
 
Yes, I've read it. But again, you are missing the point. The point here is that, R600 high-end model MUST and I'm sure it WILL be faster than the 8800 GTX, but when it's out, it will no longer be competing with the GTX, but rather a refreshed and improved version of it. Who knows what these improvements will represent.
It's hardly relevant at this point, to speculate that R600 is faster than G80, when R600 will no longer be competing with it. AMD/ATI just didn't compete with G80. It's that simple...

so what does that point have to do with what I'm saying :< You asked him to prove why he thinks R600 will be faster, I just posted a glimpse of that proof for you thats all, I have no idea how R600 will do against a G81 or G90 :p nore do I care :D I'll buy which ever card WORKS and fits my needs! just stating the point for you thats all

and to Zeus, I've built my own machines since my P3 500E, and I've never run into a problem this drastic, do you play games on your 8800GTS? I actually find it hard to believe, even with XP was released, the only real problem I had with xp was some older games were not designed to run on the NT/2k/XP kernal vs 98, and that was it. I've built bleeding edge hardware systems and i've done value systems and NEVER have i run into such a huge wall like this one where I had to actually downgrade my OS to use a piece of hardware that was designed for that OS.

Honestly I think you're one of those people who defend a company to justify their purchase, When I pay for something I expect it to work, specially when its such a high end and expensive product. I didn't have a problem with my 7900,7800,6800 cards under xp nor did I have any problems with my X1800/X1900 cards under XP(And they were "new hardware")
I'm not attacking Nvidia for their products sucking, more for their claims.

AND

I'm not attacking MS, nor am I blaming them I'm solely blaming NV for their DX10 hardware not working up to MS DX10 and Vista spec, who knows how ATI is doing , probably better since they worked with MS on R500 maybe not, who knows

BTW I'm happy with my purchase, its by far one of the best cards(next to an 8800GTX) to buy for XP and gaming in general as long as you stay away from Vista =)
 
and to Zeus, I've built my own machines since my P3 500E, and I've never run into a problem this drastic, do you play games on your 8800GTS? I actually find it hard to believe, even with XP was released, the only real problem I had with xp was some older games were not designed to run on the NT/2k/XP kernal vs 98, and that was it. I've built bleeding edge hardware systems and i've done value systems and NEVER have i run into such a huge wall like this one where I had to actually downgrade my OS to use a piece of hardware that was designed for that OS.

yes i am a gamer.. :D
 
so what does that point have to do with what I'm saying :< You asked him to prove why he thinks R600 will be faster, I just posted a glimpse of that proof for you thats all, I have no idea how R600 will do against a G81 or G90 :p nore do I care :D I'll buy which ever card WORKS and fits my needs! just stating the point for you thats all

and to Zeus, I've built my own machines since my P3 500E, and I've never run into a problem this drastic, do you play games on your 8800GTS? I actually find it hard to believe, even with XP was released, the only real problem I had with xp was some older games were not designed to run on the NT/2k/XP kernal vs 98, and that was it. I've built bleeding edge hardware systems and i've done value systems and NEVER have i run into such a huge wall like this one where I had to actually downgrade my OS to use a piece of hardware that was designed for that OS.

Honestly I think you're one of those people who defend a company to justify their purchase, When I pay for something I expect it to work, specially when its such a high end and expensive product. I didn't have a problem with my 7900,7800,6800 cards under xp nor did I have any problems with my X1800/X1900 cards under XP(And they were "new hardware")
I'm not attacking Nvidia for their products sucking, more for their claims.

AND

I'm not attacking MS, nor am I blaming them I'm solely blaming NV for their DX10 hardware not working up to MS DX10 and Vista spec, who knows how ATI is doing , probably better since they worked with MS on R500 maybe not, who knows


For one. belive what you want. two, i am not trying to justify my purchase. third, I am trying to explain to you that you can't point the finger at somthing thats THIS NEW.

You know back in the day. When 2000 server came out. You know how many people had problems moving from NT to 2000? and that AD was a new thing. along with the new Exchange.

Same thing here. Newer hardware/software has been released. newer bugs and troubleshooting tactics are created.

If you been building since P3 days.. your way behind IMO. or just starting for whatever reason.

Welcome to the world of Info Tech. You have a long road ahead of you. Be ready for anything. And having problems with vista/nvidia drivers is all part if this ever changing world.


EDIT:
One last thing. IMHO if your having such problems with this and bitching up a storm. Why are you still doing what you are doing? Why arn't you looking for fixes or work arounds? instead of bitching. Do somthing productive and fix the problem. Nothing gets fixed on it's own and nothing gets fixed by complaning.
 
You're assumption that I'm new to the tech world is fairly innacruate I'm not going to get into how long I've been doing this or what I've been doing

from a consumer stand its a simple point
I don't want to be tricked into buying somethign that "DOESN'T" work as advertised or "MIGHT" work as advertised, I tried out vista on the assumption that the problem was being caused by peoples configurations, some setting somewhere, ocing, or any other variable, but condisering the fact that everything else works fine I have to conclude that its the shitty drivers(mind you I haven't tried the latest set as I don't feel like re-installing AGAIN).

and I'm no where close to being "new" to the tech world, the amount of pcs, hardware, software and issues that are around now are worlds apart, (I actually started by blowing up my 486DX4 100 which was built by AMD :D) so even comparing 15 years ago to now in terms of "experience" is not something that one could do.

edit: and as an answer to your last thing question, simple, because I've spent countless hours already doing so, and I'm pretty sure nVidia won't pay me if I do figure out a solution =)

the 8800 Cards are not new, Vista is not new . :/ so its not "THIS NEW" anymore, I didn't really care that Nvidia had no drivers for vista till the retail version was out to the masses the 8800 is now 5months out in the market, Vista was released to oems around the same time, I'm pretty sure though that Nvidia had vista long before the oems did to work on their hardware for it,

I think we've derailed this thread enough.



[RIP]Zeus;1030765174 said:
For one. belive what you want. two, i am not trying to justify my purchase. third, I am trying to explain to you that you can't point the finger at somthing thats THIS NEW.

You know back in the day. When 2000 server came out. You know how many people had problems moving from NT to 2000? and that AD was a new thing. along with the new Exchange.

Same thing here. Newer hardware/software has been released. newer bugs and troubleshooting tactics are created.

If you been building since P3 days.. your way behind IMO. or just starting for whatever reason.

Welcome to the world of Info Tech. You have a long road ahead of you. Be ready for anything. And having problems with vista/nvidia drivers is all part if this ever changing world.

BTW, comparing hardware to software problems with software to software problems and config problems are 2 different things, the main issues with 2000 Server werent around the Videocard not playing games, it was migrating their software over from NT to 2000S/AS (I didn't really work with 2k server till much later on though)
 
[RIP]Zeus;1030765058 said:
I have yet to get an answer from him on that. ;)

Well for one I made my post only a couple of hours ago so excuse me if I don't leap at the first opportunity to log in and have to answer you.

Like I said in my prior post, with some of the rumors floating around it looks as though the R600 will be in a better position to handle intensive DX10 titles like Crysis. My opinion based on these rumors is that it is possible considering the shading power and the memory bandwidth of the R600.

If the 512-bit external ring bus interface to the framebuffer is confirmed to be true, than a "half a billion+" shader ops/sec figure has been suggested. ATI's not stupid, they know that efficient utilization of shader resources will be critical for DX10 performance. I believe that to be true.

If the numbers are correct the R600 will be LEAPS ahead of the 8800GTX in terms of memory bandwidth and sheer shading power.

That good enough or should you pull the string on my back so I can sing Mammy for you?
 
Well for one I made my post only a couple of hours ago so excuse me if I don't leap at the first opportunity to log in and have to answer you.

Like I said in my prior post, with some of the rumors floating around it looks as though the R600 will be in a better position to handle intensive DX10 titles like Crysis. My opinion based on these rumors is that it is possible considering the shading power and the memory bandwidth of the R600.

If the 512-bit external ring bus interface to the framebuffer is confirmed to be true, than a "half a billion+" shader ops/sec figure has been suggested. ATI's not stupid, they know that efficient utilization of shader resources will be critical for DX10 performance. I believe that to be true.

If the numbers are correct the R600 will be LEAPS ahead of the 8800GTX in terms of memory bandwidth and sheer shading power.

That good enough or should you pull the string on my back so I can sing Mammy for you?


Not really good enough for me. As you are still baseing it on a RUMOR and not fact.

Till you get hard proof that it's FACT. then saying somthing like this doesn't help you. as you are still baseing on rumor
 
Well for one I made my post only a couple of hours ago so excuse me if I don't leap at the first opportunity to log in and have to answer you.

Like I said in my prior post, with some of the rumors floating around it looks as though the R600 will be in a better position to handle intensive DX10 titles like Crysis. My opinion based on these rumors is that it is possible considering the shading power and the memory bandwidth of the R600.

If the 512-bit external ring bus interface to the framebuffer is confirmed to be true, than a "half a billion+" shader ops/sec figure has been suggested. ATI's not stupid, they know that efficient utilization of shader resources will be critical for DX10 performance. I believe that to be true.

If the numbers are correct the R600 will be LEAPS ahead of the 8800GTX in terms of memory bandwidth and sheer shading power.

That good enough or should you pull the string on my back so I can sing Mammy for you?

You can bet that the difference between R600 and even the "old" 8800 GTX, will be minimal if any. Why ? Because the 8800 is being used during Crysis development. Since it was released actually. So Crysis is definitely not a good example to praise R600, nor any other game, that is being developed right now and is using the fastest and feature rich card in the market, i.e. 8800 GTX, during development
So again, all this is nothing more than speculation. That's why I asked for the damn full specs of R600, in a previous post. Don't you think it's about time for AMD/ATI, to reveal them ?
 
You can bet that the difference between R600 and even the "old" 8800 GTX, will be minimal if any. Why ? Because the 8800 is being used during Crysis development. Since it was released actually. So Crysis is definitely not a good example to praise R600, nor any other game, that is being developed right now and is using the fastest and feature rich card in the market, i.e. 8800 GTX, during development
So again, all this is nothing more than speculation. That's why I asked for the damn full specs of R600, in a previous post. Don't you think it's about time for AMD/ATI, to reveal them ?

I think we've received specs in rumour form :/
64SP
512B mem
512-1GB mem gddr3 or gddr4
:D ?
 
Back
Top