Next Nvidia / Ati Cards? When?

Nirad9er

2[H]4U
Joined
Feb 18, 2004
Messages
2,956
I know there might have been a thread on this, but when does everyone think the next gen nvida/ati cards will be coming out and what will they consist of. I know noone will know till it happens, but what are the rumors as of right now. Us desktop users now have some competition with the DTR equipped with x800 go /6800go ultra's. Desktops are supposed to much more powerful and superior. SO...WHAT DOES EVERYONE THINK?
 
imo i think ati will bomb out in there high end cards again like they did with the XTPE cards
 
they show off there cards some time in the fall maybe
but it will be like 1 year maybe before they hit the stores.
ATI will have the better card
 
Don't be silly, everyone knows beyond a shadow of doubt that 3dfx is going to win this round with their Nvidia/ATI killer product lines. :rolleyes:
 
tornadotsunamilife said:
i'm interested on how you have reached this decision so early on
:D Reality doesn't matter, he's already made up his mind.
----

From the sounds of it, both should be previewing the next generation soon. ATI has already hinted that the R520 will be announced soon and NVIDIA has some new chips listed (G70, NV48) in the latest leaked drivers.

When will actual cards will be available? I wouldn't hold my breath for either one.
 
sandbox said:
Don't be silly, everyone knows beyond a shadow of doubt that 3dfx is going to win this round with their Nvidia/ATI killer product lines. :rolleyes:
What's going on? 3dfx? are you serious?
 
tornadotsunamilife said:
i'm interested on how you have reached this decision so early on

I believe it's because ATI's next card is supposed to be their first true nextgen card, rather than an update of the three year old 9700 core.

I believe it's also because the people designing this new core are the exact same ones who blew Nvidia out of the water with their last core, the 9700..... A core that Nvidia has found it impossible to definitively surpass, or even equal in most regards, in the last 3 years. And this is despite ATI having not released a truly new core in that time while Nvidia has done so.

Nvidia is seriously going to have to step up to the plate with their next card because it's pretty certain when ATI drops their next core on us it will be a true revolution, just like the original Geforce and the 9700 were.

(BTW, I am not saying I think that ATI will be faster, I really don't know. I just know ATI will be damn fast and Nvidia is going to have to bring something special to beat ATI.).
 
I think Nvidia does have something up their sleeve thats why they didn't release an NV48. They are using their time to work on something else.
 
arentol said:
I believe it's because ATI's next card is supposed to be their first true nextgen card, rather than an update of the three year old 9700 core.

I believe it's also because the people designing this new core are the exact same ones who blew Nvidia out of the water with their last core, the 9700..... A core that Nvidia has found it impossible to definitively surpass, or even equal in most regards, in the last 3 years. And this is despite ATI having not released a truly new core in that time while Nvidia has done so.

Nvidia is seriously going to have to step up to the plate with their next card because it's pretty certain when ATI drops their next core on us it will be a true revolution, just like the original Geforce and the 9700 were.

(BTW, I am not saying I think that ATI will be faster, I really don't know. I just know ATI will be damn fast and Nvidia is going to have to bring something special to beat ATI.).

i wouldn't be so optimistic. ATI has manufacturing problems since the X8xx launch and still suffers from them.
Even with the new R520 the world doesn't change immediatly. They need a whole lineup of new products. Right now they are loosing market share to Nvidia.
 
Ideas:

ATi will have the better card since nVidia doesn't have to have the best card with their SLI setup.

ATi might also have the best card but not available to purchase so nVidia will have the best card you can actually buy

that's it for now
 
rcf1987 said:
I think Nvidia does have something up their sleeve thats why they didn't release an NV48. They are using their time to work on something else.


i would have to somewhat agree with you on that....but we won't really know till they start to show...

i just have this feeling ati will bomb out like they did with the x800 and x850....sorry but telling your customers they are canceling your order is quite bullshit...but what can you do...ati is after the oem market...and they could care less about there [H] customers...oh well i guess you can't please them all
 
[RIP]Zeus said:
i would have to somewhat agree with you on that....but we won't really know till they start to show...

i just have this feeling ati will bomb out like they did with the x800 and x850....sorry but telling your customers they are canceling your order is quite bullshit...but what can you do...ati is after the oem market...and they could care less about there [H] customers...oh well i guess you can't please them all


I love people who say this. When the X800 line was announced i did some comparison between that and the 6800. Made my choice, went to gameve, got a visiontek X800XT PE, had it on my doorstep in 4 days. This was way back in June. The cards were never ghosts. There was simply a limited supply due to the price range and the number ordered by each supplier.

You cant say ATI had problems with the X800PE card and not say Nvidia did the same with the 6800Ultra. They under-estimated the market a little bit for them. If you compare the earnings for both companies they have switched places repeatedly, and ATI has made massive gains on Nvidia over the last 2-3 years. Technically things couldnt be better for ATI.

ATI, i am quite sure, will hit the same mark they did as last year for a new card. Being around the first of june. I think this because the R520 is already tapped. As far as the G70 goes, who knows. I'd hope not a month or 2 after June as they will lose sales if ATI is the only thing available. Who ever said 1yr more needs to look at treands over the last 4 years. Niether company has ever gone 2 or even a year and half before releasing a brand new chip.
 
arentol said:
I believe it's because ATI's next card is supposed to be their first true nextgen card, rather than an update of the three year old 9700 core.

I believe it's also because the people designing this new core are the exact same ones who blew Nvidia out of the water with their last core, the 9700..... A core that Nvidia has found it impossible to definitively surpass, or even equal in most regards, in the last 3 years. And this is despite ATI having not released a truly new core in that time while Nvidia has done so.

Nvidia is seriously going to have to step up to the plate with their next card because it's pretty certain when ATI drops their next core on us it will be a true revolution, just like the original Geforce and the 9700 were.

(BTW, I am not saying I think that ATI will be faster, I really don't know. I just know ATI will be damn fast and Nvidia is going to have to bring something special to beat ATI.).

After reading your post I had some questions and statments to ask and remind you of before you make you full deision on this.

Just because its their 1st new core deisgn since the 9700 you belive that it will smoke the competition for that reason alone? :eek: With that in mind remember that Nvidia had such great success with the origanal Geforce cores like the GF3. That their overall core design had not changed until the creation of the FX core. Also designed by the same group that did the Gforce cores earlier and how poorly the FX cores ran. (not completely the fault of Nvidia but thats another rant)

As far as the 9700 core not being surpassed by anything Nvidia has made, thats incorrect. The newest of the cores The NV40 Gforce6800/GT/Ultra has well and by far surpassed the 9700 core. The NV40 is doing more with less clock cycles then any of the ATI cards are doing. The NV40 cores kick the snot out of a 9700.

The next gen cards For Nvidia will also be a new core based on a new process and not just a clocked up version of cores that exist now. One other thing to rember is Nvidia is and has done things ATI has never done. Nvidia Pioneers and pushes new technology. Things Like shader 3.0. As far as who will make the best or the fastest we will see. Only time will tell. But I have to say that you nor anyone here can say what is to come and how the end results will be to us on the product we recive from either of these manufactures. Just not enough fact. I think ATI will bring the heat but, I still foresee big issues with Highend cards from them. And dont worry Nvidia has plenty up their sleeves to bring something to that table that could beat anything ATI does (or atleast perform equal). The nice thing when competition such as ATI and Nvidia have with one another it makes all the better for us Price and performance wise.
 
mohammedtaha said:
Ideas:

ATi will have the better card since nVidia doesn't have to have the best card with their SLI setup.

ATi might also have the best card but not available to purchase so nVidia will have the best card you can actually buy

that's it for now

I think the SLI part of your comment will mean nothing because ATI is comming out with their own version of SLI. It will still boil down to Who makes the badest fastest monster on the block and still being able to make it usable in SLI as well.
 
Captin Insano said:
I think the SLI part of your comment will mean nothing because ATI is comming out with their own version of SLI. It will still boil down to Who makes the badest fastest monster on the block and still being able to make it usable in SLI as well.


The other question to that is...what mobo's will you have to buy to do ati's sli?
 
I don't know, I have a newfound respect for NVidia's engineering department.
The FX series wasn't the "inferior" architecture everything thought it was. Rather, the two companies had different visions for their products: ATI thought simple but fast would be better, whereas NVidia put its money on slow and intricate, but getting power out of its complexity. ATI's approach was easier, so it caught on. NVidia managed to completely turn around their approach in only two product cycles, the FX5900 and FX5950, while ATI "refined" the Radeon 9700/R300 core in the R350 and R360. Now NVidia gets to sit back and similarly "refine" the NV40 while ATI reinvents its architecture with Fudo. NVidia's been on top since the GeForce256, 3DFX was also king of the hill with its Voodoos; how many generations of inferior cards did ATI wade through before striking gold with R300?
 
Just wondering....

Are any of the next gen card going to be out for AGP or is this gen's 6800s the last of the line?

-wil
 
This be my Theory.

People have complaned about how ATI Isnt making any cards (or at least, they are in low supply). In my opinion, this is actually a somewhate cover-up. You see, the R520 supposedly has this amazing new tech, this would not of course been able to be developed within a year. So what does ati do? Take on of there best cores (9700) and Upgrade ir, persay, to the X8** lineup. However, they are still in the process of developing this new amazing tech, hence alot of there recourses are put out on that, rather then making the X8** lineup. So what do I think? In a few months ATI Will announce the R520. And it will blow people away.

(For the record, I have always found it easier to find X8** here in Canada, then it was to find Nvidias 6 series (namely GTs and Ultras), also, the Nvidia cards always seem to be WAY Above CDN MSRP, while I have found alot of ATI cards for MSRP or (even) under)
 
pigwalk said:
Just wondering....

Are any of the next gen card going to be out for AGP or is this gen's 6800s the last of the line?

-wil


AGP is dying and will be dead by this years end for any new cards they will most likely be PCI-e
 
sandbox said:
Don't be silly, everyone knows beyond a shadow of doubt that 3dfx is going to win this round with their Nvidia/ATI killer product lines. :rolleyes:

Just got the Voodoo 3... can't want to see how many 3dmarks I get... ;)
 
mavalpha said:
I don't know, I have a newfound respect for NVidia's engineering department.
The FX series wasn't the "inferior" architecture everything thought it was. Rather, the two companies had different visions for their products: ATI thought simple but fast would be better, whereas NVidia put its money on slow and intricate, but getting power out of its complexity. ATI's approach was easier, so it caught on. NVidia managed to completely turn around their approach in only two product cycles, the FX5900 and FX5950, while ATI "refined" the Radeon 9700/R300 core in the R350 and R360. Now NVidia gets to sit back and similarly "refine" the NV40 while ATI reinvents its architecture with Fudo. NVidia's been on top since the GeForce256, 3DFX was also king of the hill with its Voodoos; how many generations of inferior cards did ATI wade through before striking gold with R300?

Simple picture quality between the 5900 and the 9700 let alone anything else was like night vs day. ATI was that much clearer. And the problems with the 5200-5950 FX line are still apparent today. The D3D architechture on the CARD is fubard. They half assed it so every time vertex shaders were used the cards performance dropped massively although it wasnt suppose to. The DX of choice for these cards to get the best performance is still 8.1 for games today. HL2 included. If this is intricate then get it away from me.

Captin Insano said:
AGP is dying and will be dead by this years end for any new cards they will most likely be PCI-e


AGP will be around for YEARS. PCI cards, not PCI Express, but PCI cards are still one of the biggest sellers in the UK. I'd bet my life and a new card of anyones choice on AGP lasting for at least another 2 years of manufacture. AGP is still MUCH MUCH more widely popular then PCX in gaming computers today as well. Just because you dont use it doesnt make it obsolete to everyone. It hasnt even been broken in terms of bandwhith.
 
mavalpha said:
I don't know, I have a newfound respect for NVidia's engineering department.
The FX series wasn't the "inferior" architecture everything thought it was. Rather, the two companies had different visions for their products: ATI thought simple but fast would be better, whereas NVidia put its money on slow and intricate, but getting power out of its complexity. ATI's approach was easier, so it caught on. NVidia managed to completely turn around their approach in only two product cycles, the FX5900 and FX5950, while ATI "refined" the Radeon 9700/R300 core in the R350 and R360. Now NVidia gets to sit back and similarly "refine" the NV40 while ATI reinvents its architecture with Fudo. NVidia's been on top since the GeForce256, 3DFX was also king of the hill with its Voodoos; how many generations of inferior cards did ATI wade through before striking gold with R300?
Thats a load of bullshit .
Simple? Throw SM2 code at a focking 9600 pro*purposely midrange card against a highend card*, and an 5900, and tell me which one is faster? And don't tell me about quality, since the 5900 is going to have to run FP16 for decent performance, and FP24 shows no artifacting even in HL2, and far cry.
Ati was obviousky right, seeding as todays DX9 games either run like ass on FX cards, are run them in DX8.1 mode since they're so slow.
The turning point for ati was the 8500, by early 2002, the drivers were good(for mine in jan,, drivers are mantastic) and faster than the TI 500, with more features.
 
Shifra said:
Thats a laugh. AGP will be around for YEARS.
True, but only new chips that are retrofitted with a bridge chip will have AGP support. From the looks of things so far, don't expect AGP to have *full* support for very long.
 
Hate_Bot said:
This be my Theory.

People have complaned about how ATI Isnt making any cards (or at least, they are in low supply). In my opinion, this is actually a somewhate cover-up. You see, the R520 supposedly has this amazing new tech, this would not of course been able to be developed within a year. So what does ati do? Take on of there best cores (9700) and Upgrade ir, persay, to the X8** lineup. However, they are still in the process of developing this new amazing tech, hence alot of there recourses are put out on that, rather then making the X8** lineup. So what do I think? In a few months ATI Will announce the R520. And it will blow people away.

(For the record, I have always found it easier to find X8** here in Canada, then it was to find Nvidias 6 series (namely GTs and Ultras), also, the Nvidia cards always seem to be WAY Above CDN MSRP, while I have found alot of ATI cards for MSRP or (even) under)


Thats a pretty big stretch on what you think ATI is/has done as far as a cover up. I suppose you also dont belive that we have been to the moon either.(must be a conspiricy theroist) As far as prices Most pricing you see is not so high because of manufactures it the retailers. They find a way to gouge and crank up the prices. If your smart and look you will find plenty of them @ MSRP no problems. All we can say is we will see. But i dont expect much.
 
mavalpha said:
I don't know, I have a newfound respect for NVidia's engineering department.
The FX series wasn't the "inferior" architecture everything thought it was. Rather, the two companies had different visions for their products: ATI thought simple but fast would be better, whereas NVidia put its money on slow and intricate, but getting power out of its complexity. ATI's approach was easier, so it caught on. NVidia managed to completely turn around their approach in only two product cycles, the FX5900 and FX5950, while ATI "refined" the Radeon 9700/R300 core in the R350 and R360. Now NVidia gets to sit back and similarly "refine" the NV40 while ATI reinvents its architecture with Fudo. NVidia's been on top since the GeForce256, 3DFX was also king of the hill with its Voodoos; how many generations of inferior cards did ATI wade through before striking gold with R300?

One of the other things to remember as well as what you are saying here is that Nvidia had a whole different language planned then what it ended up being forced into using. It of course would have a full support of Open GL But Nvidia had origanaly intended for their own language to get support and be run. N-language for the FX cards. Instead MS decided at the last minute not to support it in windows enviroment and went with a newer version of Direct x and left Nvidia out in the rain while ATI got in on the DX9 suuprot with the inside track with MS. So besides revisions in the actual hardware is also the xtreeme talent that Nvidia had to get the performance out of its drivers from its software team.
 
Captin Insano said:
One of the other things to remember as well as what you are saying here is that Nvidia had a whole different language planned then what it ended up being forced into using. It of course would have a full support of Open GL But Nvidia had origanaly intended for their own language to get support and be run. N-language for the FX cards. Instead MS decided at the last minute not to support it in windows enviroment and went with a newer version of Direct x and left Nvidia out in the rain while ATI got in on the DX9 suuprot with the inside track with MS. So besides revisions in the actual hardware is also the xtreeme talent that Nvidia had to get the performance out of its drivers from its software team.


Not to be a jerk but do you have any proof or link that there was something MS said they'd do exclusively for Nvidia, or that these cards included something like this "N-language"? Cause that sounds like a load of BS to me and its definitly the first time i have ever heard that as an excuse for the bad performance of these cards from anywhere. Forgive me if i dont take just your word =p
 
Shifra said:
Not to be a jerk but do you have any proof or link that there was something MS said they'd do exclusively for Nvidia, or that these cards included something like this "N-language"? Cause that sounds like a load of BS to me and its definitly the first time i have ever heard that as an excuse for the bad performance of these cards from anywhere. Forgive me if i dont take just your word =p
QFT.
I call bullshit.
Even with a new language, you can't get floating point precision performance from no where.
So stop being an nvidia whore, OGL of DX9, FX is a flop, no language is going to magically up the FP performance.
The FX seems to designed as a dx8.1 video card with DX9 as an after thought.
 
Shifra said:
Cause that sounds like a load of BS to me and its definitly the first time i have ever heard that as an excuse for the bad performance of these cards from anywhere. Forgive me if i dont take just your word =p
Sounds the same to me.

There is a little truth in it though. For example, nvidia practically dictated what new DX8.0 features were because they were the first to implement pixel shaders and programmable vertex shaders in hardware for the PC (or at least actually make cards with those chips). Of course nvidia didn't invent PS or VS, but implementing them in hardware, creating a specific shader language and having MS adopt it made it a standard. Others would have to follow that as a minimum. But that's not the end... MS added ATI specific upgrades to DX8.1 (PS1.4, nvidia only supported 1.0-1.3).

And in the same way, MS pushed aside some suggestions from nvidia for PS2.0 and adopted ATI's suggestions instead. Why? Maybe because nvidia messed up, or ATI's specifications were better? Who knows. And it's not exactly like MS shut out nvidia. DX9.0 also contained nvidia specific features like PP and 32-bit FP support. And like before with ATI, MS adopted more nvidia specific features in an update (9.0c).

The NV35 and higher FX series (5700/5900/5950) were ok, but DX9 performance was still not up to par. There probably is a low level reason why the NV35/38 lagged so badly in DX9 performance (the NV35 was significantly faster than the NV30), and likely that is caused by the lack of influence they had over the DX9 spec.
 
Shifra said:
Not to be a jerk but do you have any proof or link that there was something MS said they'd do exclusively for Nvidia, or that these cards included something like this "N-language"? Cause that sounds like a load of BS to me and its definitly the first time i have ever heard that as an excuse for the bad performance of these cards from anywhere. Forgive me if i dont take just your word =p


Not that I would expect you to take my word for it but Im pretty sure my good freind isnt lying to me. He is the general manager for the entire open GL code for Nvidia. But if I can still find some info on it I will post it. Just remember that just cause you didnt hear about it doesnt mean its bullshit though. I have seen several threads where people all together say BS and then out of no where comes a link or an artical that proves it and everone had no clue. But I will see what I can find.
 
Captin Insano said:
Thats a pretty big stretch on what you think ATI is/has done as far as a cover up. I suppose you also dont belive that we have been to the moon either.(must be a conspiricy theroist) As far as prices Most pricing you see is not so high because of manufactures it the retailers. They find a way to gouge and crank up the prices. If your smart and look you will find plenty of them @ MSRP no problems. All we can say is we will see. But i dont expect much.

Im not a conspiricy theorist, This is more like buisness. I can't really find any reason why ATI isnt making as many cards as they could excep for this soul one. Nvidias chipset is more complex, so it doesnt have to do with the complexity of the card. Plus when you actually know about marketing schemes and big buisness cover-ups, it doesnt seem that complex. After all Windows ME was a mockup of 98, they released it because it was taking longer then they thought it would to release XP (windows I mean)
 
pxc said:
Sounds the same to me.

There is a little truth in it though. For example, nvidia practically dictated what new DX8.0 features were because they were the first to implement pixel shaders and programmable vertex shaders in hardware for the PC (or at least actually make cards with those chips). Of course nvidia didn't invent PS or VS, but implementing them in hardware, creating a specific shader language and having MS adopt it made it a standard. Others would have to follow that as a minimum. But that's not the end... MS added ATI specific upgrades to DX8.1 (PS1.4, nvidia only supported 1.0-1.3).

And in the same way, MS pushed aside some suggestions from nvidia for PS2.0 and adopted ATI's suggestions instead. Why? Maybe because nvidia messed up, or ATI's specifications were better? Who knows. And it's not exactly like MS shut out nvidia. DX9.0 also contained nvidia specific features like PP and 32-bit FP support. And like before with ATI, MS adopted more nvidia specific features in an update (9.0c).

The NV35 and higher FX series (5700/5900/5950) were ok, but DX9 performance was still not up to par. There probably is a low level reason why the NV35/38 lagged so badly in DX9 performance (the NV35 was significantly faster than the NV30), and likely that is caused by the lack of influence they had over the DX9 spec.


See, i think you're idea of one card being favored is skewed. Until someone shows otherwise, MS has never favored anyone. These things you name for example, Shader Model 3.0 and 32bit Floating Point are things microsoft chose to impliment. Nvidia simply went to use them faster. If you'll notice the next ATI card will be using the exact same 2 things as well as HDR which you left out. These are microsoft Driect X features, not Nvidia features. Nvidia is simply the only company who chose to use them as soon as possible. Personally i think it cost them the performance lead and they really didnt gain much visually either. Was it worth it? Right now i think not. Do not forget microsoft is a multi-BILLION dollar company, in terms of the Direct X engine, they offer things up, and the graphics card companies collectively follow. Not the other way around. I personally see OpenGL dying as a graphics engine for any computer games with the features DX continues to support. OpenGL can sure as hell look pretty, thats for damn sure as Doom3 proves, but one engine for all is simpler, and lets face it, DX9 isnt looking bad either, take a look at F.E.A.R. or farcry. Within this last year, and many games coming up, inculding the Unreal Engine 3, are making the transition to DX, less and less are using OpenGL still. This means Nvidia better start taking up the slack to me, because i personally want to be playing the hottest titles at the fastest frames, in DX that has been ATI for a long enough time to prove they know what they're doing.
 
Shifra and Moloch, you may find this article interesting; it's a "post mortem" of R3x0 and NV3x, if you will.
If you don't want to read the whole thing, a few highlights:
_2.htm said:
The NV3x architecture is built for maximum flexibility, and that flexibility comes at the price of overall speed in standard pixel shading operations...
The ability to handle both FP16 and FP32 does give a very significant degree of flexibility to NVIDIA hardware. I do agree with NVIDIA that partial precision is a valid function. Why waste bandwidth and space internally in a chip if a shader will not benefit from higher precision? From a developers view this is a bit different, as they have to code their shaders to do either FP16 or FP32, depending on the needs. ATI and DX9 basically allow the developers to code entirely in FP24, which gives a good tradeoff in current generation applications with speed and precision.
_3.htm said:
So, in current applications, the overall VS throughput using VS 2.0 shaders will be significantly faster on the ATI parts, not only due to the extra VS unit, but also to the decreased complexity of the ATI units being able to process the VS 2.0 based shaders faster. The more complex NVIDIA VS units are not as fast per clock, but their complexity approaches that of the VS 3.0 specification. Here the NVIDIA part is more future proof, but due to this increased complexity, the performance is not as fast as the simpler and more focused ATI part...
ATI’s approach is simply brute rendering force, with lots of pixel shaders, vertex shaders, and bandwidth. NVIDIA appears to be laying the foundation for future products that can handle highly complex shaders, unfortunately that philosophy is hurting them in the short run. Right now ATI is running with its advantage in PS/VS 2.0 performance. This may not be the case in the future though.
Now I just need to find articles that explain how NV40 is a simpler architecture than NV3x. :rolleyes:
 
mavalpha said:
Shifra and Moloch, you may find this article interesting; it's a "post mortem" of R3x0 and NV3x, if you will.
If you don't want to read the whole thing, a few highlights:


Now I just need to find articles that explain how NV40 is a simpler architecture than NV3x. :rolleyes:
Um, no thanks.
I dont cares what some article says, I can look at benchmarks and see how shitty the card is.
What's the point of being able to run more instructions if you can't even run the standard ammount with any kind of speed?
Linking to articles about the FX is focking pointless, it sucks no matter what anyone says.
 
Don't get me wrong, I never said the FX series was good. I just said it was innovative and a clear step in one direction, while 97/9800 was innovative for different reasons and a step in the opposite direction. I am not trying to defend NVidia's decisions behind NV3x; I am defending them for an amazing turnaround, making a full 180 between NV3x and NV4x in a meager year and a half.
 
mavalpha said:
Don't get me wrong, I never said the FX series was good. I just said it was innovative and a clear step in one direction, while 97/9800 was innovative for different reasons and a step in the opposite direction. I am not trying to defend NVidia's decisions behind NV3x; I am defending them for an amazing turnaround, making a full 180 between NV3x and NV4x in a meager year and a half.
Innovative != slow.
There is nothing positive about the FX, except perhaps people seeing that ati can wup on nvidia.
 
And another !!!!!! thread.

F-A-N-B-O-Y is a forbidden word around here? lol
 
ati had a seperate company design there nvidia killer core...ok there making it this time anyone remember the last time they ati made there own core that did any good :rolleyes:
i am rooting for ati this time because i want the Xbox2 to own ps3 and its supahype cell processor and a nvidia core
 
Back
Top