RIP ATi...

ryan_975 said:
first Celeron's with no L2 cache
then Celeron's with full speed L2 cache while P2's had only half speed.
RD-RAM
Prescott. Stupid business decision that had brand recognition and loyalty
QFT

Now, what was that about Intel not making stupid business decisions? ;)
 
I can't believe no one is excited about the prospect of very powerful FPU's from the gpu that the CPU may have direct access to, if AMD decides to throw a GPU onto their wafers.
 
eno-on said:
I can't believe no one is excited about the prospect of very powerful FPU's from the gpu that the CPU may have direct access to, if AMD decides to throw a GPU onto their wafers.

Ahh, they're too busy fighting eachother. That's why politicians always have a good laugh bending us over the table without lubricant.

*****

I mentioned socketed GPUS and so forth... right now ATI GPUS have something like 20X the performance in Folding @ home (Correct me if I'm wrong) as compared to a CPU.

Can anyone see the bigger picture? AMD has been speaking about, and leaking documents about, their ability to utilize hypertransport for add-in math coprocessors and the sort (certain maths being the weakness of AMD CPUS in comparison to Intel). Think about it for a second. Now I know having more interconnects that aren't hard-wired is a bad thing, but technology will overcome this (the new call flip grid chip array cheese sandwhich thing is nice too ain't it?). We all know how powerful hypertransport is. Now think about putting hypertransport 2 links between a R600 chip and an AM3 socket. Yeah, are our pants getting a little wet now? XD

Not only this, but each socket will get it's own memory interface, DDR2 or DD3. Integrating and combining A64's interface with ATi's Ring-Bus architecture... wow. Think bandwidth here people. STAGGERING BANDWIDTH. This is probably more of a server application at first.

Also we could examine the fact that more cores are being thought of to be better now. Intel is experimenting with 80-core chips. Look at GPU architecture now. Each one of these pipelines is really nothing more than a multi-purpose FP CPU. Has anyone thought that basically AMD's fusion might actually be just that? Creating a massively multi-threaded, multi-cored CPU/FPU/GPU.

What CPU does this remind you of? CELL. Can anyone deny the awesome power of a cell processor? AMD is thinking much, much larger than the computer graphic's market. Architecture, both computer and structual, go through periods of increasing and decreasing complexity. Right now AMD is looking to decrease system complexity. IT's going to merge these two components together for more than their two respective uses. AMD can use these GPUS as FPU co-processors just as IBM is using the FPU units in CELL.

AMD is trying to do something that is almost ESSENTIAL for it's survival. With all these guys gunning for the CPU spot; IBM, Intel, and now ever Nvidia and Microsoft... buying ATi is the smartest thing Daamit ever did. At least in this humble man's opinion.

Where's my bacon?
 
dderidex said:
Upside: Like having integrated memory controller, nice having an integrated graphics option with the CPU. Means there is less possibility for the motherboard manufacturer to screw things up and better CPU -> GPU communication.

Downside: Integrated graphics on the mobo are fine, as they are not (much) 'wasted space' when you DO have a discreet graphics card in place. They are just a nice bonus when you are 'between cards'. Having it on the CPU, though...I mean, it's always going to *be* there...taking up space...creating heat....even when not really in use!

?Upside?: Now, with an integrated graphics chip presumed (in 100% of systems, in instead of current ratio)....could they do something like Intel is doing with laptops? IE., have a low-power integrated solution AND a discreet GPU...just using one in 2d mode, and the other in 3d mode? Would be interesting seeing the solution needed to arrange that, BUT...would also be very cool. PCs, when 'idle', would drop power consumption to a FRACTION of what they are now. It has been said that replacing 3x 60w incandescent lights with corresponding brightness CFLs would - if done in each US household - reduce pollution by the same amount as taking 3.5 million cars permanently off the road. If PCs could, instead, reliably reduce 'idle' usage by 100w or more? Niiiiiiiiiiiiiiice....

Roger that!
 
byne said:
aye, i have also heard some talk about CPU/GPU unification bringing PCs closer to the performance of consoles since the biggest difference today is the physical distance between CPU and GPU.

You mean laughably bad performance? Yep, I'm looking forward to that.

My view on the situation:
The companies must be in dire straits. Both of them completely missed the boat on this generation of products. AMD is getting their ass handed to them on a silver platter with the Conroe series of CPUs for the desktop market and has yet to have any physical product to compete. ATI, while their cards offer the same great performance they always have, completely (IMO) missed the boat with Crossfire. nVidia made SLI simple, you flip a switch in the BIOS and use a bridge clip. With Crossfire you need to make sure you have the right board for it (Xpress 200 chipset for no master card) then wade through the depths of the internet trying to find something that is actually labeled as a master card and then connect them with an external dongle as if you didn't already have enough crap coming out of your computer.

Anyway, that's my 2 cents :)
 
With Crossfire you need to make sure you have the right board for it (Xpress 200 chipset for no master card) then wade through the depths of the internet trying to find something that is actually labeled as a master card and then connect them with an external dongle as if you didn't already have enough crap coming out of your computer.

I love my new x1950pro's, single slot design...nice two way interconnects on the INSIDE...No external dongle. And in WoW @1920x1200 4x AA Temporal enabled, Adaptive AA set to quailty, 16xAFHQ fps 25 to 60 so i think it works pretty nice :D
 
I appreciate you two voicing opinions without insulting eachother. :)

I'm really happy this thread got off it's feet. 2000 views, keep it up!
Mm.. I can't wait to fold on a fusion..


Where I agree they are in Dire Straights, I definately see this being a good move. What do you think of the theories I suggested a few posts up?
 
DaBoonies said:
I love my new x1950pro's, single slot design...nice two way interconnects on the INSIDE...No external dongle.

Now that's music to my ears, however still too-little too-late IMO.


psikoticsilver said:
Where I agree they are in Dire Straights, I definately see this being a good move. What do you think of the theories I suggested a few posts up?

I'm way too tired to be able to make sense of that right now, however expect a post from me in a few hours after I take a nap :)

That's_Corporate said:
So... if I buy a PS3 (nVidia makes the GPU), will my X800 XT go on strike and explode itself?

Yes.
 
Back
Top