RIP ATi...

psikoticsilver

Weaksauce
Joined
Oct 6, 2004
Messages
92
www.ati.com --> www.ati.amd.com

RIP ATi...
I hope your graphics cards continue to put little wet spots in my pants. <3
Let's all bow our heads and mourn. Let us pray that when the revolution comes, our new rulers at Nvidia Corporation treat us nicely even though we liked HL2. :)

Jokes aside now (I love my ATi and Nvidia cards so this is not a flame thread), I'll move onto my real point. I read an article today about AMD's new Fusion program. What do you guys think about the ramifications of AMD incorporating graphic chips into their CPU's? Personally I think, at least for the initial flavors, they could only be comparable to the integrated chips we have at the moment (of course, up to date for the time which they are released).

Power specs could get better, less clunky PCI slots, smaller pc's... there are some good posibilities here.
Imagine the bandwidth you could provide a GPU using hyper transport... *drool*
Also imagine what sorts of memory we'll have to supply these new CPU/GPU cures...
On the flip side it could hurt overclockers. We'll adapt, as always. :)

How long do you guys think ATi will actually continue to make GPU chips specifically designed to compete in the add-in market?
 
WOW. I'm just going to sit back and watch this thread horribly deteoriate and get locked..
 
Well it's nice you have an opinion. That was a rough draft post, I was getting kicked off my computer for a minute. It's been revised.
 
Holy FUD Batman!

1) AMD will still continue pushing boundaries in the discrete video card sector.

2) AMD will begin working on embedded video cards within the processors for main stream OEM pc and notebooks.

This merger is a great thing for both companies. Moreover IF anyone end up in the hot seat it would be nVidia.
 
I like the fact that I might only have to start buying ONE waterblock ;).

Eh, I dont think we're gonna see anything "high end" for QUITE some time... An R600+ an FX-62 or something would generate MASSIVE amounts of heat... Like, 300w of heat...


Which, of course, is bad!
 
In answering your last question first, they'll be making expansion graphics boards for a long time to go. AMD would be shooting themselves in the foot if they stopped the number one or number two graphics card maker from making cards. That's not why they bought ATI. It will be strange seeing AMD branded video cards, but I'm looking forward to seeing (hopefully) the DrectX 10 AMD x2000. ATI isn't gone, just reborn. And it's better for both companies, I fully expect to see more efficient, cooler processors now. Perhaps we could even have a dual core GPU?

As far as building the GPU in the CPU, it would be good for budget rigs, but I'm scared how much it would cost to upgrade those. You would essentially be making two upgrades at once. On the positive side, you wouldn't have to have bandwidth problems as the GPU is right on the CPU die. I like that idea. I see the first ones maybe having the performance of the 9200 or 9600 chipset, and moving up from there. With the requirements of Vista, I would keep a graphics chip that could run Aero as a bare minimum for the integrated chip.
 
the fusion processor will definatly usher in a new generation of low power, high performance laptops. crysis 2 over lunch, anyone?
 
Arcygenical said:
I like the fact that I might only have to start buying ONE waterblock ;).

Eh, I dont think we're gonna see anything "high end" for QUITE some time... An R600+ an FX-62 or something would generate MASSIVE amounts of heat... Like, 300w of heat...


Which, of course, is bad!

not for me! i don't have a heater in my room, i rely on my computer to keep me from dying. 3dmark at night keeps me toasty.

j/k :p
 
psikoticsilver said:
Yanno byne, I didn't even think of it's use in laptops.

Holy crap this could be kinda cool.
aye, i have also heard some talk about CPU/GPU unification bringing PCs closer to the performance of consoles since the biggest difference today is the physical distance between CPU and GPU.
 
Kinda of mabey a stupid comment but did you notice the URL? www.ati.amd.com Sounds like ATI isn't going to die but become one with the universe. It is exciting to see AMD build there own chipsets and the possiblity of a CPU/GPU chip.
 
psikoticsilver said:
Jokes aside now (I love my ATi and Nvidia cards so this is not a flame thread), I'll move onto my real point. I read an article today about AMD's new Fusion program. What do you guys think about the ramifications of AMD incorporating graphic chips into their CPU's? Personally I think, at least for the initial flavors, they could only be comparable to the integrated chips we have at the moment (of course, up to date for the time which they are released).

Upside: Like having integrated memory controller, nice having an integrated graphics option with the CPU. Means there is less possibility for the motherboard manufacturer to screw things up and better CPU -> GPU communication.

Downside: Integrated graphics on the mobo are fine, as they are not (much) 'wasted space' when you DO have a discreet graphics card in place. They are just a nice bonus when you are 'between cards'. Having it on the CPU, though...I mean, it's always going to *be* there...taking up space...creating heat....even when not really in use!

?Upside?: Now, with an integrated graphics chip presumed (in 100% of systems, in instead of current ratio)....could they do something like Intel is doing with laptops? IE., have a low-power integrated solution AND a discreet GPU...just using one in 2d mode, and the other in 3d mode? Would be interesting seeing the solution needed to arrange that, BUT...would also be very cool. PCs, when 'idle', would drop power consumption to a FRACTION of what they are now. It has been said that replacing 3x 60w incandescent lights with corresponding brightness CFLs would - if done in each US household - reduce pollution by the same amount as taking 3.5 million cars permanently off the road. If PCs could, instead, reliably reduce 'idle' usage by 100w or more? Niiiiiiiiiiiiiiice....
 
dderidex said:
Downside: Integrated graphics on the mobo are fine, as they are not (much) 'wasted space' when you DO have a discreet graphics card in place. They are just a nice bonus when you are 'between cards'. Having it on the CPU, though...I mean, it's always going to *be* there...taking up space...creating heat....even when not really in use!
Thats why I'm not looking forward to it; it's one thing if it's going to be actually used (like you outlined with the Intel example) but if it's going to be sitting there doing nothing then I don't want it.

Now if the graphics are on par with what the 6150 IGP offers and manufactures start making small, Mini ITX mobos for the chip then yeah I'd be interested in it for HTPC builds but I don't really see that happening.
 
The graphic card warzone is going to be confusing if ATI cards are made green due to AMD's color scheme. :eek:
 
This is a good business move with Vista setting 3d card base requirements so high.
 
Arcygenical said:
I like the fact that I might only have to start buying ONE waterblock ;).

Eh, I dont think we're gonna see anything "high end" for QUITE some time... An R600+ an FX-62 or something would generate MASSIVE amounts of heat... Like, 300w of heat...


Which, of course, is bad!
If "quite some time" means never, then you're correct. On-chip GPU's are not performance parts by any stretch of the imagination. It's a cost saving technology for PDA's, cell phones, and such. Until we get to the point where we can generate graphics that are indistinguishable from real life, onboard and especially on-chip GPU's will never make sense as a high-end part. To enthusiasts like us, this technology means absolutely nothing. It doesn't mean better graphics and smoother FPS, nor does it mean that ATI will never make another high-end card.

Something that might matter to enthusiasts, however, is the fact that AMD typically beats either graphics company to market with smaller die processes, so ATI might get smaller dies than they would have otherwise.
 
I'm hoping maybe AMD will revise the Catalyst Control Centre... I don't really like how it takes so long to load and really isn't that good. 'Advanced mode' has features for 8 year olds. Maybe new driver updates will be better for me considering i'm running an AMD/ATI system. :p
 
Lsv said:
This is a good business move with Vista setting 3d card base requirements so high.

Whaaat? graphics card requirements high? Where do you get that from? I just put Vista on my IBM T60 and all it has is the Integrated graphics of the i945GM chipset. You could fart and produce more highly rendered textures than that thing...yet it runs Aero just fine, smooth as silk and overall Vista runs really nice on it.

btw, specs are for the T60:
Intel Core Solo T1300 (1.66ghz)
1Gb DDR2 @ 333mhz
60gb SATA AHCI hdd
DVD writer
Bluetooth
802.11a/b/g
and some other crap that doesn't really matter.
 
You guys are missing the point.
With a halfway decent gpu on die, it can be used for things OTHER than graphics. Parallel, multiple high powered FPU's come to mind.
 
|CR|Constantine said:
Holy FUD Batman!

1) AMD will still continue pushing boundaries in the discrete video card sector.

Why do they call them "discrete" when there is nothing discrete about an X1950XTX?
 
economies of scale... better able to absorb market shocks - something AMD needed...

anyway, think this: the older fab plants of amd can switch to graphics chips as they are the same micron size. - This saves money - and generates new markets for the old fabrication specs... meaning more $ off of "old, outdated" fab plants that would have to be updated expensively to the new size next generation cpus - now can do gpus with out much updating and expense. Brilliant. Also, excess fab plant production can be swapped over to gpu or cpu depending on demand (Yeah, yeah over simplification, but the concept is true...)
 
personally i'm suprised (ok i'm not that suprised) that the GPU market hasn't gone the route of the CPU yet..

maybe AMD will make it happen now that they have ATI..

and that is that the GPU goes to a socket on the motherboard design instead of a dedicated card..

ofcourse the issue of standardizing sockets and chips may be a logistical nightmare (a s939CPU motherboard with a s874GPU)...

and i'm sure the memory bandwidth is a huge issue, (do you have a dedicated DDR slot just for the GPU socket? while having hte option just to allocate system memory)


but the idea might be right up AMD's ally, go to a much more modular design instead of a fixed approach..
 
HvyMtl said:
....
(Yeah, yeah over simplification, but the concept is true...)
....

horribly over simplified, but the concept is infact true. ATI has already placed orders for all its R600 chips and possibly the R620 (if --as i suspect it will be-- just a die shrink to 55nm of the R600) from TSMC
 
Also let's not forget AMD has been talking about making other AM2 slots available for co-processor and other add-in chips using hypertransport. Could AMD offer a GPU core with onchip memory as a AM2/AM3 compatible chip?
 
psikoticsilver said:
Also let's not forget AMD has been talking about making other AM2 slots available for co-processor and other add-in chips using hypertransport. Could AMD offer a GPU core with onchip memory as a AM2/AM3 compatible chip?

that would be interesting indeed. imagine paying $150 for the equiv. of an x1950xtx just because all you'd have to buy was the core. I for one wouldn't mind getting a graphics board that you could swap the GPU on, since most graphics boards use the same generation of memory for a few core revisions (for example my x1800xt with ddr3...if it had a socket I'd be willing to buy the core of the x1900xt and drop it on there)
 
HvyMtl said:
economies of scale... better able to absorb market shocks - something AMD needed...

anyway, think this: the older fab plants of amd can switch to graphics chips as they are the same micron size. - This saves money - and generates new markets for the old fabrication specs... meaning more $ off of "old, outdated" fab plants that would have to be updated expensively to the new size next generation cpus - now can do gpus with out much updating and expense. Brilliant. Also, excess fab plant production can be swapped over to gpu or cpu depending on demand (Yeah, yeah over simplification, but the concept is true...)
It is mostly about economies of scale, yes, but I don't know about using the old fabs for GPU's. CPU's went straight from 90 to 65 nm process, while the new GPU's are 80 nm, so the old fabs would be quite useless. But if they build enough fab capacity to produce the CPU's and GPU's, it will be cheaper per-unit for both and the GPU's will see smaller dies faster than they would have otherwise.
 
I thought that AMD were going to keep ATI a seperate brand. I don't like that they changed their site...too much green...I like it red
 
DAmnit, why are the companies I like getting killed!? ATi, RoXor (In The Groove, it's a game), all so recently...
 
i so SAW this comming for the day it started. ( no more comment on this post. )
 
Wouldn't socketing the GPU be a generally bad idea? A socket introduces another point of failure on an already stressed conductor. The biggest reason, I thought, GPU memory would reach such high speeds is that if was a permanent and short connectuon to the GPU's.
 
ryan_975 said:
Wouldn't socketing the GPU be a generally bad idea? A socket introduces another point of failure on an already stressed conductor. The biggest reason, I thought, GPU memory would reach such high speeds is that if was a permanent and short connectuon to the GPU's.
Exactly. It limits upgradability and stress on the mobo, for NO gain.
 
CrimandEvil said:
And Intel's ego is too big (both companies's egos actually).

Maybe, but Intel doesn't generally make stupid business decisions. That's why they make so much fricken money.
 
Dan_D said:
Maybe, but Intel doesn't generally make stupid business decisions. That's why they make so much fricken money.

first Celeron's with no L2 cache
then Celeron's with full speed L2 cache while P2's had only half speed.
RD-RAM
Prescott. Stupid business decision that had brand recognition and loyalty
 
Back
Top