AMD & ATI - Some Questions Answered

Mike160 said:
The one thing I hope will happen is that ATIs Linux drivers improve, as AMD supports open source software.

Exactly.

Their Linux drivers are horrible. They're so bad, I normally don't even bother installing them.

Nvidia is just overall more stable IMHO.

This is horrible news. I always felt like AMD + Nvidia was like an Intel CPU + an Intel chipset-based mobo.
 
I too am concerned on how this will affect AMD cpu's and nvidia graphics cards as this has a been my favorite combo. Will Nvidia be able to keep thier licensing for AMD chipsets?

I wish Intel would have bought ATI :p
 
2 concerns:

1. it doesn't make sense for ATI to continue making chipsets for intel CPU's. is it not a conflict of interest? will intel fans constantly fight bugs that'll never get fixed from ati chipsets and support calls to ati will be met with.. well it's intel CPU's. why not go with AMD? i don't see this happening for long. i believe the statements in the press are referring to existing/currently developed (but perhaps unreleased) chipsets for intel. but i think there will be a deadline. after the last R&D chipset for intel makes its financial returns... amd would want to close this "loop".

2. what's HT3's? memory bandwidth vs. onboard gddr4 spex's potential bandwidth? isn't video card memory bandwidth so much higher than HT3? would GPU socket am2 or am3 be limited to HT's ceiling bandwidth? show me some #'s. and would that mean that if you had 8GB of RAM, 1 to 2GB of system RAM will now be GPU's?
 
I think the #1 question I have is how this will affect the ATI + Intel relationship. With Intel supposedly giving up on the high end gaming chipset market, and supposedly pulling ATi's bus license, that leaves only nVidia chipsets for high end Intel enthusiast motherboards.
 
I cannot say I am terribly enthusiastic about this, I honestly think ATI benefits far more then AMD will.
While I have owned and will continue to own various ATI cards, from a chipset consideration I would not own an ATI chipset MB if you paid me. So that in mind, I sincerely hope this move is not AMD attempting to get an inhouse chipset to compete with Intel. right now my setup is; Nvidia/Nforce/SLI in my Gaming rigs, and ATI in my Media Center. Both do a stellar job at the tasks I use them in.

End result for me is, If everything basically remains the same with Nforce platform maintaining the performance standard it as set. I am perfectly happy. I just do not think AMD is doing themselves any favors here. I could be wrong, I have been before, but at a glance this does not thrill me.
 
im worried that it could become a AMD/ATi solution VS intel/nvidia solution, giving us (the consumer less choice in our part combination of CPU/GFX. However this could also be a huge advantage for the consumer since AMD now owns ATi, they should both work flawlessly together probably pushing intel/nvidia into a stronger partnership making thier products work semlessly. Could go either way. Regardless in the next few years I dont think were going to see many AMD/nvidia or intel/ATi boxes...
 
I can see where an on chip GPU and PPU are around the corner. :)
 
An in-house chipset developing team is just what AMD needed. I do feel kind of bad for nVIDIA though ;)
 
Hell I will swing one for the fences.

Who is to say in a few years that Intel decides to purchase, lure away all the top talent at Nvidia and enter the discrete graphics market? Surely Intel has the capacity and financial resources to do it themselves.
 
JediFonger said:
2. what's HT3's? memory bandwidth vs. onboard gddr4 spex's potential bandwidth? isn't video card memory bandwidth so much higher than HT3? would GPU socket am2 or am3 be limited to HT's ceiling bandwidth? show me some #'s. and would that mean that if you had 8GB of RAM, 1 to 2GB of system RAM will now be GPU's?

Hmm, Hypertransport has an aggregate bandwidth of up to 41.6 Gigabytes/sec according to the Hypertransport.org page. That's for a single 32bit 2.6GHz link.

The peak memory bandwidth of an X1900XTX is 49.6 GB/s so the difference there isn't that great. Plus you could have more than one link as well. The system memory itself is really slow though; it's an order of magnitude smaller.


However, if you use HTX, then it's severely limited. Can only go up to 16bits and limited at 800MHz (6.4GB/s). This won't be used for anything but the low end for the time being.
 
griff30 said:
I can see where an on chip GPU and PPU are around the corner. :)


Precisely. This is the center of the entire business deal.


Intel is currently going ape shit working on new types of massively parallel processors. At this point upping transistor density on CPU dies is showing deminishing returns. Getting more bang for your future x86 buck means running a lot of cores together, a la Sun's Niagra. The advantages of this is fourfold.

The idea is you get a few dozen very simple mini-cores in a CPU, run properly compiled software and the result is so fucking fast everything else looks like it's moving backwards. Parallelism is creeping its way into everyday software and in a couple years, by the time the mini-cores hit the street, it will be the defacto standard.

As a bonus, doing all the R&D for a mini core and then pasting it 32 times onto a die is a piece of piss compared to engineering a modern, more massive core. It takes fewer staff to achieve more real work in less time. Win win win. The third knock-on bonus effect is Intel is looking at being able to do a complete product cycle every 12-18 months, like today's GPU market. You're about to witness processor technology go nova.

And the final fourth bonus to the new x86 revolution is that with the simpler cores, adding GPU functionality to each core is too easy not to do. Never mind a GPU on die, a more generic CPU can do the same amount of work with a quarter of the raw power. Once you start seeing the new mini-core CPUs on store shelves, graphics cards will become dinosaurs.

Now, with Intel doing full steam in this direction, AMD is looking pretty much fucked. They need to overcome a head start, they need to make up some GPU functionality and they need to pick up their development pace from a jog to a flat out sprint. Either they build all this up from the ground, which takes forever and they lose even more time to Intel, shooting themselves in the foot, or they buy it strait up. Not a hard choice. Compared to traditional CPU design timelines, the GPU guys can run circles around them with time to spare for a dinner and a movie.

With the purchase of ATI, AMD is looking forward to taking the fight against Intel well into the new era of mini-core CPUs. What nVidia's plans are I can't really tell, but I susspect Intel couldn't care less. Intel does not have the same urgent need to accelerate their abilities as AMD does. nVidia is not worth the price to Intel. Mark my words, you will never see the two of them merge or co-operate in a similar manner to ATI-AMD.
 
You know what would be really cool? A motherboard, with an AMD socket, and integrated ATi graphics, except you can swap out just the GPU or graphics memory when new stuff comes out.

So, instead of buying a whole new card when new stuff comes out, just swap out little processors and ram thingies.
 
psychot|K said:
You know what would be really cool? A motherboard, with an AMD socket, and integrated ATi graphics, except you can swap out just the GPU or graphics memory when new stuff comes out.

Except that the socket would always have to change to accomodate new designs... the memory traces would be too long/too short for a given chip, etc.

From what I have heard, the high speeds at which these chips connect to the memory is the stickling point and is hard to get right. I am not really sure this would work out... ;)

Maybe if the chips were pin compatible, and the memory stayed the same?
 
So, instead of buying a whole new card when new stuff comes out, just swap out little processors and ram thingies.

hehe i like the way you think, and entirely do-able, but you will only see a solution with upgradable integrated GPU's and GDDR memory in low end solutions, ultimately this would help production and help cut down on costs, not to mention ATi cold entirely rely on their fab process instead of 3rd party support with PCB's, would be a good money maker, as well as increase the size of your motherboard

high end would still be on its own PCB and be slotted, for the least possible amount of latencies and best possible way of cooling the card as well
 
Unless you upgrade within a year or two, you just bought your last GPU ever. The need for an extra appendage on your machine is coming to a close. AMD and ATI are not partnering up to make extra neat-o add ons to your machine. They need eachother because before your testicles have time to drop both the traditional x86 CPU and add-on graphics chip will be extinct, and the only way either can addapt is to compliment eachothers' strengths.
 
I just hope nvidia continues to make chipsets for AMD, I love the nforce 4 & 5 series boards.
 
Green Genes said:
Morons,

Unless you upgrade within a year or two, you just bought your last GPU ever. The need for an extra appendage on your machine is coming to a close. AMD and ATI are not partnering up to make extra neat-o add ons to your machine. They need eachother because before your testicles have time to drop both the traditional x86 CPU and add-on graphics chip will be extinct, and the only way either can addapt is to compliment eachothers' strengths.

:mad: Nobody on here is a moron, we (at least most of us) know that the industry is going to move in that general direction. Calling everyone a moron and then flamming us with a point that was already touched up on earlyer in the thread makes YOU look like a moron. Be more mindful of the fact that most computer enthusists are intellegent people before you make an ass clown out yourself next time. (insert the line about some 30 year old living in his parents basement...ect.)

;) Thanks
 
News flash! You can know lots about ANYTHING and still be a total idiot!

Every single human being on the planet is fucking stupid. You, me, every single chip head on these boards, your mom, and the King of Spain. Everybody generally knows a tiny bit more than somebody else on one topic or another, but averaged out over the breadth of their knowlege the entire human population is pretty much the same temperature, which is about a tenth of a thousandth of a degree above absolute zero.

Relax, you're stupid. So what. That's like saying you're alive, you're human. You're born with that trait, you share it with the world, and you might as well fucking be happy about it instead of getting upset when people remind you every so often. Thank them instead. It's the internet. It was created so humorless techie morons like us can call eachother out with no chance of broken teeth. If you don't like it, go snitch to a mod, they'll protect you from the harsh, cruel world.

Thank you.


Now, back to the topic, I can't see where it's been discussed at all. I'm too stupid, give me a hand. I'm just confused about why people are getting excited about imaginary products that will never exist when we already can make an accurate guess as to what the future holds. People seem more worried that their favourite brand won't be around any more, but that doesn't matter because by the time we see the true results of this decision our favourite brands won't even exist. We know they're not teaming up to maintain the status quo. Trying to project today's product lines three or four or five years into the future is missing the point entirely. You don't make multi-billion dollar purchases to just keep on keepin' on. Watch! Wait! Feel the excitement!

Remember the very first Voodoo? Or your first CD-ROM drive or your first sound card? Remember the GeForce 256? Who here can remember a time before the x86? Unless you're old and flabby, this is probably the first time you'll see anything actually change in the tech world instead of only improving. Be fucking thankful, don't waste the energy on imaginary corporate rivalries and bullshit viral marketing. It took humans a few thousand generations just to figure out how to move from catching and taming wild fire to rubbing two sticks together. Civilization is defined by technology, and in our liftimes we're seeing progress jump billions of times faster than it ever has or is ever likely to again in the entire history of our planet.

As a demographic, the nerds are positioned at the crest of the wave, they're the fucking leading edge. But even though we're huge nerds we're all pretty fucking dumb. When we hear the winds of change, the best our imaginations can do is regurgitate marketing strategies and mistake them for dreams.
 
Well, now the underdogs are licking each other's nuts. How cute. Am I the only one here who would rather have seen AMD acquire nVidia? Yes, I have always favoured them. I know, I know, corporate loyalty over product loyalty is retarted but I still do it. I guess I like orange and green more than orange, green, and red. Maybe it's the Irish in me.

EDIT: I havn't heard anyone mention that nVidia is also in bed with Sony for the PS3, and Apple. Not too shabby.
 
It seems to me like AMD wanted ATI for the ATI's graphics technology. I believe, correct me if I'm wrong, but the Nvidia mobo chipsets have done MUCH better than ATI's mobo chipsets. However, ATI and Nvida are neck-in-neck in the graphics chip race. AMD already has a partnership with Nvidia to produce motherboards with the Nvidia chipset, so to me, it seems AMD didn't purchase ATI to try to develop ATI's mobo chipset, but rather to integrate ATI's graphics technology into AMD CPU's/chipsets. That being the case, AMD could continue to partner with Nvidia on motherboard bus chipsets and use ATI's technology to focus on integrated graphics. Not sure how the ATI/Nvidia technology would play together on one mobo though.
 
Mega2 said:
but nvidia doesn't cost 5.4billion to buy out.

Yeah, nVidia is worth a lot more as a company then ATI is so i really dont think AMD could of pulled that one off. I'm surprised though that Intel hasn't stepped up and outbid AMD though since ATI and Intel have always had closer ties and Intel is marketing Crossfire on their boards whereas they dont SLI. I think this merger can go one of two ways really. Ither it will be semi successful and ATI and AMD will remain to be independent of each other or AMD will try to change ATI's strategy and possibly lead to ATI's destruction in the graphics world against nVidia. Ither way the CPU realm isn't looking very bright bright for AMD over the next year or two.

This post here details what the possibilities are for AMD with the K8L release next year.

savantu said:
What will they fight K8L with ?

If K8L arrives in the summer , Intel will have the following :

Conroe DC
-3.33/1333MHz or 3.46GHz/1066MHz for XE ( that's conservative )
-3.2GHz E6900 ( again conservative since this is/was a Q1 2007 part )
Kentsfield QC
-2.66GHz for XE
-2.4GHz normal

If K8L arrives in oct-nov , Intel will have the following:
-The previous parts +
-Conroe 2 or Penryn is close to release (nov-dec 2007 or january 2008 )
-45nm 6MB L2 1333FSB
-3.33/1333MHz or 3.46/1066MHz top desktop
-3.66/1333MHz XE

If something really really bad happens with K8L and it slips into 2008 it will face Penryn until the summer when Nehalem appears.
What do we know about Nehalem ?

Not much , brand new microarch ( completly different from Conroe IIRC ) , IMC ( dual channel DDR3 ) , CSI with at least 9.6GBs I/O BW , 4 cores .Most likely it will have a shared cache ( 6-12MB ) and special purpose accelerators in it.

Even if by some miracle K8L recaptures the performance crown between summer and Penryn launch , I doubt it will hold for more then 4 months.
 
I'm really looking at this whole thing with mixed feelings. It's been a while since I've bought new hardware, and it will probably be a little while more until I can. But the new Core CPUs from Intel look rather promissing, and it will be interesting to see if ATI takes away crossfire technology like some other people have been hypothesizing in this thread. Personally, I have yet to be swayed to the ATI camp, and I hope that AMD will stay open to nVidia. As much as I have been thinking about switching over to AMD (especially since the new price cuts), I have been given pause by Intel's ability to lower their processors power consumption and heat problems that are currently plaguing me with my P4. It would be disappointing if we see ATI pull the plug on intel and leave them without a dual video card option just as Intel releases what could be a great line of processors.

 
I dont much care for AMD. The last time I built myself a new system AMD was nither reliable nor fast. In my opinion AMD has since cured the speed issues, but never did really remedy the reliability. That is arguable...
on the same note, I felt the same about Nvidia. To me they look fuzzy and I dont think them reliable. Again, just my opinion...
My opinion of Intel and ATI are the opposit. Stability before speed, but keep performance competative.

I dont think ATI and AMD currently share the same enthusiast base. (for the most part.) And that being said, I didnt bilieve this merger was true until I saw it on the front page of [H]. I think that both ATI and AMD both acnowledge that their respective customers would be offended at being forced to use their partner's hardware. Provided Intel and Nvidia allow, I think you will be able to use ATI and Nvidia GPU's on either Intel or AMD architecture. I do find it likely that AMD will produce all in one solutions (integrated GPU's) for low end and mobile applications, but I dont think add on cards will die.

I also dont think that Intel and Nvidia will make any binding agreements. SLI doesnt work on Intel's 975 because Nvidia refuses to release drivers supporting the 975 chipset. Hacked drivers are available to use SLI on the i975x chipset. I do bilieve they will form better relations for the short term, but I dont think Intel is going to bind itself to Nv.

You dont think Intel can make a GPU? They already make one. Its not that impressive, but it doesnt need to be. It doesnt currently make sence for intel to make a top of the line GPU because they wouldnt get any return on it. Intel probubly wouldnt get any more money for the chip if they sold a bottum end or top end GPU on their boards, so why spend the R&D and resources on a high end GPU? Intel could suprise us with a bad ass GPU if they so desired.

And a note to those mentioning VIA and S3; Via bought S3 to get a better integrated video on their mobo's. [H] linked to a review of their latest SLI capable "chrome" GPU. -Just a side note...
 
nhusby said:
I dont much care for AMD. The last time I built myself a new system AMD was nither reliable nor fast.

AMD has never really been, IMO, about speed, more power than anything else. If it's speed you're after, Intel is ideal.
 
I wonder if AMD will help ATI develop some dual core GPU's... Because from the looks of things, GPU's are what are holding gaming back (unless I'm mistaken).
 
isn't intel's color red and so is ATI vs. NVidia and AMD's green?

based on just colors wouldn't it make more sense for AMD to buy NVidia and Intel to buy ATI?
 
nhusby said:
I dont much care for AMD. The last time I built myself a new system AMD was nither reliable nor fast. In my opinion AMD has since cured the speed issues, but never did really remedy the reliability. That is arguable...

I have had the opposite experience. I have been building, modifying and overclocking PC's since the 486, I have in that time never had an intel CPU that was as reliable or fast as their Pentium Pro was when it first came on the market. Pretty much since the avent of competition, I have yet to own an Intel processor that gave me the warm and fuzzy on anything. I maintain a variety of PC's in my home, and right now my Media Center is intel powered. To be honest I have little nice to say about it, If I had the spare money I would dump it in a heartbeat.

nhusby said:
on the same note, I felt the same about Nvidia. To me they look fuzzy and I dont think them reliable. Again, just my opinion...
My opinion of Intel and ATI are the opposit. Stability before speed, but keep performance competative.
Cannot argue that point much, Nvidia's Image quality has never been their strong suit. I currently run an Nvidia card as my gaming card and ATI for my MC, simply because when I game I despite having a system powerful enough Still turn half the stuff off since I compete. It is a bad habit, but I cannot seem to break it. Outside of IQ I never would notice though Nvidia has never given me reason to not use their cards as my Primary gamer, so I continue to use them. I personally have no objections to ATI cards, they are generally very good, Their MB chipset on the other hand I do have a fairly low opinion of.

nhusby said:
I dont think ATI and AMD currently share the same enthusiast base. (for the most part.) And that being said, I didnt bilieve this merger was true until I saw it on the front page of [H]. I think that both ATI and AMD both acnowledge that their respective customers would be offended at being forced to use their partner's hardware. Provided Intel and Nvidia allow, I think you will be able to use ATI and Nvidia GPU's on either Intel or AMD architecture. I do find it likely that AMD will produce all in one solutions (integrated GPU's) for low end and mobile applications, but I dont think add on cards will die.
Agree, I would be extremely irritated if I were forced to Use ATI's cards with AMD. I doubt AMD is dumb enough to do that however.

nhusby said:
I also dont think that Intel and Nvidia will make any binding agreements. SLI doesnt work on Intel's 975 because Nvidia refuses to release drivers supporting the 975 chipset. Hacked drivers are available to use SLI on the i975x chipset. I do bilieve they will form better relations for the short term, but I dont think Intel is going to bind itself to Nv.
Cannot argue this much.

nhusby said:
You dont think Intel can make a GPU? They already make one. Its not that impressive, but it doesnt need to be. It doesnt currently make sence for intel to make a top of the line GPU because they wouldnt get any return on it. Intel probubly wouldnt get any more money for the chip if they sold a bottum end or top end GPU on their boards, so why spend the R&D and resources on a high end GPU? Intel could suprise us with a bad ass GPU if they so desired.
I do not think they can make a GPU, because they do in fact make one, advertise the hell out of it as being the greatest integrated thing since sliced bread, and it is the shittiest GPU on the market. I do not think they are capable of suprising us, simply based on the fact that they still cannot grasp that the Intel GPU is an absolute piece of crap.

nhusby said:
And a note to those mentioning VIA and S3; Via bought S3 to get a better integrated video on their mobo's. [H] linked to a review of their latest SLI capable "chrome" GPU. -Just a side note...

On that note, I find it rather amusing that Previous to the Nforce 2 platform VIA was a top Chipset developer. Post Nforce 2, you nearly never hear about them, and you laugh at anyone who buys a VIA mainboard. Sadly I have more respect for VIA then I do the Crossfire platform still.
 
Dekoth said:
I do not think they can make a GPU, because they do in fact make one, advertise the hell out of it as being the greatest integrated thing since sliced bread, and it is the shittiest GPU on the market. I do not think they are capable of suprising us, simply based on the fact that they still cannot grasp that the Intel GPU is an absolute piece of crap.

Yea, an integrated graphics solution that can run Aero on Vista and is comparable to a 6800 is shitty. :rolleyes: The past couple generations of Intel Graphics have been VERY good for what their target audience is and the latest GPU will hit the "light gamer" as well.

Dekoth said:
On that note, I find it rather amusing that Previous to the Nforce 2 platform VIA was a top Chipset developer. Post Nforce 2, you nearly never hear about them, and you laugh at anyone who buys a VIA mainboard. Sadly I have more respect for VIA then I do the Crossfire platform still.

Yea... cause those 4 in 1 Drivers were fun to use. LOL... come on now. VIA and SIS have a history of releasing a product WAY before it's ready and letting their customers do the debug on it and they release software to fix the hardware problems. Yea... pass on that.
 
I like crossfire, but I'd never buy an ATI crossfire board... I'd rather use the intel 975x... as for AMD CrossFire... I guess you might be stuck. I wouldnt count out the possability of running future GPU's on an nf chipset tho. A chipset sale to someone who uses an ATI GPU pulls in the same $$$ as a chipset sale to someone who uses an Nvidia GPU.

EDIT: posted while I was typing...

Poncho said:
Yea, an integrated graphics solution that can run Aero on Vista and is comparable to a 6800 is shitty. :rolleyes: The past couple generations of Intel Graphics have been VERY good for what their target audience is and the latest GPU will hit the "light gamer" as well.
That was the point I was after... It wouldnt pay for intel to produce top of the line graphics.

Poncho said:
Yea... cause those 4 in 1 Drivers were fun to use. LOL... come on now. VIA and SIS have a history of releasing a product WAY before it's ready and letting their customers do the debug on it and they release software to fix the hardware problems. Yea... pass on that.
I have never like any mobo with a via chipset, nither for Intel nor AMD.
 
so will intel work with ati?

and will nvidia still work with amd?

:confused:

whats to come?!
 
ATI Chipsets, save for the 3200 are crap.

I hate the fact that AMD chose ATI. I hope ATI wakes up and finds out Linux is a real OS, and it's time to support it on both their VPUs and chipsets. I am tired of modding or going without drivers in Linux for ATI hardware. When I think of ATI video cards, I think of crappy third party cards with cheap parts and no support. I think of the time I wasted in Linux before throwing out a 9600XT and buying a lesser Nvidia card simply because it worked without a hassle. (Actually, when 2.6.x came out, Nvidia was pretty much ready, while ATI took tons of time just to come out with a driver, however crappy it was.)

Nvidia is just way better IMHO.

Stable - Intel CPU + Intel Mobo
Stable - AMD CPU + Nvidia Mobo
Unstable - Any CPU + VIA Shitset
Unacceptable Support - Any CPU + ATI Chipset || ATI VPU + Linux

Nvidia already has support for most of their hardware in Linux. I remembe having to mod the fglrx driver(s) in Linux just to be able to use my 9600... that was a royal PITA. I hate ATI.

I remember once that I had to recompile my kernel and mod several files so I could get DMA working on a Xpress 2000 chipset for a friend's laptop. Nvidia chipsets worked out of the box. (I compile my own kernels anyways, but nonetheless, ATI sucks.)
 
Josh_B said:
ATI Chipsets, save for the 3200 are crap.

I hate the fact that AMD chose ATI. I hope ATI wakes up and finds out Linux is a real OS, and it's time to support it on both their VPUs and chipsets. I am tired of modding or going without drivers in Linux for ATI hardware. When I think of ATI video cards, I think of crappy third party cards with cheap parts and no support. I think of the time I wasted in Linux before throwing out a 9600XT and buying a lesser Nvidia card simply because it worked without a hassle. (Actually, when 2.6.x came out, Nvidia was pretty much ready, while ATI took tons of time just to come out with a driver, however crappy it was.)

Nvidia is just way better IMHO.

Stable - Intel CPU + Intel Mobo
Stable - AMD CPU + Nvidia Mobo
Unstable - Any CPU + VIA Shitset
Unacceptable Support - Any CPU + ATI Chipset || ATI VPU + Linux

Nvidia already has support for most of their hardware in Linux. I remembe having to mod the fglrx driver(s) in Linux just to be able to use my 9600... that was a royal PITA. I hate ATI.

I remember once that I had to recompile my kernel and mod several files so I could get DMA working on a Xpress 2000 chipset for a friend's laptop. Nvidia chipsets worked out of the box. (I compile my own kernels anyways, but nonetheless, ATI sucks.)

I agree 99% of what you said, i dont argee on "Unstable - Any CPU + VIA Shitset" My via Chipset board runs great or its just my luck, i have been useing it for a year now for my gaming server and runs like a champ, im not sure in gaming i didnt tested that much.
 
(cf)Eclipse said:
gah, delete this post :p

why??

AdamRigs: im thinking hes saying peanutbutter and chocolate go well together
AdamRigs: and oil and water dont
Ozzimark: yeah, i missed the or

thats right. you did miss the or
 
Heres what I'm hoping we see out of this:

Short term - Full AMD system, AMD CPU, AMD Chipset, AMD Video. There should be no trade secrets between the two companies once merged, so a reliable, fast chipset shouldn't take long to develope.

Standardized Dual-Video format - meaning, if a board supports SLI, it supports Crossfire, with no addon cables, etc. Even possibly to the point that you can fun SLI-Fire with an AMD/ATI + nVidia GPU.

HT becoming the communication system for AMD/ATI GPUs which leads to the GPU+CPU on one chip, each with its own memory access. Which will of course, pave the way for even more integrated CPU/GPU combinations.
 
Josh_B said:
ATI Chipsets, save for the 3200 are crap.

I hate the fact that AMD chose ATI. I hope ATI wakes up and finds out Linux is a real OS, and it's time to support it on both their VPUs and chipsets. I am tired of modding or going without drivers in Linux for ATI hardware. When I think of ATI video cards, I think of crappy third party cards with cheap parts and no support. I think of the time I wasted in Linux before throwing out a 9600XT and buying a lesser Nvidia card simply because it worked without a hassle. (Actually, when 2.6.x came out, Nvidia was pretty much ready, while ATI took tons of time just to come out with a driver, however crappy it was.)

Nvidia is just way better IMHO.

Stable - Intel CPU + Intel Mobo
Stable - AMD CPU + Nvidia Mobo
Unstable - Any CPU + VIA Shitset
Unacceptable Support - Any CPU + ATI Chipset || ATI VPU + Linux

Nvidia already has support for most of their hardware in Linux. I remembe having to mod the fglrx driver(s) in Linux just to be able to use my 9600... that was a royal PITA. I hate ATI.

I remember once that I had to recompile my kernel and mod several files so I could get DMA working on a Xpress 2000 chipset for a friend's laptop. Nvidia chipsets worked out of the box. (I compile my own kernels anyways, but nonetheless, ATI sucks.)

I've heard a great quote. Linux is free, so long as your time is worthless. ATI has products for office systems, game systems, and multimedia systems. Linux is typically in the server segment. The market share of linux multimedia systems was probubly considered negligable by ATI.
 
Dekoth said:
I do not think they can make a GPU, because they do in fact make one, advertise the hell out of it as being the greatest integrated thing since sliced bread, and it is the shittiest GPU on the market. I do not think they are capable of suprising us, simply based on the fact that they still cannot grasp that the Intel GPU is an absolute piece of crap.

Intel probably knows better than anyone else what a piece of crap they're advertising. That's the whole point of advertising, you make shit look like gold. Intel has the brains and the brawn to make products that outperforms the entire microchip industry. Where and when they exert that ability is determined purely by business, not by entering a pissing contest.
 
Back
Top