Did AMD just forfeit its high end graphics division?

No they just fired some of the cleaners and assistants of their assistants. Getting rid of the part time hired help that's all. Could mean a lot of things. Maybe they got some Einstein with big salary demands lined up or maybe its just time for a spring clean in R&D that's underperforming. I know the first guy I will fire would be the mug that came up with the naming schemes. Why not just cut of your foot before competing in the 100 dash.4 modules/8 cores If they did it right 4 modules/8 threads people wouldve looked at thread to thread not core to core performance but they didn't and up the core vs core tests went.
 
I'm not worried about it yet. Times are tough and they're likely just trimming some fat. They're making plenty of money off their GPU and server line of procs so I think AMD can weather this out. Besides, how much demand is there out there for super high end, $600+ video cards when a $200 6950 can max out anything at 1920x1080. The video card market is fast outpacing what the video game market is requiring so maybe its not so bad if AMD slows their video card market down a little to help make ends meet a little.
 
Well, their senior engineer, Carrell Killebrew, is gone from AMD.

For me, I'd like to say "No" as well to AMD killing off their graphics card division. Why? It would be idiotic to kill off what was in all honesty doing much better than their desktop CPU division.

We have until November 9th. to know for sure what Read is planning for AMD's future. Like I said in the other thread, I'm looking forward to GCN (7900-series) cards next year. If that's gone, then all we have left is Nvidia.

I'm sure [AMD's CEO] Read won't do something that stupid. I'm worried about the company though. His remarks in that email notifying employees of his intention for the company and focusing on "low power, cloud computing and emerging markets" doesn't seem to give me much hope.

I'm crossing my fingers hoping he doesn't do anything too drastic. Laying off 10% of your employees is drastic already, but I'm sure it was needed to save AMD money. Cutting off your desktop performance CPU and GPUs to save more money is just too drastic, and too stupid. It's basically committing suicide on the company if he does this.
 
I think what Read is trying to do is focus AMD at what it currently is competitive at. Thats low power, cloud computing ... I think its a mistaken to give up on highend GPU's since that research can some day find its way in gaming consoles, or even low power computing (APU's, etc)

The company i work for has shed 50% of its 10,000 employees over the past 5 years and our products have actually gone up in quality and innovation... so that doesnt have to be completely negative. im more worried that Read's focus isnt inline with my buying demands.
 
I'm betting they do give up on the high end market (In fact I've been saying it since AMD bought them). It's very expensive to compete in and they are not making enough money to hang on. NVIDIA has the luxury of selling high end chips at a huge profit with Tesla and Quadro, a market that AMD has never competed well in. NVIDIA has also had the top chip for several generations and larger market share. Which is all necessary to fund a large R&D budget. Market cap wise AMD is worth half of NVIDIA and 1/6 of Intel. That makes it tough to compete against both of them at the same time.
 
One of the speculations surrounding Dirk Meyer's resignation involved AMDs seemingly lack of direction/plan in tackling the growing handheld mobile market.
 
And, according to this, AMD has a larger share in the (overall) GPU market.

http://news.softpedia.com/news/Nvidia-Takes-Back-Graphics-Market-Share-From-AMD-214876.shtml

From your link....

According to the latest data provided by Market Watch, Nvidia has managed to secure in the second quarter of this year 54.6% percent of the total graphics card market,

Now if you want to talk the overall GPU market Intel still holds that crown at about 60%. But we are talking about "video cards". Also your link is for the 2nd qtr. Here is an article for the 3rd qtr.

http://jonpeddie.com/press-releases...up-16.7-over-last-quarter18.4-over-last-year/

Intel led the quarter with 36.5% growth, Nvidia had 30% growth

Also

Nvidia gained 10.9% in desktop discrete.

NVIDIA exited (forced out by Intel) the integrated market. But if you want to just talk discrete GPU's (video cards), then NVIDIA is looking to be up around 60% or more right now. (54.4% from the previous qtr from your link + an additional 10.9% gain this qtr). I'm pretty sure they are up around 90% for the professional/HPC market.
 
Amd just got to hold out till Francios Piednoel retires. Then they might have a chance again.
 
Now if you want to talk the overall GPU market Intel still holds that crown at about 60%.

I thought we were talking about Nvidia and AMD; and their respective market share? Who cares if Intel leads?

But we are talking about "video cards". Also your link is for the 2nd qtr. Here is an article for the 3rd qtr.

You said market share, you never specified what exactly. I assumed you meant overall. But now it's just video cards? Well okay then.


NVIDIA exited (forced out by Intel) the integrated market. But if you want to just talk discrete GPU's (video cards), then NVIDIA is looking to be up around 60% or more right now. (54.4% from the previous qtr from your link + an additional 10.9% gain this qtr). I'm pretty sure they are up around 90% for the professional/HPC market.

That wouldn't surprise me, and I don't need to verify it. However, according to this, AMD's discrete sales also increased. To what, I don't know.

http://www.pcper.com/news/Editorial/AMD-Reports-Q3-2011-Results

HPC numbers? Not sure. Hard to verify the market since it's compiled with the discrete. I'm sure NVIDIA is winning though.
 
Last edited:
maybe HP should buy the GPU division from AMD.
 
I don't think they need to sell off anything. I just don't think they're in that deep a pile of shit. Yes Bulldozer was a dumpster fire but Llano has been a success, they make a ton on the server side of the market, their GPU line is doing great and Bulldozer sucking really only hurts them in the enthusiast market which is 5-10%. There is still a lot of potential there. They get some new leadership and proper restructuring and they can be very competuive again.
 
I don't think they need to sell off anything. I just don't think they're in that deep a pile of shit. Yes Bulldozer was a dumpster fire but Llano has been a success, they make a ton on the server side of the market, their GPU line is doing great and Bulldozer sucking really only hurts them in the enthusiast market which is 5-10%. There is still a lot of potential there. They get some new leadership and proper restructuring and they can be very competuive again.

They were actually very profitable the last quarter; to the surprise of many. Bulldozer, apparently, will be a beast for the server platform, which is all that AMD cares about. I seriously don't see them exiting the high-end, particularly when it does so well for them. I think Read knows something. What exactly? I'll be damned if I know.

HP is the largest PC manufacturer.

LOL! Make a GPU and force it on the consumer, huh? I guess that could/would work. Hehe.
 
maybe HP should buy the GPU division from AMD.

In august HP was considering getting out of the PC business.
http://articles.latimes.com/2011/oct/28/business/la-fi-hp-pc-20111028


HP is probably the largest consumer for AMD chips. (I rarely see them in Dell or other computers). They should just buy all of AMD. They could be like Apple when they were using PowerPC.

It would not surprise me if all the recent management changes and layoffs are prepping AMD for sale.
 
Public perception (no matter how far off it is) does a lot for sales. For instance, the 'intel inside' stickers that meant nothing about performance was a marketing scheme that seemed to work pretty effectively. WE all know it doesn't matter who makes the best single/dual GPU high end graphics card if you are buying a card with just enough to play flash games and watch 1080p videos. But most people dont have a clue what the difference is or how much of a difference there is between a 6750 vs 4890, or a gtx 580 vs a gt 520.

So nVidia and ATI both do the same thing the car manufactures do. Notice the build and price your fiesta link on that page. Having the fastest car(d) on the market automatically means all your car(d)s, even the lower end ones, are faster than the competition. So if the Ford Mustang wins a race over a Chevy Camaro, that automatically means a Ford Fiesta is a better car than a Chevy Sonic ;) . Or so enough people apparently believe because car companies sink a ton of cash into racing. And the same thing with these overclocking contests sponsored by intel, AMD, whoever. No matter how well your 2500k overclocks, your Celeron is still a Celeron.

So although it really makes no physical difference if AMD stopped making high end graphics cards on their mainstream and lower end cards, they could loose buyers perception of having a card worth buying at any price range.
 
I'm not worried about it yet. Times are tough and they're likely just trimming some fat. They're making plenty of money off their GPU and server line of procs so I think AMD can weather this out. Besides, how much demand is there out there for super high end, $600+ video cards when a $200 6950 can max out anything at 1920x1080. The video card market is fast outpacing what the video game market is requiring so maybe its not so bad if AMD slows their video card market down a little to help make ends meet a little.

Because I play at 6100x1200
 
So although it really makes no physical difference if AMD stopped making high end graphics cards on their mainstream and lower end cards, they could loose buyers perception of having a card worth buying at any price range.

Well for years AMD was often seen as the "budget brand" and in reality they are. I bought an Athlon II system because it was incredibly well priced. Maybe this perception is translating to their video cards. So it may make sense for them to just move in that direction. It certainly seems that way based on the Anandtech article.
 
AMD just needs to find their place in the market and stick to it. Not every company needs to be in that #1 slot. I see no reason why they would back out of making GPU's.
 
I hope not. I just got my system and have ran anything on it but I wanted a 570, but the price/performance and unlock of the 6950 was too much to pass up. But if you just saw reviews and didn't look at all angles a 570 is most likely what I would have bought.

And ago much of the high end stuff depicts the rest of the line. Build a chip that can run at decent clocks 50% of the time with in tollerence, then take the slower ones or ones that have bad shaders or cores and make them the lower end cards.
 
Last edited:
AMD killed off their high-end CPU market, so why not kill off their high-end GPU market. Makes sense to me. Those dummies will all be on the unemployment line. Intel & NVidia will buy them out for nothing & laugh all the waty to the bank. What a shame.
 
AMD killed off their high-end CPU market, so why not kill off their high-end GPU market. Makes sense to me. Those dummies will all be on the unemployment line. Intel & NVidia will buy them out for nothing & laugh all the waty to the bank. What a shame.


Rigggghhtttt.........And Walmart is going to buy out Intel to create a budget minded computer line called "Cheapos."

I like how this topic has descended into the realm of ridiculous. It was bound to happen though. BTW, I thought it was illegal for Intel to buy AMD? Guess not.
 
I still think AMD and nVidia will merge. The logic of such a marriage is pretty intense.

AMD has an x86 license and still has world-class CPU engineers. nVidia has a market-leading ARM CPU product for everything below Bobcat wattage requirements. nVidia has more developed GPU-based (CUDA) compiled software tools. AMD also has the technology ironed out (no mean feat) for fusing CPUs and GPUs into the same die. This same AMD experience could also be used to build better ARM APUs than nVidia is capable of doing right now.

Furthermore, the writing seems to be on the wall for desktop CPUs: we don't really need any more performance on the desktop. (No, I'm not talking about gaming computers/game consoles - markets that AMD and nVidia would have locked up entirely if they merged, at least for the GPU portion in the near future, and probably for the whole CPU/GPU package further down the road). That means the way forward in desktop/server computing will be building code that uses GPUs for pretty well any floating point instructions, and much more massively parallel integer instruction architectures for CPUs. AMD's Bulldozer is exhibit 'A' for this type of x86 CPU, a chip designed almost entirely with integer-based supercomputing in mind.

The alternative for both companies is virual extinction at the hands of Intel. So, all Jen-Hsun Huang (a former AMD CPU engineer) has to do is make the proposal, with reasonable terms, before AMD falls too far behind Intel to catch up again.

Mark my words. We'll be hearing about an nVidia/AMD marriage soon...
 
NO WAY! Team Red 'til I die!!!

Or until nVidia comes out with something better. ;)

On a serious note this would be very, very bad for us consumers. :(
 
I still think AMD and nVidia will merge. The logic of such a marriage is pretty intense.

NVIDIA has moved on from x86 and is developing a full fledged ARM CPU. The only company that does well in the x86 market is Intel. At best NVIDIA would buy AMD's corpse on the cheap like they did 3DFX.

However, I could see AMD just shrinking to end up like VIA. Maybe not a big player but still in business and making money.
 
I still think AMD and nVidia will merge. The logic of such a marriage is pretty intense.

AMD has an x86 license and still has world-class CPU engineers. nVidia has a market-leading ARM CPU product for everything below Bobcat wattage requirements. nVidia has more developed GPU-based (CUDA) compiled software tools. AMD also has the technology ironed out (no mean feat) for fusing CPUs and GPUs into the same die. This same AMD experience could also be used to build better ARM APUs than nVidia is capable of doing right now.

Furthermore, the writing seems to be on the wall for desktop CPUs: we don't really need any more performance on the desktop. (No, I'm not talking about gaming computers/game consoles - markets that AMD and nVidia would have locked up entirely if they merged, at least for the GPU portion in the near future, and probably for the whole CPU/GPU package further down the road). That means the way forward in desktop/server computing will be building code that uses GPUs for pretty well any floating point instructions, and much more massively parallel integer instruction architectures for CPUs. AMD's Bulldozer is exhibit 'A' for this type of x86 CPU, a chip designed almost entirely with integer-based supercomputing in mind.

The alternative for both companies is virual extinction at the hands of Intel. So, all Jen-Hsun Huang (a former AMD CPU engineer) has to do is make the proposal, with reasonable terms, before AMD falls too far behind Intel to catch up again.

Mark my words. We'll be hearing about an nVidia/AMD marriage soon...

Not going to happen. That would create a virtual monopoly in the graphics card market. AMD would have to spin ATI off before an AMD/NVidia merger would be possible.
 
Because I play at 6100x1200

Yes but the VAST majority, like 99% according to the last Steam survey, don't. The bulk is at 1920x1080 and below. So yeah, there is very little demand for $600 video cards these days.
 
Public perception (no matter how far off it is) does a lot for sales. For instance, the 'intel inside' stickers that meant nothing about performance was a marketing scheme that seemed to work pretty effectively. WE all know it doesn't matter who makes the best single/dual GPU high end graphics card if you are buying a card with just enough to play flash games and watch 1080p videos. But most people dont have a clue what the difference is or how much of a difference there is between a 6750 vs 4890, or a gtx 580 vs a gt 520.

So nVidia and ATI both do the same thing the car manufactures do. Notice the build and price your fiesta link on that page. Having the fastest car(d) on the market automatically means all your car(d)s, even the lower end ones, are faster than the competition. So if the Ford Mustang wins a race over a Chevy Camaro, that automatically means a Ford Fiesta is a better car than a Chevy Sonic ;) . Or so enough people apparently believe because car companies sink a ton of cash into racing. And the same thing with these overclocking contests sponsored by intel, AMD, whoever. No matter how well your 2500k overclocks, your Celeron is still a Celeron.

So although it really makes no physical difference if AMD stopped making high end graphics cards on their mainstream and lower end cards, they could loose buyers perception of having a card worth buying at any price range.

"Intel Inside" worked because of the company's sheer pervasiveness at the time the slogan launched. Besides their utter dominance in CPUs, they also were (and still are) a major telecom player - I still think of Intel as the largest telecom company that nobody's heard of. You could find Intel hardware in a PC without an Intel CPU - while the same is true of AMD, they, in a way, are not as well-known outside of the two areas they compete with Intel in directly. (CPUs and telecom are historical competition areas for the two companies - oddly enough, AMD is better known in telecom than Intel, despite Intel, not AMD, being one of Ethernet's Old Men.)

nVidia isn't exactly profitable in the desktop GPU space - their cash cow is tablets and smartphones. (This is according to their most recent earnings reports - in other words, straight from the horse's mouth, via CNBC.) They are best known to the *consumer* for desktop graphics - however, that isn't where NVDA (NASD NMS) makes their money (nor has it been since even before the economy went south).

One thing I think AMD is looking to do is put the squeeze on NVDA in that space (tablets and smartphones) due to nVidia trying to leverage Tegra and poach on AMD's advantages in netbooks and notebooks. Also, I'm quite certain that CEO Read follows the earnings reports of all the public companies that AMD competes with (and that includes both nVidia and Intel), and one thing that has been his strength is capitalizing on competition weakness. NVDA is vulnerable because they are trying to upscale Tegra, and likely aren't focussed as much on the desktop-GPU space; however, AMD can't afford to overlook Tegra, either, due to its encroachment on AMD's to-date strength - the netbook/notebook space. In short, neither company is putting their major assets in the desktop-GPU space, and the war between them won't be won OR lost there. However, both companies will still maintain presence there, if for no other reason than inertia.
 
Back
Top