Should AMD consider bowing out of the cpu market?

If they are turning a profit and providing humanity with a decent product, I'm not sure why they "should" bow out; unless they stand to, in some way, increase their potential or limit risk.
 
Yes the high end market, but who said intel still wouldn't offer the 2600k?

I am fairly certain Intel wouldn't offer a 2600k.

Let's see, things AMD did first:

- Unlocked multiplier Black Edition (so maybe 2600-no k)
- x86_64, probably wouldn't be 64bit
- Memory controller on die
... who knows what else..

Ivy Bridge will offer TDP configuration on the fly, something bulldozer is introducing.
 
If AMD's goals is to compete at the high end, they should get out. That's not their aim though. Most people don't buy CPU's, they go to best buy or dell.com and buy a retail machine and more often then not, they go for the cheapest option that will meet their needs, which many times has an AMD processor in there
 
Should Lowes bow out of the Home Improvement Warehouse business?
Should Apple bow out of the computer business?
Should Wendy's bow out of the Hamburger business?
Should Target bow out of the retail superstore business?
Should Miller bow out of the beer business?

The answer is "no"

plain and simple... You see AMD running dead last in the PC CPU race, while I see them as running 2nd... both are right, because there's really only 2 companies. Just because a company isn't currently the best doesn't mean they should just give up. There's a market for AMD CPUs, whether it's the llano, phenom II, or BD. All of them fill a gap that Intel doesn't
 
Why would they? just because they are not the fastest doesn't mean they dont sell. We sell plenty of AMD systems and cpu and motherboards at work. Just because intel is way faster in benchmarks, you cant really see the difference when doing everyday things that most people do. Power users will always go intel, budget users will most likely go AMD.
 
Just because you are a monopoly doesn't meant you gained it illegally. Plenty of monopolies around with little or no competition which don't have legal problems (of course we know Intel is no saint).

Now if Intel were to force AMD out of the market, it would be. But currently, AMD can't get themselves a competitive product. And it's not like the new CEO can change anything, given that AMD has changed CEOs like underwear lately. The CEO isn't the many people working in R&D with too little money.

And I thought there was a big legal settlement recently to exactly that effect. You can be sure that in the event AMD bows out of the market (extremely unlikely to happen since their products still sell very well), those files will be dug up and revisited in detail.

There are very few economic and legal minds out there who honestly believe a monopoly is a good situation for a market to be in. Are you attempting to argue otherwise?
 
There are very few economic and legal minds out there who honestly believe a monopoly is a good situation for a market to be in. Are you attempting to argue otherwise?

No, certainly not. I'm arguing that it may not be the devil in person if Intel got a monopoly, though they probably won't be saints either. One thing is for sure, the only sane people who would want that are people with Intel stocks.
 
Being a Phenom 2 and SNB user I can pretty much say that my 1090T was =/+ my 2500K. The only thing that's really leaps and bounds better than the 1090/1100T is the 2600K / 990X. I got better gaming and overall system performance out my 1090T @ 4.2gz then I did with my 2500K clocked the same. I was expecting some nice gains but I was actually incredibly underwhelmed, it wasn't until I got my 2600K that I said "yeah this is an upgrade".

So you claim your 2500K was slower than your 1090T, but a 2600K was a huge upgrade over your 2500K for gaming? That's interesting considering that hyperthreading is almost the only difference between the 2500k and 2600k and it does nothing for you in games. In fact there are many cases where a 2500k outperforms a 2600k by few FPS at the same clocks due to the lack of overhead from not having HT. The only way a 1090T would come anywhere near a 2500K or 2600K is if you were severely GPU limited or something else was terribly wrong.
 
I am fairly certain Intel wouldn't offer a 2600k.

Let's see, things AMD did first:

- Unlocked multiplier Black Edition (so maybe 2600-no k)
- x86_64, probably wouldn't be 64bit
- Memory controller on die
... who knows what else..

Ivy Bridge will offer TDP configuration on the fly, something bulldozer is introducing.

dec alpha in the 90s had on-die memory controller. fairly certain the original pentiums had unlocked multipliers too i think it was around 98 that intel began locking all their multipliers.

so apart from amd64, yeah ...
 
dec alpha in the 90s had on-die memory controller. fairly certain the original pentiums had unlocked multipliers too i think it was around 98 that intel began locking all their multipliers.

so apart from amd64, yeah ...

From what I read, a fair bit of the DEC Alpha design teams went over to AMD during the creation of the Athlon 64, hence the IMC.

we might all be on Itaniums now :p
 
:rolleyes: Geez, AMD would not get out of the CPU business!

They are going to cash in when MS/Sony/Nintendo make next gen consoles!

If it's it's as big a coup de grace as I think it might be, I hope they invest that money well, and they may yet be able to compete with intel on more level footing.
 
Intel retained the DEC Alpha team, the fab and the campus in Hudson. They are working on Itanium, their creation, Poulson, will see the light in Q1 2012 - 8 cores, 12 way wide, 4 way SMT , 50MB cache, 32nm.
 
Did anyone ever imagine if AMD JOINED intel much like 3dfx did nvidia, what would happen? We might move along even better. AMD has some good ideas!
 
Intel retained the DEC Alpha team, the fab and the campus in Hudson. They are working on Itanium, their creation, Poulson, will see the light in Q1 2012 - 8 cores, 12 way wide, 4 way SMT , 50MB cache, 32nm.

Maybe the licensed it, I forget but they took something from the Alpha design.
 
I'd have preferred that. A much much better architecture. If Intel had invested heavily into it (instead they went for amd64) we would have ditched the junk that is called x86 a long time ago.

i've seen this said many times and don't buy it. if ia64 were so amazing it would have some actual market share instead of being a minute presence.

i just don't buy it. not saying x86 is great or anything but ia64 has always been a dog.
 
I'd have preferred that. A much much better architecture. If Intel had invested heavily into it (instead they went for amd64) we would have ditched the junk that is called x86 a long time ago.

hey did invest heavily into it =p too bad they priced it about 10x more expensive then a comparable performing opteron at the time.
 
2600k has 100 mhz on it too...8mb cache.....and hyperthreading....

The extra cache doesn't seem to help in games and HT definitely isn't doing anything for game performance. The 100Mhz advantage would only come into play stock as they both have similar overclocking potential. Even with the 100mhz advantage the 2600k has at stock speeds, the 2500k is still faster in many games:
http://www.anandtech.com/show/4083/...core-i7-2600k-i5-2500k-core-i3-2100-tested/20

But my point wasn't to knock the 2600k. The user I originally quoted claimed that the 2500k was slower than his 1090T but that the 2600k represented a huge upgrade for gaming - that just doesn't jive with reality.
 
i've seen this said many times and don't buy it. if ia64 were so amazing it would have some actual market share instead of being a minute presence.

i just don't buy it. not saying x86 is great or anything but ia64 has always been a dog.

Itanium CPUs have always been the best performing in a given process technology. Best in 180nm, 130, 90, 65nm. Too bad they were always competing with CPUs one node ahead. :p
 
Intelligent discussion, no troll postings. Debate this. Phenoms arent bad, Bulldozer isnt what we thought but still a good chip. But they can't grasp the stranglehold intel has on the high end market, and it's only going to get worse. I remember when my Opteron, Athlon 64, and X2 were the shit. Those days are gone, so should AMD consider just focusing on llano apu's and gpu's (Radeon 7000 etc) or continue to vie for high end?

I personally think they either need new management or should bow out.

They only have less than 40% of the GPU market and their Graphics division lost money last quarter. I think they would be better off ditching their graphics cards.
 
So you claim your 2500K was slower than your 1090T, but a 2600K was a huge upgrade over your 2500K for gaming? That's interesting considering that hyperthreading is almost the only difference between the 2500k and 2600k and it does nothing for you in games. In fact there are many cases where a 2500k outperforms a 2600k by few FPS at the same clocks due to the lack of overhead from not having HT. The only way a 1090T would come anywhere near a 2500K or 2600K is if you were severely GPU limited or something else was terribly wrong.

I probably should of been a little more clear. The 2600K was a huge upgrade overall, I wasn't specifically saying that it was big in gaming but in general.

At the time of testing, and in the game's I was playing the 2500K didn't offer me an upgrade from what I was running...a 1090T @ 4.2ghz/3.0ghz CPU-NB. It was more of a side grade. I saw better framerate stability and had a smoother experience with my 1090T in BlackOps, CSS and a few others compared to the 2500K. I think a lot of that could be due to having two extra cores for all the "background noise" while the others can work the games as needed.

I don't really care if you don't believe me. I don't really have any [H] numbers to prove that other then some 3D Mark runs (which did put the 1090T/2500K neck-in-neck but 3D Mark is irrelevant, imo). I'm just sharing my overall, subjective, end user experience with the two.

I'm sure there are plenty of other people who either have, or have had both a 2500K/1090T+ that can comment.

At stock obviously the SNB based parts will tear it apart but when you've got the X6 overclocked and configured properly on the RAM/CPU-NB side of things it actually holds it's own quite well with i5 750/2500K and in certain instances even the i7 9**/2600K'. Which is why I can't understand why BD would have lower IPC per core then Phenom 2. Especially when JF specifically stated that BD will have IPC INCREASES.

If the above is true they'd of been much better off doing something like taking the current Llano core, adding all of the cache the Ph2's had and putting out a true 8-core design. Instead of the shared module approach BD takes.

The end of this evening is going to be very interesting.
 
dec alpha in the 90s had on-die memory controller. fairly certain the original pentiums had unlocked multipliers too i think it was around 98 that intel began locking all their multipliers.

so apart from amd64, yeah ...

The Intel 386SL in 1990 had an on-die memory controller.

http://books.google.com/books?id=eVcEWDIeTYcC&pg=PT209&lpg=PT209#v=onepage&q&f=false
The 386SL which was introduced in 1990 is a highly integrated microprocessor developed for small, portable laptop and notebook computers. It consumes less power. The Intel 386SL microprocessor contains a 386 CPU, memory controller, cache controller, etc.

And if you google for 386SL datasheets, you'll find block diagrams indicating the RAM was hooked up directly to the 386SL chip.
 
Itanium CPUs have always been the best performing in a given process technology. Best in 180nm, 130, 90, 65nm. Too bad they were always competing with CPUs one node ahead. :p

based on what metric? mhz? socket/core count?

it sure as hell isn't price/performance ratio cause you can build 4 x86_64 systems for the same price and destroy itanic.
 
Itanium CPUs have always been the best performing in a given process technology. Best in 180nm, 130, 90, 65nm. Too bad they were always competing with CPUs one node ahead. :p

Itanic is great... if you can optimize your application for it's custom compiler in order to actually user it's parallel architecture. That's it's problem.
 
based on what metric? mhz? socket/core count?

it sure as hell isn't price/performance ratio cause you can build 4 x86_64 systems for the same price and destroy itanic.

Performance/power.

Cost is massive because demand is low and is only focused on the extreme server market.
 
Suddenly this thread is more relevant. Well maybe they can bottom feed 7 more years and see what comes out next. But two major architectures in a row being complete duds, is kind of ridiculous.
 
after the Bulldozer review, AMD should consider selling their CPUs at $150 tops
 
Performance/power.

Cost is massive because demand is low and is only focused on the extreme server market.

makes more sense to purchase 10x the x86_64 resources and distribute/HA/mirror/hotfail over/raw brute force the problem.

you can argue CTO with the power draw but then i can counter with virtualization and powering down recoourses when they're not required.

further outside of HPC, which still favors x86_64 (or hybrid setups with gpus), the applications that really shine on ia64 are few and far between and being dropped by all major vendors (oracle, MS, red hat).

this is why i don't buy this argument. itanic was HPs baby, intel tried to get it to go somewhere, it fell on its face, now everyone is abandoning it. if it were such a great platform it would have gained more traction. the fact is though it really isn't a good platform. not that x86_64 is the bee's knees or anything but it is more than sufficient.
 
For those of you talking Itanium, I was shocked to find out, Intel actually makes more money on Itanium than AMD does with its entire CPU business.
 
For those of you talking Itanium, I was shocked to find out, Intel actually makes more money on Itanium than AMD does with its entire CPU business.

lol. Now I'm curious how much IBM made off of POWER6 and 7... brb

EDIT: whatever it is, IBM is seemingly bigger than AMD :eek:
 
Suddenly this thread is more relevant. Well maybe they can bottom feed 7 more years and see what comes out next. But two major architectures in a row being complete duds, is kind of ridiculous.

Indeed. I want to support AMD but they're making it really hard for us gamers to do so.
 
after the Bulldozer review, AMD should consider selling their CPUs at $150 tops


I don't think they need to drop them that much

We'll see soon as sales (or lack of sales) will really dictate where bulldozer settles in pricing.
 
lol. Now I'm curious how much IBM made off of POWER6 and 7... brb

EDIT: whatever it is, IBM is seemingly bigger than AMD :eek:

Power7 is used for massively parallel/supercomputing.

IBM zSeries uses similar but smaller processors that are CISC in nature, I think the ISA is called z/Architecture. They're used in mainframes which are still alive and well and it can trace its lineage back to OS/360.
Mainframes of all ages, shapes, and sizes are alive and well. By the way, fuck MVS. Sorry, just hate using MVS under z/VM and having to do crap like JCL and all this ancient mainframe OS magic.

AMD doesn't need to duck out because they're not top dog, they just need a pricing strategy and to focus on their APUs while hoping to fix and tweak up Bulldozer. There is another thread on here saying the single threaded performance could be explained by thrashing at the cache level.

But Trinity is supposed to have Northern Lights as its GPU. If we're stuck with consoles for another three years, we ought to be happy. The whole concept of an APU raises the LCD of computing and it is the mainstream market, if just somehow they could win the OEM hearts.
 
Can this thread be converted to a POLL cause I sorta want to vote yes right now after taking in the BD review numbers.
 
Given those numbers, where in many cases it's slower than 1100T and in some, even X4 980... AMD should get the fuck out of the processor business, sell it to VIA or someone else who can actually PROPERLY SPEND the little R&D money they had (compared to Intel) on something that ISN'T SLOWER than their existing stuff in MOST usage scenarios.

Sigh...
 
Back
Top