Will AMD ever get their act together once again?

GoHack

Gawd
Joined
Jul 14, 2002
Messages
824
Will AMD ever get their act together once again, when it comes to their CPU's?

For a short period of time, AMD CPU's were ahead of Intel, when it came to affordable overall performance, w/multi-core CPU's, supporting 64 bit software, and memory performance w/an on chip memory controller. While their CPU's now-a-days aren't exactly Model "T's" by any means, when compared to Intel Products, but neither are they exactly Ferrari's either.

AMD's (ATI) GPU's on the other hand, appear to be keeping up w/Nvidia, w/each company coming out w/a better product/technology to compete against the other, thus helping us, the customer. It also helps w/controlling prices w/competition.

With less competition, we are seeing less innovative CPU products coming out. With Intel, w/their Tick, Tock formula, it's one slow minor upgrade to another, from one new CPU generation to another, rather than a Tick, and a big Bell Ringing. AMD, it's more of a flat Tick, Tick, w/nothing really new coming out in a very long time. They seem to keep a steady pace behind Intel, w/o doing anything to equal, or pass them, in a very long time.

With no real technology jumps, does it make any sense to upgrade as often anymore as we once did, thus the slowdown in the purchase of desktops?

Memory technology on the other hand, seems to be really behind the curve, most likely due to Rambus, who no one wants to do business with.

.
 
I don't know if your speaking of gaming and general pc use or what but I just built an AMD rig for my friend and I was surprised how well it did. an 8320 at 4.2 got same fps in most games as a 2500k at 4,8 with same gpu and speeds.... nothing wrong with amd cpu's from where I sit.
 
Enthusiasts aren't their only market, and thus, price per chip is more important when selling en mass.
It's like, say, cars, I know full well any Ferrari, AMG, etc. is going to blow my Impala away when it comes to performance. However, price aside, they're not what I need for me and my family. The same comes with CPU's. An A4 is not a beast, but if someone basically surfs the internet and uses a word processor, it's more than enough.

High-end performance is a small market compared with general usage for the majority, so that may not be as big of a loss as people in this forum may think.
 
I don't know if your speaking of gaming and general pc use or what but I just built an AMD rig for my friend and I was surprised how well it did. an 8320 at 4.2 got same fps in most games as a 2500k at 4,8 with same gpu and speeds.... nothing wrong with amd cpu's from where I sit.

First of all, that's overclocked.

Second, you've got to remember, the 2500K, which is a Socket 1155 CPU, only has two memory channels, vs. the four of the Socket 2011. That should impede it's performance against the 8320, which I would imagine, is a four channel. How would an AMD 3.5 GHz 8320 do against a Socket 2011 Intel 3.4 GHz i7-4930K, w/o overclocking?

.
 
AMD has been focusing on low power CPUs, and Intel is doing the same. Intel even went as far as to say ARM is their primary threat, and AMD was smart in moving to low power CPUs earlier than Intel.

Besides, AMD's short term dominance was a fluke, and something not likely to be repeated.

First of all, that's overclocked.

Second, you've got to remember, the 2500K, which is a Socket 1155 CPU, only has two memory channels, vs. the four of the Socket 2011. That should impede it's performance against the 8320, which I would imagine, is a four channel. How would an AMD 3.5 GHz 8320 do against a Socket 2011 Intel 3.4 GHz i7-4930K, w/o overclocking?

.

He compared a 8320 at 4.2 with a 2500k at 4.8, and didn't even mention 2011.
 
AMD has been focusing on low power CPUs, and Intel is doing the same. Intel even went as far as to say ARM is their primary threat, and AMD was smart in moving to low power CPUs earlier than Intel.

Besides, AMD's short term dominance was a fluke, and something not likely to be repeated.



He compared a 8320 at 4.2 with a 2500k at 4.8, and didn't even mention 2011.

Be it a fluke, or they were simply ahead, technology wise, either way, it helped us, the consumer, at the time, and it would be nice if they kept that pressure on Intel. This would help us all, both in price and new technology, whether you like AMD or Intel. Like I said earlier, look at AMD's GPU's vs. Nvidia's? They are pretty much neck to neck, which helps us, the consumer.

.
 
Will AMD ever get their act together once again, when it comes to their CPU's?

For a short period of time, AMD CPU's were ahead of Intel, when it came to affordable overall performance, w/multi-core CPU's, supporting 64 bit software, and memory performance w/an on chip memory controller. While their CPU's now-a-days aren't exactly Model "T's" by any means, when compared to Intel Products, but neither are they exactly Ferrari's either.


.

Hell, for several years the K7 and then K8 was awesome. Not sure if I'd say it was short....

Remember the Thunderbird series being $100-$140ish and being faster than the sad sad netburst?
 
Intel handed out fistfulls of cash to OEMs who didn't use AMD products. Thats what happened.

Its hard to 'get your act together' when a coorperation 10x your size with 10x your net income bullies you out of market share.
 
AMD's "dominance" wasn't very short. AMD was neck & neck or faster than Intel ever since the K6-2/3 and up until Intel released their Core 2 series.

Anyway, the CPU speed wars is a thing of the past. Intel and AMD are focusing more on low-power products because that's where the money is. AMD sells many more APU's than they do FX CPU's, and Intel sells more Pentiums and i3's than they do anything else.

This false notion that Intel has become "lazy" due to "lack of competition" needs to die out already. Intel gets roughly 5% perf increase with each new uarch. This isn't because AMD isn't a "threat" to them, it's because this uarch is old and is reaching its performance wall. Intel's focus is on lowering power consumption and further optimizing their uarch for mobile products. This is why Haswell wasn't a big hit with "enthusiasts" but was actually really impressive on the mobile front.

AMD with their current CPU's are in a decent spot. Sure, Vishera isn't much faster than Phenom II, an architecture from 2009 by much, but it's still competitive with Intel. All AMD needs to really work on is their single-threaded perf, which will be increased with Steamroller. Bulldozer was a completely fresh uarch, while Intel's current uarch's are heavily-modified P3 derivatives. AMD has huge room for improvement with Bulldozer, whereas Intel is just squeezing out small portions to make it last as long as possible. Not that it matters, seeing as how their R&D budget blows AMD's out of the water. The fact that AMD even survived this long, let alone remained competitive when their resources are microscopic compared to Intel's, is nothing short of amazing.

That being said, the current AMD is not the same AMD from yesterday. The Bulldozer fiasco was a product of many things; completely fresh uarchs are bound to have kinks that need to be worked out; the uarch was originally planned for release around 2008 or 2009 but got scrapped for whatever reason; the uarch was hyped to the moon by BOTH AMD and enthusiast forums all over the place (it gets annoying when enthusiasts try to play innocent by denying this) and much of the nonsensical claims came from the enthusiasts themselves; the product was rushed out much at the chagrin of the engineers who wished to further tweak it but were ignored thanks to corporate meddling, etc. etc.

Had Bulldozer released when it initially was projected to, it would have been mature by now and realistically would have caught up to or overtook Intel in their performance. Intel still has other advantages that AMD could only dream of, such as having their own dedicated fabs and the money to spend on getting to those new process nodes. AMD is at the mercy of GloFo and TSMC.

Even so, the new management at AMD are awesome and have turned the company around for the better. AMD has secured console wins and has their tech inside all three of the new consoles. The custom APU's have eight low-power Jaguar cores, which will force developers to make their engines properly multi-threaded which in turn will translate to better performance for everyone on the PC ports/versions. The new Mantle API, which is basically the same thing as the low-level API's in the consoles which allow devs to code straight "to the metal" will provide performance boosts to anyone with GCN GPU's, will remove CPU overheads and bottlenecks, and will speed up porting processes as well as potentially bringing more games to Linux, which is the same thing that Valve's Gabe Newell has been pushing for ages due to his hatred of Windows.

AMD has been doing a lot right recently. Things will just get better from here on out.


About the Intel unethical practices:
CBjZBa5.png
 
Hell, for several years the K7 and then K8 was awesome. Not sure if I'd say it was short....

Remember the Thunderbird series being $100-$140ish and being faster than the sad sad netburst?

At the time, Intel's were quite expensive. It was the AMD x64 Dual Cores, that were a threat to Intel, w/them not having anything to compete against them, both in price and technology. Remember too, they ended up licensing AMD's x64. The only x64 bit chip that they had at the time was their very expensive Itanium, and for what ever reason, they went w/AMD's. It was probably less complicated, and could run 32 bit software, which was the norm at the time, which may of been the problem w/their own x64 technology.

.
 
At the time, Intel's were quite expensive. It was the AMD x64 Dual Cores, that were a threat to Intel, w/them not having anything to compete against them, both in price and technology. Remember too, they ended up licensing AMD's x64. The only x64 bit chip that they had at the time was their very expensive Itanium, and for what ever reason, they went w/AMD's. It was probably less complicated, and could run 32 bit software, which was the norm at the time, which may of been the problem w/their own x64 technology.

.

Yeah I just wish CPU's were cheaper these days. I remember when a 1.4Ghz Tbird was $150, and Intel had nothing faster....
 
AMD was first to move their lvl2 cache to their cpu and get huge gains.
AMD was the first to provide dual cores, REAL dual cores, not 2 chips on the same socket.
AMD gave us 64 bit instructions for our X86 chips.
AMD integrated their memory controllers into their cpus providing huge memory bandwidth gain.
I was all in on AMD from K6-2 to socket 939.
Really, quite possibly the best days ever for we enthusiasts as far as bang fro the buck goes.
Barton 2500+ an Abit Nforce 7-S motherboard= insane gaming performance and some of the best 5.1 channel sound ever made added onto an excellent board as the mother of all bonuses
I still miss my socket 939 DFI SLI board and dual core 2.2ghz chip. It absolutely SLAYED anything Intel had and was hundreds less.
It was all downhill from there, cutting cache, losing performance, ramping speeds. Those were the SAME THINGS that made Intel's P4 a crappy chip.
Went Intel Core after 939 was done.
Probably going to do a build some time next year and I'll re-evaluate, and if AMD is close, I'll go AMD to keep competition alive and save a few bones in the process.
Kind of miss those days AMD was crazy awesome.
 
AMD's "dominance" wasn't very short. AMD was neck & neck or faster than Intel ever since the K6-2/3 and up until Intel released their Core 2 series.

Anyway, the CPU speed wars is a thing of the past. Intel and AMD are focusing more on low-power products because that's where the money is. AMD sells many more APU's than they do FX CPU's, and Intel sells more Pentiums and i3's than they do anything else.

This false notion that Intel has become "lazy" due to "lack of competition" needs to die out already. Intel gets roughly 5% perf increase with each new uarch. This isn't because AMD isn't a "threat" to them, it's because this uarch is old and is reaching its performance wall. Intel's focus is on lowering power consumption and further optimizing their uarch for mobile products. This is why Haswell wasn't a big hit with "enthusiasts" but was actually really impressive on the mobile front.

AMD with their current CPU's are in a decent spot. Sure, Vishera isn't much faster than Phenom II, an architecture from 2009 by much, but it's still competitive with Intel. All AMD needs to really work
on is their single-threaded perf, which will be increased with Steamroller. Bulldozer was a completely fresh uarch, while Intel's current uarch's are heavily-modified P3 derivatives. AMD has huge room for improvement with Bulldozer, whereas Intel is just squeezing out small portions to make it last as long as possible. Not that it matters, seeing as how their R&D budget blows AMD's out of the water. The fact that AMD even survived this long, let alone remained competitive when their resources are microscopic compared to Intel's, is nothing short of amazing.

That being said, the current AMD is not the same AMD from yesterday. The Bulldozer fiasco was a product of many things; completely fresh uarchs are bound to have kinks that need to be worked out; the uarch was originally planned for release around 2008 or 2009 but got scrapped for whatever reason; the uarch was hyped to the moon by BOTH AMD and enthusiast forums all over the place (it gets annoying when enthusiasts try to play innocent by denying this) and much of the nonsensical claims came from the enthusiasts themselves; the product was rushed out much at the chagrin of the engineers who wished to further tweak it but were ignored thanks to corporate meddling, etc. etc.

Had Bulldozer released when it initially was projected to, it would have been mature by now and realistically would have caught up to or overtook Intel in their performance. Intel still has other advantages that AMD could only dream of, such as having their own dedicated fabs and the money to spend on getting to those new process nodes. AMD is at the mercy of GloFo and TSMC.

Even so, the new management at AMD are awesome and have turned the company around for the better. AMD has secured console wins and has their tech inside all three of the new consoles. The custom APU's have eight low-power Jaguar cores, which will force developers to make their engines properly multi-threaded which in turn will translate to better performance for everyone on the PC ports/versions. The new Mantle API, which is basically the same thing as the low-level API's in the consoles which allow devs to code straight "to the metal" will provide performance boosts to anyone with GCN GPU's, will remove CPU overheads and bottlenecks, and will speed up porting processes as well as potentially bringing more games to Linux, which is the same thing that Valve's Gabe Newell has been pushing for ages due to his hatred of Windows.

AMD has been doing a lot right recently. Things will just get better from here on out.


About the Intel unethical practices:
CBjZBa5.png


Certainly there is no question about it when it comes to their GPU's. Their R9 290X is suppose to faster than Nvidia's GeForce GTX Titan. It'll be interesting what they price it at. Rumors have them selling for around $600, while the Titan's run presently at around $1000.

I look forward to them working on their CPU's as well.

.
 
My point was for normal gaming and general use, meaning 1080p single card a 8300 series cpu is all you really need, if your gonna be quad sli or w/e then i'm sure 2011 socket and quad channel memory might come in handy but it doesnt mean shit in the scenario 99% of even enthusiast type crowd like myself and a lot of people here.

I consider myself an enthusiast or H or w/e lol but I do not need more than 1 card in fact I steer people away from it. I build for a lot of beginners and even some seasoned pc gamers and I always try and save them the most money and I always have them sit on my pc and compare and then ask, you think it's worth $800 more? Usually I get a smile and a response that is usually like I will take the $800 and this one and thanks !
 
First of all, that's overclocked.

Second, you've got to remember, the 2500K, which is a Socket 1155 CPU, only has two memory channels, vs. the four of the Socket 2011. That should impede it's performance against the 8320, which I would imagine, is a four channel. How would an AMD 3.5 GHz 8320 do against a Socket 2011 Intel 3.4 GHz i7-4930K, w/o overclocking?

.
The 8320 uses dual channel RAM too, not quad, and the i7 4930K costs more than double of AMD's 8320.
 
My point was for normal gaming and general use, meaning 1080p single card a 8300 series cpu is all you really need, if your gonna be quad sli or w/e then i'm sure 2011 socket and quad channel memory might come in handy but it doesnt mean shit in the scenario 99% of even enthusiast type crowd like myself and a lot of people here.

I consider myself an enthusiast or H or w/e lol but I do not need more than 1 card in fact I steer people away from it. I build for a lot of beginners and even some seasoned pc gamers and I always try and save them the most money and I always have them sit on my pc and compare and then ask, you think it's worth $800 more? Usually I get a smile and a response that is usually like I will take the $800 and this one and thanks !

Sure if you want to cheat people into thinking that AMD is better solution then comparing to s2011 and talking about 800$ diffrence is great way.

While specing same system with i5 would cost around 100$ more depending if they take K or not.
 
The 8320 uses dual channel RAM too, not quad, and the i7 4930K costs more than double of AMD's 8320.

actually it cost about 5 times more here for the 4930k...

Sure if you want to cheat people into thinking that AMD is better solution then comparing to s2011 and talking about 800$ diffrence is great way.

While specing same system with i5 would cost around 100$ more depending if they take K or not.

Depends what games you play, if Mantle move the overhead from the cpu and your main game is Bf4, then a amd system is the preference choice in that game.
Normally you have a slightly better fps with a Intel system today but this is changing with the Mantle api.
in all many who changed from Intel OC system to a 8350 OC system says the same thing, it works great. and if your able to change the system and not be concern with the gaming I say it works good.

I game with Bf4, and if Mantle works great there, well I might opt to go amd cpu also in spite of a 4.7ghz haswell..
 
With less competition, we are seeing less innovative CPU products coming out. With Intel, w/their Tick, Tock formula, it's one slow minor upgrade to another, from one new CPU generation to another, rather than a Tick, and a big Bell Ringing. AMD, it's more of a flat Tick, Tick, w/nothing really new coming out in a very long time. They seem to keep a steady pace behind Intel, w/o doing anything to equal, or pass them, in a very long time.
.

What does it really matter right now how slow or fast CPU innovations are coming out?

For the people that do encoding and heavily stressed multi-core applications I can see CPU innovations being something to think about. But for average PC users and probably most every gamer out there, what we have in the CPU department is more than enough power. I remember looking on the back of game boxes all the time to make sure my machine could handle the game and it wasn't long before what I had was obsolete. Now, I see a game I might like and don't worry about system requirements, still to this day and my current primary rig is going on four years old, and is what, three or four architectures back from Haswell?
 
If mantle removes CPU overhead then intel cpus will benefit too.

this, its good for both intel and amd.

while AMD might benefit a little more atleast when it comes to comparing an 8 core amd to a 4 core intel, because mantle should scale better with an 8core cpu than DX has in the past.

so it might close the gap a little.

atleast thats what i would expect to happen, but i dont think we will see mantle to allow amd to blow intel out of the water when it comes to cpu performance.

if anything it might make a 6core intel cpu look more attractive despite its price premium over 4 cores.
 
Master [H];1040301416 said:
Enthusiasts aren't their only market, and thus, price per chip is more important when selling en mass.
It's like, say, cars, I know full well any Ferrari, AMG, etc. is going to blow my Impala away when it comes to performance. However, price aside, they're not what I need for me and my family. The same comes with CPU's. An A4 is not a beast, but if someone basically surfs the internet and uses a word processor, it's more than enough.

High-end performance is a small market compared with general usage for the majority, so that may not be as big of a loss as people in this forum may think.

^ This. I just upgraded my brother's computer on a budget. Went from a AM2 x2 5400+ and HD 4850, to an FX-6300, budget mobo, and HD 7770 for < $250. Got a small rebate as well as a free game. (Thanks Microcenter bundle! At least I think it was a great deal.)

He occassionally plays some LoL, WoW, and other light gaming at 1080. He was happy with his old system except the mobo on the old system died and I didn't feel like replacing/fixing that system. New system is more than enough for him.
 
What does it really matter right now how slow or fast CPU innovations are coming out?

For the people that do encoding and heavily stressed multi-core applications I can see CPU innovations being something to think about. But for average PC users and probably most every gamer out there, what we have in the CPU department is more than enough power. I remember looking on the back of game boxes all the time to make sure my machine could handle the game and it wasn't long before what I had was obsolete. Now, I see a game I might like and don't worry about system requirements, still to this day and my current primary rig is going on four years old, and is what, three or four architectures back from Haswell?

I love gaming on my PC. I love watching others game on Twitch.tv. When I stream to Twitch.tv, I have to encode and render on the fly. That's enough reason to justify upgrading to a 8 core AMD or 6 core Intel. And I can't live without a second monitor to have all my streaming apps open on and stream chat. Enthusiast gaming is requiring more and more, yet Intel and AMD CPU's are becoming stagnate.

Just speaking from a gamer's perspective. :)
 
I don't know if your speaking of gaming and general pc use or what but I just built an AMD rig for my friend and I was surprised how well it did. an 8320 at 4.2 got same fps in most games as a 2500k at 4,8 with same gpu and speeds.... nothing wrong with amd cpu's from where I sit.

This was my experience with all the FX cpus. They ran very well, but the benchmarks stunk. Except multithreaded ones. With that said, I don't know why people are so bent on increasing CPU performance, How often are we CPU bottle-necked with core i7 or an FX8350? FX-8350 matches the FPS of an I7 in BF3/BF4 for about 60% of the cost.

I think we should focus more on SSD and GPU performance
 
Signs say that AMD's lead in x86 was a one time thing. Hey, it was good while it lasted! ;) Economies of scale show that AMD's ever shrinking size is not the way to dig its way out of a gigantic performance and manufacturing process deficit. AMD can still scrape by, nibbling at the low end and selling some mainstream and some higher end processors.
 
I love gaming on my PC. I love watching others game on Twitch.tv. When I stream to Twitch.tv, I have to encode and render on the fly. That's enough reason to justify upgrading to a 8 core AMD or 6 core Intel. And I can't live without a second monitor to have all my streaming apps open on and stream chat. Enthusiast gaming is requiring more and more, yet Intel and AMD CPU's are becoming stagnate.

Just speaking from a gamer's perspective. :)

Yeah, exactly. But with doing those kinds of things... Are big innovations needed by CPU developers because it's too demanding on the current line up of processors?

This was my experience with all the FX cpus. They ran very well, but the benchmarks stunk. Except multithreaded ones. With that said, I don't know why people are so bent on increasing CPU performance, How often are we CPU bottle-necked with core i7 or an FX8350? FX-8350 matches the FPS of an I7 in BF3/BF4 for about 60% of the cost.

I think we should focus more on SSD and GPU performance

^This is my point. Who cares about the current lack of innovations. It's not like we're in dire need of better CPU's all the time right now. If something comes up that calls for a much better architecture/CPU I'd be surprised if both AMD and Intel didn't work on it.
 
Games that use all 8 cores of my FX-8120 like BF3 won't allow me to turn up the graphics and stream / render at the same time. My game will start lagging from the CPU load. Single threaded stuff like Skyrim is no problem. With the new consoles coming out hopefully we will have more multithreaded games to make consumers desire more performance out of our PC's.
 
The FX-9590 is the fastest CPU in battlefield 4

AMD destroys Intel in Battlefield 4 benchmarks, when games start using all 8 cores we will see AMD CPUs start to perform better.
 
I don't know if your speaking of gaming and general pc use or what but I just built an AMD rig for my friend and I was surprised how well it did. an 8320 at 4.2 got same fps in most games as a 2500k at 4,8 with same gpu and speeds.... nothing wrong with amd cpu's from where I sit.

A 8320 4.2ghz keeping up with a 2500K 4.8?... I'd love to see that.
 
Signs say that AMD's lead in x86 was a one time thing. Hey, it was good while it lasted! ;) Economies of scale show that AMD's ever shrinking size is not the way to dig its way out of a gigantic performance and manufacturing process deficit. AMD can still scrape by, nibbling at the low end and selling some mainstream and some higher end processors.

People forget how AMD achieved that lead. The K6 was based on technology acquired from NexGen Systems and the Athlon was derived from the DEC Alpha as AMD hired a crap ton of former DEC engineers after DEC was bought by Compaq and shutdown. The K5 and earlier CPUs are the track record of the real AMD prior to these events. Phenom and and all subsequent CPUs show AMD is back to their status quo after the technology from the Athlon's was played out and taken as far as it could go. Now this is somewhat of an over simplification of history but you get the idea.

Unless another great buyout opportunity comes around AMD will remain "that other CPU maker" as they had before the Athlon and one day be relegated to the realm of Centaur and Cyrix when it comes to CPUs.
 
People forget how AMD achieved that lead. The K6 was based on technology acquired from NexGen Systems and the Athlon was derived from the DEC Alpha as AMD hired a crap ton of former DEC engineers after DEC was bought by Compaq and shutdown. The K5 and earlier CPUs are the track record of the real AMD prior to these events. Phenom and and all subsequent CPUs show AMD is back to their status quo after the technology from the Athlon's was played out and taken as far as it could go. Now this is somewhat of an over simplification of history but you get the idea.

Unless another great buyout opportunity comes around AMD will remain "that other CPU maker" as they had before the Athlon and one day be relegated to the realm of Centaur and Cyrix when it comes to CPUs.

I remember those days well. If you wanted both good integer perfomance AND a strong FPU, Intel was the ONLY way to go. Cyrix had competitive integer performance, but their FPUs stunk, and sucked for gaming. AMD was just all around "meh" until Athlon hit the scene. Then they got lucky with Intel riding out "NetBust" for 5 years...

It's simply a return to the status quo. Their processors are adequate, but can shine if you bang on all 8 cores with heavy INT workloads. If you need strong INT performance in addition to a beefy FPU, Intel is where it's at. Unless AMD gets bought out by a company like Samsung with a massive R&D budget to play catch-up, don't plan on the Athlon 64 glory days returning again.

*waits for Intel to start charging $500+ for mainstream CPUs again*
 
Well that's nice Dan_D, but Intel hasn't done a thing to warrant a purchase. I mean what's the difference in raw power between a 2700K and a 4770K? The only thing I see them bragging on is "we use less power over the years!". I suspect that they have better tech in waiting, but looking at their current offerings around the $145 - $199 range, using Amazon pricing as that's a place everyone can order from, I'm just not impressed.

For $150 you can have a FX-8320 and overclock the snot out of it. In BF4 you'll certainly exceed anything that Intel has in that price range. Why spend more? For extra frames in single threaded ancient engines? All of the ancient apps run fine also in Win 8.1 64. If you want to try for FX-9590 speeds grab a motherboard like this Asrock Extreme 9 for $170.

I don't hate Intel. I've owned my Q6600 back in the day and others. I just don't see what they are trying to accomplish right now. They need to chuck some rocks into their pond and make some waves.
 
*waits for Intel to start charging $500+ for mainstream CPUs again*
What are you, high? :p

I don't think Intel ever charged $500 for mainstream CPUs. As far back as I remember, working for PC companies in the 1980s, mainstream models were priced around $120-$130. Top speed models were introduced at $500 and $1000 price points in the 386 and 486 era, but that was before Intel decided to realign prices to allow Windows PCs to be sold for cheaper since that was when the PC industry really started to take off.

If you have some particular insight into the economics of CPU pricing, please share. $500, which is 7-8x higher than current ASPs, doen't make sense at all for a volume processor and Intel isn't stupid to price processors so much out of reach if it wants to sell million of units a quarter.
 
This was my experience with all the FX cpus. They ran very well, but the benchmarks stunk. Except multithreaded ones. With that said, I don't know why people are so bent on increasing CPU performance, How often are we CPU bottle-necked with core i7 or an FX8350? FX-8350 matches the FPS of an I7 in BF3/BF4 for about 60% of the cost.

I think we should focus more on SSD and GPU performance

Actually, in my opinion, it is the memory which is way behind, technology wise. The SSD's and GPU's are moving right along.

With the memory chips, they don't presently run at the same speeds as the CPUs do. Intel's latest supported CPU memory speeds are presently running from 1600 to 1866 MHz, or 1.6 and 1.866 GHz, vs. what, around 3.9 GHz for their CPU? That's approx 1/2 the processor speed. It would be nice if they ran at the same speeds, 4000 MHz memory for a 4.0 GHz CPU.

I realize that there are people who are happy w/what they have, and that's fine. By the way, at some point in time, those very systems you have, performance wise, were once top of the line.

Systems now-a-days are now surpassing the fastest Super Computers of the past.

.
 
Last edited:
People forget how AMD achieved that lead. The K6 was based on technology acquired from NexGen Systems and the Athlon was derived from the DEC Alpha as AMD hired a crap ton of former DEC engineers after DEC was bought by Compaq and shutdown. The K5 and earlier CPUs are the track record of the real AMD prior to these events. Phenom and and all subsequent CPUs show AMD is back to their status quo after the technology from the Athlon's was played out and taken as far as it could go. Now this is somewhat of an over simplification of history but you get the idea.

Unless another great buyout opportunity comes around AMD will remain "that other CPU maker" as they had before the Athlon and one day be relegated to the realm of Centaur and Cyrix when it comes to CPUs.

DEC was way ahead of everyone else, and were taken over and torn apart, solely for their technology, w/everyone bleeding them of it, and their staff. One could only imagine where they would of been today, if they were to have continued on. 128 bit perhaps? They were already running 64 bit when they were taken over back in the '90's.

.
 
Games that use all 8 cores of my FX-8120 like BF3 won't allow me to turn up the graphics and stream / render at the same time. My game will start lagging from the CPU load. Single threaded stuff like Skyrim is no problem. With the new consoles coming out hopefully we will have more multithreaded games to make consumers desire more performance out of our PC's.

And Battlefield is the only game I think I hear about using multiple-all cores on the CPU as well as about the only benchmark I see used for AMD fanboi's to prove AMD is faster.

I know I made it seem like I assumed just one piece of software that stressed the CPU would have AMD/Intel looking to make some new innovations and to further increase what their respective pieces could do... But I don't see that happening. I don't believe this one example is going to do it. I mean, aren't we finally starting to really see some program/software developers use multiple cores? If not I question why single threaded performance is always benched and talked about? And even then how many programs take advantage of 4+ cores? I may be way off base here, but that's what I'm talking about. There's simply not enough demand for it. Or am I completely wrong?
 
What are you, high? :p

I don't think Intel ever charged $500 for mainstream CPUs. As far back as I remember, working for PC companies in the 1980s, mainstream models were priced around $120-$130. Top speed models were introduced at $500 and $1000 price points in the 386 and 486 era, but that was before Intel decided to realign prices to allow Windows PCs to be sold for cheaper since that was when the PC industry really started to take off.

If you have some particular insight into the economics of CPU pricing, please share. $500, which is 7-8x higher than current ASPs, doen't make sense at all for a volume processor and Intel isn't stupid to price processors so much out of reach if it wants to sell million of units a quarter.

Please forgive my wording. I meant like upper mid-range models that we are currently paying $200-300 for would more likely go up in price if AMD were to falter or offer ZERO competition. An exaggeration, certainly, but monopolies are a bitch.
 
Well that's nice Dan_D, but Intel hasn't done a thing to warrant a purchase. I mean what's the difference in raw power between a 2700K and a 4770K? The only thing I see them bragging on is "we use less power over the years!". I suspect that they have better tech in waiting, but looking at their current offerings around the $145 - $199 range, using Amazon pricing as that's a place everyone can order from, I'm just not impressed.

For $150 you can have a FX-8320 and overclock the snot out of it. In BF4 you'll certainly exceed anything that Intel has in that price range. Why spend more? For extra frames in single threaded ancient engines? All of the ancient apps run fine also in Win 8.1 64. If you want to try for FX-9590 speeds grab a motherboard like this Asrock Extreme 9 for $170.

I don't hate Intel. I've owned my Q6600 back in the day and others. I just don't see what they are trying to accomplish right now. They need to chuck some rocks into their pond and make some waves.

Like many AMD people say about Intel winning out on a vast majority of all the other benchmarks, "so what, AMD will still get the job done." In this case, so will Intel. And as I mentioned in my other recent comment, it's the SAME God damn program/game that gets brought up every time. Battlefield. For the record, not everyone plays that one particular game. (Not trying to come off as an ass, just a statement)

As for Intel doing something to warrant a purchase, as many have mentioned their size and profit margins compared to AMD says it all. If they aren't worthy of a purchase, why are they doing so well?

Now, I'm not shitting, or at least not purposely shitting on AMD as I am a fan. (for the record I have four AMD machines in my house compared to my one Intel in my sig) But come on... Why shit on Intel when their stuff IS superior bottom line performance wise? Real world use of web surfing, email checking, and word processing I can agree the difference is negligible. But that's why we have benchmarks, a way of showing us what kind of cap room we have on hardware.
 
Last edited:
Naw, not seeing that happen really. Price elasticity doesn't favor such a move. I know it's hyperbole, but some people take it as fact.

AMD isn't really important anymore as a CPU competitor. It has to stick in the low cost ghetto where it ships 2x as much silicon to remain competitive against Intel's low end and mid range chips. The CPU portion of the company isn't doing well now, as sales fell 2x faster than PC sales as a whole. Margins are also down, meaning that even discounted it didn't move those and today's unsold processors are tomorrow's write down.

Obviously it's not all negative. AMD processors work fine, and there are good deals on certain performance levels.
 
A 8320 4.2ghz keeping up with a 2500K 4.8?... I'd love to see that.

Well I said gaming 1080p single GPU

Test systems were 7950 each rig 1050/1250 one rig had 8320 @ 4.2 and one had a 2500k @ 4.8 in tomb raider bench fps were identical we also saw almost identical results in valley.
 
Back
Top