Whats the problem with AMD

I was considering moving from my X58 platform (which had an un-overclockable 920 C0) to a FX8350 and overclock the shit out of it. But then, I found the Xeon X5650 for $65. 6-core, overclocked now at 4.18ghz (need to update my sig). Absolutely stomps the AMD offerings.

That said, I think AMD brings good value to the table for budget and mid-tier systems. The APU setup is a good alternative for those with low gaming needs. However, with Broadwell releasing in June, and incorporating Iris graphics, the APU setup may become rather irrelevant, except for those on a tight budget.

Actually Intel iGPU are incredibly bad. And don't post crap avg fps. Look at any frame time graph and you quickly find why they are terrible. AMD is light years ahead in iGPU so no they will never be irrelevant in that regard.
 
Actually Intel iGPU are incredibly bad. And don't post crap avg fps. Look at any frame time graph and you quickly find why they are terrible. AMD is light years ahead in iGPU so no they will never be irrelevant in that regard.

Which is why most fair-minded people aren't arguing that AMD is irrelevant or that all of their offerings are bad. But if you want a high-end gaming system, there is absolutely no reason whatsoever to use an AMD CPU as your cornerstone. None.

APU's have their usage case, but they are not (nor are intended to be) a high-end option. AMD FX's are great CPU's if you want throughput on a budget. I have often fantasized about putting together a machine based on an FX-8370e and running Linux on it and putting a Java application server and a Database server on it. Intel simply doesn't have an option for that at the price of the AMD.

But in general, Intel simply has more IPC, and in general, has the better CPU's.
 
You are wrong, there was a compelling reason. Just not a compelling reason for you. Speak for yourself and stop stating your opinions as fact. A quad athlon ii x4 was half the price of a core 2 quad and matched it's performance.

Lol, get your facts in order please....
 
It seems it's always important to some Intel owners to point out to AMD owners that somehow their purchase or machine is inadequate. Been using 8350 for quite some time at 4.7ghz and it does the job and the 290x keeps my games running smooth.

My AMD A8-5600K seems to be doing just fine for a $120 2 year old CPU... My GPU is about 5 years old so i'm looking into a new one, but i have never had an issue with my PC since installing the AMD chip...
 
Which is why most fair-minded people aren't arguing that AMD is irrelevant or that all of their offerings are bad. But if you want a high-end gaming system, there is absolutely no reason whatsoever to use an AMD CPU as your cornerstone. None.

APU's have their usage case, but they are not (nor are intended to be) a high-end option. AMD FX's are great CPU's if you want throughput on a budget. I have often fantasized about putting together a machine based on an FX-8370e and running Linux on it and putting a Java application server and a Database server on it. Intel simply doesn't have an option for that at the price of the AMD.

But in general, Intel simply has more IPC, and in general, has the better CPU's.

so did you read the post I responded to? Because your point is not even in response to either. The debate was Intels Irispro which still, despite being costly, is terribad at gameplay at any level. AMDs iGPUs perform quite well, granted not at top tier GPU level but for small form factors, admirably so.
 
so did you read the post I responded to? Because your point is not even in response to either. The debate was Intels Irispro which still, despite being costly, is terribad at gameplay at any level. AMDs iGPUs perform quite well, granted not at top tier GPU level but for small form factors, admirably so.

Per Notebook check, the 6200 Iris Pro beats out the top ranked AMD APU, based on the hierarchy charts, and falls in line with the R7 250 dedicated GPU:

http://www.notebookcheck.net/Intel-Iris-Pro-Graphics-6200.125593.0.html

According to wcctech, the Iris Pro 5200 (which has 8 fewer execution units than the 6200), is about even on the benchmarks, but loses to AMD in terms of performance-per-dollar (which is likely the case with the 6200 as well).

http://wccftech.com/amd-kaveri-apu-a10-7850k-intel-i5-4570r-iris-pro-graphics-showdown/

In pure, raw processing numbers, the Intel option is quite impressive if you don't factor in cost. For folks with deeper pockets, and don't mind spending the extra coin, they can't go wrong with the Intel setup if they are looking for a integrated solution. For those folks, AMD is irrelevant, except to the fact that AMD has pushed Intel to deliver a high quality product.
 
I was considering moving from my X58 platform (which had an un-overclockable 920 C0) to a FX8350 and overclock the shit out of it. But then, I found the Xeon X5650 for $65. 6-core, overclocked now at 4.18ghz (need to update my sig). Absolutely stomps the AMD offerings.

That said, I think AMD brings good value to the table for budget and mid-tier systems. The APU setup is a good alternative for those with low gaming needs. However, with Broadwell releasing in June, and incorporating Iris graphics, the APU setup may become rather irrelevant, except for those on a tight budget.

Per Notebook check, the 6200 Iris Pro beats out the top ranked AMD APU, based on the hierarchy charts, and falls in line with the R7 250 dedicated GPU:

http://www.notebookcheck.net/Intel-Iris-Pro-Graphics-6200.125593.0.html

According to wcctech, the Iris Pro 5200 (which has 8 fewer execution units than the 6200), is about even on the benchmarks, but loses to AMD in terms of performance-per-dollar (which is likely the case with the 6200 as well).

http://wccftech.com/amd-kaveri-apu-a10-7850k-intel-i5-4570r-iris-pro-graphics-showdown/

In pure, raw processing numbers, the Intel option is quite impressive if you don't factor in cost. For folks with deeper pockets, and don't mind spending the extra coin, they can't go wrong with the Intel setup if they are looking for a integrated solution. For those folks, AMD is irrelevant, except to the fact that AMD has pushed Intel to deliver a high quality product.

So I am going to go with a hunch and say you didn't read my previous post. Reviews tend to stick to fps not showing frame time graphs. All Intel iGPUs are terrible and hardly even decent compared to AMD or NVIDIA. And drivers? Even worse. Funny that with CF naysayers wanted to talk frametime because fps they were losing. But when frametime doesn't garner them a win they want to talk fps.
 
So I am going to go with a hunch and say you didn't read my previous post. Reviews tend to stick to fps not showing frame time graphs. All Intel iGPUs are terrible and hardly even decent compared to AMD or NVIDIA. And drivers? Even worse. Funny that with CF naysayers wanted to talk frametime because fps they were losing. But when frametime doesn't garner them a win they want to talk fps.

Feel free to send your links to your "frametime" charts and info then. Its funny to me how defensive you are about all this. I have concurred with others that AMD brings good value, but when it comes to enthusiasts, they simply are not there.

Hopefully, AMD will get there heads out of their asses, and look at releasing something worth talking about. Otherwise, they will eventually go the way of the Dodo, which is to say, extinct, unless they just want to release stuff for the budget minders, and hope their dedicated graphics cards make up for the mounting losses, here on out.
 
Last edited:
If Samsung bought them and bring the smaller die sizes, then it might survive.
Otherwise I see them slipping fast.
 
So I am going to go with a hunch and say you didn't read my previous post. Reviews tend to stick to fps not showing frame time graphs. All Intel iGPUs are terrible and hardly even decent compared to AMD or NVIDIA. And drivers? Even worse. Funny that with CF naysayers wanted to talk frametime because fps they were losing. But when frametime doesn't garner them a win they want to talk fps.
http://techreport.com/review/26166/gigabyte-brix-pro-reviewed/3

Iris Pro 5200 D3D frame times are better than Kaveri's GPU in both D3D and Mantle. No doubt the TBR method Intel uses will be more prone to glitches than Nvidia or AMD GPUs, but it's not "even worse" than frame times since frame times isn't an issue. The performance of eDRAM equipped Intel CPUs shouldn't be so easily dismissed. It's a niche product, and priced relatively high, making it better for compact all in one systems than a volume product.
 
Comparing Iris Pro w/ any regular iGPU is ridiculous. They are completely different products for completely different uses and they are priced no where near each other for a reason. Any GPU that's bandwidth starved ( and just about any iGPU is) will show dramatic results when you increase bandwidth by including on-die memory. It's also not anything new either. AMD and Nvidia have done products that are similar.

It would be like comparing an i7 w/ an i3 without acknowledging that they are not competing against each other and are in price brackets that are hundreds of dollars apart.
 
Feel free to send your links to your "frametime" charts and info then. Its funny to me how defensive you are about all this. I have concurred with others that AMD brings good value, but when it comes to enthusiasts, they simply are not there.

Hopefully, AMD will get there heads out of their asses, and look at releasing something worth talking about. Otherwise, they will eventually go the way of the Dodo, which is to say, extinct, unless they just want to release stuff for the budget minders, and hope their dedicated graphics cards make up for the mounting losses, here on out.

Not defensive, just pointing out that you are talking about something completely different than the original argument. If you wish to quote for debate then at least have the mind to debate on the topic you quoted.
 
http://techreport.com/review/26166/gigabyte-brix-pro-reviewed/3

Iris Pro 5200 D3D frame times are better than Kaveri's GPU in both D3D and Mantle. No doubt the TBR method Intel uses will be more prone to glitches than Nvidia or AMD GPUs, but it's not "even worse" than frame times since frame times isn't an issue. The performance of eDRAM equipped Intel CPUs shouldn't be so easily dismissed. It's a niche product, and priced relatively high, making it better for compact all in one systems than a volume product.

Not too hard and I am tired so didn't spend a lot of time looking.

http://www.anandtech.com/show/7677/amd-kaveri-review-a8-7600-a10-7850k/12

Look at each chart, as the resolution rises the fps fall sharply from the Iris-pro. But what you really want to look at is the minimums, just click the min button below each chart. Iris-pro just like any Intel iGPU just plain sucks, is trash, a waste of money, die space ... .

If I feel like it later I will try to find the frame times I was referring to. Your link seems very fishy considering every other frametime on AMD iGPUs to date never look like that.
 
Not too hard and I am tired so didn't spend a lot of time looking.

http://www.anandtech.com/show/7677/amd-kaveri-review-a8-7600-a10-7850k/12

Look at each chart, as the resolution rises the fps fall sharply from the Iris-pro. But what you really want to look at is the minimums, just click the min button below each chart. Iris-pro just like any Intel iGPU just plain sucks, is trash, a waste of money, die space ... .

If I feel like it later I will try to find the frame times I was referring to. Your link seems very fishy considering every other frametime on AMD iGPUs to date never look like that.

Intel iGPU has a place, workstations were graphics performance is not a real requirement.
 
Intel fanboys should be the ones hoping that AMD pulls out of their funk...competition keeps prices low and drives innovation. For a fanboy of either side to hope for domination by "their team" is a Pyrrhic victory (since they "win" but end up paying higher prices for increasingly fewer features).
 
Not too hard and I am tired so didn't spend a lot of time looking.

http://www.anandtech.com/show/7677/amd-kaveri-review-a8-7600-a10-7850k/12

Look at each chart, as the resolution rises the fps fall sharply from the Iris-pro. But what you really want to look at is the minimums, just click the min button below each chart. Iris-pro just like any Intel iGPU just plain sucks, is trash, a waste of money, die space ... .

If I feel like it later I will try to find the frame times I was referring to. Your link seems very fishy considering every other frametime on AMD iGPUs to date never look like that.
performance characteristic is imho quite ok. Only Bioshock show alarming drop in minimum fps, others games do not.

Intel drivers might suck in general but they have pretty good silicon. AMD should really get to work because they are starting to loose. Intel had first APU (Clarkdale) and now they seem to have better performance (especially upcoming Broadwell).

Facts are that with their current technology Intel could easily make i7 with iGPU that have PS4 level of performance. I actually wonder why they are not doing just that...
 
I was considering moving from my X58 platform (which had an un-overclockable 920 C0) to a FX8350 and overclock the shit out of it. But then, I found the Xeon X5650 for $65. 6-core, overclocked now at 4.18ghz (need to update my sig). Absolutely stomps the AMD offerings.
even Nehalem/Lynnfield i7 @ 4GHz is better for games and most programs than 5GHz FX

dirty cheap 32nm 6c/12t Xeon was good choice :)
 
I think AMD is a perfectly viable option instead of Intel these days, there can be many reasons to choose AMD, here are a few -

4. You don't mind extra $150/month power bill

At $0.13/kWh for me, that's 1154kWh in a month. Really inefficient space heater, though. Barely heats my room in the winter.
 
At $0.13/kWh for me, that's 1154kWh in a month. Really inefficient space heater, though. Barely heats my room in the winter.

Pretty sure the i5 has better kWh than the FX, don't think kWh matters in gaming though, whatever it is
 
Not too hard and I am tired so didn't spend a lot of time looking.

http://www.anandtech.com/show/7677/amd-kaveri-review-a8-7600-a10-7850k/12

Look at each chart, as the resolution rises the fps fall sharply from the Iris-pro.
All those GPUs are unplayable in higher settings and resolutions and it's more of a special olympics competition at that point. Your criticisms don't really relate to the segment and goal post moving is so last year. Scroll back up if you forgot your incorrect claims that I responded to.
 
Pretty sure the i5 has better kWh than the FX, don't think kWh matters in gaming though, whatever it is

So, you've found a way to use a computer without it consuming electricity? You'll become the next trillionaire.
 
I used AMD way back in the day, the k6-2, k6-III, k6+. I've had an fx chip. My buddy in 6th grade swore they were the shit back then so I stuck with it. Then I noticed people with celerons at half the clock speed just utterly demolishing my system in every way possible. Then I got a CHEAP celeron system in 05. What a fucking difference. I didnt have to worry the first time I'd run a game. If you don't know what you're missing then I guess ignorance is bliss. I thought being the underdog is cool. But when you are getting frame rate stomped by processors that are under $100, I wouldn't really call that being the underdog. The underdog surprises people by winning when they thought you would lose. When you think you are winning but are actually losing and refuse to accept the fact, I'd call that being delusional and just straight up lying to yourself. AMD is a piece of shit for nearly everything but the lightest computing. You have double the cores and half the performance at the same price point, why the hell would I want that? I'm an Intel convert for life.
 
Look at more reviews and you will better understand why people have the opinions they do:

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Thief_-test-proz.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Action-Dying_Light-test-dl__proz.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Action-Grand_Theft_Auto_V_-test-2-GTA5_proz.jpg

Late to the party, but I know for a fact those numbers are not right.
 
Most of people who spent over $500~700 USD on their CPU will say that AMD and price/performance doesn't matter just to justify their buyout. :)
 
Most of people who spent over $500~700 USD on their CPU will say that AMD and price/performance doesn't matter just to justify their buyout. :)

pretty ignorant comment. a $250-330 stock clocked intel quad can compete with a moderately overclocked fx8/9.

anyone who knew what they were buying with an intel-e platform probably never considered an amd solution in the first place.
 
Last edited:
Hmm left this one several pages ago and it looks like its gone round the block three times now.

Walk away folks unless you have nothing better to do.
 
I used AMD way back in the day, the k6-2, k6-III, k6+. I've had an fx chip. My buddy in 6th grade swore they were the shit back then so I stuck with it. Then I noticed people with celerons at half the clock speed just utterly demolishing my system in every way possible. Then I got a CHEAP celeron system in 05. What a fucking difference. I didnt have to worry the first time I'd run a game. If you don't know what you're missing then I guess ignorance is bliss. I thought being the underdog is cool. But when you are getting frame rate stomped by processors that are under $100, I wouldn't really call that being the underdog. The underdog surprises people by winning when they thought you would lose. When you think you are winning but are actually losing and refuse to accept the fact, I'd call that being delusional and just straight up lying to yourself. AMD is a piece of shit for nearly everything but the lightest computing. You have double the cores and half the performance at the same price point, why the hell would I want that? I'm an Intel convert for life.

A Celeron from 2005? Kicking butt Celeron in 2005? LOL! Ok, if you say so. :D
 
It is sad that an 8 core 5 GHZ chip has to compete with a fucking i3 in the end and still lose and win here and there.

I dont know if these chips performed near i5s when it was sandy bridge era? Did they?
 
FX 8320 and FX 8350 going fast and strong here. Games play well and the computers are stable and fast. Oh well, a lot of us folks do more than one thing at a time with our computers.
 
I used AMD way back in the day, the k6-2, k6-III, k6+. I've had an fx chip. My buddy in 6th grade swore they were the shit back then so I stuck with it. Then I noticed people with celerons at half the clock speed just utterly demolishing my system in every way possible. Then I got a CHEAP celeron system in 05. What a fucking difference. I didnt have to worry the first time I'd run a game. If you don't know what you're missing then I guess ignorance is bliss. I thought being the underdog is cool. But when you are getting frame rate stomped by processors that are under $100, I wouldn't really call that being the underdog. The underdog surprises people by winning when they thought you would lose. When you think you are winning but are actually losing and refuse to accept the fact, I'd call that being delusional and just straight up lying to yourself. AMD is a piece of shit for nearly everything but the lightest computing. You have double the cores and half the performance at the same price point, why the hell would I want that? I'm an Intel convert for life.
In 2005 there were only Prescott based Celeron D. They were not that bad compated to older Socket A Athlons/Durons (test) but certainly not at half the clock speed. Celerons needed 1GHz more to keep up. Power consumption of those Celerons sucked big time.

But we are talking 2005 here and in 2005 you could buy Sempron 2600+ which after OC to 2.5GHz obliterated any Celeron D, especially in games. Those Semprons were much cheaper than you could get selling used Athlon XP or Celeron so often upgrade from eg. Athlon XP 2500+ to this Sempron was not only for free but there was some money left to buy eg. some ram. Celeron D were considerably more expensive with generally pricier motherboards, especially those for LGA775 <= really bad performance/price and performance/power_usage.

From whole Netburst family only good perofrmance/price CPUs were Pentium D 805 which came in 2006 but they were not that good in games. I bought one in 2006 and OC'd to 3.6GHz. It was 2.5 times more expensive than friends Sempron [email protected] and performed visibly worse in games. Windows performance and ability to actually not needing to shut down any programs while playing (because all games ran on single core and all programs together used second core - total cpu utilization was still far from 100%) was well worth it. Athlon X2 were much pricier then. Celeron D in that time were from what I remember slightly more expensive than Semprons.

AMD was much better on the budget from first Athlons/Durons up to single core Athlons 64 and then had good thing going on with Phenom II and its derivatives. Whole issue is that Bulldozer just suck balls and can be hardly recommended in any case scenario :eek:

BTW. AMD K6s were rather poor processors due to very poor FPU unit. Kinda the same situation as we have now with Bulldozer actually :D
 
In 2005 there were only Prescott based Celeron D. They were not that bad compated to older Socket A Athlons/Durons (test) but certainly not at half the clock speed. Celerons needed 1GHz more to keep up. Power consumption of those Celerons sucked big time.

But we are talking 2005 here and in 2005 you could buy Sempron 2600+ which after OC to 2.5GHz obliterated any Celeron D, especially in games. Those Semprons were much cheaper than you could get selling used Athlon XP or Celeron so often upgrade from eg. Athlon XP 2500+ to this Sempron was not only for free but there was some money left to buy eg. some ram. Celeron D were considerably more expensive with generally pricier motherboards, especially those for LGA775 <= really bad performance/price and performance/power_usage.

From whole Netburst family only good perofrmance/price CPUs were Pentium D 805 which came in 2006 but they were not that good in games. I bought one in 2006 and OC'd to 3.6GHz. It was 2.5 times more expensive than friends Sempron [email protected] and performed visibly worse in games. Windows performance and ability to actually not needing to shut down any programs while playing (because all games ran on single core and all programs together used second core - total cpu utilization was still far from 100%) was well worth it. Athlon X2 were much pricier then. Celeron D in that time were from what I remember slightly more expensive than Semprons.

AMD was much better on the budget from first Athlons/Durons up to single core Athlons 64 and then had good thing going on with Phenom II and its derivatives. Whole issue is that Bulldozer just suck balls and can be hardly recommended in any case scenario :eek:

BTW. AMD K6s were rather poor processors due to very poor FPU unit. Kinda the same situation as we have now with Bulldozer actually :D

Hardly. :) The AMD FX 8350 and 8320 are fantastic chips that work great. Quad cores in my daily usage case are not going to be very effective. Also, if I were to build a gaming only PC, I would just go ahead and build a 860K with a mITX board and R9 280. There would be no reason for me to build a super high end gaming only PC since I do not game all that much anymore.

I do enjoy gaming since I have a R9 290, Xbox Original, One and 360 but, I prefer competitive running now. I see no real good ROI to go from what I have to anything Intel has at this time. If I were to build from scratch and go with Intel, the 5820k would be the bare minimum I would go with. ;)
 
pretty ignorant comment. a $250-330 stock clocked intel quad can compete with a moderately overclocked fx8/9.

anyone who knew what they were buying with an intel-e platform probably never considered an amd solution in the first place.

I don't understand that argument and that seems to be a pretty popular topic. I think it's ridiculous to assume Intel users spend $500 - $700 if not more on just the CPU. My last two Intel purchases (including the 4770k I'm currently using) haven't topped $300. And I consider both chips to have been near the top of the line. This whole exaggeration of what Intel users spend on their CPU alone is a very moot point.

I have two machines running each brand. Yes, I'm of the mindset that my AMD, even the older 1055T would easily be enough to get me by. It's a good solid machine and I'm very happy with it. But I certainly won't discount just how much better the 4770k is, or even my older i7 930 I had before. They both run circles around it.

I got my 4770k for $250, motherboard for $150 I'm thinking an AMD counterpart like the 8320e would have been a entire $150 less or so. Considering just what my build consists of... $150 wouldn't have made much difference on my budget. And if CPUboss is anywhere near accurate it looks as though the 4770k is quite a bit better which is just future proof to me.

I like AMD, and I like making budget builds with them. And I would love for them to release something competing with Intel all around, including TDP. Because I would seriously consider going AMD as my primary rig again. But I swear I see more AMD users defending spending less than Intel users defending spending more... *shrugs*
 
I don't understand that argument and that seems to be a pretty popular topic. I think it's ridiculous to assume Intel users spend $500 - $700 if not more on just the CPU. My last two Intel purchases (including the 4770k I'm currently using) haven't topped $300. And I consider both chips to have been near the top of the line. This whole exaggeration of what Intel users spend on their CPU alone is a very moot point.

I have two machines running each brand. Yes, I'm of the mindset that my AMD, even the older 1055T would easily be enough to get me by. It's a good solid machine and I'm very happy with it. But I certainly won't discount just how much better the 4770k is, or even my older i7 930 I had before. They both run circles around it.

I got my 4770k for $250, motherboard for $150 I'm thinking an AMD counterpart like the 8320e would have been a entire $150 less or so. Considering just what my build consists of... $150 wouldn't have made much difference on my budget. And if CPUboss is anywhere near accurate it looks as though the 4770k is quite a bit better which is just future proof to me.

I like AMD, and I like making budget builds with them. And I would love for them to release something competing with Intel all around, including TDP. Because I would seriously consider going AMD as my primary rig again. But I swear I see more AMD users defending spending less than Intel users defending spending more... *shrugs*

Must be nice having a Microcenter around where you live. :D I bet you have 2 of them close by. I would probably spend more than I do if I had one but, it is probably best that I do not. (Although I do want one.)
 
The only good thing about a MicroCenter is their CPU deals and sometimes their motherboard deals.
 
I don't understand that argument and that seems to be a pretty popular topic. I think it's ridiculous to assume Intel users spend $500 - $700 if not more on just the CPU. My last two Intel purchases (including the 4770k I'm currently using) haven't topped $300. And I consider both chips to have been near the top of the line. This whole exaggeration of what Intel users spend on their CPU alone is a very moot point.
Generally the point of those arguments is to deride the tiny fraction of Intel purchasers that opt for the extreme editions. AMD fans like to declare themselves superior consumers since their three year old bargain chips give better performance "for the price" and can't seem to comprehend that there are those out there that just want the best performance without qualifiers. Or want lots of PCIe 3.0 lanes. Or just want PCIe 3.0 in general. Or want higher RAM capacities. Or want 10 SATA 3.0 connectors attached to the chipset. Ok I may be getting silly with that last one but you get the point. To this tiny subsegment of the intel owner base AMD was never even an option.

Honestly I'm of the opinion that AMDs biggest problem is that whole phrase "for the price".
 
Must be nice having a Microcenter around where you live. :D I bet you have 2 of them close by. I would probably spend more than I do if I had one but, it is probably best that I do not. (Although I do want one.)

Hah, no I don't. I know the point is new vs. new but I bought my 4770k from a forum user. I was going to go with a 4670k but couldn't pass up that deal. (I know that doesn't matter)

I wish we had one around where I'm at. But I'm with you - I spend enough on PC purchases online that a store would only make it worse.
 
Still don't understand the debate here -
People are claiming their 2 L Civics run just as good as the next guy's V8s in the city and daily tasks.

Technically it is correct since you have a speed limit in the city but saying you "saved" money by going with the civic is hilarious, just as saying you saved money by going for a 8320 instead of a 4790k.

Because they are not the same products, they are not intended for same audience and do not perform even in the same leagues. - while gaming both will give you identical performance @ 60fps 1080p probably, but so will a pentium even in a lot of games
 
Generally the point of those arguments is to deride the tiny fraction of Intel purchasers that opt for the extreme editions. AMD fans like to declare themselves superior consumers since their three year old bargain chips give better performance "for the price" and can't seem to comprehend that there are those out there that just want the best performance without qualifiers. Or want lots of PCIe 3.0 lanes. Or just want PCIe 3.0 in general. Or want higher RAM capacities. Or want 10 SATA 3.0 connectors attached to the chipset. Ok I may be getting silly with that last one but you get the point. To this tiny subsegment of the intel owner base AMD was never even an option.

Honestly I'm of the opinion that AMDs biggest problem is that whole phrase "for the price".

If it's the world AMD and AMD users want to live in I have no problem with that. But I don't think Intel is so far behind on budget builds though. And that's the problem I see.

But, you know... If these typical Intel chips cost $500+ (like the 4770k, or a 3770k, etc...) that I see a lot of people using you're damn right I would have gone with AMD.
 
Back
Top