AMD FX-8150 Multi-GPU Gameplay Performance Review @ [H]

i agree on all parts. until theres an apples to apples between the AMD cfx and SLI its kind of hard to say if the problem really is the processor its self. for all we know the SLI drivers are broken as hell for the 990 chipset(and honestly i wouldn't doubt that from Nvidia). all we got was half the cookie, now we need the rest of it, without it the review doesn't really tell us as customers/readers anything.




sure the processor may be cheaper but the entire platform sure as hell isn't. yet people seem to love ignoring that fact.

Im not sure about that, You can get into a GOOD 2500k+mobo+ram for under $400?

Thats a pretty damn good Mainsteam PC....

P.S. when going to build a pc for a customer its really hard not to recommend an Intel based system....AMD just dropped the ball on bulldozer (I was accually waiting for this proc to upgrade too)
 
Im not sure about that, You can get into a GOOD 2500k+mobo+ram for under $400?

Thats a pretty damn good Mainsteam PC....

P.S. when going to build a pc for a customer its really hard not to recommend an Intel based system....AMD just dropped the ball on bulldozer (I was accually waiting for this proc to upgrade too)
easily but some people love to make those ignorant claims of an Intel platform being so much more expensive. its nonsense and you can easily build a 2500k system for a similar price as an 8150 system. you do not have to get the really crazy high end boards unless money is not a concern. now overclock both cpus and you will need almost a 200 watt higher psu for the 8150 system and deal with all the additional heat and costs of all that extra power.
 
easily but some people love to make those ignorant claims of an Intel platform being so much more expensive. its nonsense and you can easily build a 2500k system for a similar price as an 8150 system. you do not have to get the really crazy high end boards unless money is not a concern. now overclock both cpus and you will need almost a 200 watt higher psu for the 8150 system and deal with all the additional heat and costs of all that extra power.

Exactly. I love AMD and Intel both, but come on its so cheap to build a 2500k system. I mean even microcenter has those deals on the procs for like $170?

I am sorry, but if you telling people bulldozer is worth the $65 more....You really are lieing to them.

The 2500k IS the best price/performance CPU money can buy period right now.
 
Is the cpu's role more int. or FPU related?
Which games use mult threads..... or ???

It's not too good but using older games for this cpu is not going to work well. It's been pretty much established that this is a pretty much multi tasking or multithreading cpu design with a so so FPU so how did you really think the cpu would work on older games.

Not very well, no supprises here, The cpu's should also have been run at their native speeds.
With a supposed longer pipeline the BZ is clocked higher to make up for part of this right? Clocking them the same really isn't fair in this case. Test them as they come out of the box for real world data results if you want people to see what they can expect of it.

And even then the design is future minded for better optimized software that can make use of it's new (not old) design. You might as well have used a 6 core AMD cpu as well as this is more represented to current and older programs. This is been pretty much the accepted status of the BZ cpu. It's way to new of a design to be compared this way. This is how AMD thought out it's future, they sure didn't design this for past software even though it would be nice for it to run older single threaded software better it is what it is.

I really think the benchmarks should wait a few months for either new software including either patch's for win7 or for win8 to come out and newer software that can make better use of the design of the new cpu. It's so different that there has to be programs that have this design in mind for them to give a more realistic view of it's true performance levels if they are there to be coaxed out of this cpu. It's too soon to be running benchmarks on it I think.
It may be bad timing by AMD putting it out before windows 8, or they want it out so software designers have more of a chance to work with it so when win8 or a patched win7 and better software will really start to show up for it.
This is my view on this cpu anyway. AMD must have known how it would perform and has better info. on what direction future software is going to ask of it's hardware and designed it to take this into account. I doubt they would be that dumb not to know these things or it's design would be much different.
Also with their plans to go with APU's they may have designed it for this future to use the video part to offset the FPU workloads so they knew they didn't have to design a better one that it has on it but that is just guessing. Makes a little sense but only time will tell with this cpu. It's too early to know.
From programs that have been optimized for this cpu it can perform very well. I will not write this cpu off on current software.

Sounds like a whole lot of excuses for a bad product. We heard the same rhetoric from the other side when the Pentium 4 came out...
 
Rather eye opening review! I am going to have to hate to ditch my 8120 rig if dual cards are that terrible:mad:. Now I can't rule out that the problem seen in this review is not a 8150/990/SLI combination problem, a.k.a driver issues. Almost looks like SLI was not working or working partially, sporadic. Since SLI is a new thing on AMD chipsets, I am not sure if the rather poor performance deals with an Nvidia issue vice an AMD cpu issue. Probably both. Would have been nice to eliminate or to minimize possible software issues by testing AMD CFX also maybe using a Phenom II CPU, that could show a real problem with the FX if the Phenom II blows it away in SLI/CFX. Great start but it does leads to more questions that are not answered yet. The FX is not 72% slower then a 2500K, well I sure hope not :(.

Has HardOCP contacted AMD over this? I really wonder if they have a good reason or reponse or will they blow it off which may mean the FX indeed has some major problems. How do you make a 2 billion transitor chip, 8 cores which is over twice the number of transitors of a 2600K, double the number of cores, consume much more power when OC and be 72% slower???
 
Last edited:
Rather eye opening review! I am going to have to hate to ditch my 8120 rig if dual cards are that terrible:mad:. Now I can't rule out that the problem seen in this review is not a 8150/990/SLI combination problem, a.k.a driver issues. Almost looks like SLI was not working or working partially, sporadic. Since SLI is a new thing on AMD chipsets, I am not sure if the rather poor performance deals with an Nvidia issue vice an AMD cpu issue. Probably both. Would have been nice to eliminate or to minimize possible software issues by testing AMD CFX also maybe using a Phenom II CPU, that could show a real problem with the FX if the Phenom II blows it away in SLI/CFX. Great start but it does leads to more questions that are not answered yet. The FX is not 72% slower then a 2500K, well I sure hope not :(.

Has HardOCP contacted AMD over this? I really wonder if they have a good reason or reponse or will they blow it off which may mean the FX indeed has some major problems. Now do you make a 2 billion transitor chip, 8 cores which is over twice the number of transitors of a 2600K, double the number of cores, consume much more power when OC and be 72% slower???

Or it really could be that bulldozer is this bad....and no drivers are going to fix crappy silicon
 
Or it really could be that bulldozer is this bad....and no drivers are going to fix crappy silicon

Yes it could mean that, or it could mean otherwise since while the review is good I don't think it answered that 100% by being limited to Nvidia SLI. Will CFX be much better, I do not know but I am sure if HardOCP does a CFX review with similar results, the CFX review will make this review look like cotton candy for the kiddies. In other words there will be no mercy for AMD :D. You don't fool around with hardcore ethusiast with OC speed records, fancy marketing etc. and deliver a turd in the end.
 
Yes it could mean that, or it could mean otherwise since while the review is good I don't think it answered that 100% by being limited to Nvidia SLI. Will CFX be much better, I do not know but I am sure if HardOCP does a CFX review with similar results, the CFX review will make this review look like cotton candy for the kiddies. In other words there will be no mercy for AMD :D. You don't fool around with hardcore ethusiast with OC speed records, fancy marketing etc. and deliver a turd in the end.

Either way, the 2500k is still a better buy....Doesnt matter what the crossfire results are.

It's not like intel is going to get worse results because you went with an AMD Card.

Seriously, grasping at Straws....I want bulldozer to be badass....but its just not. Not even windows 8 will save it.

Maybe the B3 stepping will
 
Either way, the 2500k is still a better buy....Doesnt matter what the crossfire results are.

It's not like intel is going to get worse results because you went with an AMD Card.

Seriously, grasping at Straws....I want bulldozer to be badass....but its just not. Not even windows 8 will save it.

Maybe the B3 stepping will

Grasping at what straws? It is what it is. Fortunately or unfortuanately I have one of these, I want the facts not drama. Premature buying on my part, you bet!. While I am having fun with my new rig, doesn't mean I will sugar coat it either. One of the reasons or needs I should say is a future dual card setup and three monitors (probably Eyefinity) but if it is this much a turd I will have to ditch it as in sell it, take my loss and get something better. At this point it is working well. From what I've seen though you are right the 2500K system is the better deal which usually AMD did better in the bucks/performance ratio. Bulldozer is beginning to feel like Duke Nukem Forever, glad it is release but when you play it, WTF?
 
Grasping at what straws? It is what it is. Fortunately or unfortuanately I have one of these, I want the facts not drama. Premature buying on my part, you bet!. While I am having fun with my new rig, doesn't mean I will sugar coat it either. One of the reasons or needs I should say is a future dual card setup and three monitors (probably Eyefinity) but if it is this much a turd I will have to ditch it as in sell it, take my loss and get something better. At this point it is working well. From what I've seen though you are right the 2500K system is the better deal which usually AMD did better in the bucks/performance ratio. Bulldozer is beginning to feel like Duke Nukem Forever, glad it is release but when you play it, WTF?

Exactly. I love AMD so much. When it came to the better buy I usually steared people toward an AMD chip....

It's just the 2500K is such a damn cheap and effecient CPU, thats its really hard to justify recommending a Bulldozer.

Like I said lets hope B3 stepping fixes the issue.
 
Rather eye opening review! I am going to have to hate to ditch my 8120 rig if dual cards are that terrible:mad:. Now I can't rule out that the problem seen in this review is not a 8150/990/SLI combination problem, a.k.a driver issues. Almost looks like SLI was not working or working partially, sporadic. Since SLI is a new thing on AMD chipsets, I am not sure if the rather poor performance deals with an Nvidia issue vice an AMD cpu issue. Probably both. Would have been nice to eliminate or to minimize possible software issues by testing AMD CFX also maybe using a Phenom II CPU, that could show a real problem with the FX if the Phenom II blows it away in SLI/CFX. Great start but it does leads to more questions that are not answered yet. The FX is not 72% slower then a 2500K, well I sure hope not :(.

Has HardOCP contacted AMD over this? I really wonder if they have a good reason or reponse or will they blow it off which may mean the FX indeed has some major problems. How do you make a 2 billion transitor chip, 8 cores which is over twice the number of transitors of a 2600K, double the number of cores, consume much more power when OC and be 72% slower???

Overclocked i7 9x0 [H] used previously for sli tests was also bottlenecking triple sli config.

I wouldn't be suprised if Ivy Bridge showed even more fpses at triple Sli configs.
 
So why would anyone buy Bulldozer? Anyone?

The only motivation these days is so that we avoid buying chips from the Evil Empire, and thus supporting them.

6237945182_d69c402ed1_o.jpg


In every other measure they have won this round, to the point where I don't see AMD ever coming back.

You know it's bad when even I am considering switching Intel for the next round...

...just can't force myself to do it though...
 
Yeah as Zarathustra says non-logic reasoning is the only way you can arrive at conclusion to buy BD ;)
 
Sounds like a whole lot of excuses for a bad product. We heard the same rhetoric from the other side when the Pentium 4 came out...

People have short memories and believe wonders will happen...Wait for Win8 in late 2012 or 2013 !:p

..
Maybe the B3 stepping will

Man..do yourself a favour and stop engaging in wishfull thinking. A new stepping can bring you 1-2% from tweaks, improve power consumption and yield better. But it won't fix a massive performance deficit.

The phoronix linux results aren't bad...

Yes, doing 10 archiving tests and 10 video encoding tests ( since that's where BD does best ) out of 30 tests is enough to call the CPU decent...

Zarathustra[H];1037976342 said:
The only motivation these days is so that we avoid buying chips from the Evil Empire, and thus supporting them.
..

I find a bit tiring your whole "Evil Empire" crusade. Every one or two days I stumble upon a post by you where you engage in long tirades about how Intel is responsible for AMD's failures. I suggest you leave aside the AMD coolaid kit ( if you haven't got enough in the pre-BD launch days ) and accept that reality is different.

Intel tried to curb AMD's rise in 1999-2005. The OEMs knew that and made huge profits by getting discounts from Intel threatining to use AMD products. Even so, demand drives the offer and OEM supported AMD, some enthusiastically ( SUN ). Even Dell jumped in when AMD had the production capacity in place.The market worked : AMD enjoyed significant market share wins, profitability boomed, and their were sold out. They couldn't sell more because they did not have the production capacity. By the time the production capacity was in place with a new FAB in Dresden and a deal with Chartred, Core 2 arrived and demand for AMD products fell. And from then on, they were on a downward path.

It isn't Intel's fault that AMD wasn't able to capitalize on K8 succes and produced a lemon, Barcelona. Neither is Intel's fault that God knows how many projects were canned until they delivered another turd, BD. It isn't Intel's fault either that they spent on ATI equivalent money for building TWO state of the art FABs and ended up selling theirs to GF.
With all eyes on Intel since 2006, it didn't made a difference. The last 10 years show that the market works : performance, quality and execution are rewarded. With the exception of 2 years, 2004-2005, AMD didn't deliver.

But like you say, it must be Intel's fault that BD is a turd. It's easier to accept.
 
Last edited:
People have short memories and believe wonders will happen...Wait for Win8 in late 2012 or 2013 !:p



Man..do yourself a favour and stop engaging in wishfull thinking. A new stepping can bring you 1-2% from tweaks, improve power consumption and yield better. But it won't fix a massive performance deficit.



Yes, doing 10 archiving tests and 10 video encoding tests ( since that's where BD does best ) out of 30 tests is enough to call the CPU decent...



I find a bit tiring your whole "Evil Empire" crusade. Every one or two days I stumble upon a post by you where you engage in long tirades about how Intel is responsible for AMD's failures.

Accually you are wrong. The C0 to D0 stepping for the I7 1366 socket was a huge improvement.

Before you were lucky to get 3.8-4.0ghz on the C0 core, with the D0 core 4.4-4.8ghz was possible.

Not only that, they also lowered the TDP of the D0 core as well, and it ran alot cooler.

A new stepping could easily help Bulldozer out.
 
Accually you are wrong. The C0 to D0 stepping for the I7 1366 socket was a huge improvement.

Before you were lucky to get 3.8-4.0ghz on the C0 core, with the D0 core 4.4-4.8ghz was possible.

Not only that, they also lowered the TDP of the D0 core as well, and it ran alot cooler.

A new stepping could easily help Bulldozer out.

And how much did Intel gain with D0 in stock frequency ? 200MHz ? You see, OC ability doesn't translate into equivalent headroom for new bins.

Thirdly, D0 did nothing for IPC. Everybody is hoping BD will somehow magically get an IPC increase with a new stepping.

Fourthly, BD gaining 200-400MHz is also dependent on the process. From the looks of it, BD has all it needs to ramp up in frequency : longer pipeline, relaxed instruction timings, increased cache latencies. Unlike Nehalem which wasn't designed for large frequency increases, BD has all the knobs in place. What it lacks is a worthy 32nm process.

I'd say BD will gain in frequency as the 32nm process improve and it isn't something which will be solved by new steppings.
 
And how much did Intel gain with D0 in stock frequency ? 200MHz ? You see, OC ability doesn't translate into equivalent headroom for new bins.

Thirdly, D0 did nothing for IPC. Everybody is hoping BD will somehow magically get an IPC increase with a new stepping.

Fourthly, BD gaining 200-400MHz is also dependent on the process. From the looks of it, BD has all it needs to ramp up in frequency : longer pipeline, relaxed instruction timings, increased cache latencies. Unlike Nehalem which wasn't designed for large frequency increases, BD has all the knobs in place. What it lacks is a worthy 32nm process.

I'd say BD will gain in frequency as the 32nm process improve and it isn't something which will be solved by new steppings.

The new stepping could also fix the module issue that is the issue.

They could also have an engineer handdraw the CPU instead of having a machine do it. That was one of the issue with BD that a former AMD engineer said.

BD is just plain bad, it is possible a new stepping could fix issues, but increase IPC not sure, but if they fixed a few thingsd here and there, it is possible to get a 5-10% (im talking out of my butt)
 
I find a bit tiring your whole "Evil Empire" crusade. Every one or two days I stumble upon a post by you where you engage in long tirades about how Intel is responsible for AMD's failures.

AMD are responsible for their own failures, but having a competitor 10 times their size that doesn't play by the rules sure doesn't help.

Intel didn't agree to pay $1B in out-of-court settlements to AMD for no reason. Unfortunately this is about a tenth of what analysts feel an actual court settlement would have yielded, and that - in its turn - may have been low compared to the actual losses they sustained from these practices.

You - however - miss my point.

I am not trying to find a scapegoat for AMD's failures. I am not an AMD fanboy. I really don't care if I have an AMD CPU in my system.

I am an "anyone but Intel" zealot. I'd go with Cyrix, or someone else, if they released a chip that was competitive.

I used to just buy whatever gave me the best performance for the amount of money I had to spend. Sometimes that was Intel, sometimes it was AMD. As i read about Intel's shady business practices over time it angered me more and more to the point where I would feel dirty if I bought anything from them.

I am a fierce defender of competitive markets, and as such I find it disgraceful that the 800lb gorilla in the microchip market has such a tainted record in this regard.

I am also a strong supporter of the concept that no matter what you buy or when you buy it, you are never just a customer. Every dollar you give to a company - any company - is a vote of support for them and what they do.

I also don't shop from BP. I don't do any business with Bank of America, including avoiding their ATMs. I haven't bought anything from Apple since I found out about the sweatshop-like conditions the people who manufacture their stuff endure. I refuse to buy GM products as I have a lot of contempt for that organization and how they've treated their subsidiaries in the past. I also refused to buy anything from Nike back when they had a lot of sweatshop issues.

Every time your your wallet it's not just about you. I'ts incredibly naive and selfish to think that way. Every time you open your wallet, think about the impact of what you are doing has on other people and the world as a whole. Then buy the ethically best product.

This isn't always an easy choice. Often you need a certain product, and none of the suppliers of it are of particularly high moral standing Then you have to pick the lesser evil, and hope that if enough people do, it sends a message.

Remember. It's not all about you. Sure I could go out and get a Intel CPU that would outperform what I currently have for the same (or in some cases less) money, and have lower power usage to boot. Doing so - however - would entail betraying everything I believe in in supporting a bad company.

I hold myself to a higher standard than that. And I hold everyone in the world to the same standards I hold myself.
 
Zarathustra[H];1037976342 said:
The only motivation these days is so that we avoid buying chips from the Evil Empire, and thus supporting them.

6237945182_d69c402ed1_o.jpg


In every other measure they have won this round, to the point where I don't see AMD ever coming back.

You know it's bad when even I am considering switching Intel for the next round...

...just can't force myself to do it though...

I used to think the same way, but when you consider the crap AMD has been feeding us in terms of "performance increases" with relation to the old to Phenom II's and older core i3/5/7's, you have to question if there's a good guy in this fight at all. They're marketing really set this up for a massive fail and swindled us who bought in to their performance estimates. They delayed the processor for years and it still feels like an unfinished and unpolished product.

They've had their bad products before, too, but never did they trail so far behind, delay a product so long and frankly flat out lie to the consumers quite like they did with Bulldozer. This chip has no place in any segment but maybe a linux workstation crunching integers. It fits nowhere else.~10 years ago AMD was about making chips the smart way with fewer resources and getting the best bang for your buck. I think those days are long gone and we'll all suffer for it.

Piledriver promises a 10-15% performance increase, which means they may have a shot at catching the Phenom II in IPC soon. Unfortunately, considering the extremely small estimation (really, i think we'd have to see at least a 30-40% increase in performance to make them competitive), we can come upon the conclusion that they'll just increase their clock speed once the 32nm process at GloFo matures. This, of course, doesn't bode well for power consumption figures and the piss-poor scaling we've seen when these chips are overclocked because I don't think they'll be addressing the architectural and design flaws.

In short, yea, i agree with you. If AMD doesn't pull an amd64 out of its pocket within the next couple of months and really see that they clearly made a catastrophic error here, I don't think we'll see AMD last very long in the desktop space. Judging by their rhetoric, though, my guess is that they'd rather abandon the desktop altogether than admit they really fucked it up with the doodoodozer. They've got a lead on Intel in the graphics space and APU (at least on the graphical performance side. the CPU side of that remains poor) and they've made a good amount of money on the llano and brazos. I'd love to power my rig with an AMD fusion chip and a graphics card to crossfire it with, but the tech, or software, just isn't there yet.

some links
http://www.xbitlabs.com/news/cpu/di..._Market_Shifts_to_Hybrid_Microprocessors.html
http://www.xbitlabs.com/news/other/...ces_Set_to_Unveil_New_Strategy_Next_Week.html
 
Last edited:
Zarathustra[H];1037976579 said:
...snip....

Well, naive wouldn't be a proper word, but is the closest I can think off. In a perfect world, what you'd say would work. In this real world however, anyone who operates a business knows that competition can be hard sometimes.

I don't hold Intel's practices as significantly damaging. Competing too agresively isn't as harmfull in my book as bribing politicians and the state or making cartels to keep prices high and supply limited.

After all, with their rebate scheme, they lowered CPU prices significantly. AMD couldn't prove the consumer was harmed through pricing, the best they could do was harm by lack of choice. But with the lack of choice come the issue of OEMs. Did the OEMs care about what they were selling ? Not really. HP couldn't care if it sold chips with dung inside as long as it made them a profit.

Going back to the moral issue of supporting illegal practices, pretty soon you'd have to surroung yourself in a crystal ball. You don't get an Iphone for $200 if the workers producing it are paid $30k a year. We like the benefits globalization has brought to us. Is it moral, is it ok ? Remains to be seen. Even in the sweatshops, you earn something and can eat and buy clothes. The western working class did not fair much better 100 years ago, situation was similar. Could it be a needed step in the evolution ?
 
The new stepping could also fix the module issue that is the issue.

?!?! That's set in stone. That will be fixed with a new uarch, 4-5 years from now. Until then, CMT is here to stay.
They could also have an engineer handdraw the CPU instead of having a machine do it. That was one of the issue with BD that a former AMD engineer said.

That's out of the question on any sensible time horizon. And it's not about hand optimization. You need experienced people for that. Probably machines do it better than rookies. And AMD had an exodus of experienced engineers in the past 5 years.
BD is just plain bad, it is possible a new stepping could fix issues, but increase IPC not sure, but if they fixed a few thingsd here and there, it is possible to get a 5-10% (im talking out of my butt)

The new iteration, Piledriver will bring 10% :
-3-5% uarch wise
-5% frequency

That's what AMD has in their slides.
 
Going back to the moral issue of supporting illegal practices, pretty soon you'd have to surroung yourself in a crystal ball. You don't get an Iphone for $200 if the workers producing it are paid $30k a year. We like the benefits globalization has brought to us. Is it moral, is it ok ? Remains to be seen. Even in the sweatshops, you earn something and can eat and buy clothes. The western working class did not fair much better 100 years ago, situation was similar. Could it be a needed step in the evolution ?

Y'know, that's exactly Marx said.

But yes. I, for one, don't mind paying a higher price if it means that I'm better off in the long run. I buy clothes and sneakers made here. I buy from local mom and pop hardware stores rather than home depot. I buy food that's grown locally (and I live in NYC, so i'm probably eating sewer meat). I understand your argument, and in a time where the country isn't doing well economically you generally pick what's cheapest, regardless of where it came from or who made it. But you have to consider, maybe... just maybe that's why the country's not doing so well?

China loves it. And btw, they're still technically run by socialists. Which is quite amazing to think that a country run by a bunch of socialists will be, economically, the strongest nation in the world. Talk about back asswards

EDIT: Inb4 hipster. I don't live in brooklyn and don't wear flannel. Also, i fucking hate scarves.
 
That would have been wishful thinking at best. We've seen nothing to indicate Bulldozer would be especially good at anything compared to Sandy Bridge, Sandy Bridge-E or Ivy Bridge. Well perhaps encoding, but that's about it. It may be compelling for owners of AM3 / AM3+ motherboards since that's the only upgrade path they have after Phenom II. I'm sure those stuck in that upgrade path (without spending the cash on an Intel chipset based board) wish it was better, but wishing for a thing does not make it so.

I know it's wishful thinking :p but as of right now thats the only path I have for upgrade.. Oh well, I got my fingers crossed right now...

I can only hope thats it's bad BIOS coding for the board, from what i was reading on the Asus ROG forums is that the CHF5 has been getting CPU code updates.
 
Zarathustra[H];1037976579 said:
AMD are responsible for their own failures, but having a competitor 10 times their size that doesn't play by the rules sure doesn't help.

Intel didn't agree to pay $1B in out-of-court settlements to AMD for no reason. Unfortunately this is about a tenth of what analysts feel an actual court settlement would have yielded, and that - in its turn - may have been low compared to the actual losses they sustained from these practices.

You - however - miss my point.

I am not trying to find a scapegoat for AMD's failures. I am not an AMD fanboy. I really don't care if I have an AMD CPU in my system.

I am an "anyone but Intel" zealot. I'd go with Cyrix, or someone else, if they released a chip that was competitive.

I used to just buy whatever gave me the best performance for the amount of money I had to spend. Sometimes that was Intel, sometimes it was AMD. As i read about Intel's shady business practices over time it angered me more and more to the point where I would feel dirty if I bought anything from them.

I am a fierce defender of competitive markets, and as such I find it disgraceful that the 800lb gorilla in the microchip market has such a tainted record in this regard.

I am also a strong supporter of the concept that no matter what you buy or when you buy it, you are never just a customer. Every dollar you give to a company - any company - is a vote of support for them and what they do.

I also don't shop from BP. I don't do any business with Bank of America, including avoiding their ATMs. I haven't bought anything from Apple since I found out about the sweatshop-like conditions the people who manufacture their stuff endure. I refuse to buy GM products as I have a lot of contempt for that organization and how they've treated their subsidiaries in the past. I also refused to buy anything from Nike back when they had a lot of sweatshop issues.

Every time your your wallet it's not just about you. I'ts incredibly naive and selfish to think that way. Every time you open your wallet, think about the impact of what you are doing has on other people and the world as a whole. Then buy the ethically best product.

This isn't always an easy choice. Often you need a certain product, and none of the suppliers of it are of particularly high moral standing Then you have to pick the lesser evil, and hope that if enough people do, it sends a message.

Remember. It's not all about you. Sure I could go out and get a Intel CPU that would outperform what I currently have for the same (or in some cases less) money, and have lower power usage to boot. Doing so - however - would entail betraying everything I believe in in supporting a bad company.

I hold myself to a higher standard than that. And I hold everyone in the world to the same standards I hold myself.

Yeah...right, blindly and continuously reward a company for making worse product is "fierce defender of competitive markets", it's opposite (and it's just plain fanboyism)

"buy the ethically best product"...you mean rewarding a company who has a good record of spreading FUD & lying to customer about their CPU product before launch ?

Every time you reward a company for worse product, it's not just about you, or the engineers...it's about throwing away money (a much larger percentage of that) to the lackluster CEO & the top.

If you hold "higher standard" for Intel, you should also hold "higher standard" for AMD too (...oh it's inconvenient that way, since you only recall the bad bad thing that Intel did in the past)
 
because i don't use multi-gpu and i never will so this has little effect on my gaming.

That's correct, if you're playing only the newest DX11 games that virtually guarantee, at settings anywhere near highest, that your GPU will be running at 100% 100% of the time. Run Far Cry 2's benchmark at lowest resolution and settings, with v-sync on, and monitor CPU and GPU usage. You likely won't maintain 60fps without dropping, even though your GPU will be using 20% or less of its capacity (assuming you're running at least a 560ti). This will demonstrate why IPC is absolutely essential for gaming (unless you're intentionally 100% [including minimums] GPU-limited). Hell, toss in Doom 3 and see if you can maintain 60 without dropping (my x4 810 [a 2009 processor, I believe, that scored over 3000 in Passmark] couldn't do it...and that's when I said "enough is enough").

Additionally, there's the concerns of your chip running much hotter and drawing much more power to produce the same framerate...so, one has to ask...why? "The future?" I might be missing something here, but why would someone spend $60 more for far less current performance and the (presumably) false promise of eventually overtaking 2500k (by the time that that would supposedly happen [thanks only to software advances], Ivy Bridge will have been released)?

Also, if you're entirely GPU-limited (as "unwittingly" implied by your comment), or weren't planning on upgrading your GPU (to boost framerates at given settings)...why did you upgrade to this chip? :)
 
Last edited:
Well on this topic of big screens for the cpu wars, Any chance someone can confirm this also with amd new driver release

"AMD Eyefinity enhancements

Enables support for Eyefinity 5x1 display (portrait and landscape) configurations
Maximum supported resolution has been increased to 16000 x 16000 pixels on the AMD Radeon HD 6000 Series. (limited to DX11 applications only)
Bezel compensation is now possible when using sets of displays that have mismatched pixel densities."

Since you got beefy 2 and 3 way set ups workin' how does the 16000x16000 look,
post pic if this type of thing is real or not.

See if can even get a single frame rate with AA and such on at the 16k resolution lol.
 
Well on this topic of big screens for the cpu wars, Any chance someone can confirm this also with amd new driver release

"AMD Eyefinity enhancements

Enables support for Eyefinity 5x1 display (portrait and landscape) configurations
Maximum supported resolution has been increased to 16000 x 16000 pixels on the AMD Radeon HD 6000 Series. (limited to DX11 applications only)
Bezel compensation is now possible when using sets of displays that have mismatched pixel densities."

Since you got beefy 2 and 3 way set ups workin' how does the 16000x16000 look,
post pic if this type of thing is real or not.

See if can even get a single frame rate with AA and such on at the 16k resolution lol.

Heh, that has to be some kind of typo. I'm guessing one frame every four-five seconds (triple SLI'd 580's, Battlefield 3, highest). Is that playable? :) Wtf...
 
Well on this topic of big screens for the cpu wars, Any chance someone can confirm this also with amd new driver release

"AMD Eyefinity enhancements

Enables support for Eyefinity 5x1 display (portrait and landscape) configurations
Maximum supported resolution has been increased to 16000 x 16000 pixels on the AMD Radeon HD 6000 Series. (limited to DX11 applications only)
Bezel compensation is now possible when using sets of displays that have mismatched pixel densities."

Since you got beefy 2 and 3 way set ups workin' how does the 16000x16000 look,
post pic if this type of thing is real or not.

See if can even get a single frame rate with AA and such on at the 16k resolution lol.

I assumed the 16000 x 16000 was for some sorta of multi-monitor work station but it does in dx 11 applications only so dunno.
 
I have I7-920 and AMD 1100T and in daily use the 920 just seems alot more responsive and spunkier in using desktop and games less delay in most actions, both using same video card set up.

Earlier article here [H] even expressed similar feeling between using the 2 different systems, I have to concur such is the case< i have similar results, with older cpu's the same effect though.

AMD for basic desktop use where performance is not a mandatory issue over cost, as they are alot cheaper.
 
I assumed the 16000 x 16000 was for some sorta of multi-monitor work station but it does in dx 11 applications only so dunno.

Well exactly they are running multi monitor / gpu here and some games here do dx 11, so lets see the goodness of 16kx16k !!!
 
Zarathustra[H];1037976579 said:
Sure I could go out and get a Intel CPU that would outperform what I currently have for the same (or in some cases less) money, and have lower power usage to boot. Doing so - however - would entail betraying everything I believe in

I must say I strongly agree with this. I'm willing to hold out for a little while before upgrading my CPU to see what AMD can do.

Frankly, I think AMD and nVidia should stop fighting each other and merge. Lord knows that as a combined company, leveraging their synergies, they'd have a MUCH better chance against their true mutual enemy - Intel.
 
AMD for basic desktop use where performance is not a mandatory issue over cost, as they are alot cheaper.

This isn't true anymore...a G620 ($70 or less), and debatably a G530 ($50 or less), will defeat far more expensive AMD PII (and therefore Bulldozer) quads (running much higher clocks) in 99% of "daily" PC tasks (because they use two or fewer cores). IPC is most important, by far, to nearly all PC users, *especially* those who don't need "performance" machines for the sake of video editing and encoding. There is currently no PC segment that AMD is leading (at least in terms of value, in my impression)...and that's a big problem for the industry.
 
Last edited:
Back
Top