FX 8350 best gaming chip thread

Coming from an X2 6400+, I was chomping at the bit for the day BD was released and the official [H] review. I was sorely disappointed by the single threaded performance results and the sheer amount of power it drew and heat it put off, especially while OC'd. I put off my upgrade for IB, since it was 77W vs the 95W for SB. Had the performance and all other aspects of the 8350 been what was delivered with the original 8150, I'd be continuing my trend of AMD system lineage. The 8350 is one hell of a nice chip for the money if the higher power draw and heat isn't a concern. 8150 -> 8350 has some decent improvements, but the entire new FX lineup is just not what I grew to love about and expected from an AMD CPU product. I'm anxiosly awaiting Steamroller results when it's released.

Oh, and Kudos to AMD for scoring the deals for the next gen consoles. Keep that R&D going [H]ard & [H]eavy!
 
this is a forum which is for discussion :) would you prefer no one posts anything unless they need something?

well to an extent yes. Sure I'll post a hey-what-do-you-recommend for this or that but when it comes to a chip? All else equal (same gpu, same ram, ssd, etc) pick whatever chip has the best performance for the budget you have. From what I can tell the i5/i7 based intel chips just are quicker for games and things needing single threaded performance.

The idea that as threaded performance becomes more important AMD chips will become better might have worked a few years ago but intel continues to make strides in threaded performance taking what works in single high IPC sensitive applications and port that over to multi core setups (ring bus, lots of cache, etc).

But I did just pick up a 955BE and board from this board for a buddy who does a lot of photoshop with a tiny almost nonexistent budget. AMD has it's use cases, Intel has theirs. It's my opinion that if money was not an issue the Intel route would be the way to go, but in the end, this is my ultimate point: we're all [H] readers right? informed no less. Pick whatever your research says is best but don't try to skew shit to make me believe like you because you want to feel better about your purchase to fulfill some fanboi agenda.
 
well to an extent yes. Sure I'll post a hey-what-do-you-recommend for this or that but when it comes to a chip? All else equal (same gpu, same ram, ssd, etc) pick whatever chip has the best performance for the budget you have. From what I can tell the i5/i7 based intel chips just are quicker for games and things needing single threaded performance.

The idea that as threaded performance becomes more important AMD chips will become better might have worked a few years ago but intel continues to make strides in threaded performance taking what works in single high IPC sensitive applications and port that over to multi core setups (ring bus, lots of cache, etc).

But I did just pick up a 955BE and board from this board for a buddy who does a lot of photoshop with a tiny almost nonexistent budget. AMD has it's use cases, Intel has theirs. It's my opinion that if money was not an issue the Intel route would be the way to go, but in the end, this is my ultimate point: we're all [H] readers right? informed no less. Pick whatever your research says is best but don't try to skew shit to make me believe like you because you want to feel better about your purchase to fulfill some fanboi agenda.

LOL what's funny is most if not ALL AMD purchases are the BEST.

Why?

Because you don't get priced raped. You bought Intel 6 core for 499-599 cpu alone.

You can get a 8 core FX 8320 for 139.99tax @ microcenter or 175.00 online no tax.

You went Intel so you got price raped and can always say to yourself and others. Look I have single core performance advantage!...lulz

While the other person says "I have 350-450 bucks more to spend on a GPU" ;)

Just my honest opinion.
 
LOL what's funny is most if not ALL AMD purchases are the BEST.

Why?

Because you don't get priced raped. You bought Intel 6 core for 499-599 cpu alone.

You can get a 8 core FX 8320 for 139.99tax @ microcenter or 175.00 online no tax.

You went Intel so you got price raped and can always say to yourself and others. Look I have single core performance advantage!...lulz

While the other person says "I have 350-450 bucks more to spend on a GPU" ;)

Just my honest opinion.

And for a lot of software that is a valid point.

There are some applications that just won't run acceptably on any AMD chip out there, overclocked on water, or not. There are fewer of them now than when BD first launched, but they are out there...
 
LOL what's funny is most if not ALL AMD purchases are the BEST.

Why?

Because you don't get priced raped. You bought Intel 6 core for 499-599 cpu alone.

You can get a 8 core FX 8320 for 139.99tax @ microcenter or 175.00 online no tax.

You went Intel so you got price raped and can always say to yourself and others. Look I have single core performance advantage!...lulz

While the other person says "I have 350-450 bucks more to spend on a GPU" ;)

Just my honest opinion.

Performance is what matters, not number of cores.

http://www.hardocp.com/article/2012...ver_processor_ipc_overclocking/6#.UXdAvKJwfh0
 

QFT.

I was an AMD fan boy and just swapped from an 8350 to an i7 3570k. There's no comparison, the i7 smashes in overall performance and with less power consumption.

Not sure if some of you understand that more cores dies not equal more powerful lol. Especially when gaming when very few games use more than 4 cores let alone 6 or 8.
 
LOL what's funny is most if not ALL AMD purchases are the BEST.

Why?

Because you don't get priced raped. You bought Intel 6 core for 499-599 cpu alone.

You can get a 8 core FX 8320 for 139.99tax @ microcenter or 175.00 online no tax.

You went Intel so you got price raped and can always say to yourself and others. Look I have single core performance advantage!...lulz

While the other person says "I have 350-450 bucks more to spend on a GPU" ;)

Just my honest opinion.

No you're right, and while the 3930 probably doesn't justify it's price even though it is likely to be faster at stock speeds vs the 8350 but not 300 usd better in performance, it is still faster, and I use it for more than gaming so I needed the extra threads and IPC speed.
 
LOL what's funny is most if not ALL AMD purchases are the BEST.

Why?

Because you don't get priced raped. You bought Intel 6 core for 499-599 cpu alone.

You can get a 8 core FX 8320 for 139.99tax @ microcenter or 175.00 online no tax.

You went Intel so you got price raped and can always say to yourself and others. Look I have single core performance advantage!...lulz

While the other person says "I have 350-450 bucks more to spend on a GPU" ;)

Just my honest opinion.

Ah, i take the advantage to ask, something i have been asking myself, watching you, new gamers quote all the time. The "fps" issue. Now, i remember clearly, that in my teenager years that i was an enthusiast gamer, i was reading in PC magazines and game reviewers were repeating the same mantra (if it's above 30 fps, the game is fluid). Which is not irrational, thinking that cinema movies are shot at 23-25 fps and the human eye can't catch stuttering.

So, what is different with video games today? If a game plays at 50 fps with AMD and at 70 fps with Intel, do you actually understand it? What happens? The armies on the battlefield start running like rabbits at 70fps? I have only seen this with old DOS games, where, if you make them run on new CPUs, for programming reasons, they get "crazy" and the sprites become "supercharged" and you can't play the game unless you use an emulator to slow down the game.

But in Windows games, today, does that happen too?
 
I had a chance to test the FX-8350 for a short while, and while I can say that it is not a bad processor, I find it to be not noticeably better than the i5-750 I have at home. It also reached about 60 C under moderate load, which is A LOT for a current-gen CPU considering the i5-750 I have is overclocked at 2.9 GHz and reaches those temps only in IntelBurnTest. I'd say Intel Ivy is a better choice even in the price/perf. ratio when taking into account power usage and heat.

BTW I have a Radeon card and want AMD to be serious competition in order to drive the Intel prices down. So I'm definitely not an Intel/nVidia fanboy.
 
Just wondering but how far OC'd did you have the FX-8350 to hit 60c? My FX-8120 hits 60c around 1.5v when I'm trying for 4.6GHz+. If you're hitting 60c at stock voltage then you probably need to properly install the heatsink as it's misaligned or you have air pockets in the thermal paste. If you did have the FX-8350 overclocked to 4.5+ then how did it compare to your i5-750 @2.9?
 
Wyqtor, what coolers are on 750 vs 8350? It's not like all coolers perform the same.
 
Ah, i take the advantage to ask, something i have been asking myself, watching you, new gamers quote all the time. The "fps" issue. Now, i remember clearly, that in my teenager years that i was an enthusiast gamer, i was reading in PC magazines and game reviewers were repeating the same mantra (if it's above 30 fps, the game is fluid). Which is not irrational, thinking that cinema movies are shot at 23-25 fps and the human eye can't catch stuttering.

So, what is different with video games today? If a game plays at 50 fps with AMD and at 70 fps with Intel, do you actually understand it? What happens? The armies on the battlefield start running like rabbits at 70fps? I have only seen this with old DOS games, where, if you make them run on new CPUs, for programming reasons, they get "crazy" and the sprites become "supercharged" and you can't play the game unless you use an emulator to slow down the game.

But in Windows games, today, does that happen too?

That's the argument a lot of AMD fans, especially their fanboys make. If AMD gets the job done what does it matter? And if I see flaming going on by Intel fanboys I'll even through that argument out there. But that still doesn't change the fact that Intel bests AMD. Whether Intel costs more or not (and they do, and that's another typical argument) it's still the better chip from a performance standpoint. I'm not saying bang for the buck or anything like that, just strictly performance.

I don't know how accurate it was, but I just did a benchmark test this past weekend on my two machines because I was curious to see the difference. Two big differences, the CPU and the video card. Video card was obvious, albeit I wasn't expecting the 680 to be around three times as fast. The CPU on the other hand surprised me a bit. The 930 (granted it's got a higher overclock) was twice, to half again faster than the 1055T. Of course I believe I spent about $290 on my i7, and around $110 on the 1055T. But I also bought the 1055T a couple years later so it had gone down in price pretty drastically I'm assuming.

Anyway, the big deal for an enthusiast like me on Intel being that much faster is longevity. I'm going on three years with the i7 once May hits. I see no problems at this point on making it another two or so years if not longer with it. Am I saying the 1055T couldn't go that long? No. But it's quite obvious as long as nothing goes down that I should get more time out of the i7.

As for the price raping. I asked questions, did some research etc. before I went with the i7. I had an idea what I was getting myself into, so I don't consider myself raped by Intel's prices. And as time goes on the more I think I made the right choice at that particular time. If I was building a new machine now, I think I'd seriously contemplate between AMD and Intel based on price difference and that argument that AMD will get the job done at a cheaper cost.
 
That's the argument a lot of AMD fans, especially their fanboys make. If AMD gets the job done what does it matter?

Ah, so i deduce, that things haven't changed so much since the time i was a gamer, thanks. It was just a curiocity of mine, because i don't play modern games anymore and i am of the generation of gamers, where you played the game for the gameplay, not the graphics.

Thanks.

P.S.: In my time, the question with a new game was:
- Does it run?
- Pheww, yeah,runs fine / no, It's "sticky" (stuttering).
Nobody really measured how many fps he was getting :)
 
Last edited:
Ah, so i deduce, that things haven't changed so much since the time i was a gamer, thanks. It was just a curiocity of mine, because i don't play modern games anymore and i am of the generation of gamers, where you played the game for the gameplay, not the graphics.

Thanks.

You're definitely not alone as far as playing games for the gameplay over the graphics. Anymore as powerful as technology has become I don't see why graphics have to lack any, but gameplay is definitely what I look for as well. And I haven't been overly happy with any modern games for some time now. The last one I thought was pretty well done, not over the top was Deus Ex: Human Revolution.

Ah, so i deduce, that things haven't changed so much since the time i was a gamer, thanks. It was just a curiocity of mine, because i don't play modern games anymore and i am of the generation of gamers, where you played the game for the gameplay, not the graphics.

Thanks.

P.S.: In my time, the question with a new game was:
- Does it run?
- Pheww, yeah,runs fine / no, It's "sticky" (stuttering).
Nobody really measured how many fps he was getting :)

Lol on that second part because I know exactly what you're talking about. Of course, in those times I wasn't the enthusiast I am now. Either way, that's the same way I was. If the game didn't play smooth I knew I needed a better machine in some way. That was about it.
 
Look I am an AMD fan from WAAAAY back. I had an 8 core bulldozer. They are great. But for BF3 on multiple monitors my Ivy bridge quad RAPED the bulldozer. People don't want to realize IPC is EVERYTHING. It's the basis for all core performane.

I had an 1100T and 4.0, an 8150 at 4.5 GHz and a 3570k at 4.5GHz walked away with the performance in BF3. I love AMD always have... I dieing for them to come out with something competitive. When I build regular computers for other people I use AMD! It's just for the high end AMD has nothing to compete. Not to mention now with the frame time stuttering bullshit on Crossfire their GPUs are in question too! AMD needs to wake up and move quick.
 
You're definitely not alone as far as playing games for the gameplay over the graphics. Anymore as powerful as technology has become I don't see why graphics have to lack any, but gameplay is definitely what I look for as well. And I haven't been overly happy with any modern games for some time now. The last one I thought was pretty well done, not over the top was Deus Ex: Human Revolution.

RANT MODE ON

I know exactly the feeling... But i haven't decided yet, whether it's just me getting older or if really the games have abbandoned gameplay for "wow-graphics". I am a particular bad case to judge in that, because i arrived at your state, more or less, at the time that every game went "first person 3D point of view". Meaning, that if you put me now to play Skyrim, i physically can't. I will vomit (motion sickness). I slowly start to get dizzy, sweating, my hands get cold and eventually i will vomit. I don't have this problem with games that are 3D but have "top view", like Shogun 2. So, i have pretty much abbandoned hope that i can play new games (i 've read about some "tricks", but...). Anyway, this alone for me says that my body doesn't have "fun" anymore. A game that makes you vomit isn't something i consider "game". I tried to play "Thief", "Morrowind", nausea was coming in record time. In Morrowind blurry text was making things even worse.
I grew up with 2D games, bad graphics, great gameplay, so much fun. I have no idea how much time i spent as a little boy on 1987 Microprose Sid Meier's Pirates. I 've learnt the Carribean by heart, i could recognize any point of the map just by looking at the shape of the coast.
I also liked RPGs, but that last one that i enjoyed without vomiting, was Wizardry 8 and Might & Magic series. The only "modern" game i play now is Shogun 2. I enjoyed Shogun 1 too, but not the rest TW games, i am not into firearms, even in Shogun,i don't use gunpowder weapons.
Same for simulators, i only play Total Air War 2.0, because i can play with great precision using just keyboard. I grew up playing with keyboard. At some point, even my "traditional joystics" disappeared and now you have these awful "thumb controllers", which i don't even want to touch.

/RANT MODE OFF

If the game didn't play smooth I knew I needed a better machine in some way. That was about it.

Exactly... I remember when i bought what i thought would be the "ultimate gaming solution", the 3dfx Voodoo Banshee so that i could play DID's Total Air War. Woohooo! I was in 7th heaven! But didn't last as long as i had hoped :D

Still, good times. Today kids have way too much money to spend. Lost part of the joy in the upgrade too. It's now so often that becomes almost a "normal procedure".
 
Last edited:
Here is an interesting bit of info

AMD Is estimating 15% IPC increase per generation:
Intel hasn't given any #s, but theres have been around 7-10% so lets use 10%

Ff we assume 15% for AMD and 10% for Intel for IPC increase, based on just those #s, given everything else stays the same, here are some CineBench Scores for the next 3 generations based on Anandtechs Vishara Review Cinebench 11.5X Single and Multi-threaded scores and scaling.

AMD Single AMD Multi(6.26x)
1.10 -----------6.886
1.32 -----------8.2632
1.58 -----------9.91584

Intel Single Intel Multi(4.58x)
1.66 -----------7.6028
1.83 -----------8.36308
2.01 -----------9.199388

In the next generation alone, AMD will catch up to intel, and the generation after that will surpass it by about 10% in threaded workloads(ones similar to cinebench)

Ofcourse this is a small picture. I guess we will have to wait and see what actually happens :)

totally off topic i know!
 
RANT MODE ON

I know exactly the feeling... But i haven't decided yet, whether it's just me getting older or if really the games have abbandoned gameplay for "wow-graphics". I am a particular bad case to judge in that, because i arrived at your state, more or less, at the time that every game went "first person 3D point of view". Meaning, that if you put me now to play Skyrim, i physically can't. I will vomit (motion sickness). I slowly start to get dizzy, sweating, my hands get cold and eventually i will vomit. I don't have this problem with games that are 3D but have "top view", like Shogun 2. So, i have pretty much abbandoned hope that i can play new games (i 've read about some "tricks", but...). Anyway, this alone for me says that my body doesn't have "fun" anymore. A game that makes you vomit isn't something i consider "game". I tried to play "Thief", "Morrowind", nausea was coming in record time. In Morrowind blurry text was making things even worse.
I grew up with 2D games, bad graphics, great gameplay, so much fun. I have no idea how much time i spent as a little boy on 1987 Microprose Sid Meier's Pirates. I 've learnt the Carribean by heart, i could recognize any point of the map just by looking at the shape of the coast.
I also liked RPGs, but that last one that i enjoyed without vomiting, was Wizardry 8 and Might & Magic series. The only "modern" game i play now is Shogun 2. I enjoyed Shogun 1 too, but not the rest TW games, i am not into firearms, even in Shogun,i don't use gunpowder weapons.
Same for simulators, i only play Total Air War 2.0, because i can play with great precision using just keyboard. I grew up playing with keyboard. At some point, even my "traditional joystics" disappeared and now you have these awful "thumb controllers", which i don't even want to touch.

/RANT MODE OFF



Exactly... I remember when i bought what i thought would be the "ultimate gaming solution", the 3dfx Voodoo Banshee so that i could play DID's Total Air War. Woohooo! I was in 7th heaven! But didn't last as long as i had hoped :D

Still, good times. Today kids have way too much money to spend. Lost part of the joy in the upgrade too. It's now so often that becomes almost a "normal procedure".

Gaming has never had an effect on me like that. I don't get dizzy or anything. I can play for hours and hours and have no problems. But each person is different. I really started my gaming on consoles, and continue on them today although I love gaming on the PC as well. I see the pro's and con's of console and PC gaming. And that's nothing that I'm going to get into right now.

Anyway, as time progresses and computers get more and more powerful I'm always looking for games to have "more" to them. A linear story line doesn't cut it for me any more. I like open world and one in which you actually have to know what it is you're doing and where you're going. I know these are games we're playing, but real life doesn't play out that way. Or at the very least, give me a destination and options to multiple paths I can take to get to that destination. Very rarely do I find games like this, and that's why I liked Deus Ex. I just want more added to these games, rather than them playing all the same, or things being taken away.
 
Look I am an AMD fan from WAAAAY back. I had an 8 core bulldozer. They are great. But for BF3 on multiple monitors my Ivy bridge quad RAPED the bulldozer. People don't want to realize IPC is EVERYTHING. It's the basis for all core performane.

I had an 1100T and 4.0, an 8150 at 4.5 GHz and a 3570k at 4.5GHz walked away with the performance in BF3. I love AMD always have... I dieing for them to come out with something competitive. When I build regular computers for other people I use AMD! It's just for the high end AMD has nothing to compete. Not to mention now with the frame time stuttering bullshit on Crossfire their GPUs are in question too! AMD needs to wake up and move quick.

IPC isn't everything. It means absolutely nothing. IPS is what matters.
 
Look I am an AMD fan from WAAAAY back. I had an 8 core bulldozer. They are great. But for BF3 on multiple monitors my Ivy bridge quad RAPED the bulldozer. People don't want to realize IPC is EVERYTHING. It's the basis for all core performane.

I had an 1100T and 4.0, an 8150 at 4.5 GHz and a 3570k at 4.5GHz walked away with the performance in BF3. I love AMD always have... I dieing for them to come out with something competitive. When I build regular computers for other people I use AMD! It's just for the high end AMD has nothing to compete. Not to mention now with the frame time stuttering bullshit on Crossfire their GPUs are in question too! AMD needs to wake up and move quick.

you sure about that? because here
http://www.hardocp.com/image.html?image=MTMxODIzNDM3ODFEM0pGUjlMaUhfMl8zX2wuZ2lm

an 8150 @ 4.6ghz is bout 8% faster than a 2600k and 2500k @ 4.8ghz.

soo.....
 
Isn't that benchmark from a single monitor?

yes, but he didn't mention he used eyefinity or crossfire. I don't think there are BF3 #s for those ATM is there?

Anyhoo, I think the point of BF3 is that it is a game that utilizes multiple execution cores on a chip.
 
Ah, i take the advantage to ask, something i have been asking myself, watching you, new gamers quote all the time. The "fps" issue. Now, i remember clearly, that in my teenager years that i was an enthusiast gamer, i was reading in PC magazines and game reviewers were repeating the same mantra (if it's above 30 fps, the game is fluid). Which is not irrational, thinking that cinema movies are shot at 23-25 fps and the human eye can't catch stuttering.

Well, The human eye CAN distinguish stuttering at 24hz (or more precisely 23.976hz) of movies. That's part of what what gives films their "movie look", and make them feel more "epic" than sitcoms.

Your own personal framerate tolerance differs from person to person. My experience is as follows.

I can see the stuttering at framerates < 30fps This is - however - typically fine for me for strategy games like Civ 5 and the like, but not for FPS games.

While things LOOK good at 30fps, the mouse feel is just completely off to me until about 60fps. Things don't feel responsive and fluid. In single person FPS games this doesn't matter much, but in high paced multiplayer FPS:es, this can make a huge difference.

Above 60fps, I can no longer tell a difference.

Keep in mind, tradtionally back in the day we always measured AVERAGE framerate. To me that is the wrong number. Usually the lowest framerate happens just at the most intense scene and gets you killed. To me MINIMUM frame rate is what really matters, and it can be much lower than the average.

Because of tearing and things like that, I tend to want to vsync too, which adds to the necessity of high framerates.

I am usually not happy unless I have the following MINIMUM frame rates with vsync on:

Turn Based Strategy Games: 15fps
Real time strategy Games: 30fps
Single player FPS Games: 30fps
Multiplayer FPS Games: 60fps

This is what I could not accomplish with my Phenom II X6 1090T at 4.1Ghz, and why I didn't buy a bulldozer, as at launch the FX-8150's single core performance at max overclock performed worse than the Phenom II, despite much higher clocks.

At the time, my favorite game was Red Orchestra II, and it was very dependent on per core speed. As all the artillery started falling during a massive offensive on a 64 player server (in other words, exactly at the wrong time), the framerate would often drop into the 30s, and sometimes below, when I wanted everything above 60. And I knew it wasn't my video cards, as the framerate was unchanged, even if I lowered the resolution and all the settings to their minimums. In other words, a classic CPU bottleneck.

Since that time - however - RO2 has been patched many times, and now runs better on AMD chips, but I based my decision on that I wanted to be able to play my favorite game, even if it was dependent on single core performance, and thus Intel was a safer bet at that time.

If I were shopping today, I'd probably wind up with an FX-8350, as while it is slower than the Intel chips, it's more than fast enough for 60fps on anything I play, and I like supporting the under dog.
 
So, you must own a 15 inch crt with a maximum res of 1024 x 768, eh? :D

This should be obvious but....

It tells you in general which processor is the faster processor, especially if a trend is established among multiple games. By testing at low res it's essentially removing the GPU as a FPS cap, so the limiting factor ends up being the CPU, essentially testing how good the CPUs performance is in the game.

So say the CPU will max out at say 55% for the Intel, and 70% for the FX8350 when running at a normal 1920x1080 resolution, in that case the FPS is going to be basically identical. So while that may not be relevant *now* for current games, it will be a factor in system longevity when later on down the road... say 2 years, you drop in say a GTX 970. If the Intel chip manages to hit 90% usage under normal games, its possible the 8350 wouldn't be able to keep up and *then* your FPS would suffer.
 
Is the OC headroom for the 8350 better than for the 8150? Bought a bulldozer at launch and was very dissapointed in the performance, but I am seeing a major difference in reviews of the 8350 vs bulldozer.
 
Is the OC headroom for the 8350 better than for the 8150? Bought a bulldozer at launch and was very dissapointed in the performance, but I am seeing a major difference in reviews of the 8350 vs bulldozer.

Overclock headroom is similar. However, the 8350 runs cooler than the 8150, so it's easier to get higher clocks with less cooling.
 
This should be obvious but....

It tells you in general which processor is the faster processor, especially if a trend is established among multiple games. By testing at low res it's essentially removing the GPU as a FPS cap, so the limiting factor ends up being the CPU, essentially testing how good the CPUs performance is in the game.

So say the CPU will max out at say 55% for the Intel, and 70% for the FX8350 when running at a normal 1920x1080 resolution, in that case the FPS is going to be basically identical. So while that may not be relevant *now* for current games, it will be a factor in system longevity when later on down the road... say 2 years, you drop in say a GTX 970. If the Intel chip manages to hit 90% usage under normal games, its possible the 8350 wouldn't be able to keep up and *then* your FPS would suffer.

I never really bought into that testing methodology. I prefer to actually see what I would get with realistic settings with realistic hardware. If you want to test with those low results, put in an old FX5200 64MB 64 bit card and see what you get. Otherwise, those results do not tell me what I need to know.

Enjoy.
 
I never really bought into that testing methodology. I prefer to actually see what I would get with realistic settings with realistic hardware. If you want to test with those low results, put in an old FX5200 64MB 64 bit card and see what you get. Otherwise, those results do not tell me what I need to know.

Enjoy.

It can be a useful aspect for system performance investigation, for those that have a penchant for profiling details. Completely blowing off the value of such data perhaps is a bit narrow/shallow, or are you just projecting your preferences onto everyone else.
 
I never really bought into that testing methodology. I prefer to actually see what I would get with realistic settings with realistic hardware. If you want to test with those low results, put in an old FX5200 64MB 64 bit card and see what you get. Otherwise, those results do not tell me what I need to know.

Enjoy.


I think both are valuable.

Testing with current realistic hardware at the settings you'd actually pay, gives you the most realistic sense of what performance will be now.

Testing while eliminating the video card as a bottleneck, gives you some understanding for how future proof the CPU will be.

I don't know about you guys but my CPU/Motherboard cycle is usually double the length of my video card cycle, so when I buy a new CPU and Motherboard, I have to plan for it to last not just for the current gen, but the next one as well.

That being said, as far as games go, and current performance goes, doing the whole "testing with super low settings to eliminate the video card" just runs you into pissing matches like the bad old days, where "my CPU is better than yours because I get 500fps in Q3A, vs your 430fps". Useless indeed.
 
I also wanted to comment that this chip is working great with Virtual Box virtual machines. I have the VHD's on a 500GB Seagate Hybrid harddrive and they load quickly from that. (A straight up mechanical drive was slower for VHD usage.) I can assign 2 cpu cores for each VM and still play Crysis 3 without issue well running 4 VM's at the same time.
 
For People that try to bring in what the human eye can "see" I say:

I did not think over 60FPS/HZ would make a difference. When I got my 2B CATLEAP @ 96HZ in a windows environment you can tell the difference. Gaming just as well. You Won;t believe it until you see it your self. I went from an HP LP3065c 30" 2560x1600 to CATLEAP 2b 2560*1440 and it was noticeable in every way! If your wondering why 96Hz i wanted 24*4=96 so I get frames in movies split equally.
 
Back in the CRT days it would hurt my eyes to see a monitor at 60 Hz. If I was at work and was walking past somebody's computer I could tell if it was set to 60. I would always stop and change it to 70 or 75, whatever it would take without a problem.
 
Back
Top