AMD Zen Performance Preview

Case in point:

The 1080 GTX can barely max the game at 1080p, but the FX-4230? No issues whatsoever. CPU performance certainly helps, but games have been GPU limited for ages now. Upgrading CPUs for gaming purposes is a thing of the past; they're fast enough to get all the necessary work done, and performance is almost entirely dominated by the GPUs ability to pump out frames.

Be careful with possible prescripted benchmarks. They will have artificially low CPU usage.
 
Interesting assessment, do you have information on that?

As in? You can test it with a benchmark vs ingame.

Warhammer is an example of it. The previous Tomb Raider another. Alien vs Predator(This one is really bad) and so on. You first find out when you test with ingame playing.

If we look at Tomb Raider (Previous one).

Benchmark using an i5 Haswell and 7950 CF
800MHz 74 FPS
4900MHz 104 FPS
Difference : ~40% performance gain from 800Mhz to 4900MHz

Ingame playing with the same,
800Mhz 22 FPS
4900Mhz 105 FPS
Actual Difference: ~400% performance gain from 800Mhz to 4900MHz

Why buy an FX for example when you can do 50-60 FPS with Kabini as well.
AMD A6-5200 (Kabini)

Perhaps because you cant when playing ;)
 
Last edited:
As in? You can test it with a benchmark vs ingame.

Warhammer is an example of it. The previous Tomb Raider another. Alien vs Predator(This one is really bad) and so on. You first find out when you test with ingame playing.

If we look at Tomb Raider (Previous one).

Benchmark using an i5 Haswell and 7950 CF
800MHz 74 FPS
4900MHz 104 FPS
Difference : ~40% performance gain from 800Mhz to 4900MHz

Ingame playing with the same,
800Mhz 22 FPS
4900Mhz 105 FPS
Actual Difference: ~400% performance gain from 800Mhz to 4900MHz

Why buy an FX for example when you can do 50-60 FPS with Kabini as well.
AMD A6-5200 (Kabini)

Perhaps because you cant when playing ;)

And this, right here, is why I put sole faith in [H] because of the real-world gaming reviews for CPUs and GPUs that Kyle and team undertake and publish.
 
you've been here since 2008? what forum sections are you visiting? lol.. AMD's been on the shit list for the last 5-6 years, there's literally nothing they can do right or wrong where they don't get bashed into the ground by people here.
How many years has AMD been filling their marketing spiel with obfuscation, bullshit and lies? My issue is not, nor will it ever be, with AMD's products. My issue, as I've said multiple times, is with their marketing. They know full well that their products cannot perform to the level that they say they can, yet they still say they can. Speaking for myself, if and when AMD stop with the bullshit then I will give them a fair go. While they continually blatantly lie in their marketing I am not interested. It comes down to trust - if a company is lying so obviously about their performance then what else are they lying about, and how can I ever trust them?
 
How many years has AMD been filling their marketing spiel with obfuscation, bullshit and lies? My issue is not, nor will it ever be, with AMD's products. My issue, as I've said multiple times, is with their marketing. They know full well that their products cannot perform to the level that they say they can, yet they still say they can. Speaking for myself, if and when AMD stop with the bullshit then I will give them a fair go. While they continually blatantly lie in their marketing I am not interested. It comes down to trust - if a company is lying so obviously about their performance then what else are they lying about, and how can I ever trust them?

The same way you trust all of the other manufacturers that lie. AMD doesn't have a lock on BS or lying about performance or features.
 
Case in point:

CPU_01.png


The 1080 GTX can barely max the game at 1080p, but the FX-4230? No issues whatsoever. CPU performance certainly helps, but games have been GPU limited for ages now. Upgrading CPUs for gaming purposes is a thing of the past; they're fast enough to get all the necessary work done, and performance is almost entirely dominated by the GPUs ability to pump out frames.

Are you being sarcastic? Because it is really tough to tell online.

If we look at the minimum framerates (the only measure that really matters, as they determine gameplay experience. Average framerates are a useless measure) it doesn't match what you are saying at all.

I look at this chart, and only the top six CPU's are not holding the GPU back, and they are all Intel chips.


The top AMD chip on there has a minimum framerate of 56. That's not awful, but its not great either. I usually demand that my minimums never drop below 60. The 4320 you are talking about has a minimum framerate of 37, compared to a non-bottlenecked 61 on the top CPU's. That's pretty awful. It means that during the most heated, the most intense scenes where you need the performance the most, the system will drop down to 37fps. That is completely and totally unacceptable.
 
Are you being sarcastic? Because it is really tough to tell online.

If we look at the minimum framerates (the only measure that really matters, as they determine gameplay experience. Average framerates are a useless measure) it doesn't match what you are saying at all.

I look at this chart, and only the top six CPU's are not holding the GPU back, and they are all Intel chips.


The top AMD chip on there has a minimum framerate of 56. That's not awful, but its not great either. I usually demand that my minimums never drop below 60. The 4320 you are talking about has a minimum framerate of 37, compared to a non-bottlenecked 61 on the top CPU's. That's pretty awful. It means that during the most heated, the most intense scenes where you need the performance the most, the system will drop down to 37fps. That is completely and totally unacceptable.
you really need more than minimums. Need how often it spends there.
 
Are you being sarcastic? Because it is really tough to tell online.

If we look at the minimum framerates (the only measure that really matters, as they determine gameplay experience. Average framerates are a useless measure) it doesn't match what you are saying at all.

I look at this chart, and only the top six CPU's are not holding the GPU back, and they are all Intel chips.


The top AMD chip on there has a minimum framerate of 56. That's not awful, but its not great either. I usually demand that my minimums never drop below 60. The 4320 you are talking about has a minimum framerate of 37, compared to a non-bottlenecked 61 on the top CPU's. That's pretty awful. It means that during the most heated, the most intense scenes where you need the performance the most, the system will drop down to 37fps. That is completely and totally unacceptable.

You really should invest in a real monitor like gsync. You demand the best but you are not really doing your system a justice if you don't have that. Keeping your framerate high is one thing but if you don't have gsync with that Titan XP its unacceptable in itself with that setup. That will give you the game play experience you are looking and you will have no stutters.
 
You really should invest in a real monitor like gsync. You demand the best but you are not really doing your system a justice if you don't have that. Keeping your framerate high is one thing but if you don't have gsync with that Titan XP its unacceptable in itself with that setup. That will give you the game play experience you are looking and you will have no stutters.

Gsync is pointless if you have enough GPU power to vsync at an acceptable refresh rate.

I wouldn't mind going with Gsync if it came on any monitors I want, but I've never liked any monitors I've seen with Gsync.

I'll look into it when 40"+ 4k gsync screens with IPS (or something better) and good color calibration exist.

Only about 2% of my computer time is spent in games, so I'm not going to make any sacrifices at all for desktop use, if I get a gsync screen. It has to be awesome at everything.
 
Last edited:
You really should invest in a real monitor like gsync. You demand the best but you are not really doing your system a justice if you don't have that. Keeping your framerate high is one thing but if you don't have gsync with that Titan XP its unacceptable in itself with that setup. That will give you the game play experience you are looking and you will have no stutters.


If the minimums are 60+fps I don't give a rat's ass how often they spend there :p
 
I'm still rockin a Phenom II 1045t. I'm hoping that Zen will be the chip that I feel like I can finally upgrade to.
 
Gsync is pointless if you have enough GPU power to vsync at an acceptable refresh rate.

I wouldn't mind going with Gsync if it came on any monitors I want, but I've never liked any monitors I've seen with Gsync.

I'll look into it when 40"+ 4k gsync screens with IPS (or something better) and good color calibration exist.

My only about 2% of my computer time is spent in games, so I'm not going to make any sacrifices at all for desktop use, if I get a gsync screen. It has to be awesome at everything.

2% and you have a titan XP! damn! Must be Nice! LOL
 
My rationale is that I have so little time for games, that I want the little time I do have to be as good as possible.

I justify it the same but I can't justify with a titan xp lol. I would probably have buyers remorse quick.. I don't game much either but want good enough system to game. have a 4k monitor but at least game well at 1440p.
 
Gsync is pointless if you have enough GPU power to vsync at an acceptable refresh rate.

I wouldn't mind going with Gsync if it came on any monitors I want, but I've never liked any monitors I've seen with Gsync.

I'll look into it when 40"+ 4k gsync screens with IPS (or something better) and good color calibration exist.

Only about 2% of my computer time is spent in games, so I'm not going to make any sacrifices at all for desktop use, if I get a gsync screen. It has to be awesome at everything.

Well gsync is not pointless to be honest. If you dig deeper in to it, it matches your monitors refresh rate with the fps range. Hence no matter what the FPS drops are you get consistent gaming experience. You have to see it to believe it. I don't have gsync but I have seen it in action and its smooth. My monitor does have freesync which I am about to try out soon.
 
Well gsync is not pointless to be honest. If you dig deeper in to it, it matches your monitors refresh rate with the fps range. Hence no matter what the FPS drops are you get consistent gaming experience. You have to see it to believe it. I don't have gsync but I have seen it in action and its smooth. My monitor does have freesync which I am about to try out soon.
lol i often wonder if i could really detect the hole sync display thing.....mainly cause i cant visualize any issue currently. (I never think to myself this is so not smooth) I remember playing bf3 with my nvidia gtx 280's and the vram stutter limit drove me nutz..that kind of issue i notice!:woot:
 
lol i often wonder if i could really detect the hole sync display thing.....mainly cause i cant visualize any issue currently. (I never think to myself this is so not smooth) I remember playing bf3 with my nvidia gtx 280's and the vram stutter limit drove me nutz..that kind of issue i notice!:woot:

Well, I currently run with adaptive vsync on, and my monitor refresh set to 60hz.

So as long as my GPU can push 60fps everything is perfect. (I know people will disagree, but personal I'm of the opinion that anything above 60fps puts you in placebo territory)

The problems occur if scenes get busy ebough to drop my frame rates below 60fps. Then I will see longer frame times, lower frame rate, more input lag and tearing, as I am no longer vsynced.

If I had a gsync screen I'd still see the higher frame times, lower framerates and more input lag, but the gsync monitor would take care of matching the refresh rate, so I'd no longer have tearing.

Now I hate tearing as much as the next guy, but I'm just not sure that's enough of a benefit to give up my 48" 4k JS9000.

The exception is with titles like Deus Ex Mankind Divided which is unusually hard on the GPU. I'm hoping that by the time I get around to playing it, game patches and driver optimizations will have improved it to the point that any dips below 60fps will be rare and short, and I won't have to worry about it.
 
Well, I currently run with adaptive vsync on, and my monitor refresh set to 60hz.

So as long as my GPU can push 60fps everything is perfect. (I know people will disagree, but personal I'm of the opinion that anything above 60fps puts you in placebo territory)

The problems occur if scenes get busy ebough to drop my frame rates below 60fps. Then I will see longer frame times, lower frame rate, more input lag and tearing, as I am no longer vsynced.

If I had a gsync screen I'd still see the higher frame times, lower framerates and more input lag, but the gsync monitor would take care of matching the refresh rate, so I'd no longer have tearing.

Now I hate tearing as much as the next guy, but I'm just not sure that's enough of a benefit to give up my 48" 4k JS9000.

The exception is with titles like Deus Ex Mankind Divided which is unusually hard on the GPU. I'm hoping that by the time I get around to playing it, game patches and driver optimizations will have improved it to the point that any dips below 60fps will be rare and short, and I won't have to worry about it.

how is text on this screen. Personally it will be too big for me. I do have a 4k samsung tv in my room. Little over a year old so not the latest model. But Damn though 48" that is just big. I have a 49 inch samsung 4k, I can't imagine having it 3-4 feet away from me, lol.
 
lol i often wonder if i could really detect the hole sync display thing.....mainly cause i cant visualize any issue currently. (I never think to myself this is so not smooth) I remember playing bf3 with my nvidia gtx 280's and the vram stutter limit drove me nutz..that kind of issue i notice!:woot:

I was the same way I started noticing that shit more when I started looking in to gsync and freesync. I was like why did I ever, Now I notice tearing and shit, damn it! rofl.
 
Well, I currently run with adaptive vsync on, and my monitor refresh set to 60hz.

So as long as my GPU can push 60fps everything is perfect. (I know people will disagree, but personal I'm of the opinion that anything above 60fps puts you in placebo territory)

The problems occur if scenes get busy ebough to drop my frame rates below 60fps. Then I will see longer frame times, lower frame rate, more input lag and tearing, as I am no longer vsynced.

If I had a gsync screen I'd still see the higher frame times, lower framerates and more input lag, but the gsync monitor would take care of matching the refresh rate, so I'd no longer have tearing.

Now I hate tearing as much as the next guy, but I'm just not sure that's enough of a benefit to give up my 48" 4k JS9000.

The exception is with titles like Deus Ex Mankind Divided which is unusually hard on the GPU. I'm hoping that by the time I get around to playing it, game patches and driver optimizations will have improved it to the point that any dips below 60fps will be rare and short, and I won't have to worry about it.
How exactly do you have tearing when the frame rate is not above the refresh rate? (Aside from certain games).....Pretty sure i can only see tearing when i dont have a fram rate cap or vsynce....And even then "some" games i still dont see any! (Alien Isolation)....maybe i just never notice it lol
 
how is text on this screen. Personally it will be too big for me. I do have a 4k samsung tv in my room. Little over a year old so not the latest model. But Damn though 48" that is just big. I have a 49 inch samsung 4k, I can't imagine having it 3-4 feet away from me, lol.

I sit with my head approximatel;y 2.5 feet away from it. When I bought it I was concerned it was a little bit too big, but I wanted the higher quality of the JS9000 model, and 48" was the smallest size it was offered in.

Overall the quality is quite fantastic. True 4:4:4 chroma in PC mode results in good fonts too. The DPI is a little bit low for my viewing distance. According to my calculations, would find this screen absolutely perfect if it were ~42", but as it is, it's size results in slightly lower DPI than I would like.

That being said, the extra screen real estate is a godsend when working with many open windows, and playing games on it is effin amazing.

Pic of setup:



Screenshot of work arrangement:

 
How exactly do you have tearing when the frame rate is not above the refresh rate? (Aside from certain games).....Pretty sure i can only see tearing when i dont have a fram rate cap or vsynce....And even then "some" games i still dont see any! (Alien Isolation)....maybe i just never notice it lol

You can have tearing any time your frame rate isn't exactly synced to your screen refresh rate.

With vsync on, this doesn't happen, because it syncs the framerate to the refresh rate of the screen. (60fos for me) If you cant render fast enough, it syncs the framerate to every other refresh of the screen (30fps for me) and so on.

With Nvidia's adaptive vsync, it syncs to the refresh rate of the screen, unless you cant render fast enough. Then the framerate drops below the refresh rate of the screen and it is unsynced. This is when tearing can occur.
 
I sit with my head approximatel;y 2.5 feet away from it. When I bought it I was concerned it was a little bit too big, but I wanted the higher quality of the JS9000 model, and 48" was the smallest size it was offered in.

Overall the quality is quite fantastic. True 4:4:4 chroma in PC mode results in good fonts too. The DPI is a little bit low for my viewing distance. According to my calculations, would find this screen absolutely perfect if it were ~42", but as it is, it's size results in slightly lower DPI than I would like.

That being said, the extra screen real estate is a godsend when working with many open windows, and playing games on it is effin amazing.

Pic of setup:



Screenshot of work arrangement:


That is a nice setup, do ya do music or voice overs?
 
pretty dope setup. and I don't think you have enough monitors! ;)
 
That is a nice setup, do ya do music or voice overs?

Thanks!

It's an old leftover microphone from when I tried to build a mini-home studio on the cheap for my non-serious post-college band. After not messing with music for years, I started using it again for gaming, as I got so fed up with the crappy mics that kept breaking. :p
 
pretty dope setup. and I don't think you have enough monitors! ;)

Lol. I have a collection of monitors in the basement I could use. (they were throwing out a shit ton of them at work, and I saved them from the recycling bin)

What I don't have more of - however - is desk space :p
 
Lol. I have a collection of monitors in the basement I could use. (they were throwing out a shit ton of them at work, and I saved them from the recycling bin)

What I don't have more of - however - is desk space :p
Its nice but i wont lie....its a tad busy! but i guess if you really use the side displays for work or what ever......it might erk me after a while! i couldn't use all the information anyway!
 
Its nice but i wont lie....its a tad busy! but i guess if you really use the side displays for work or what ever......it might erk me after a while! i couldn't use all the information anyway!

I only really look at one source of information at a time.

What irks me is when I have to cycle between windows to find the two I need information from.

Much nicer to just arrange them before hand, and only have to glance back and forth.
 
lol.

The question I have is why aren't more people up in arms about this? If Intel or nVidia's marketing programs were as full of blatant lies, obfuscation and deception as AMDs have been for the past several years you would hear the outcry from the South Pole. But AMD somehow gets a pass? Not from me they don't.
You can't be serious right? You mean the same Intel that's appealing European courts for bribing OEMs or the same Nvidia that just settled the class action lawsuit over the 4gb 970 fiasco?
 
  • Like
Reactions: kac77
like this
Nice setup. Are those dt-770? Unless you're worried about isolation closed headphones aren't that great for gaming.
Headphones are VERY subjective! The absolutely are good and Just depends on ones taste!
 
  • Like
Reactions: N4CR
like this
You can't be serious right? You mean the same Intel that's appealing European courts for bribing OEMs or the same Nvidia that just settled the class action lawsuit over the 4gb 970 fiasco?

And thy have and will pay for it because of the out cry lol,
 
Headphones are VERY subjective! The absolutely are good and Just depends on ones taste!
I can't wear them, well I can but they give me a headache. I have 5.1 so I like it better anyway. Wife has some 5.1 headphones (not sure on the .1).
 
Headphones are VERY subjective! The absolutely are good and Just depends on ones taste!
I am not talking about subjective traits of a headphone. Like some people like darker headphones, and others like brighter sounding headphones. I am talking about the width of the imaging sometimes referred to as sound stage.

It's the ability for a headphone to give you an illusion of the sounds happening around you and not in your head. This is not just subjective, this can be measured (although not easily) and is heavily influenced by the phase response. While closed back headphones offer sound isolation they have a hard time providing a wide sound stage (that's the main difference between different types of over the ear headphones). This is why Byerdynamics and other manufacturers make both types of headphones, closed and open.

I own a pair of dt-770 among others they are good headphones but they aren't ideal for gaming, because they have a very narrow sound stage. They have other qualities, for instance they are popular in studio work because they are fairly neutral (flat frequency response curve) while offering great isolation, which is clearly a must for any band recording with live instruments.

If you care about isolation, both egress and ingress (other people hearing your sound, or blocking outside noise) than closed back headphones are your only choice. But for gaming generally which relies on binaural audio to provide spatial imaging, an open back headphone has a very clear advantage.

If you want a good open back gaming headphone look into: AKG 7XX (darker) or Philips SHP9500 (brighter)
 
Last edited:
Nice setup. Are those dt-770? Unless you're worried about isolation closed headphones aren't that great for gaming.

They are. I originally bought them for music, but I have been using them for games, just because I have them.

I know a set of open back headphones would provide me a much better sound-stage with better positional audio, but I haven't gotten around to investing in a pair yet. I tried a friends set of supposedly 5.1 "gaming" headphones, and while the positional audio might have been better, I found the sound quality abysmal, and couldn't stand them.

Surprisingly, it is also very game dependent. I find that I get excellent positional audio with them in Counter-Strike and other source based titles. In Red Orchestra 2 - however - which is based on Unreal Engine 3, the positional audio was awful compared to what other people I was playing with were getting. I couldn't tell what direction sounds were coming from at all, whereas others had no problem with this. Maybe the source engine just exaggerates positional sounds to the point where even my headphones that are ill suited to it do an OK job?

Either way, I've gone back and forth on getting a second set for games for a while now. Not sure if its worth it, since I play so few competitive multiplayer games these days. Most of my time in Steam is in Civilization :p I guess they could be helpful in th elikes of Fallout, but I'm not sure I want to spend the money for single player games...

That, and I also could use a new set for music. The DT770 pro's have been with me for 6 years now and are starting to feel a little tired.

And wow, have we gone off topic...
 
If you want a good open back gaming headphone look into: AKG 7XX (darker) or Philips SHP9500 (brighter)

I feel terrible about taking this thread so horribly off-topic (maybe I should PM you instead) but I am interested in your recommendation about those AKG 7xx headphones. The only thing I can find called 7xx is a limited edition massdrop buy. are the XX just "insert number here", as I can find K 701 and K702 models. They seem nice, but they are all very low impedence (62 ohm?) Do you know if they come in a 250+ ohm version?
 
Back
Top