AMD Ryzen 1700X CPU Review @ [H]

120fps is useless meaningless overkill in 100% of all scenarios. I know lots of people think they can tell the difference. They can't. It's all placebo over 90. It's even mostly placebo over 60.

I honestly wasn't sure about this, so I decided to do a test while playing WoW. I have a 144hz monitor.

So, if there's one thing I noticed when I first got the monitor, it was the increased ability to read text that was moving clearly. So I figured I could do a simple test in WoW. I'd position my camera straight up and down (maximizing fps and also making sure all tests were equal), stand right next to an NPC, and strafe left and right and note the smoothness, and ease of readability, his name would appear. For each test, I'd lock my FPS.

At 144. Name scrolled smoothly, easy to read.
At 120. Named still scrolled smoothing, easy to read. Could not discern between 144.
At 90. Name was noticeably harder to read, text was not moving nearly as smooth.
At 60. Not even in the same world as the 120 test. Text was nearly impossible to read.

I tried the tests again in different locations, looking at objects instead of names. It's a bit harder to discern those, and 90fps was harder to tell apart from the rest. 60 was still clearly not as smooth as 120 though.

Now, were any of these game breakers? Not really. Would I noticed while playing, without actively looking for it, probably not. But it's definitely not placebo, it's very, very real.
 
I honestly wasn't sure about this, so I decided to do a test while playing WoW. I have a 144hz monitor.

So, if there's one thing I noticed when I first got the monitor, it was the increased ability to read text that was moving clearly. So I figured I could do a simple test in WoW. I'd position my camera straight up and down (maximizing fps and also making sure all tests were equal), stand right next to an NPC, and strafe left and right and note the smoothness, and ease of readability, his name would appear. For each test, I'd lock my FPS.

At 144. Name scrolled smoothly, easy to read.
At 120. Named still scrolled smoothing, easy to read. Could not discern between 144.
At 90. Name was noticeably harder to read, text was not moving nearly as smooth.
At 60. Not even in the same world as the 120 test. Text was nearly impossible to read.

I tried the tests again in different locations, looking at objects instead of names. It's a bit harder to discern those, and 90fps was harder to tell apart from the rest. 60 was still clearly not as smooth as 120 though.

Now, were any of these game breakers? Not really. Would I noticed while playing, without actively looking for it, probably not. But it's definitely not placebo, it's very, very real.
I agree. I have not compared 120 vs 144, but I have gone from 90 to 105 to 144. There is a huge difference and very noticeable. Anyone who believes that FPS is limited to 60 is crazy. I see that there is a 240hz monitor coming out. Some people will say its marketing and useless, but I believe there are peopel who CAN tell a difference and others who wont notice.

But to say that people only "think they can tell a difference" is absurd and pretty embarrassing for a "staff member" to state.
 
But to say that people only "think they can tell a difference" is absurd and pretty embarrassing for a "staff member" to state.

My opinions when I am posting in the forums are my own, and do not reflect those of HardOCP, unless I am explicitly posting news, and even then are really my own.

I honestly wasn't sure about this, so I decided to do a test while playing WoW. I have a 144hz monitor.

So, if there's one thing I noticed when I first got the monitor, it was the increased ability to read text that was moving clearly. So I figured I could do a simple test in WoW. I'd position my camera straight up and down (maximizing fps and also making sure all tests were equal), stand right next to an NPC, and strafe left and right and note the smoothness, and ease of readability, his name would appear. For each test, I'd lock my FPS.

At 144. Name scrolled smoothly, easy to read.
At 120. Named still scrolled smoothing, easy to read. Could not discern between 144.
At 90. Name was noticeably harder to read, text was not moving nearly as smooth.
At 60. Not even in the same world as the 120 test. Text was nearly impossible to read.

I tried the tests again in different locations, looking at objects instead of names. It's a bit harder to discern those, and 90fps was harder to tell apart from the rest. 60 was still clearly not as smooth as 120 though.

Now, were any of these game breakers? Not really. Would I noticed while playing, without actively looking for it, probably not. But it's definitely not placebo, it's very, very real.

I appreciate you taking the time to attempt looking at it from a rational and testing perspective.

I'm having a difficult time picturing what you are saying, as I can't possibly imagine moving text not being legible at any refresh rate 60 or above, unless there are some serious flaws in th engine. I certainly have not seen this in my 25 years of building systems.

I'll agree to keeping an open mind though.

My experiences are pretty old at this point. I used to game with vsync on at 100hz in the original Counter-Strike days (back before CSGO, or even Source) When my 22" Iiyama Visionmaster Pro 510 died, and I decided to get a flatpanel, (A Dell 2405 FPW) I was very concerned that the drop in refresh rate from 100hz to 60hz would be a big deal and ruin my ability to play. I was a huge believer that flatpanels sucked back then, and that the only good screens were CRT's, but I decided to "embrace the future" by being an early adopter of 1920x1200 widerscreen flatpanels.

I spent a ton of time testing, going back and forth between my new screen and a friends similar CRT to my dead one at 100hz and every test I ran, every experiment I tried, I could not find anything what so ever to confirm my fears. That was - what - like 12, maybe 13 years ago now? So I was a man in my mid 20's, and my responses and senses were probably more sensitive then than they are now in my late 30's

The acknowledgment that should matter here though is that is was a LONG time ago. So I'll admit I might be wrong. It goes against everything I've experienced and tested, but there is always a possibility. I'll see if I can come across a high refresh screen at some point to give it a try myself.
 
My opinions when I am posting in the forums are my own, and do not reflect those of HardOCP, unless I am explicitly posting news, and even then are really my own.



I appreciate you taking the time to attempt looking at it from a rational and testing perspective.

I'm having a difficult time picturing what you are saying, as I can't possibly imagine moving text not being legible at any refresh rate 60 or above, unless there are some serious flaws in th engine. I certainly have not seen this in my 25 years of building systems.

I'll agree to keeping an open mind though.

My experiences are pretty old at this point. I used to game with vsync on at 100hz in the original Counter-Strike days (back before CSGO, or even Source) When my 22" Iiyama Visionmaster Pro 510 died, and I decided to get a flatpanel, (A Dell 2405 FPW) I was very concerned that the drop in refresh rate from 100hz to 60hz would be a big deal and ruin my ability to play. I was a huge believer that flatpanels sucked back then, and that the only good screens were CRT's, but I decided to "embrace the future" by being an early adopter of 1920x1200 widerscreen flatpanels.

I spent a ton of time testing, going back and forth between my new screen and a friends similar CRT to my dead one at 100hz and every test I ran, every experiment I tried, I could not find anything what so ever to confirm my fears. That was - what - like 12, maybe 13 years ago now? So I was a man in my mid 20's, and my responses and senses were probably more sensitive then than they are now in my late 30's

The acknowledgment that should matter here though is that is was a LONG time ago. So I'll admit I might be wrong. It goes against everything I've experienced and tested, but there is always a possibility. I'll see if I can come across a high refresh screen at some point to give it a try myself.


I'd agree that it's basically impossible to tell the difference on lower refresh rate monitors. I've only been able to tell the difference since switching to a 144hz one.

I'm completely confident though if somebody ran the test I performed above blind with me, that I'd get 100% of them correct when switching between 60fps and 120fps. The difference is that glaring. I'm not so confident I'd do perfectly between 90 and 120 though.
 
I'm having a difficult time picturing what you are saying, as I can't possibly imagine moving text not being legible at any refresh rate 60 or above, unless there are some serious flaws in th engine. I certainly have not seen this in my 25 years of building systems.

https://www.testufo.com/#test=framerates&count=3

I think that's the best I can offer to try and describe what I see. So in that test above, the 120fps UFO looks nearly perfect on my screen, as clear as if it weren't moving. The 60fps UFO has a bit of "double vision" going on. The 30fps UFO is a mess. (iirc the 120fps one only appears if you have a 120hz monitor).

The same thing happens to NPC names while I strafe left and right very close to them. At 120fps, they're fairly smooth and readable. At 60fps, the double vision effect makes them very hard to read while they're moving left and right very quickly across the screen.

At the end of the day, I do think certain people are more susceptible to being bothered by it than others. The whole thing is some weird combination of your eyes and brain and I don't think everybody sees it exactly the same.
 
Last edited:
Z and FS, let me add an addendum:

Going from a slower ~16ms 60Hz panel, my ZR30w (contemporary to Dell's U3011, just with far less input lag) to a 165Hz Predator (IPS), while literally everything from snapping around in BF games to moving the mouse cursor and application windows around was smoother, one thing that really stood out was this:

Flying a little bird (helicopter) in BF4. Those things are nimble and also very deadly against infantry, if you can see them. On my 60Hz monitor, I had to slow down; the image just wasn't 'clear' enough to track small targets while pushing the chopper as fast as it could go. At 165Hz, man oh man, you can pick the little bastards out and do quick snap kills.

To put this in perspective: while my Acer monitor does have G-Sync, the single feature that I will not be without on a gaming monitor is 120Hz+ support. High refresh rate is simply where it's at. I'd rather live with tearing than live at 60Hz.

This current monitor as a 1440p 27" and is noticeably smaller than the 1600p 30" that now sits beside it. I've considered replacing it with one of the 34" ultrawides, but those are stuck at 100Hz and have iffy game support for 21:9, so I'm waiting for the high-refresh 4k monitors in the 32"+ class (and appropriate GPU(s)).
 
https://www.testufo.com/#test=framerates&count=3

I think that's the best I can offer to try and describe what I see. So in that test above, the 120fps UFO looks nearly perfect on my screen, as clear as if it weren't moving. The 60fps UFO has a bit of "double vision" going on. The 30fps UFO is a mess. (iirc the 120fps one only appears if you have a 120hz monitor).

The same thing happens to NPC names while I strafe left and right very close to them. At 120fps, they're fairly smooth and readable. At 60fps, the double vision effect makes them very hard to read while they're moving left and right very quickly across the screen.

At the end of the day, I do think certain people are more susceptible to being bothered by it than others. The whole thing is some weird combination of your eyes and brain and I don't think everybody sees it exactly the same.
Monitor response time also plays into this, for the above test at 120pixels/sec top UFO at 60hz is clear as can be, at 240 it starts to get shimmering (ghosting) but very slight - above that worst. Now most games have motion blur - which if relative motion to view is present it gets blurred - unless you turn that off it will be blurry anyways or the monitor will blur anyways if fast enough. Having text go across the screen is not the same as a game with motion blur, depth of field etc. Have you turned off motion blur in games? I am just not sure full impact of 100hz versus 144hz on a monitor. In VR 90 fps and 45 fps is in a whole different league.

Anyways the problem I had before with higher refresh monitors was the lack of color which is detrimental for me worst than accurate motion. Now days you can have both fortunately. My next upgrade in a monitor is most like at the earliest 2018 if not 2019. It will need to be HDR, high resolution first and highest refresh rate second. I just put more priority over the quality of the pixel then motion quality at the extremes. I am glad there are options out there to meet other folks preferences or needs.
 
Three additional points noko:

1. I always turn motion blur off when fast reactions are required, whether it's a multiplayer shooter like a BF game or something like the Mass Effect games where I like to turn the settings up and missing usually means failure.
2. I'm not sure if I could reasonably tell the difference between 100Hz and 144Hz+, but I'll take better where I can get it.
3. I will say that I do not notice color degradation on my Predator at 165Hz in games versus running the desktop at 120Hz (for video). I believe that this is mostly tied to strobing, something I haven't tried.
 
Three additional points noko:

1. I always turn motion blur off when fast reactions are required, whether it's a multiplayer shooter like a BF game or something like the Mass Effect games where I like to turn the settings up and missing usually means failure.
2. I'm not sure if I could reasonably tell the difference between 100Hz and 144Hz+, but I'll take better where I can get it.
3. I will say that I do not notice color degradation on my Predator at 165Hz in games versus running the desktop at 120Hz (for video). I believe that this is mostly tied to strobing, something I haven't tried.
Very cool and that I think you will get the most out of the much better or accurate motion with the higher refresh rate.

The second issue would be how many games can be driven at 100hz or higher at max or near max settings at high resolutions, what hardware is needed? Minus some of the dumb ones like Chromatic Abberation, most game Depth of Field (except for me Doom did it right and it does look better and not just a distance blur).

FreeSync does a great job in giving smoother feel and non-tearing but will not clear up text or objects with high motion either. To get that high consistent refresh rate 100hz in fast titles does require a lot of fine tuned components. VR I think is pushing this more than anything else which shows 90fps should be the minimum baseline and the next version may push that higher.
 
And again, show me a system that does give that super FPS in all games except CSGO, CSGO can get it's fps on a potato of a computer...
If one wants 144 fps in their GTA5 on highest settings, and all the latest titles.
Most games are already gpu bottlenecked for the fps for 144 hz screens unless IQ is lowered.

I totally get the aspects of 144hz screens, I have one, I can see what it does but it's not "night and day" going back to a 60 hz screen and I rarely achieve the fps where a 144 hz really comes into play unless I play competitive games that usually can be ran at a 2010 amd cpu.
You ofc have those who buy the best and greatest at every generation of hardware who can usually push even new games barely to the point where a 144hz screen really shines but you're talking about 800-2000 bucks a year to be able to do it currently.

Competitive games is usually not very demanding, the biggest games out there are so popular because so many can play it.
LOL, Starcraft, Warcraft 3, CS, CSGO just to name a few, all of them ran good at midrange laptops in their prime time....
 
Last edited:
I agree. I have not compared 120 vs 144, but I have gone from 90 to 105 to 144. There is a huge difference and very noticeable. Anyone who believes that FPS is limited to 60 is crazy. I see that there is a 240hz monitor coming out. Some people will say its marketing and useless, but I believe there are peopel who CAN tell a difference and others who wont notice.

But to say that people only "think they can tell a difference" is absurd and pretty embarrassing for a "staff member" to state.
No it is scientific fact and no most can not TELL the difference. Most scientific tests show that most people can not tell the difference over 60. Then there is a small number that can up to maybe 90. Now here is the kicker and the issue with most claiming super human abilities:

Take two frame 10ms(100fps) each. one yellow one red(not certain of the colors it was a few years ago I read the findings). In the test not one person saw the individual colors and only saw orange.

The problem is most people can FEEL the difference not SEE the difference. If one of those 120FPS frames was black you would not see it but you might feel its existence, as something being off. NO ONE can discern a single frame above 90fps. They might be able to feel the smoothness but that is about it.
 
No it is scientific fact and no most can not TELL the difference. Most scientific tests show that most people can not tell the difference over 60. Then there is a small number that can up to maybe 90. Now here is the kicker and the issue with most claiming super human abilities:

Take two frame 10ms(100fps) each. one yellow one red(not certain of the colors it was a few years ago I read the findings). In the test not one person saw the individual colors and only saw orange.

The problem is most people can FEEL the difference not SEE the difference. If one of those 120FPS frames was black you would not see it but you might feel its existence, as something being off. NO ONE can discern a single frame above 90fps. They might be able to feel the smoothness but that is about it.

Your explanation is bullshit.

You cannot compare simple tests to games that simulate real activity, which isn't binary and includes degrees of clarity as well as responsiveness to input.
 
Your explanation is bullshit.

You cannot compare simple tests to games that simulate real activity, which isn't binary and includes degrees of clarity as well as responsiveness to input.
So are you saying the findings are BS and so is the scientific study? Come on, if you can debate with your findings within the scientific specifications for research then by all means show us what you got.
 
So are you saying the findings are BS and so is the scientific study? Come on, if you can debate with your findings within the scientific specifications for research then by all means show us what you got.

The study that you didn't link?

Yes.
 
The study that you didn't link?

Yes.
You can google frame rate tests there are a lot of them. I think the test I speak of may be referenced in Wiki so you could try there. I have not the time nor the desire to go floundering for the information at this time.

Best I can tell it seems like most of you high refresh rate believers are just trying to justify your purchases. All scientific tests prove otherwise, and yet you defiantly rebuke their findings based on personal feelings.
 
Show a single one or quit your bullshit.
seriously. It isn't bs and fact is I work for a living therefore can't spend decades looking for it at this time. Now feel free to post a link in support of your point till I can or you too can quit.
 
seriously. It isn't bs and fact is I work for a living therefore can't spend decades looking for it at this time. Now feel free to post a link in support of your point till I can or you too can quit.

Yup, he's done. Thanks for admitting you don't have a clue as to what you're talking about. Please refrain from repeating this nonsense in the future.
 
I like how the defence of the poor (yes it is poor whether you compare it to the strong multi threaded performance or the fact that it's competitor has a chip that's both faster and cheaper) gaming performance has shifted from it's "good enough" to "you can't tell the difference over 90 anyway".

I mean for fucks sake. This is a tech forum and you've got shit like that coming out that I'd expect to see in a console gaming forum instead.

If you are primarily doing productivity work that needs more cores and your budget is in that range, get a 1700 and overclock it. If you are primarily gaming and your budget is in that range, get a 7700K and overclock it. Why does this need 25 pages of arguing?
 
Of course people can see a diff between 60 and 100 in games. At 60 Hz, quick pans of the viewport will move objects inches on the screen from frame to frame. I don't know how anyone would perceive that as smooth.

60 Hz looks smooth in ideal material, such a static camera pointed at actors faces. The frame to frame delta is very small. In any case with larger movement, they use horrific amounts of blurring - anathema to twitch gamers.

I concur with others, over 100 it gets very hard to tell even in worst cases (again, quick viewport pans). It would be interesting to get a military pilot or similar and see what they can perceive. I suspect some humans are astonishingly good at this, but would of course be the exception.
 
I like how the defence of the poor (yes it is poor whether you compare it to the strong multi threaded performance or the fact that it's competitor has a chip that's both faster and cheaper) gaming performance has shifted from it's "good enough" to "you can't tell the difference over 90 anyway".

I mean for fucks sake. This is a tech forum and you've got shit like that coming out that I'd expect to see in a console gaming forum instead.

If you are primarily doing productivity work that needs more cores and your budget is in that range, get a 1700 and overclock it. If you are primarily gaming and your budget is in that range, get a 7700K and overclock it. Why does this need 25 pages of arguing?

Because in the last two weeks everyone has decided they now play games and do content creation (on the CPU instead of a GPU).
 
I like how the defence of the poor (yes it is poor whether you compare it to the strong multi threaded performance or the fact that it's competitor has a chip that's both faster and cheaper) gaming performance has shifted from it's "good enough" to "you can't tell the difference over 90 anyway".

I mean for fucks sake. This is a tech forum and you've got shit like that coming out that I'd expect to see in a console gaming forum instead.

If you are primarily doing productivity work that needs more cores and your budget is in that range, get a 1700 and overclock it. If you are primarily gaming and your budget is in that range, get a 7700K and overclock it. Why does this need 25 pages of arguing?

It doesn't, you're right. Only caveat I'd give is that there's probably 10-15% gaming performance on the table with the bad performing games due to the CCX issue, but even then it's still slower than the 7700k in 3/4 of games and marginally better, if at all, in the rest. It does look to be at parity or a bit better for minimum frame times in VR where fps is capped at 90, though.
 
Basically.


just watch that video. Gamer Nexus just says it straight out the boards are a mess right now. AMD didn't ship final code until 3 weeks and they had already gotten their review samples. So I think its an unfinished product on the microcode side. Given how memory is locked down there will be lots of bios updates in the near future.


AMD not being finished with its code or reference UEFI/BIOS code is not the same as the motherboard manufacturers being done with their code. The motherboard manufacturers were probably not done with their code until a week to ten days prior to launch. At the very least It seems to take the motherboard manufacturers about a week or so to turn what the CPU companies give them into something they can launch a motherboard with. When Z170 launched, GIGABYTE had finished with their code about a week before hand. I think they received their code about two weeks before that. ASUS had their UEFI BIOS in shipping / retail ready form about two weeks in advance.

I think this CPU performance in gaming issues is overblown. What I suspect is happening is that we are starting to see that we are CPU limited at 1080P when testing with monster cards like the GTX 1080, and Titan X (Pascal). I've been saying for a couple of years now that 1080P is not a demanding resolution. As GPUs get faster and faster for things like multimonitor, 4K and VR gaming, we will see pedestrian resolutions like 1080P become more CPU bound than GPU bound. I think this is primarily why we see the differences we do with Ryzen vs. Intel CPUs. I'm speculating of course. A lot of testing has to be done with Ryzen and various Intel CPUs to see if that's really the case. This doesn't change the fact that Ryzen may indeed be considerably weaker at gaming than Intel CPUs, but so far this issue seems to clear up at 2560x1440 and higher resolutions. To me that has "CPU bound" at 1080P" written all over it. I don't see any other way the architecture, the Windows scheduler, or SMT could hurt Ryzen at 1080P only.

At least that's the impression I'm getting when looking at data like this.
 
Last edited:
AMD not being finished with its code or reference UEFI/BIOS code is not the same as the motherboard manufacturers being done with their code. The motherboard manufacturers were probably not done with their code until a week to ten days prior to launch. At the very least It seems to take the motherboard manufacturers about a week or so to turn what the CPU companies give them into something they can launch a motherboard with. When Z170 launched, GIGABYTE had finished with their code about a week before hand. I think they received their code about two weeks before that. ASUS had their UEFI BIOS in shipping / retail ready form about two weeks in advance.

I think this CPU performance in gaming issues is overblown. What I suspect is happening is that we are starting to see that we are CPU limited at 1080P when testing with monster cards like the GTX 1080, and Titan X (Pascal). I've been saying for a couple of years now that 1080P is not a demanding resolution. As GPUs get faster and faster for things like multimonitor, 4K and VR gaming, we will see pedestrian resolutions like 1080P become more CPU bound than GPU bound. I think this is primarily why we see the differences we do with Ryzen vs. Intel CPUs. I'm speculating of course. A lot of testing has to be done with Ryzen and various Intel CPUs to see if that's really the case. This doesn't change the fact that Ryzen may indeed be considerably weaker at gaming than Intel CPUs, but so far this issue seems to clear up at 2560x1440 and higher resolutions. To me that has "CPU bound" at 1080P" written all over it. I don't see any other way the architecture, the Windows scheduler, or SMT could hurt Ryzen at 1080P only.

At least that's the impression I'm getting when looking at data like this.

It looks like moving threads across CCX causes a significant performance hit. The benchmarks in 'bad' games on 4 core show it pretty well - 2+2 (2 cores on each CCX) was 10% or so slower than 4+0 (all cores on one CCX and none on the other) despite having effectively half the L3 cache. If they can keep the threads from moving across CCX's in win10, it should provide a boost back to where it should be. Win7 doesn't seem to have the issue - there's a long thread on Anandtech with a lot of that info.
 
Hmm, it is disheartening that the R7 processors are somewhat "weaker" than Intel's CPUs, but let's not forget that most of the games that are out today have been tuned for Intel's offerings. Let's just wait until everything's patched up and we might see slight/huge improvements.
 
Hmm, it is disheartening that the R7 processors are somewhat "weaker" than Intel's CPUs, but let's not forget that most of the games that are out today have been tuned for Intel's offerings. Let's just wait until everything's patched up and we might see slight/huge improvements.

People need to stop with that "tuned" for Intel nonsense. The compilers often do come from Intel SDK's but the fact is that instruction sets are standard and are shared across both processor companies product lines. AMD and Intel have a cross license agreement for instruction sets. If an AMD processor sucks ass at AVX or SSE4 its their own fucking fault for not creating a design that handles it better. Games aren't written for specific hardware as they are written for specific API's. The importance and reliance on the Windows scheduler shouldn't be undervalued either. There is room for improvement on that front but there aren't "driver paths" in the games or game engines like you have for GPUs.
 
Last edited:
Hmm, it is disheartening that the R7 processors are somewhat "weaker" than Intel's CPUs, but let's not forget that most of the games that are out today have been tuned for Intel's offerings. Let's just wait until everything's patched up and we might see slight/huge improvements.

Games aren't specifically tuned for Intel CPUs, Intel just happens to handle them better. I think the Ryzen performance you see today is going to continue in the future. AMD FX chips had much worse performance issues and they never got fixed. People sat around for years waiting on games to require more cores or be better optimized for FX multi core CPUs. This never happened because the issue is with the CPU, not the "optimization".
 
Games aren't specifically tuned for Intel CPUs, Intel just happens to handle them better. I think the Ryzen performance you see today is going to continue in the future. AMD FX chips had much worse performance issues and they never got fixed. People sat around for years waiting on games to require more cores or be better optimized for FX multi core CPUs. This never happened because the issue is with the CPU, not the "optimization".

People have a hard time understanding that CPU architectures handle some tasks better than others. It just so happens that AMD is weaker on the gaming front than Intel. This isn't actually news as this has been the case for the bulk of the last 25 years. The Athlon / Athlon 64 are the exception, not the rule.
 
People have a hard time understanding that CPU architectures handle some tasks better than others. It just so happens that AMD is weaker on the gaming front than Intel. This isn't actually news as this has been the case for the bulk of the last 25 years. The Athlon / Athlon 64 are the exception, not the rule.

I think what's causing the disbelief is that one generally follows the other, and the Ryzen is demonstrably the tops in some tasks while falling significantly behind in gaming.

I'm guessing that Ryzen will be a capable gaming processor going forward as devs test to it to insure there aren't the massive deficits some games have on it.
 
I think what's causing the disbelief is that one generally follows the other, and the Ryzen is demonstrably the tops in some tasks while falling significantly behind in gaming.

I'm guessing that Ryzen will be a capable gaming processor going forward as devs test to it to insure there aren't the massive deficits some games have on it.

With the current performance numbers, devs aren't going to blink and eye and try to make things run better on Ryzen. They are way more concerned about getting the game finished than changing up their game engine to run better for a tiny number of people using Ryzen.
 
The importance and reliance on the Windows scheduler shouldn't be undervalued either. There is room for improvement on that front but there aren't "driver paths" in the games or game engines like you have for GPUs.
To be honest, Ryzen does look to benefit from hand tuning thread affinities in games and few other workloads of sort, to both use the cache available and dodge the inter-CCX communication issues being too rampant. That is not an optimization as much as a workaround, but w/e.

That, almost certainly, won't happen in this generation of games, but who the hell knows what happens further. For all we know AMD goes under or fixes the main uarch issues making the workaround obsolete.
 
With the current performance numbers, devs aren't going to blink and eye and try to make things run better on Ryzen. They are way more concerned about getting the game finished than changing up their game engine to run better for a tiny number of people using Ryzen.

At what point does the market share become something game developers pay attention to? 10%? 20%? 30%? 50%? Just curious as to what you are thinking Ryzen as a platform will stay below (including forthcoming low priced quad and hex core models) when you make statements like that.

Also I doubt devs will concern themselves with 10% or even 20% deficits in games if they're still playable unless Ryzen literally becomes too big not to make a primary target of optimization, but when I say devs will test I mean for those relative corner cases like Total War Warhammer where the Ryzen is stock for stock only about 60% as fast as an i7 7700.

I just don't expect games too be released (if Ryzen isn't a complete failure on the market) with that kind of deficit going unnoticed for long.
 
To be honest, Ryzen does look to benefit from hand tuning thread affinities in games and few other workloads of sort, to both use the cache available and dodge the inter-CCX communication issues being too rampant. That is not an optimization as much as a workaround, but w/e.

That, almost certainly, won't happen in this generation of games, but who the hell knows what happens further. For all we know AMD goes under or fixes the main uarch issues making the workaround obsolete.
I can see some long hours seeing if different configurations of threads makes a significant difference - for example game on one CCX with higher priority and then all other stuff on both but the game will dominate the one CCX - where would the GPU driver operate from? Is the GPU driver operating across the CCX? Can the driver have inconsistent behavior in how it is configure between the CCX?
 
At what point does the market share become something game developers pay attention to? 10%? 20%? 30%? 50%? Just curious as to what you are thinking Ryzen as a platform will stay below (including forthcoming low priced quad and hex core models) when you make statements like that.

Also I doubt devs will concern themselves with 10% or even 20% deficits in games if they're still playable unless Ryzen literally becomes too big not to make a primary target of optimization, but when I say devs will test I mean for those relative corner cases like Total War Warhammer where the Ryzen is stock for stock only about 60% as fast as an i7 7700.

I just don't expect games too be released (if Ryzen isn't a complete failure on the market) with that kind of deficit going unnoticed for long.
I expect RyZen will be used to optimized for since it will most likely be in XBox Scorpio - The 35w performance from RyZen is in a league of it's own (very strong), it is almost as if the processor was designed for servers at 2.2ghz and then pushed up for the desktop.
 
I can see some long hours seeing if different configurations of threads makes a significant difference - for example game on one CCX with higher priority and then all other stuff on both but the game will dominate the one CCX - where would the GPU driver operate from? Is the GPU driver operating across the CCX? Can the driver have inconsistent behavior in how it is configure between the CCX?


Yep and this is why I stated its not an easy problem to fix.
 
At what point does the market share become something game developers pay attention to? 10%? 20%? 30%? 50%? Just curious as to what you are thinking Ryzen as a platform will stay below (including forthcoming low priced quad and hex core models) when you make statements like that.

Also I doubt devs will concern themselves with 10% or even 20% deficits in games if they're still playable unless Ryzen literally becomes too big not to make a primary target of optimization, but when I say devs will test I mean for those relative corner cases like Total War Warhammer where the Ryzen is stock for stock only about 60% as fast as an i7 7700.

I just don't expect games too be released (if Ryzen isn't a complete failure on the market) with that kind of deficit going unnoticed for long.

Users can turn down settings to make the games run better. Look at Warhammer, the game runs like hell on high settings for existing AMD FX users (of which there are millions). Did the developer "fix" it before they launched the game? Nope. They launched the game as is with some great looking visuals and terrible performance for AMD users compared to Intel.
 
Last edited:
The 60 Hz thing is that people cannot pick out a single crazy frame in a 60 Hz stream, at most we sense a flash or glitch, but cannot identify what was shown. That has nothing to do with sensing the fluidity from 30 to 60 to 90 to 144, which is easy.
 
Back
Top