AMD Ryzen 1700X CPU Review @ [H]

(Seen on reddit /r/amd) pcgameshardware is saying the successor to Summit Ridge will be called "Pinnacle Ridge". I cannot remember if Pinnacle Ridge has been mentioned before, so I decided to post this. Eta according to PCGH is gonna be 2018, tbh thats a little too broad to be useful in my opinion.
 
I have no problem with people saying they can in fact feel the smoothness and to some degree up to 100fps I feel they can tell the difference of extreme frame rates ie: 30, 60, 90 not 65, 70, 92, 95. Unfortunate most claim they can in fact see individual frames as in discern that actual image that lasted just one frame.
No, most do not and most discussions revolve around smoothness, which your study doesn't talk about. People can tell the difference above 60 FPS and you're building a straw man by referencing single frame recognitions.


My experiences are pretty old at this point. I used to game with vsync on at 100hz in the original Counter-Strike days (back before CSGO, or even Source) When my 22" Iiyama Visionmaster Pro 510 died, and I decided to get a flatpanel, (A Dell 2405 FPW) I was very concerned that the drop in refresh rate from 100hz to 60hz would be a big deal and ruin my ability to play. I was a huge believer that flatpanels sucked back then, and that the only good screens were CRT's, but I decided to "embrace the future" by being an early adopter of 1920x1200 widerscreen flatpanels.

I spent a ton of time testing, going back and forth between my new screen and a friends similar CRT to my dead one at 100hz and every test I ran, every experiment I tried, I could not find anything what so ever to confirm my fears.
Man, you must have some "slow ass eyes" if you couldn't tell a huge difference between a 100 Hz CRT and an old slow 60 Hz LCD.
 
more rumors that Win 10 scheduler is totally messed up. l.E., looking at all 16 cores as physical cores and also somehow thinking that the CPU has over 130MB of cache available to it? rumors are that AMD/MS are looking into it.
 
Man, you must have some "slow ass eyes" if you couldn't tell a huge difference between a 100 Hz CRT and an old slow 60 Hz LCD.

...and back then I was actually good at fast twitchy shooters. I was usually one of the best players in any server I joined, and was very frequently falsely accused of hacking.

This is why my inclination is to believe that there is at least some element of placebo involved at higher framerates. if I could still perform as well as I did and not notice the difference, then is there really a difference? :p
 
...and back then I was actually good at fast twitchy shooters. I was usually one of the best players in any server I joined, and was very frequently falsely accused of hacking.

This is why my inclination is to believe that there is at least some element of placebo involved at higher framerates. if I could still perform as well as I did and not notice the difference, then is there really a difference? :p
I can read an 80cd/m2 monitor in the dark just as well as a 300cd/m2 one, but the experience is quite different :)
 
I can read an 80cd/m2 monitor in the dark just as well as a 300cd/m2 one, but the experience is quite different :)

I intentionally keep my monitors dim, as when I turn up the brightness it feels like they are going to set fire to my eyeballs.
 
...and back then I was actually good at fast twitchy shooters. I was usually one of the best players in any server I joined, and was very frequently falsely accused of hacking.

This is why my inclination is to believe that there is at least some element of placebo involved at higher framerates. if I could still perform as well as I did and not notice the difference, then is there really a difference? :p

Egads....

I always run with v-sync and anytime I'm on Quake Live I'm top three.

For god's sake we used to play UT at 30fps and we were damned happy to have that!

Kids. Easily marketed to.
 
Get out your tinfoil hats and strap in. :rolleyes: This has been present for a very long time on Intel's platform. This includes any Xeon based server used in government and fortune 500 companies. I've never heard of any case where the management engine interface was used maliciously. That doesn't mean there isn't any cases of that but were it a commonly exploited flaw I think people would be up in arms about it. Something like that would be a PR nightmare that would be worse than the Intel FDIV bug issue on the Pentium 60/60MHz CPUs ever was. This is why Intel guards the ME source code so closely.


Yep, this has been around for some time. Lets say recent CIA leaks uncover that Intel has provided this backdoor to CIA? Maybe NSA has it? And now with the leak it is in the wild? Chinese hackers using it to steal confidential company IP?

These are all theoretical, but they really go to show that backdoors, any backdoors are always a bad idea. It's not a matter of if, but a matter of when they get compromised.
 
No, most do not and most discussions revolve around smoothness, which your study doesn't talk about. People can tell the difference above 60 FPS and you're building a straw man by referencing single frame recognitions.



Man, you must have some "slow ass eyes" if you couldn't tell a huge difference between a 100 Hz CRT and an old slow 60 Hz LCD.
Not straw man and seriously, what is with the attack on the both of us.

Fact: When both frame rates are metered Say perfect frame times like Vsync then no you will not be able to tell the difference between 60Hz and 120Hz.

Proof: ask anyone with G-sync or adaptive sync.

As far as smoothness, No that was not the bulk of the discussion it was in fact the difference between 60, 90, 120 and 144Hz and the ability to discern the difference, being 90Hz as the maximum where one MAY be able tell with most barely able to tell the difference above 60Hz which is where the study I posted proved.

Now if you have some study that proves your point then by all means post it. But if all you are going to do is cry and moan because you don't want to believe well then, I don't expect to hear any more from you on the subject.
 
Not straw man and seriously, what is with the attack on the both of us.

Fact: When both frame rates are metered Say perfect frame times like Vsync then no you will not be able to tell the difference between 60Hz and 120Hz.

Proof: ask anyone with G-sync or adaptive sync.

As far as smoothness, No that was not the bulk of the discussion it was in fact the difference between 60, 90, 120 and 144Hz and the ability to discern the difference, being 90Hz as the maximum where one MAY be able tell with most barely able to tell the difference above 60Hz which is where the study I posted proved.

Now if you have some study that proves your point then by all means post it. But if all you are going to do is cry and moan because you don't want to believe well then, I don't expect to hear any more from you on the subject.

Yup, I cannot use my machine without Gsync on(getting it to work is a mission all on its own) not sure if it's the 1070 or MSI really skimpped on the monitor but the screen tearing is so unbearable on it it's unplayable. With gsync on I don't notice anything bad at all, no idea why the screen tearing was so apparent, desktop has a 7870 in it and has never been an issue. 24" vs 15" screen perhaps giving me a lot smaller area for eyes to focus, most likely it was games I played, Overwatch, CSGO, MWO, Witcher 2 etc and all had quite high framerates, so I'd say gsync/freesync replicates the high FPS experience
 
So there is a recent PCGamer article on the refresh rate perceptible by the human eye.

http://www.pcgamer.com/how-many-frames-per-second-can-the-human-eye-really-see/

It appear from that (and my research on Wikipedia) that the human visual system operates around 13Hz. But smoothness of motion can still be seen up to around 200Hz.

I know personally I saw a big difference going from a 60Hz LCD screen to 120Hz, but almost no difference upgrading to 144Hz.
 
So there is a recent PCGamer article on the refresh rate perceptible by the human eye.

http://www.pcgamer.com/how-many-frames-per-second-can-the-human-eye-really-see/

It appear from that (and my research on Wikipedia) that the human visual system operates around 13Hz. But smoothness of motion can still be seen up to around 200Hz.

I know personally I saw a big difference going from a 60Hz LCD screen to 120Hz, but almost no difference upgrading to 144Hz.


Yeah the eye can see around 120hz, depending on the person though, how fast their nerves transmit and how fast their brain can analyze the information is where precept ability comes in. But 120hz is about the max.
 
Windows scheduler has been confirmed as a major problem as it doesn't assign CPU to tasks functions efficiently. The Stilt reported about 17% better performance on Win7 and more stable.

We managed to get win 7 on and the OS feels snappier. We tried running a RE7 run while observing core usage in win 10 there are instances of usage being fine then fully loaded single core then back to normal.
 
Windows scheduler has been confirmed as a major problem as it doesn't assign CPU to tasks functions efficiently. The Stilt reported about 17% better performance on Win7 and more stable.

We managed to get win 7 on and the OS feels snappier. We tried running a RE7 run while observing core usage in win 10 there are instances of usage being fine then fully loaded single core then back to normal.

Not really shocked kind of figured the erratic performance at times felt like a scheduler issue to me. Hope Microsoft and AMD fixes it quickly. Kind of sad that Windows 7 supports it better then Windows 10.
 
The cache problem, if it really does exist, sounds like a plausible reason why disabling SMT helps some, but not really or by not that much. If more data is being sent to the CPU than the CPU can actually handle I would assume it would cause all sorts of odd issues.
 
Not straw man and seriously, what is with the attack on the both of us.

Fact: When both frame rates are metered Say perfect frame times like Vsync then no you will not be able to tell the difference between 60Hz and 120Hz.

Proof: ask anyone with G-sync or adaptive sync.

As far as smoothness, No that was not the bulk of the discussion it was in fact the difference between 60, 90, 120 and 144Hz and the ability to discern the difference, being 90Hz as the maximum where one MAY be able tell with most barely able to tell the difference above 60Hz which is where the study I posted proved.

Now if you have some study that proves your point then by all means post it. But if all you are going to do is cry and moan because you don't want to believe well then, I don't expect to hear any more from you on the subject.
https://link.springer.com/article/10.3758/s13414-013-0605-z
77 FPS recognition. Plus the fighter pilots' 220.

That is single frame recognition, a much harder task than smoothness perception.


https://us.hardware.info/reviews/45...fers-120-hz-monitors-results-86-prefer-120-hz
Not vsynced, but 200 FPS average and >140 minimum.


As far as smoothness, No that was not the bulk of the discussion it was in fact the difference between 60, 90, 120 and 144Hz and the ability to discern the difference
No, but yes?
Yes, ability to discern the difference in motion/smoothness, not single frame differences/recognition.
 
https://link.springer.com/article/10.3758/s13414-013-0605-z
77 FPS recognition. Plus the fighter pilots' 220.

That is single frame recognition, a much harder task than smoothness perception.


https://us.hardware.info/reviews/45...fers-120-hz-monitors-results-86-prefer-120-hz
Not vsynced, but 200 FPS average and >140 minimum.



No, but yes?
Yes, ability to discern the difference in motion/smoothness, not single frame differences/recognition.
Had you read my post entirely you would have seen me speak of the fighter pilot test which is bunk. It is a plain setup and they are told what they are to be looking for only having to identify the model. Motion pictures with the subliminal messages were at rates for lower than 200Hz and less than 60Hz yet most all never actual were aware ie: able to discern the frame.

I have no doubt at all that in blind tests most may observe higher frame rates as better/smoother. But that doesn't mean given perfect frame timing that they would choose 120 over 90 or even 60. Like I said before Most using Gsync/ Adaptive-sync speak of how low frame rates still feel like the higher frame rate they might have used before.
 
Ok, Ryzen is a relative failure so everyone just continue to buy Intel.........problem solved.

I own both and my 7700k isn't as good as ryzen. My 7700k is at constant 95-99% of usage while my 1700 is at 35-50%. This is in games, also if I stream and play at the same time ryzen can do it but 7700k can't it just dips a lot the frame rate and sometimes cripple.

The 1700 is superior processor hands down. If you want to compare the 7700k wait for ryzen 1600.
 
I own both and my 7700k isn't as good as ryzen. My 7700k is at constant 95-99% of usage while my 1700 is at 35-50%. This is in games, also if I stream and play at the same time ryzen can do it but 7700k can't it just dips a lot the frame rate and sometimes cripple.
Why i do not feel like your opinion is trustworthy... But i digress, 1700 is great as long as you do not care about fps.
 
I own both and my 7700k isn't as good as ryzen. My 7700k is at constant 95-99% of usage while my 1700 is at 35-50%. This is in games, also if I stream and play at the same time ryzen can do it but 7700k can't it just dips a lot the frame rate and sometimes cripple.

The 1700 is superior processor hands down. If you want to compare the 7700k wait for ryzen 1600.

Which goes against the results of every single review on every single respectable tech site I've come across. Needless to say I'm going to treat your information with just a hint of skepticism.
 
Which goes against the results of every single review on every single respectable tech site I've come across. Needless to say I'm going to treat your information with just a hint of skepticism.
Most reviews only touch min and avg as a single number. He may very well be having better gaming experience with the Ryzen based on sustained frame rates. There are a number of reviews pointing out real world gameplay is fine and in some situations better than the 7700k. Of course many of these are in fact stock settings but that doesn't detract from the general point of how well frame times tend to be on Ryzen thus far even in it infancy.
 
You can easily settle this using frametime analysis from the people who pioneered it.

Go to the bottom of the page (and subsequent pages) and compare results for time beyond 16.7ms and 8.3ms. 16.7ms corresponds to solid 60FPS- here, Ryzen is very close to the Intel CPUs. 8.3ms corresponds to solid 120FPS, and here Ryzen doesn't always put on a good showing.

This is really all you need to know about Ryzen and gaming upon release. 60FPS good, 120FPS not as good.
 
No one needs to settle anything. The only conclusion to all of this is that your 6700K -7700K 4 cores have become modern day Celerons. Deal with it :D
 
  • Like
Reactions: noko
like this
I ended up buying a 1700X this weekend, and a B350 board with some cheap DDR4. I plan to get a better board when stock evens out, and use the B350 for a Ryzen 5 when they eventually come down the pipe.

Running on an FX8350 with 8GB of RAM right now is really killing me in games, everything is laggy and frame rates suck in the games I play. I also don't need the fastest CPU out there so I don't mind giving Ryzen a try.
 
Reading some of these posts, you'd get the impression that ryzen could barely play minesweeper.
lol, 6 stages of grief for some folks - denial I think is first. RyZen came out strong. Platform is shaky but should mature.
 
Reading some of these posts, you'd get the impression that ryzen could barely play minesweeper.
I can confirm that Ryzen struggles to run minesweeper at 1080, but i hear if you disable SMT and park 7 of the 8 cores it blows the i7 7700k out of the water.
 
I'm leaning more towards optimization... than the bottle neck theory..while these are hand picked... it does leave questions.

r_600x450.png



r_600x450.png




r_600x450.png
 
Back
Top