Rockenrooster
Gawd
- Joined
- Apr 11, 2017
- Messages
- 957
Wow such utter destruction in IPC, Single Core Perf, obvioiusly multi, AND at a much lower power!!! Insane!!!
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
So, 98 percent of Ryzen 5000 series can to Infinity Fabric 2000? That's what you're saying?Genuine Intel(R) CPU 0000 Today at 12:46 PM
Toppc from MSI claims 98% of chips can do 2000
Reviews for the 5600x/5800x make it look like the i5-10600k is usually faster when gaming. Of course at 1440p+ it's only a few frames but the 5950x looks to be much faster than the lower SKUs and more on par with Intel. I was excited for an upgrade from my 3700x but for gaming it looks like it's practically the same unless I want to drop $800...From the reviews it looks like all of the 5000 series chips are roughly the same for gaming, ultimately.
only issue is, about double the price of the 3600 MHz kitsSo, 98 percent of Ryzen 5000 series can to Infinity Fabric 2000? That's what you're saying?
That's refreshing. I've had two 3900x and one 3950x and none could do 1900. Of course, that was on the December 2019 BIOS. Maybe I should test on the latest.
I will finally say it, “RIP Intel.”From the reviews it looks like all of the 5000 series chips are roughly the same for gaming, ultimately.
I found it really odd that the whole 5000 series lineup poops all over Intel in low res benches but not at Ultra 1440p on some games. Sometimes Intel is a frame or two ahead.Reviews for the 5600x/5800x make it look like the i5-10600k is usually faster when gaming. Of course at 1440p+ it's only a few frames but the 5950x looks to be much faster than the lower SKUs and more on par with Intel. I was excited for an upgrade from my 3700x but for gaming it looks like it's practically the same unless I want to drop $800...
Probably because at that point you are GPU bound - that's why they CPU test games at 1080p.I found it really odd that the whole 5000 series lineup poops all over Intel in low res benches but not at Ultra 1440p on some games. Sometimes Intel is a frame or two ahead.
Interesting.....
Because when gaming at 1440p/4k you are more GPU bound. Which means the CPU used doesnt mean much at all.It's been exactly one day and I'm already bored with the usual talking-heads on YouTube getting hard-ons for 1080p gaming...............Jesus Christ guys we get it, these things DOMINATE at low rez 2010 gaming! =|
1440p/4k numbers would be less thumbnail-worthy but more relevant. VR numbers and Multi-Monitor setups as well. Is there any real reason for us to upgrade from 8000 or 9000 or 10000 series Intel parts (or even Ryzen 2, 3) if Minecraft at fucking 300fps is not what we are after....
Please. Thank You.
It's been exactly one day and I'm already bored with the usual talking-heads on YouTube getting hard-ons for 1080p gaming...............Jesus Christ guys we get it, these things DOMINATE at low rez 2010 gaming! =|
1440p/4k numbers would be less thumbnail-worthy but more relevant. VR numbers and Multi-Monitor setups as well. Is there any real reason for us to upgrade from 8000 or 9000 or 10000 series Intel parts (or even Ryzen 2, 3) if Minecraft at fucking 300fps is not what we are after....
Please. Thank You.
Serious question: is it fair to compare the 3600, instead of the 3600X, to the 5600X? My gut says no.Also appears to be double the price of 3600.
He was talking about RAM speeds, 3600 vs 4000Serious question: is it fair to compare the 3600, instead of the 3600X, to the 5600X? My gut says no.
I have no idea, I just know that my first thought was definitely not "these are cool and all, but how will I afford the RAM?"we need better than 3600 to get the gains out the 5900 and 5950X you're saying?
4000 1:1 IF is required?
Oh, my bad.He was talking about RAM speeds, 3600 vs 4000
Yes but you would think it would be the AMD chips that would be 1 or 2 frames ahead though...Probably because at that point you are GPU bound - that's why they CPU test games at 1080p.
gamersnexus shows average, 1% and 0.1% lowsSo getting into 1440p and 4k gaming, seems pretty close for AMD / Intel, any review sites showing MIN FPS, as that would show more detail as to what CPU is more powerful when you get more into GPU limited resolutions...
At the same time I think you want definitely both.As others have said, 1920x1080 is a CPU bound resolution. As you increase resolution, you become more GPU bound. When reviewing CPU's, you want to remove other variables as much as you can and showcase what the CPU can do.
At the same time I think you want definitely both.
You want to isolate CPU as much as possible to see what in yet to know gaming scenario that you would not be GPU bound what is likely to happen, but look at how misleading those 400p resolution test showing +50% game boost would be if we had only those, a CPU ability to "feed" a video card is an essential part of it's gaming performance in almost all scenarios someone buying a new high price CPU person would care for, thus showing how good they are that is essential.
Putting some Flight simulator 2020 type of game that are know to be CPU heavy even under 70 FPS for example being a good one, testing how much DirectX 12-Vulkan multicore scenario does.
Many potential buyer have has a question, I have a 2070 super or just got a new 3080, what do I gain has a gamer by upgrading my cpu, that what they want to see and if they are playing a 1440p or 4K it is absolutely those resolution that will be telling for them and they get obviously wait until a scenario exist for which it will matter at those resolution (by then an even better alternative or better price point are likely to exist).
I suspect there is a bit of it make CPU choice virtually irrelevant and boring to gamer that are a large part of the audience and remove reviews views to reviewer that could be going on sometime.
All those benchmark will need to be redone with the 6800-6800xt i imagine as well and it would be ridiculous for when they compare a 5950xt vs 10900K with a 6800xt to only show low detail 1080p benchmark to isolate the CPU.
That sucks.You don't have to tell me. I include 4K testing in my CPU reviews. Unfortunately, AMD decided to sample us late and didn't provide us with all the available models. My reviews will be delayed quite some time as we'll have to purchase the 5900X and 5950X ourselves.
Wait for AM5so tempted to get the 5900x ...but i have a 3900x now. could give the wife the 3900x but she's got a 3800x. Decisions!
I have no idea, I just know that my first thought was definitely not "these are cool and all, but how will I afford the RAM?"
"The term sweetspot is used incorrectly. It referes to the max speed that you can slap the fastest sticks in and get them to run which is 3600mhz for Zen 2 and now 3800mhz for Zen 3. The difference between the sweetspots is that you can still clock higher but you have to manually do it. Thus on Zen 2 going from 3600mhz to 3800mhz you have to manually set the IF to 1900mhz to support 3800mhz memory and then proceed to tweak the timings. Conversely, you'd have to do the same with Zen 3 when going to 4000mhz ram and 2000mhz IF. Essentially what it means is that the maximum speed the Infinity Fabrick (IF/fclk) will run went from 1900mhz on Zen 2 to 2000mhz on Zen 3. The sweet spot of each is 100mhz below the maximum."Oh, my bad.
Just checked the Chicago Micro Center, all SKUS, in store only, 10+ in stock, 3950X, $649.99We have a covid retail restriction here of 25% capacity.... which means about 5-6 people in our local part shop. I drove by this morning and the line for their door was almost 2 blocks long. lol
I hope stock is good... AMD is killing it.
Of the 5000 series? Because I don't show any availability at either the Chicago or Westmont store. It said 10+ for the 5900X for Westmont before launch this morning and now it doesn't say anything.Just checked the Chicago Micro Center, all SKUS, in store only, 10+ in stock, 3950X, $649.99
[3:04 PM]"The term sweetspot is used incorrectly. It referes to the max speed that you can slap the fastest sticks in and get them to run which is 3600mhz for Zen 2 and now 3800mhz for Zen 3. The difference between the sweetspots is that you can still clock higher but you have to manually do it. Thus on Zen 2 going from 3600mhz to 3800mhz you have to manually set the IF to 1900mhz to support 3800mhz memory and then proceed to tweak the timings. Conversely, you'd have to do the same with Zen 3 when going to 4000mhz ram and 2000mhz IF. Essentially what it means is that the maximum speed the Infinity Fabrick (IF/fclk) will run went from 1900mhz on Zen 2 to 2000mhz on Zen 3. The sweet spot of each is 100mhz below the maximum."
That 1920x1080 ultra setting result is interesting:just saw a flight simulator test at the LTT review...amazing
I think after today, almost all of the GPU and gaming benchmarks going forward will be on the X570/5950x test benches since it is now the top gaming setup. So you will eventually see 1440/4k results.Thanks fellas, understand I'm aware of why testing at 1080p is done.........my issue was the fact that all of the "known news sources" for this release kinda wasted our collective time by including "WOW OMG!" thumbnails and then telling us shit we expected......"Ryzen is finally beating Intel at gaming.......".
But gaming at 1080p is the only thing they focused on (I KNOW)...to stress the impact of the CPU. Thing is you've been able to game....at 1080p...for a decade or so on chips you can buy for $50 bucks. So while the graphs are great, and those chasing 200+FPS at 1080p for..some reason..competition, whatever....is relevant its not really answering the *other half* of the question I think we all want answered, what does this mean to people running 1080 Ti's or 2070's or Maxwell 980's....does this CPU magically give you "moar framez!" somehow due to its multicore and faster design......at more modern resolutions. Does your brand new 3080 or 3090 suddenly pick up 20+ frames moar than it did on an Intel part "because AMD's sub-processor scaling array with added moxie is now operating at 3.3gigaflops versus last gens 2.1 whatevers...", etc.
It's just one of those irritating omissions....I know there will be a smaller impact, but I'm sure all of us would like to know "ok, how small". But you know..that takes away from the *drama*....and drama gets clicks.
I don't know why I checked the 3000 series, maybe I had that search saved. Checking the 5000 series, you are correct, 0 in stock, status "LIMITED AVAILABILITY". Sorry for the false hope.Of the 5000 series? Because I don't show any availability at either the Chicago or Westmont store. It said 10+ for the 5900X for Westmont before launch this morning and now it doesn't say anything.
I would call ahead. I have high doubts that they're in stock.
That sucks.
there are plenty of reviews with 1440p and 4k testsThanks fellas, understand I'm aware of why testing at 1080p is done.........my issue was the fact that all of the "known news sources" for this release kinda wasted our collective time by including "WOW OMG!" thumbnails and then telling us shit we expected......"Ryzen is finally beating Intel at gaming.......".
But gaming at 1080p is the only thing they focused on (I KNOW)...to stress the impact of the CPU. Thing is you've been able to game....at 1080p...for a decade or so on chips you can buy for $50 bucks. So while the graphs are great, and those chasing 200+FPS at 1080p for..some reason..competition, whatever....is relevant its not really answering the *other half* of the question I think we all want answered, what does this mean to people running 1080 Ti's or 2070's or Maxwell 980's....does this CPU magically give you "moar framez!" somehow due to its multicore and faster design......at more modern resolutions. Does your brand new 3080 or 3090 suddenly pick up 20+ frames moar than it did on an Intel part "because AMD's sub-processor scaling array with added moxie is now operating at 3.3gigaflops versus last gens 2.1 whatevers...", etc.
It's just one of those irritating omissions....I know there will be a smaller impact, but I'm sure all of us would like to know "ok, how small". But you know..that takes away from the *drama*....and drama gets clicks.
Holy cow! look at those differences in frames per second. Those CPU's really did add a lot of performance in GPU bound benchmarks...