I've seen rumors calling it the 5700X and rumors calling it the 5800X. Either way whatever they end up calling the 8-core SKU; I think the price will be $350-400.What is the 5800x? I saw that the 5700x is the 8 core sku.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
I've seen rumors calling it the 5700X and rumors calling it the 5800X. Either way whatever they end up calling the 8-core SKU; I think the price will be $350-400.What is the 5800x? I saw that the 5700x is the 8 core sku.
if the 12 core comes in at 499 I will probably swing for that, otherwise, 8 or 10 core for me.I've seen rumors calling it the 5700X and rumors calling it the 5800X. Either way whatever they end up calling the 8-core SKU; I think the price will be $350-400.
3900x/xt were $499 at launch, weren't they? And $749 for the 3950x I believe. I guess we will see in a week.That's almost certainly not going to happen. Expect AMD to charge performance-leadership prices.
Still going Intel. I have no idea why.
3900x/xt were $499 at launch, weren't they? And $749 for the 3950x I believe. I guess we will see in a week.
They were $499 at launch and did not have comprehensive performance leadership. Zen 3 likely will.
what cooler are you running?Yea coming from a 3900XT no point in upgrading atm. Only way i'm going to really go for a new cpu is if im getting 5ghz on all cores while gaming. Fell for that "4.7ghz" boost marketing trick, when in reality for a gamer its like 4.2ghz, so not gonna fall for that scam again.
ftfyMindshare. Same reason people buy Blose speakers.
Yeah, the max boost for CPU's has always been a bit of a marketing trick. "max" boost advertised is usually for 1 or 2 threads. Intel has been doing that forever. and I don't think they even let you see any sort of official boost table anymore. Since the 7 or 8 series.Yea coming from a 3900XT no point in upgrading atm. Only way i'm going to really go for a new cpu is if im getting 5ghz on all cores while gaming. Fell for that "4.7ghz" boost marketing trick, when in reality for a gamer its like 4.2ghz, so not gonna fall for that scam again.
what cooler are you running?
Yea coming from a 3900XT no point in upgrading atm. Only way i'm going to really go for a new cpu is if im getting 5ghz on all cores while gaming. Fell for that "4.7ghz" boost marketing trick, when in reality for a gamer its like 4.2ghz, so not gonna fall for that scam again.
yeah but clockspeed currently correlates to more game performance. The two architectures are more or less equal in performance. So, clockspeed marks the winner.Clockspeeds don't matter, performance does. If these things came out topping out at 3 GHz but demolished a 10900K in gaming, would you really care about the number?
yeah but clockspeed currently correlates to more game performance. The two architectures are more or less equal in performance. So, clockspeed marks the winner.
It seems to be title specific and not a general rule, for games. techpowerup tested 10 games for the 3600xt and 3900x and only two showed indications of such an issue. and even then....there are only a couple of standout intel chips which seem to have something going on, which belies their lower clock speed (i5-10400). But overall, I'm having a tough time finding numbers which support the general idea that AMD's supposed latency issues are the underlying problem. 10 games isn't a huge sample size but----its seems like latency inside the CPU isn't that big a deal for some games. Maybe most games. I'm not saying it technically isn't an issue. But, most games seems to be mostly clockspeed sensitive.That's not how that works at all. They are roughly equal in terms of IPC, so higher clocks can make a difference, but that's actually not the reason Intel leads in low-resolution gaming. That actually has to do with the increased latency of AMD's design due to having to shuttle data all over the place (over the substrate!) via the Infinity Fabric. Hopefully this is one area where Zen 3 will offer solid improvements.
It seems to be title specific and not a general rule, for games. techpowerup tested 10 games for the 3600xt and 3900x and only two showed indications of such an issue. and even then....there are only a couple of standout intel chips which seem to have something going on, which belies their lower clock speed (i5-10400). But overall, I'm having a tough time finding numbers which support the general idea that AMD's supposed latency issues are the underlying problem. 10 games isn't a huge sample size but----its seems like latency inside the CPU isn't that big a deal for some games. Maybe most games. I'm not saying it technically isn't an issue. But, most games seems to be mostly clockspeed sensitive.
If you look closely, some games due better on Ryzen, with an all core overclock which is much lower than their boost clock. Which further suggests clock sensitivity for gaming. We would need to examine the frequency graphs for CPU's on a game by game basis, to really see which ones are actually latency sensitive. And not actually still clock sensitive and really suffering because ryzen boost tables don't scale well with some games.
It seems to be title specific and not a general rule, for games. techpowerup tested 10 games for the 3600xt and 3900x and only two showed indications of such an issue. and even then....there are only a couple of standout intel chips which seem to have something going on, which belies their lower clock speed (i5-10400). But overall, I'm having a tough time finding numbers which support the general idea that AMD's supposed latency issues are the underlying problem. 10 games isn't a huge sample size but----its seems like latency inside the CPU isn't that big a deal for some games. Maybe most games. I'm not saying it technically isn't an issue. But, most games seems to be mostly clockspeed sensitive.
If you look closely, some games due better on Ryzen, with an all core overclock which is much lower than their boost clock. Which further suggests clock sensitivity for gaming. We would need to examine the frequency graphs for CPU's on a game by game basis, to really see which ones are actually latency sensitive. And not actually still clock sensitive and really suffering because ryzen boost tables don't scale well with some games.
Clockspeeds don't matter, performance does. If these things came out topping out at 3 GHz but demolished a 10900K in gaming, would you really care about the number?
Weird, when I do a manual overclock on this 3900XT to 4.65ghz all core my fps goes up nearly 30 on the low end and top end. My score in cinebench skyrockets as well (so does temps lol). So to say that clockspeed doesn't matter is so weird lol, when its a night and day difference when I increase it. In this case, the clockspeed does matter, but for whats coming out I guess we will have to see whats in store. But im not falling for their marketing scam of "max boost clock" lol, ill wait for GN, or Jay2Cents, and few others to do some testing.
Sure but I think the underlying point is that no consumer CPU architecture is good enough right now, to dramatically offset clockspeed. At least, that was the point I wasn't really communicating well. And I seriously doubt we will see a situation from AMD, where they release a massively efficient architecture----which still doesn't clock well. All indications point to them having figured out their clockspeed issues with their fab processes. So, even if the latency issue is only partially addressed, I still expect to see nice gains, simply from more megahertz. And if we do get an amazing new architecture: I think it will still clock well.Quite obviously I mean "clockspeed alone doesn't matter." Clock any chip higher and performance will go up, but it doesn't matter one bit if a 10900K hits 5 GHz and a 5900X doesn't if the 5900X performs better.
105w vs 150w for overclocking? Why would something pushed to 150w overclock better than something that's sipping power at 105w? If anything there is more room at 105w to overclock because you can turn up the voltage. If it's 150w just to hit stock clocks, that doesn't leave much room for OC. I think you've got it backwards .
My mini ITX has a 1000w PSU, it's not the PSU I'd be concerned about, it's the amount of heat I have to remove from my case. I mean, really I'm not that concerned personally, but in general that's the concern .
because AMD's TDP ratings are 100% garbage and everyones known that for years.. they come up with their TDP based on an algorithm, not on actual power draw.. the 3900x was a 150w TDP chip with stock boost, 115w base clock. only hope would be that AMD finally listened to all the criticism they've been receiving for the last 10 friggin years and finally switched to a true power draw rating but not going to keep my hopes up on that one. intel's are just as garbage since their TDP is power draw at base clock.
At the very least it depends on what CPU you're talking about. My Ryzen 2600x running all threads maxed out will occasionally go above 95w but has never that I've seen even hit 100w. That's with the boost on all cores sitting around 4100 to 4150. I imagine if I could keep the CPU cooler it would go a bit higher yet and I'd still be surprised if it went over 100w.
AMD's are very simple to figure out, while still annoying, it's not some magical thing that nobody knows about and they give you a TDP for your cooling solution, not as a # for how much watts the CPU may draw at some point.because AMD's TDP ratings are 100% garbage and everyones known that for years.. they come up with their TDP based on an algorithm, not on actual power draw.. the 3900x was a 150w TDP chip with stock boost, 115w base clock. only hope would be that AMD finally listened to all the criticism they've been receiving for the last 10 friggin years and finally switched to a true power draw rating but not going to keep my hopes up on that one. intel's are just as garbage since their TDP is power draw at base clock.
Well, seeing as both the 3700x/3800x are 8-core, there is a good possibility both the 5700x/5800x are both going to be 8-core, so it may not be the rumors calling it one or the other, but both may exist .I've seen rumors calling it the 5700X and rumors calling it the 5800X. Either way whatever they end up calling the 8-core SKU; I think the price will be $350-400.
Well, seeing as both the 3700x/3800x are 8-core, there is a good possibility both the 5700x/5800x are both going to be 8-core, so it may not be the rumors calling it one or the other, but both may exist .
3700x was $329, and 3800x $399, so I'm hopeful that we'll end up with an 8-core under $400. I'm mostly curious if both the 5700x /5800x are going to be a single CCD or if the 5700x might be dual CCD that couldn't make the full core count each. This would be a good product segmentation/differentiator one with dual CCD's with 4 cores each vs a single 8-core CCD, but no clue. I hope it's a single CCD, but could see it either way.
because AMD's TDP ratings are 100% garbage
That's almost certainly not going to happen. Expect AMD to charge performance-leadership prices.
still have my hopes set that maybe the 5800x will be a 10 core chip given the changes to the chiplets(using 2x5 core chiplets) just as an ultimate middle finger to intel and the 5700x being the single 8 core chiplet.
mmmm nope , Intel has way too much market share.
These are different economic times than in 2005.They most likely ask $499. Maybe even $529.99 at the most.You've been around long enough to remember the Athlon 64, right? The situation then was quite a bit worse than it is now for AMD and they still released a processor that was 20% more expensive than Intel's halo part because it was comprehensively faster while also being more efficient.
Clockspeeds don't matter, performance does. If these things came out topping out at 3 GHz but demolished a 10900K in gaming, would you really care about the number?
I mean I could get a FX8350 to run at 5ghz. Didn't mean much however, lol.LMAO I have been literally made fun of on this forum for making that same suggestion.
People that do not understand how to look at big pictures are the ones that get hung up on one single stat as a deciding factor in their information-less decisions.
LMAO I have been literally made fun of on this forum for making that same suggestion.
People that do not understand how to look at big pictures are the ones that get hung up on one single stat as a deciding factor in their information-less decisions.
I mean, I kind of agree, but, who was buying 550m's because they were rebranded b450's, they are new chipsets that have more features (2.5gbe, more/faster usb, pcie 4.0 for both GPU and 1st nvme)? Or are you talking about the OEM 550A? The 550M is the B550M that we all know is not a rebranded b450 and does have additional feature. The 550A was the B450 rebrand for the OEM market which is probably what you're thinking/talking about, but these were not sold directly to consumers so not sure who on an "enthusiast" forum was buying prebuilts just for a motherboard.the sad reality is the numbers on the box sell before the benchmarks.. OEM's have proven that time after time and that's why they convinced AMD to release the 550M which was just a rebranded b450 because consumers thought they were buying something out dated and slower because b450 didn't have pcie 4.0. yet even in an enthusiast forum that shit still happens which is depressing.