AMD’s Zen 2 Will Reportedly Offer Higher Core Counts, Major IPC Gains

I found this one. Hard to read the osd on mobile, but looks like around 50% usage across most of the 8 threads.



Fluctuates quite a bit, its on an amd fx cpu as well. Going from 50 $ to around 80 ish on all cores.
 
Guys just face it, we hit peak PC many years ago. Just be ok with 15% increase in performance each generation until we see a major paradigm shift (moving away from silicon). I remember the 90s and 00s when wed get double the performance each generation for the same price, it was glorious of course! But now its time to wait and save money. Then youll have the cash for when the revolution happens and PCs suddenly become 10s or 100s of times faster
 
I've yet to see some of these magical games that can load up 8 cores almost fully, i've already said that a few games can put more load on a core but i've yet to see one maxing one out or coming close to it , usually the load will be mostly on one core and then slacken off the rest of the cores using much smaller percentages of the core. (funny how you ignored that part) instead of flapping your gums give some examples that can "nearly peg" all 8 cores of a cpu.
Okay so you have been living under a rock. When I get home in about 3 hours from work I'll post you screenshot after screenshot showing pretty much every modern game uses way over 50% of my overclock i7 and some even Peg or nearly Peg all eight threads. Of course at that point you will probably already have a new excuse ready. I'm sure you already have the un optimized game excuse already in your holster. I'll go ahead and cut you off before you pull out that nonsense as that is completely irrelevant to the real world. Whether you like it or not plenty of games are un optimized and you have to deal with that reality when you buy your Hardware.
 
Okay so you have been living under a rock. When I get home in about 3 hours from work I'll post you screenshot after screenshot showing pretty much every modern game uses way over 50% of my overclock i7 and some even Peg or nearly Peg all eight threads. Of course at that point you will probably already have a new excuse ready. I'm sure you already have the un optimized game excuse already in your holster. I'll go ahead and cut you off before you pull out that nonsense as that is completely irrelevant to the real world. Whether you like it or not plenty of games are un optimized and you have to deal with that reality when you buy your Hardware.


Don't be a cheeky little boy, just post your evidence and take a prozac ffs. :rolleyes:
 
Don't be a cheeky little boy, just post your evidence and take a prozac ffs. :rolleyes:
you are the one that made a sweeping ignorant comment here. You're also the one that told me not to Flap my gums so maybe you can roll those eyes right up your rear end. It's not my fault you've been oblivious for the last couple fucking years.
 
you are the one that made a sweeping ignorant comment here. You're also the one that told me not to Flap my gums so maybe you can roll those eyes right up your rear end. It's not my fault you've been oblivious for the last couple fucking years.


Right because everyone plays exactly the same games as you do? LOL Get a clue ffs.
 
Right because everyone plays exactly the same games as you do? LOL Get a clue ffs.
wow you are reaching new levels of ignorance. every fucking game I'm referring to is a well-known and popular game. These aren't some strange games that no one plays but at least I see you already prepared for ridiculous excuses just like I said.. maybe crawl out from you little fucking rock and actually look at CPU utilization in modern games before flapping your gums.
 
wow you are reaching new levels of ignorance. every fucking game I'm referring to is a well-known and popular game. These aren't some strange games that no one plays but at least I see you already prepared for ridiculous excuses just like I said.. maybe crawl out from you little fucking rock and actually look at CPU utilization in modern games before flapping your gums.


Like i said, list them. You're very fond of this crawling out from under a rock saying, speaking from experience?
 
Like i said, list them. You're very fond of this crawling out from under a rock saying, speaking from experience?
unlike you I actually pay attention to things before talking out of my ass. But I'll go ahead and give you a short list right now since you were even more clueless than I anticipated.

Deus Ex mankind divided
Rise of Tomb Raider
Mafia 3
Watch Dogs
Watch Dogs 2
Assassin's Creed Origins
Crysis 3
Witcher 3


That's some right off the top of my head that use well in excess of 50% of my CPU. and Mafia 3 and Watch Dogs 2 will fully Peg my CPU. Rise of the Tomb Raider we'll go over 80 and even 90% sometimes. Deus Ex mankind divided I have seen hit over 80%. The Witcher 3 I've seen hit 75 to 80% just the other night when I was playing.

Those are facts that you clearly know nothing about as you don't pay attention to anything. Maybe if you actually played some games you could test for yourself and see. But again please go ahead and start with those excuses cuz I know they're coming.
 
Assassin's Creed Origins does better on an 8 core Ryzen when compared to a 6 core:

Assassin-s-Creed-Origins-CPU-Core-Scaling-Performance-Benchmark-Ryzen-pcgh.png




Then these guys found that a Threadripper beat an 1800X:


qqeKTqZnuXpvXawBz4nATR-650-80.png
 
I don't understand why they would put 16 cores on a non-Threadripper CPU. Doing so would overlap on the high end market. What purpose would Threadripper serve at that point? More PCI lanes?

(I know next to nothing about Threadripper)

The two platforms overlap already. A Ryzen 7 2700x should be faster than a TR 1900x at everything. The difference is that if you go with the Ryzen you are buying at the very top end of the platform, so a need for more performance means a total upgrade.

So what will TR offer? Well, if there is a 16 core Ryzen, then my assumption is that the top end TR part will be at least 24c/48t if not 32c/64t, keeping or expanding the gap that exists today. TR also gives you more pcie lanes as you pointed out, as well as 2x the memory bandwidth and ECC support. There's also the fact that TR motherboards are a clear step up over Ryzen based ones, containing multiple m.2 slots and multiple full 16x PCIE slots, etc.

Bottom line, I really don't see a 16 core Ryzen eating into the TR market any more than a 2700x does already.
 
unlike you I actually pay attention to things before talking out of my ass. But I'll go ahead and give you a short list right now since you were even more clueless than I anticipated.

Deus Ex mankind divided
Rise of Tomb Raider
Mafia 3
Watch Dogs
Watch Dogs 2
Assassin's Creed Origins
Crysis 3
Witcher 3


That's some right off the top of my head that use well in excess of 50% of my CPU. and Mafia 3 and Watch Dogs 2 will fully Peg my CPU. Rise of the Tomb Raider we'll go over 80 and even 90% sometimes. Deus Ex mankind divided I have seen hit over 80%. The Witcher 3 I've seen hit 75 to 80% just the other night when I was playing.

Those are facts that you clearly know nothing about as you don't pay attention to anything. Maybe if you actually played some games you could test for yourself and see. But again please go ahead and start with those excuses cuz I know they're coming.

Could you be a bit more angry please? I love how you come into this thread proclaiming people to be "living under a fucking rock" then get all uppity when you get told to stop flapping your gums.


And those are how many games compared to what's available out there? You seem to be stating i made a blanket statement that games that can use a lot of cores don;t exist, when in actuality what i said was:

for most games there's a lot of untapped processing power that isn't used, you might get the odd one that really taxes the cpu but they're in the minority.
Apparently that makes me wrong somehow....

As for assassins creed origins, a lot of that cpu usage was apparently down to Denuvo. I don't have the game so i can't say if that's the case or not.

https://www.extremetech.com/gaming/258173-assassins-creed-origins-redlining-cpus-100

Googling mafia 3 cpu usage seems to imply its a problem with the game.

There's plenty of threads online on watchdogs 2 saying its unoptimized.

Crysis 3 i'll give you used quite a lot of the cpu, though thats a staple with crysis games being hard on the system.

So some you're right on, others according to sources are issues with drm or shitty ports.
 
Could you be a bit more angry please? I love how you come into this thread proclaiming people to be "living under a fucking rock" then get all uppity when you get told to stop flapping your gums.


And those are how many games compared to what's available out there? You seem to be stating i made a blanket statement that games that can use a lot of cores don;t exist, when in actuality what i said was:

Apparently that makes me wrong somehow....

As for assassins creed origins, a lot of that cpu usage was apparently down to Denuvo. I don't have the game so i can't say if that's the case or not.

https://www.extremetech.com/gaming/258173-assassins-creed-origins-redlining-cpus-100

Googling mafia 3 cpu usage seems to imply its a problem with the game.

There's plenty of threads online on watchdogs 2 saying its unoptimized.

Crysis 3 i'll give you used quite a lot of the cpu, though thats a staple with crysis games being hard on the system.

So some you're right on, others according to sources are issues with drm or shitty ports.
yep plenty of excuses just like I said. I don't give a crap what the actual reasons for the games being CPU intensive are because that doesn't change the facts about what I said one single bit. Do you want to sit in a corner and pout saying you are not going to play this game because you think it is unoptimized or has protection that uses too much of the CPU? If you want to play those games and get over 60fps then you're going to have to have more than 4 cores and in some cases more than 8 threads. That is just the reality of Gaming so you have to deal with it.
 
I've yet to see some of these magical games that can load up 8 cores almost fully, i've already said that a few games can put more load on a core but i've yet to see one maxing one out or coming close to it , usually the load will be mostly on one core and then slacken off the rest of the cores using much smaller percentages of the core. (funny how you ignored that part) instead of flapping your gums give some examples that can "nearly peg" all 8 cores of a cpu.


Yep.

This is because most of the game engine code cannot be multithreaded. So they cheat a little. The main game thread goes in one process. The audio processing goes in a different process. The bullet physics in a third. CPU tasks supporting the rendering pipeline in a fourth, and so on.

It's still mostly single threaded code, but now it is split up into many processes, and as such it can run on different threads.

Usually though, you'll have the main game thread heavily load or pin one core, and have the other processes take up anywhere from a few to like 30 percent of other cores.

It still helps, but it is not true multithreading where one process can spread itself across any amount of cores.

There is a limit to how much of this can be done though, and true multithreading is usually not possible. It would at worst cause thread locks causing the game to freeze, and and at best require so many watchdogs and workarounds that the multithreaded code would actually run slower than the single threaded version.

People on these forums are fond of throwing out multithreading like it's a simple thing to do, or even achoice at all, and if they just plan carefully and hire a few more software engineers the multithreaded gaming bliss would be upon us, but the devs are just lazy and cheap. The people who claim this don't have a clue what they are talking about. True multithreading is a rare unicorn, and usually only works well in mathmatical simulations, rendering and encoding where batch jobs of many similar tasks are run one after the other and can easily be puzzled back together at the end.
 
Last edited:
yep plenty of excuses just like I said. I don't give a crap what the actual reasons for the games being CPU intensive are because that doesn't change the facts about what I said one single bit. Do you want to sit in a corner and pout saying you are not going to play this game because you think it is unoptimized or has protection that uses too much of the CPU? If you want to play those games and get over 60fps then you're going to have to have more than 4 cores and in some cases more than 8 threads. That is just the facts so you have to deal with it.

LOL you're a riot, "it doesn't matter that a drm program was making a game peg cpu cores, i was right regardless". That's basically what you just said. The article i linked is stating that, not me as i don't have the game. And i think you're confusing "excuses" with the facts at hand talked about in the article. We're talking about GAMES using multiple cores to their full extent, not some piece of shit drm that ubisoft shoehorned into it that is having an adverse affect on performance.

May as well throw a core miner into a game and show that as "proof" of multi core usage.
 
Yep.



Usuay though, you'll have the main game thread heavily load or pin one core, and have the other processes take up anywhere from a few to like 30 percent of other cores.


Thats been more or less my experience in games as well for the most part, 1 or 2 cores being used a lot then the rest hanging at much lower percentages. It's not like i constantly open task manager to see what is going on with usage but generally if something is awry there will be threads about it. Some games will use the cores much more but as i said before they're generally in the minority. Battlefield 1 on 64 player multi can load up a few cores but its far and away from being anything significant over all 8 cores and 16 threads, and that's a game that in multi can have a ton of stuff happening at once.
 
LOL you're a riot, "it doesn't matter that a drm program was making a game peg cpu cores, i was right regardless". That's basically what you just said. The article i linked is stating that, not me as i don't have the game. And i think you're confusing "excuses" with the facts at hand talked about in the article. We're talking about GAMES using multiple cores to their full extent, not some piece of shit drm that ubisoft shoehorned into it that is having an adverse affect on performance.

May as well throw a core miner into a game and show that as "proof" of multi core usage.
you seem to have trouble with reality. You want to blame DRM or poor optimization when there's nothing that can be done about that on your end anyway. Again it is what it is and if you want to play those games then it requires that much CPU power. I didn't make the goddamn games and I can't optimize them or remove the DRM.
 
you seem to have trouble with reality. You want to blame DRM or poor optimization when there's nothing that can be done about that on your end anyway. Again it is what it is and if you want to play those games then it requires that much CPU power. I didn't make the goddamn games and I can't optimize them or remove the DRM.

There you go again, why so mad about this?

In the past ubisoft eventually relented and removed some drm that was causing issues with games. I'm pretty sure at one point the drm in their games was cracked as well.

The reality is that in some cases its the DRM that is causing the cpu to load up much more than it should (in the case of assassin's creed origins at least). I don't consider that to be a game taking advantage of multicores for the betterment of the final product. It's more like some parasite leeching cpu cycles in the background that's nothing but a detriment. Yes there's nothing currently that can be done about it unless ubisoft release a patch to disable it or someone finds some way to remove it. But it's hardly a good example of a game making use of multiple cpu cores.

LIke i said, there's some games out there that do make decent use of multiple cores, but they're in the minority. My opinion on that hasn't changed.

And now, bedtime, after 3am here.
 
I tend to think that mulithreading has now gotten about as good as it ever will be, without some kind of major technological innovation that can break down instructions and execute them across multiple cores or subcores.

The truth is that true multi threading is not possible on all types of code.

Gamer fanboys dating back to AMD's early days of "MOAR CORES" always like to blame "lazy devs" for not making it happen, but the truth is that only a small portion of code (maybe ~30%) can ever be effectively multithreaded due to a variety of dependency issues, thread locking and the like.

To a certain extent you can fake it, and have multiple single threaded processes running in parallel utilizing more cores, but there are limited to how much you can brake up a game engine effectively.

You're wrong.

Yes, there is a limit. Yes, some tasks can be threaded better than others. Some tasks will exhibit negative scaling. The kind of stuff you do in games programming, especially since you tend to have many, many instantiations of similar tasks and system, as well as highly parallelizable systems in general (graphics, sfx), is not a good example.

It's not really an issue with lazy devs, it's that there's so much work to make a game from scratch, and while game making tools haven't completely caught up to a multithreaded future yet, they're good enough for 99% of cases. This strongly disincentivizes building multi-threaded games, especially for PC developers who aren't juggernauts with their own special in-house tech.

For example, currently multi-threading in Unity is an exercise in frustration because of how the engine is structured. But they're in the process of reworking the fundamental aspects of the engine to make multi threading extremely straight forward for a huge number of tasks. Google the Unity ECS system sometime and you'll see what I'm talking about. When the tools catch up, then so will the games.

(As an aside, I'm an indie developer, and I have a ton of use cases in my current project that are single threaded but ideally shouldn't be, where we're talking about the potential for order of magnitude performance increases)
 
I know its been discussed to death, but personally, for me, a 10-15% IPC gain per generation is acceptable. Combine that with higher clocks, and higher core count (i'd be happy with just 12 core, a nice medium) and you've got a real winner. Eventually I'd like to see 5GHz on Ryzen though, if it can ever hit that at an appropriate power draw and heat, it'll be clear cut for sure.
 
I'd much prefer them to focus on IPC improvements and clock speed improvements, they have the opportunity to stick the boot into intel big time if they can get ipc up a decent percentage and more clock speed is always nice. 16 cores is threadripper territory, and for most users that's the route they will go.

I bet you're the type of person who would say car manufacturers should concentrate on getting better mpg instead of moving to fully electric vehicles. Intel has stagnated the computing landscaped for over a decade. It's about god damn freaking time there was some innovation on the desktop.
 
Assassin's Creed Origins does better on an 8 core Ryzen when compared to a 6 core:

View attachment 91466



Then these guys found that a Threadripper beat an 1800X:


View attachment 91467
It really comes down to good programming support or not. Intel stagnated core count for so long that more cores aren't as well supported as they could or should be. AMD Ryzen's overall core count is just too good to pass up w/o acceptable enough IPC. The end result I think is smart developers are taking notice and trying to utilize them as much as possible within reason worth the effort to do so. The big difference with IPC gains for Ryzen and it's high core counts though is the net result is bigger because it's applied across more cores. Core count is much lower than clock speed overall right now so they can continue to push it hard for awhile for decent gains and do some minor refinements to IPC as well. I think we may see instruction sets between some of the additional cores split up in the future within a CCX cluster. That way you could turn off cores w/o instructions you don't need to save power and energy while pushing clock speed higher on the ones you do need. Essentially power gating for performance I think?
 
Didn't this rumor come out 5 days ago? And how much do we put on unverifible information from chiphell? Oh well gotta feed the current front page news focus with something to get the fanboys all riled up!

If they hit 15% Ipc improvement it would make AMD a true threat in the mainstream market and they could increase their pricing to help that bottom line... or better yet truly force intels hand to move theirs down. But profit motive tells me they will shift strategy at 15% gains.
 
I bet you're the type of person who would say car manufacturers should concentrate on getting better mpg instead of moving to fully electric vehicles. Intel has stagnated the computing landscaped for over a decade. It's about god damn freaking time there was some innovation on the desktop.
Innovation as in more cores? Until software is designed to utilize that it's a nice gimmick and differentiating advertising piece. The only thing that is gonna make AMD more money in this rumor is the 15% IPC gain. That would put them at intels pace and create an entirely different marketplace with legitimate competitive landscape.

Otherwise why would I get a ryzen 2700 when I can get a 7820x for nearly the same price with the 7820x not only faster but ocs better and is on a better platform?
 
I know its been discussed to death, but personally, for me, a 10-15% IPC gain per generation is acceptable. Combine that with higher clocks, and higher core count (i'd be happy with just 12 core, a nice medium) and you've got a real winner. Eventually I'd like to see 5GHz on Ryzen though, if it can ever hit that at an appropriate power draw and heat, it'll be clear cut for sure.

If I had to pick between current clocks and 12 cores, and 5ghz and 6 cores, I'd pick the latter every time.

I just don't do that much rendering/encoding, and all my VM's are on my dual socket Xeon server...
 
Innovation as in more cores? Until software is designed to utilize that it's a nice gimmick and differentiating advertising piece. The only thing that is gonna make AMD more money in this rumor is the 15% IPC gain. That would put them at intels pace and create an entirely different marketplace with legitimate competitive landscape.

Otherwise why would I get a ryzen 2700 when I can get a 7820x for nearly the same price with the 7820x not only faster but ocs better and is on a better platform?
This, more cores is nice but we haven’t seen a change of landscape ... yet. What AMD did is move core count up for all levels. I’m sure if Intel could release higher IPC cores they would have, in the past when they were essentially the only player in the market they did this to move product. When they hit we see what happened with sandy bridge and the constant reiterating.

Another bonus to AMD is they added the direct 4PCIe lanes to the CPU, to me this was bigger than the additional cores.
 
As I told Juan, offset tiling of core/L3 one would have 3 way cache coherency leading to lower latency between any three cores within CCX.
 
No no no.

It's 2018, almost 2019.

Tired of the shitty crappy games from 1995. The last real improvement in game AI happened in Half-Life 1 when the marines picked up your grenades and tossed them back and used cover.

NOOOOOTHING has changed. It's because of you 640K-ers.

Get off my lawn. Dangit.

Higher IPC is a crutch for lazy devs. We have decent enough IPC. Need more computational power for everyone.
Don't forget about F.E.A.R. from 2005 - that game had/has groundbreaking AI that has been almost unheard of before or since, and that was a single-threaded game.
I don't think it is the hardware that is lacking, it is more the software/coding techniques that may not potentially being either written or utilized.
 
I don't understand why they would put 16 cores on a non-Threadripper CPU. Doing so would overlap on the high end market. What purpose would Threadripper serve at that point? More PCI lanes?

(I know next to nothing about Threadripper)
16 cores ia great for higher end work stations or single purpose servers. But with virtualization happening more and more for servers with hyperV, Citrix, and VM ware all recommend 32+ cores 64 tends to be the sweet spot for most applications.

There are large cost savings with higher core counts specifically with licensing as most use the socket count as a price multiplier. 64 cores on 1 socket is cheaper licensing wise than 64 cores spread over 2 sockets. I am sure this will change in the future but for now that is the way it goes.
 
To the two guys arguing multithreading... Frostbite engine is a good example. BF4 would peg all eight cores on my old [email protected] during 64m large maps. Old game.

I think some people are getting soggy panties over word definitions. I think, more often than not, that when we say multithreaded, we are talking about an .exe that has multiple threads spread out over multiple cores. Main thread/process, followed by x, y, z, a, b, & c tasks that each have their own thread(s) phoning home to the main. In reference to gaming, I really don't think that we are talking about taking a process/task that runs well serially, and then for no good reason at turning it into a big parallel CF of threads because moar cores.
 
I've yet to see some of these magical games that can load up 8 cores almost fully, i've already said that a few games can put more load on a core but i've yet to see one maxing one out or coming close to it , usually the load will be mostly on one core and then slacken off the rest of the cores using much smaller percentages of the core. (funny how you ignored that part) instead of flapping your gums give some examples that can "nearly peg" all 8 cores of a cpu.

Even if you aren't at 100% CPU usage (for each core), you can still be CPU limited.
 
I bet you're the type of person who would say car manufacturers should concentrate on getting better mpg instead of moving to fully electric vehicles. Intel has stagnated the computing landscaped for over a decade. It's about god damn freaking time there was some innovation on the desktop.

Hardly, i just think that 16 cores is verging into threadripper territory and they would be clashing with their own product lines in that respect. Also i wouldn't really say that more cores is necessarily innovating anything, intel has had 6 core chips and amd has had 8 core chips in the past. What's kicking intel in the balls currently is that bulldozer etc fell well short of performance expectations and for a while the whole "more cores" thing was ridiculed, intel rested on their laurels and was content to crank out hyperthreaded quad cores for the most part. Now that their ipc with ryzen is comparable to intels suddenly its a wakeup call and what was ridiculed in the days of bulldozer is now getting some traction.
 
I'd much prefer them to focus on IPC improvements and clock speed improvements, they have the opportunity to stick the boot into intel big time if they can get ipc up a decent percentage and more clock speed is always nice. 16 cores is threadripper territory, and for most users that's the route they will go.

I much prefer they focus on both, since that seems to be working quite well for them. :)
 
Many people can't remember how awful a single core was.... and thanks to AMD in a few years 4 cores will feel awful too. First the cores then slowly the software will adapt, that is how this works

I usually upgraded my computer every 2-3 years, however I am still on a seven year old i7 HEDT which replaced a Phenom II X4. and nothing that came after that was compelling... until last year.

I don't even care about the cores. I remember when I was able to plug half a dozen PCI cards on my motherboard. Now try to plug a GPU, an m2 SSD, perhaps a PCIe SSD too, a 10Gb network adapter and a raid controller,... Intel HEDT? what a joke.

But threadripper? Oh fuck yeah, can't wait to get a 2000 series in a couple months, that will be my main desktop until next year when Ill upgrade to a 3000 series and pass the 2000 to my home server.
 
I don't understand what the fuss is about.

Both AMD and Intel will do work and game. However Intel at the higher end of the scale will favor gaming, while AMD will favor production. Have a game that needs that last 5fps to keep 144hz? Get Intel. Want to have a great gaming rig that can also do production work faster? Get AMD.
 
Lol at the people that say quad cores are enough. Those are the same people that would rather have dual cores back when quad cores just came out.
Try to use a dual core PC from that era on Windows 10...
I have an 11 year old T7400 Precision workstation with 2 Quads (8 cores total, Core 2 Quad based Xeons) that I am still using as a workstation at my work. 11 years later...And I can game on it too!!! A higher clocked Core 2 Duo would be unbearable and absolutely suck at gaming if it could at all!!!
Some people actually keep their PCs longer than a year...
We will be saying the same thing 10-15 years from now and I'll probably still have my lowly 1700 while a 7600k/7700k will be almost unusable even though back in their day they were "faster" at gaming.

I welcome more cores! (provided they have decent IPC)
 
I much prefer they focus on both, since that seems to be working quite well for them. :)
How so? They are still far behind Intel in every segment. Regardless of the AMD love that [H] recently has been giving ad nauseam there is a long way to go. I know that the reports are very positive and they have gained some traction, but to truly compete in the consumer (and enterprise in many regards) market IPC is king. We here are all the <1% and basically using enterprise level technology. 99.8% of the consumer market couldn't give a damn about 16 cores, they just want to have 6 tabs open in chrome while playing freecell and listening to music without slowdown... hell maybe add a second monitor and get crazy with some netflix playing.

AMD has to get the IPC up or there still won't be true competition as Intel will win until a major breakthrough in the way software is designed occurs - full disclosure I'm looking at this purely from a business side. I'm glad that AMD may become competitive, but this forum has gotten way ahead of itself with a bit of promotion and a lot pent up fans finally getting some very positive news after a very very long time in the gutter...
 
Last edited:
Intel brainwashed me with the retail edge program honestly think Ryzen and Zen are comparable chips they were rocky at first but if games or proggys don't make use of them its just for show. I do favors intel chips simply because they really went down in price over 3-4 year period.
 
Lol at the people that say quad cores are enough. Those are the same people that would rather have dual cores back when quad cores just came out.
Try to use a dual core PC from that era on Windows 10...
I have an 11 year old T7400 Precision workstation with 2 Quads (8 cores total, Core 2 Quad based Xeons) that I am still using as a workstation at my work. 11 years later...And I can game on it too!!! A higher clocked Core 2 Duo would be unbearable and absolutely suck at gaming if it could at all!!!
Some people actually keep their PCs longer than a year...
We will be saying the same thing 10-15 years from now and I'll probably still have my lowly 1700 while a 7600k/7700k will be almost unusable even though back in their day they were "faster" at gaming.

I welcome more cores! (provided they have decent IPC)
This is a horrible argument. Why would they be referencing 11 year old processors? And who here holds onto a CPU for 11 years?

I replaced an AMD quad with an i3 for my mothers build last year and it was incredibly faster. I am not a huge fan of static benching to gauge real world performance, but it was outperforming the 4 year old quad you obviously believe is very important by a scale of 2-3 in nearly every one. I actually think the passmark score was over 3x and that piece of software loves MOAR CORES.
 
Innovation as in more cores? Until software is designed to utilize that it's a nice gimmick and differentiating advertising piece. The only thing that is gonna make AMD more money in this rumor is the 15% IPC gain. That would put them at intels pace and create an entirely different marketplace with legitimate competitive landscape.

Otherwise why would I get a ryzen 2700 when I can get a 7820x for nearly the same price with the 7820x not only faster but ocs better and is on a better platform?


Lol X299 and better platform.....(HEDT vs consumer????? really?)
Lol 2700 and 7820x same price.....This is funny

You forgot you have to buy a dang Mobo, and Intel changes sockets every year, and that AM4 is going to be supported for at least a few more years.
Longevity of a platform AMD > Intel.
 
Back
Top