Separate names with a comma.
Discussion in 'Intel Processors' started by LaCuNa, Jan 2, 2019.
What do U guys think!! I wanna hear it ALL!! all your reasons <3
edit to add: 14nm++++++++ and shitty thermals to boot.
Higher core counts usually mean lower clocks and more cost. If the applications you're running don't scale well with more cores, then you're just spending more money for less realized performance.
The thing is, you can't ever ignore the cost, and very few people are going to pony up $1000+ for stuff they don't need.
Unless you really need a HEDT platform, the average user is much better served by a mainstream platform from the cost/performance perspective. Even high core count mainstream platforms (8C/16T) are pushing it for the average user.
Nobody can deny what is said. For me, coming from an X58 + Xeon, and having a really competitive asshole 'acquaintance', is making me want to adopt an x299/i9 combo. Srsly.
At the very least, I wouldn't buy anything until you see what AMD has to say at CES.
The X58 with a 6C/12T Xeon is still pretty good bang for the buck. Mine does 4.2Ghz all day .
The Windows scheduler has problems with high core count platforms - even the i9 and its not-quite-uniform cache architecture is enough to cause issues, and Windows likes to bounce threads between cores to balance loads (on average, this leads to more uniform core temperatures and therefore more aggressive turbo), but migrating an entire thread leads to dips in FPS in games.
Getting consistent performance is also trickier - i9's nominally have incredibly high turbo ratios, but in practice, the maximum turbo ratios are rarely achieved because of background processes. You really need good knowledge of your motherboards' BIOS tweaks and power limit behavior to dial in a usable overclock - merely typing in 49x and 1.3V won't do it on the larger parts because you end up looking at 500+W under heavy loads, a point at which things like ambient temperature become an issue for stability.
Just get your dick out & show them it’s bigger. No need to spend a grand to prove it
My work loads are different than most. My daily drivers are all quad core (over kill but eh...). Gaming is quad as well in the 4+ ghz range. 3x VMWare hosts are 6 core HT.
We just put together a streaming box. The quad core struggles to put out 2x 1080p@60 with two web cams when the source is 4K. But it barely struggles in that we only see 10% frame drop every 30-45 mins for less than 30 seconds. Since we are not Ninja Rn yet... that little pig will do for now.
Unless you have friends that share the need for bigger is better I agree with others... it’s not needed or wanted (highly due to price)
I doubt i'll ever have use for more than 8C16T cpu. I may buy a 12C24T one if the AMD leaks are true though just coz why not? Well mainly for the higher clockspeed of the 3700X
Because I'm apparently the only person on the planet that can't notice fps differences beyond a stable 60fps and my i5 6600k still does that with no problems.
I really need a new rig for gaming. Stable 60fps sounds great. Last time I was playing a fps game (I don't game often) it was Shadow Warrior 2 and the fps tanked to 0 fps for around 4 minutes lol
I gotta admit I'm still put off by Intel's shifting definition of TDP here.
I mean, I can understand it - but it definitely breaks with tradition and wasn't what anyone expected.
Now, is that reason enough to not buy a chip, especially when I was probably going to overclock it just like Boost is doing now automatically? No, probably not. But that, coupled with the chipset bullshit, certainly makes me pause before throwing my money at a company that practices like this.
And I'm a sucker for the underdog...
Not trying to call you out, but that's a really short sighted way to view computers and technology.
You really couldn't see ANY reason at all to get more than 8 cores... even in an incredibly long time like say 25 years? NEVER?
You might just be speaking hyperbolically, but I think a lot of people don't actually think about this stuff. Modern computing is barely 25 years old (the first generation Pentium came out in 1993, just barely over 25 years at this point). There has been a massive quantum leap in computing power and capacity on multiple levels since then.
I might be cynical, and obviously programming has been slow as all get out to utilize cores and get more applications to be multi-threaded. But I'm still 100% certain that eventually it will happen as adding more cores becomes more and more viable over frequency (at least for the time being). Certainly far past the point of using 8c/16t.
As for the OP: no. Because as others have stated, cost is always a factor. Until Intel is giving me processors for free the hypothetical doesn't matter. It doesn't matter if Intel makes a 18 core desktop "non-HEDT" part if I can't afford it. Desire and viability is always linked to cost. Intrinsically.
If cost really wasn't an issue, than obviously we'd all just get the HEDT processors now and just skip desktop processors. Skylake-X might be "slower" in frequency related apps but if you actually do use multi-threaded applications then it's the clear winner.
I think there would need to be some new compelling use case for me to use more than 8 cores at home. My first home pc was a 486dx2 so I'm aware of the increases in computational power. The only use I have now for a faster PC is gaming which I find myself not as interested in these days. I dabble in programming a little C# at the moment but visual studio seems to run alright on my i5-3570k. Other than that the usual web browsing and netflix streaming is all I really do on my PC. I could probably make a case for me not needing 8C16T at all lol.
There's a difference between not needing something and never needing something. You were shown to be wrong and now your backtracking.
I have my opinion and you have yours. This reminds me of the debates when the Q6600 came out. It wasn't very interesting then and it isn't now.
Q6600 was the bomb what U talkin' bout!
EPEEN chip <3
As long as I can keep single-thread performance up, sure.
Long live my Ivy Bridge I-7. I haven't had to upgrade since 2012.
The best way to do it is get a mainstream computer, and show them in benchmarks that yours runs things faster and you paid half for it.
Or the AMD X2 4400+ Did most people need two cores back then? Probably not, but it was pretty awesome to have. I think that was my last AMD build, Zen 2 is long overdue.
This is the correct answer. We're deep into the core count wars, but what out there actually scales past 4 or even 2 threads? We are just starting to get games that utilize 6 or more, and they're still few and far between. From a productivity perspective I don't use anything currently that scales past 8.
It is always to have more cores than SW that is being used can utilize.
What if the software only uses 20 -50% of one core?
Like my apps.
I need the low latency of a Dual Core but a Quad is what I use.
6 Cores is where latency starts to take its toll in real time usage.
This is why I just bought 2 more ASRock Z97m WS/i7 4790k combos and Modded them for Supermicro CSE 500f-441B 1U Chassis.
Once I see “new” cores instead of these incremental moves requiring new chips and boards, Im 100% stable with the lowest latency possible and ZERO driver issues.
I really think by the time this happens I will need an i3 Quad, because AMD doesn’t look like it needs to “stoop” like that.
The latest i3 9350k has a base of 4GHz and 65 watts. That’s pretty nice, but I want 4.5GHz base ( all cores ) and 65 watts.
Guess 10nm can do this.
If not I’m making plenty of cash on my 4790k rigs.
550 for the CPU, DRAM 32GBs and motherboard.
More money for bigger Samsung Pro SSDs...
Intel can't even keep 8 core CPUs cool. CPU temperatures are absolutely absurd. Really, a lot of computer technology is basically in a place where it isn't even acceptable as a consumer product. It's basically prototype hardware that gets sold like it's actually ready for prime time.
Intel has been keeping eight-core CPUs cool for years
Because the biggest difference in system performance is the GPU. It is a waste of money, like going to 64 GB of RAM in a gaming system.
AMD Faildozer's "Eight-Core Processor"s ran much hotter, especially those that had 5GHz turbo XD
Saying 'biggest' throws off the perspective; having not enough memory or CPU makes a huge difference. It's just that once you do have enough (capacity and cores), the GPU starts to scale.
The other side of this is how it is tested: we should be looking at maximum frametimes, which correlate with minimum framerates but allow us to identify real issues that will be 'felt' by a player.
If it's just a dick measuring contest then get the 32c/64t Threadripper? x299/i9 can't match that, and you can brag about all 64 PCI-E lanes you have as well. Fuck it, run 12 NVME drives in raid 0 because you can. That's your untouchable bragging rights there if you don't care about price at all and don't have any particular workload in mind.
Otherwise money & usage definitely matters, and just to hit 12c+ you may be compromising on the things you actually care about. Intel's X-series don't clock as high, primarily, meaning things like gaming will be slower on the more expensive chip. As will many other things that don't scale out as well.
At this point unless sheer desperation plagues your veins I would wait until Zen 2 is released with its beloved 7nm process. It claims to make massive gains in performance etc...
I have a 2950x and love my 16 cores. I can crunch videos at max while playing Oculus games and it doesn't even feel like my processor is being worked.
I would wait and get as many cores as you can for what you can afford. Unless your purely gaming. Then I would go with a 6 or 8 really high clock speed chip rather than a lower clock speed high core count monster like a Threadripper or the coming $SOUL priced 28 core Intel
I also have a 2600x 6 core and it feels lovely. I can't imagine what a 3600x is going to feel like with an brand new process, lower power usage, high clocks, higher IPC etc... but I find that 6 cores is the min I would get in a gaming centric processor moving forward.
At least they listed them as a 225W processor unlike Intel's "95W"
Still needs at least 2 core processor otherwise overall system performance will be terrible.
Modern OSes (all Windows NT) have terrible performance on single core processors in general imho
Completely agree, and that will actually continue to be the case. As we've discussed before, there are a few main categories:
1) Single thread
2) Lightly parallel
3) Embarrassingly parallel
Once you truly have a workload which scales up well past 8-wide, you're realistically heading for GPU territory. Batch processing of huge data sets, no data dependencies between workloads, and usually very little branching. Perfect for a GPU.
For other things, Amdahl's Law hits you very hard, very fast. Yes, some software is getting better at using more cores, so it isn't like more cores has no value. But for most home users, 4 is decent, 6 is good, 8 great. Beyond that, you're getting sharply diminishing returns, and should consider strongly what your workloads are and where that money is best spent.
For example - maybe an ultra-low latency SSD, or a beefier GPU. Or heck, RGB lights on your fan - that makes things faster I Read Somewhere.
massive security holes?
unknown performance degradation from fixing them?
Fixed it for you.
<nod> Fair enough, that's a good generalization.
You can use it to play Witcher 3. (Of course RAM controller was on chipset and it only supported DDR2 which had bad write rates.) But it was great for emulation. These in poverty overclocked E5xxx, or E7200/E7300. These who had money to make custom water cooling overclocked Qxxx. PS2 emulation required CPU power and Intel CPU allowing non standard compliant handling of denormals. (Aka DNZ.)