Intel Core i9-10900K - 10 Core Is Coming

Very possible. If they were on top of their game, they'd call it the "KO"

Honestly, if they wanted to go that route, they could adapt their HEDT platform to do the job. They have enough core density there to compete with AMD on the regular desktop. At this point, AMD's X570 platform is well into and even past HEDT motherboard price territory. Intel has parts like the Core i9 9980XE that have the core count, IPC and the clocks to beat or match AMD's offerings. They could just rebrand those chips and cut the prices down. Sadly, that's not their style and I don't know what the cost is on those parts. It may be extremely rare to get chips that bin well enough to become 9980XE's. The reason this isn't a viable option for people right now isn't the platform cost as much as the price of the CPU's themselves. $2,000+ for a CPU is way beyond the budget of even many of the most hard core enthusiasts. I've spent some coin on processors without a second thought and even that's too rich for me.

Intel also doesn't really market its HEDT line as gaming processors, but they could rebrand and do just that. It wouldn't be the first time Intel passed workstation hardware off as a gaming solution.
 
Even IF this was real... New socket, forced upgrades when I can throw a 3950x into a B350 I bought 2 years ago and call it good. No thanks, Intel. I took a break from AND with my 8700k and 9900k. I'm going back home to AMD and will ask forgiveness.
 
Note that AMD is going to change sockets either at Zen3 or Zen4.

it'll depend on if they do a zen 2+ or zen 3 in 2020 which ever happens will be the last zen architecture on am4. what ever they decide to go from there will be interesting to see though.
 
Wow, does the OP really believe this? Only with a grain of salt? How about truck fulls of salt!

Like others have said, the prices are not even close but the new socket I believe. But I guess it was good for a laugh.
 
Is this a real one or the fake one that was debunked yesterday?

I know it's fake, but is it a real fake, or a false fake? And if it's a false fake, is it a real false fake? That's what we don't know.

Tech reporting is hard. But the truth is out there.


P.S. Even so, I'm a little curious what kind of fake benchmarks some of those processors have. Because after reading tech websites for 25+ years, finding out how close fake benchmarks are to real benchmarks is almost a thing.
 
Meanwhile Ryzen 3000 is actually out. I'll take launched products over future promises any day.

This here BIAAAAICHES. note to Intel fanboys, Ryzen 3000 you can buy right now. Or wait for this Intel CPU for god knows how long, if ever long it takes them to figure out how to manufacture these.

I bettcha Kyle is glad he stopped working at Intel.
 
Forbes just reported on this...pretty lulz if it is indeed fake.
Forbes is meaningless. Every single week they have 2 articles about LATEST PHONE is having issues!! Delayed!! Or some other bullshit. I see these articles daily in my news feed. iPhones, galaxy note, galaxy fold, and galaxy S. It's the same content copied and pasted from last weeks article and they're hoping to just get clicks.
 

Forbes copy pastes shit all the time. I doubt Intel will comment on it because they are probably hoping it spreads more. Ever wonder this pops up right after Zen 2 launch lol. Intel can barely keep 9900k tdp in check. And now we are going to have 10 cores at 65w is fishy as hell. I am sure they are happy this leak is circling around though.
 
Forbes is meaningless. Every single week they have 2 articles about LATEST PHONE is having issues!! Delayed!! Or some other bullshit. I see these articles daily in my news feed. iPhones, galaxy note, galaxy fold, and galaxy S. It's the same content copied and pasted from last weeks article and they're hoping to just get clicks.

Yea their tech writers are desperate for anything and everything lol. I feel like they are very noob at times lol.
 
65w 8 and 10 core chips? What a load of bullshit. I think intel has found that you can lie about TDP straight up off base clock and then double the power on the back end and all set, no one will notice. HAHA


Well, isn't that what they're already doing with their "95w" 9900k horseshit :D

We've known this was due out, but not until the end of the year/next year. And by that time, AMD will already have 16 cores out.
 
The 4 core i3 has a 91w TPD but the i9 10 core part has a 65w TPD, this seems totally legit

ignoring that it's fake, the wattage would make sense since they would just be recycled kaby lake chips.


Forbes also posted about AdoredTV's leaks, mainstream media can be duped

most online news sites these days are just copy pasta sites duking it out for the 3 cents in ad revenue for every idiot that views their site.. no one cares about doing real journalism anymore.
 
Intel also doesn't really market its HEDT line as gaming processors, but they could rebrand and do just that. It wouldn't be the first time Intel passed workstation hardware off as a gaming solution.

Reminds me of a certain hexacore CPU review FrgMstr got blacklisted over :p
 
fake but Intel really needs to come up with a new naming scheme...10900KF doesn't roll off the tongue...
 
Yeah, I doubt this very much, as far as the TDP and prices go, anyways.
 
Reminds me of a certain hexacore CPU review FrgMstr got blacklisted over :p

True. I didn't say it was a particularly good idea. I said Intel could do something like that and say: "We are faster again." They still have strong enough IPC that all it would need to do is equal AMD's core counts and maintain the 9900K's frequencies. The problem is, that's extremely tough to do at 14nm and it leaves them no where to go. Intel flat out needs to get its 10nm and 7nm processes squared away ASAP.

Again, I think that Intel could do something like that, but it's also in line with brute force tactics like AMD's 4x4 Gaming Platform. Its inelegant and it really would probably get laughed at. However, if they got the prices of the CPU's down low enough to where it matched AMD in multi-threaded workloads and kept the lead in gaming without being more than 25% more expensive than a comparable Ryzen 9 3900X / 3950X based setup, then I think few would care that it came from the HEDT product stack originally. It would take rebranding any HEDT derived parts and perhaps ditching two RAM channels, but I think its feasible if getting the CPU prices down that low is possible. I don't know what Intel's actual margin is on chips like the 9980XE.

You also have to consider that rebranding HEDT or server hardware for gaming back then was ridiculous because games couldn't do anything with more than four cores back in those days. So having some Xeon derived HEDT part with more cores and threads being heralded as the fastest gaming processor on the planet when we all knew it wasn't, is what got Intel called out. The solution I mentioned isn't needed for gaming. Intel still has that. What it would be needed for is competing at the desktop level in multi-threaded work loads. Right now, Intel can't do that. They need more cores and at 14nm, that's tough to do.

Sure, Intel could develop a new chipset and a new socket with a higher core count and maybe maintain the clocks they have on the 9900K. The problem is going to be the time its going to take to do that. Rebranding and adapting existing HEDT stuff for the desktop market would be much quicker to do. I'm not saying its a good idea, or that its right, but it could be done relatively fast.
 
Last edited:
My 2 cents:

In all likelihood, Intel can do literally nothing for the time being and still maintain significant sales, because of name brand recognition. The DIY market is relatively small. AMD needs OEM wins on Ryzen 3000. If that happens, perhaps Intel will react.
 
My 2 cents:

In all likelihood, Intel can do literally nothing for the time being and still maintain significant sales, because of name brand recognition. The DIY market is relatively small. AMD needs OEM wins on Ryzen 3000. If that happens, perhaps Intel will react.

Not only that, but as I've said before: Most DIY PC's are built for gaming these days. On that front, Intel still rules the roost. Sure, there are also content creation machines out there and AMD has a strong lead but as you said, brand recognition counts.
 
Looked fake when the first time I saw this. No way in hell that Intel will release a CPU that matches the price of an AMD top dog processor, Intel is known to overprice their products regardless if its faster or not.

Also, as several people mentioned, the "14nm+++" was a dead giveaway that it is fake. I don't think Intel would label their existing 14nm with the extra pluses would make too much sense.
 
Looked fake when the first time I saw this. No way in hell that Intel will release a CPU that matches the price of an AMD top dog processor, Intel is known to overprice their products regardless if its faster or not.

Also, as several people mentioned, the "14nm+++" was a dead giveaway that it is fake. I don't think Intel would label their existing 14nm with the extra pluses would make too much sense.

P4's and Pentium-D's were cheaper than A64 and X2's back in the day. Since then Intel hasn't had a slower processor than AMD to worry about what to price their products at.
 
For Ryzen and Ryzen plus the general consensus was AMD is the best all round chip if you were more focused on workload, and while that is still true with Ryzen 2, I think the gaming differential has closed a lot in many games and I was quite suprised reading PCPer's review given their history of being pro intel in reviews. They did DX12 testing which is good because it is better and it is the future despite gaming being slow in adapting to it, it is more scale-able than DX11 and that has shown massive performance gains.

metro-exodus-fps.png


sottr-fps.png


The results are very interesting, it has reached that point where it isn't just reduce gaming to intel recommendations as there is still copious game performance on AMD now. What is actually missing from reviews is multiplayer reviews, even battlefield is tested on single player and nobody buys battlefield to play single player mode, it is difficult to compare with multiplayer but I think more objective multiplayer experience reviews are needed as online gaming is by far the biggest game market, nobody really gives a crap about single player anymore.
 
An alleged leak of Intel Comet Lake comes courtesy of ComputerBase, and if it is to be believed, shows an array of 13 different chips, with the flagship 10C/20T flagship weighing in with a maximum single-core boost clock of 5.2GHz and all-core boost of 4.6GHz. Featuring 20MB of Intel Smart Cache, a 105W TDP, and $499. The Core i7-10700K an 8C/16T chip with 5.1GHz single-core boost and 4.8GHz all-core boost, with a 65W TDP and integrated graphics for $398.

New LGA 1159 socket with new Z490 motherboards, and supports DDR4-3200 natively.

Recommend taking with a grain of salt until confirm/denial from Intel.

View attachment 172993

View attachment 173066
The model name alone is a dead give away that this is fake. Keeping this open makes the hard forum look bad.
Pretty sure the old staff would have closed this
 
For Ryzen and Ryzen plus the general consensus was AMD is the best all round chip if you were more focused on workload, and while that is still true with Ryzen 2

Up until the latter half of the Ryzen 2 release, that is, with far better memory support to get the Infinity Fabric up to speed, the cost of a board with good VRMs + the cost of exotic memory needed to get Ryzen 'up to speed' was the same or worse. Once those issues were resolved, a CPU like the 2600 (X) became essentially the best basic recommendation for gaming, with recommendations diverging into higher Intel or AMD parts depending on the user.

I think the gaming differential has closed a lot in many games and I was quite suprised reading PCPer's review given their history of being pro intel in reviews.

The gap has closed massively; in general, unless responsive gaming is the priority, there isn't really an argument for an Intel CPU for enthusiasts. That's not true for the broader market of course, but for us here, honest recommendations for Intel are going to be few and far between.

They did DX12 testing which is good because it is better and it is the future despite gaming being slow in adapting to it, it is more scale-able than DX11 and that has shown massive performance gains.

I like to see DX12, but also as with Vulkan, developer support just isn't there yet. There are only a few games that benefit from DX12 over DX11 on any system configuration. We need both.

The results are very interesting, it has reached that point where it isn't just reduce gaming to intel recommendations as there is still copious game performance on AMD now.

Please do remember to link frametimes / 0.1% lows alongside average framerates. This isn't to detract from AMD CPUs- in general, they do very well here, but the basic reality is that the only story that average FPS tells is academic and isn't useful for describing the end-user experience alone.

What is actually missing from reviews is multiplayer reviews, even battlefield is tested on single player and nobody buys battlefield to play single player mode, it is difficult to compare with multiplayer but I think more objective multiplayer experience reviews are needed as online gaming is by far the biggest game market, nobody really gives a crap about single player anymore.

I'll say that single player is still important, and DICE / EA put significant effort into their single player campaigns, but if benchmarked they should be benchmarked in addition to multiplayer, not in place of it. I am with you 100% that multiplayer testing should take precedence.
 
For Ryzen and Ryzen plus the general consensus was AMD is the best all round chip if you were more focused on workload, and while that is still true with Ryzen 2, I think the gaming differential has closed a lot in many games and I was quite suprised reading PCPer's review given their history of being pro intel in reviews. They did DX12 testing which is good because it is better and it is the future despite gaming being slow in adapting to it, it is more scale-able than DX11 and that has shown massive performance gains.

View attachment 173268

View attachment 173269

The results are very interesting, it has reached that point where it isn't just reduce gaming to intel recommendations as there is still copious game performance on AMD now. What is actually missing from reviews is multiplayer reviews, even battlefield is tested on single player and nobody buys battlefield to play single player mode, it is difficult to compare with multiplayer but I think more objective multiplayer experience reviews are needed as online gaming is by far the biggest game market, nobody really gives a crap about single player anymore.


My experience with DX12 is that it generally benefits low end CPU's and low power CPU's, but that if you have a high end desktop CPU with very high single thread performance, performance is often actually worse with DX12 than with DX11 due to the inherent cache thrashing caused by DX12 spreading out work over all the cores.
 
My experience with DX12 is that it generally benefits low end CPU's and low power CPU's, but that if you have a high end desktop CPU with very high single thread performance, performance is often actually worse with DX12 than with DX11 due to the inherent cache thrashing caused by DX12 spreading out work over all the cores.

Both DX12 itself and games using it need more time in the incubator. It delivers, but there are still little annoyances that are out of whack.
 
Please do remember to link frametimes / 0.1% lows alongside average framerates. This isn't to detract from AMD CPUs- in general, they do very well here, but the basic reality is that the only story that average FPS tells is academic and isn't useful for describing the end-user experience alone.

Agreed. I tend to like 0.1% lows more than I like frametimes, but one of them has to be there. I'd argue that if you have to leave anything out, keep the 0.1% lows and drop the average FPS. The average is meaningless.

I'll say that single player is still important, and DICE / EA put significant effort into their single player campaigns, but if benchmarked they should be benchmarked in addition to multiplayer, not in place of it. I am with you 100% that multiplayer testing should take precedence.

Well, I'm not the market. In fact, judging by how dumb I think ever streaming or watching a stream is, I'm about as different from the market as you can get these days, but I have completely stopped playing any multiplayer games. For me PC gaming is single player and single player only. I'm looking for a good story in a title, coupled with amazing graphics and sound to tie it together into an immersive experience. I really don't care about framerates that much as long as my minimums stay above 60fps (I'll take more if I can get it, but cranking up visual details is a more important way to use the finite GPU capacity to me)

Multiplayer games have become stupid and uninteresting to me. Often unrealistic, and just full of kids spouting off at the mouth. To me what killed multiplayer gaming was a combination of games becoming silly, and automatic matchmaking and official servers. Now that official servers are ubiquitous no one is dependent on community servers anymore, and when no one is dependent on community servers they have nothing to lose by behaving like total tools.

Back before official servers existed, if you wanted to play online, you had to find a community server. It took a long time to find one you liked and once you did, you definitely didn't want to get banned from it, so you behaved,m stayed around, got to know the regulars, and it was great. Now, with official servers and automatic matchmaking, there is no policing of anything, people are total asshats all the time, no one cares about anything about becoming famous for their stream, so rather than just playing the game they are running around ad doing stupid shit for attention.

Multiplayer gaming in 2019 is completely uninteresting. It is dead to me.
 
Back
Top