Jeez 10900k really?

I don't know who this processor is for.

Gamers. Intel is marketing the CPU directly to gamers as the fastest gaming processor in the world. Which it is. Now, Intel is only doing that because in multi-threaded workloads it gets its ass handed to it by a 3900X or 3950X.
 
Gamers. Intel is marketing the CPU directly to gamers as the fastest gaming processor in the world. Which it is.
I guess I'm not loaded enough to buy much of anything to full one purpose and one purpose only.
 
I guess I'm not loaded enough to buy much of anything to full one purpose and one purpose only.

But how much do you really do with your computer that's demanding enough to need more than 10c/20t, or 8c/16t for that matter? Unless your doing a lot of video editing, computer animation / rendering, or VM work, gaming is probably the most demanding thing you do. In which case, Intel has the stronger option for that purpose. It still does other things just fine. AMD does beat it at multi-threaded workloads, sure. But, the amount of cores and threads simply go unused much of the time.

It was a bit different earlier this year or late last year when Intel only had the 9900K and AMD had both the 3900X and 3950X. The aggregate difference in gaming performance was virtually meaningless. Now, Intel's increased it's lead enough to where some people with enough of a budget can build a machine with an emphasis on gaming as there is a larger gap between Intel and AMD on that front now. Although, there are a lot of variables there as some of the games I tested showed no difference at 4K, and other games showed much higher increases than just 5-6%.
 
But how much do you really do with your computer that's demanding enough to need more than 10c/20t, or 8c/16t for that matter? Unless your doing a lot of video editing, computer animation / rendering, or VM work, gaming is probably the most demanding thing you do. In which case, Intel has the stronger option for that purpose. It still does other things just fine. AMD does beat it at multi-threaded workloads, sure. But, the amount of cores and threads simply go unused much of the time.

It was a bit different earlier this year or late last year when Intel only had the 9900K and AMD had both the 3900X and 3950X. The aggregate difference in gaming performance was virtually meaningless. Now, Intel's increased it's lead enough to where some people with enough of a budget can build a machine with an emphasis on gaming as there is a larger gap between Intel and AMD on that front now. Although, there are a lot of variables there as some of the games I tested showed no difference at 4K, and other games showed much higher increases than just 5-6%.

You're probably right for those that really want the edge.

But even for me, that all my desktop does it game now; even if I thought about Intel I'd probably buy the 10700k which is 409 at microcenter. But then I look at the 3800x next to it for 309, and for me, for the extra $100, I'm not worried about 20 extra FPS. But that's just me at nearly 40. Maybe at 28 I would have thought differently.
 
You're probably right for those that really want the edge.

But even for me, that all my desktop does it game now; even if I thought about Intel I'd probably buy the 10700k which is 409 at microcenter. But then I look at the 3800x next to it for 309, and for me, for the extra $100, I'm not worried about 20 extra FPS. But that's just me at nearly 40. Maybe at 28 I would have thought differently.

When you are at 4K, or even 3440x1440, or really high refresh rates 20FPS could be huge. 10% increases in FPS can make or break whether or not a game is playable at some settings.
 
I guess I'm not loaded enough to buy much of anything to full one purpose and one purpose only.

It's not one purpose only. There is nothing the CPU does poorly.

That would be like saying the only purpose of AMD CPUs is for render farm activities.

A lot more people game, than run render farms. :D
 
That's not correct. There are users like me who try to go for "best in class" components. Price is still important, but secondary. That meant that since the AMD Athlon I've been Intel since Intel technology was clearly superior. And yes, during that time AMD was for people looking to save a buck.

But it's now 2020 and that's no longer the case. I'm not going to get "old technology" 14 nm components cause Intel messed up their manufacturing and is stuck on 14nm. That means the 10900k is out of the question. I'm going with a modern 7nm CPU from the guys with the better technology. And I don't care about the price. Sadly, that means no Intel this time.

I suppose the 10900k is for two groups: "pure gamers" looking for the highest clock speed who don't mind CPU's built on older technology and are slightly less capable (as an overall CPU) and have plenty of money. And people who like Intel because they like the brand.
And why would ANYONE mind if a CPU is made with caveman tech or today's tech if the "ancient one" is still outperforming the newer one? So when your game, photoshop, etc...are delivering less performance on the fancy 7nm(ish) AMD CPU, does it make you feel better that you can say, "but at least is not ancient tech".??? I don't care if AMD goes to 1nm to 100nm...if their performance is not able to match/beat Intel for what I NEED, then it has zero importance which tech is newer. Considering even at 7nm they still cannot beat Intel with their 14nm "ancient" tech at all they should be able to, when intel hits 10nm or 7nm, it will be a scary day for AMD ;)
And as far as the 10900 not being best in class, or for pure gamers, that is just silly. So is AMD JUST for blenders? Because apparently every AMD user now is a 100%3d renderer these days... lol
 
Honestly it's the low 1% and 5% that make Intel the undisputed choice, if you're after gaming numbers and providing tangible advantage in keeping those 1% FPS numbers as high as possible.
EuroGamer numbers for BF5, Witcher 3 - the leads are quite huge frankly. All this while still on 14nm. Yeah, Intel doesn't have the price advantage, but then Microcenter hasn't offered the same discounts they have in the past either (e.g. i5 that's at least $40 cheaper than online, $40 mobo combo discount).

I'd prefer that when things are GPU-impacted (1440/4k, depending on game), that when the capable GPUs upgrades are available that my installed CPU can still stretch its legs. I'm a cheapass (and my vision isn't what it used to be), so I'm perfectly fine staying on a 32" 1440P.
 
You're probably right for those that really want the edge.

But even for me, that all my desktop does it game now; even if I thought about Intel I'd probably buy the 10700k which is 409 at microcenter. But then I look at the 3800x next to it for 309, and for me, for the extra $100, I'm not worried about 20 extra FPS. But that's just me at nearly 40. Maybe at 28 I would have thought differently.
Yes - if you're looking for cost effective Intel high end gaming solutions, the 10700k is pretty good or if you can find a cheap 9990k and overclock it that can be works pretty well also. Take the money you save and invest it in a high end 2080ti.... You can't expect the 10900k to be cost-effective, those top-end CPU's are the ones Intel used to make for "bragging rights" as the best overall CPU's, except that approach doesn't work quite as well nowadays. But yeah, the 10700k works.
 
And why would ANYONE mind if a CPU is made with caveman tech or today's tech if the "ancient one" is still outperforming the newer one? So when your game, photoshop, etc...are delivering less performance on the fancy 7nm(ish) AMD CPU, does it make you feel better that you can say, "but at least is not ancient tech".??? I don't care if AMD goes to 1nm to 100nm...if their performance is not able to match/beat Intel for what I NEED, then it has zero importance which tech is newer. Considering even at 7nm they still cannot beat Intel with their 14nm "ancient" tech at all they should be able to, when intel hits 10nm or 7nm, it will be a scary day for AMD ;)
And as far as the 10900 not being best in class, or for pure gamers, that is just silly. So is AMD JUST for blenders? Because apparently every AMD user now is a 100%3d renderer these days... lol
I agree with you at a tactical level. When the 9900k came out, it was "the best CPU" you could buy in that general class, damn the price. And yes, it made sense to buy it for that. So if you had money back in 2018 and wanted the best CPU, it absolutely made sense to get the Intel i9-9900k.

But the 10900k cannot claim the same. But I will agree with you that it's the "best for gaming".
 
I agree with you at a tactical level. When the 9900k came out, it was "the best CPU" you could buy in that general class, damn the price. And yes, it made sense to buy it for that. So if you had money back in 2018 and wanted the best CPU, it absolutely made sense to get the Intel i9-9900k.

But the 10900k cannot claim the same. But I will agree with you that it's the "best for gaming".
Also best for Photoshop, so is not ONLY gaming, there are other apps, that is my point that you cannot just crown this or that as best overall unless the whole world suddenly uses 3d rendering and Cinebench 24/7 and I missed the memo lol
 
Ok, the "best at photoshop and gaming", lol. And yes, agree that nowadays it's not so easy to crown any one CPU as the best overall. And that's the market the 10900k has come out in hence this thread where things aren't so black and white. A couple of years ago it was easy: the top Intel was always going to be the "best CPU" (if you ignore price). Competition is good.
 
Ok, the "best at photoshop and gaming", lol. And yes, agree that nowadays it's not so easy to crown any one CPU as the best overall. And that's the market the 10900k has come out in hence this thread where things aren't so black and white. A couple of years ago it was easy: the top Intel was always going to be the "best CPU" (if you ignore price). Competition is good.
Competition is good indeed. The 10900k would have been over $1000 without AMD where they are today, or close to it.
 
Competition is good indeed. The 10900k would have been over $1000 without AMD where they are today, or close to it.

We wouldn't have a 10c/20t mainstream segment CPU without AMD. We would be at 4c/8t or 6c/12t at most in that segment. That said, I disagree with the idea that Intel's mainstream segment CPU's would be $1,000 without AMD, as that hasn't been the case prior to the launch of Zen based AMD CPU's. The 4790K and its predecessors stomped all over Bulldozer and were never close to $1,000.
 
Last edited:
We wouldn't have a 10c/20t mainstream segment CPU without AMD. We would be at 4c/8t or 6c/12t at most in that segment. That said, I disagree with the idea that Intel's mainstream segment CPU's would be $1,000 without AMD, as that hasn't been the case prior to the launch of Zen based AMD CPU's. The 4790K and its predecessors stomped all over Bulldozer and were never close to $1,000.
Well, we will never know who is right about the pricing.
 
Buying based on process technology is among the most irrelevant of reasons there is.

If AMD was selling 7nm Dozer derived parts, how much would 7nm vs 14nm matter to you?

You do know that a LOT of the IP in Zen is Bulldozer derived right? Even has FMA4 and some other Bulldozer specific extensions though undocumented.
They just fixed it. There would be no Ryzen without Bulldozer.

From reddit
Zen inherits quite a bit from Bulldozer, actually! And it greatly benefits from it.

Zen uses Bulldozer's front end almost as a direct copy with a uop cache bolted on.

Commonalities from the top of my head:

  • Decoupled prediction and prefetch pipelines
  • Predictor directs prefetch
  • Four identical decoders
  • Branch fusion
  • 64KiB Instruction Cache
  • Decodes two CPU threads at once
  • and much more...
On top of that, Zen's FPU is a natural progression from the FlexFPU found in Bulldozer.

  • Implements SMT directly including prioritization
  • Treated as a coprocessor
  • Independent scheduler
  • Multiple pipelines work together on the same instructions
  • Dual 128-bit wide
... make no mistake, though... Zen's FPU is quite an upgrade.

Further, AMD learned that they can get away with just two AGUs if they have enough ALUs and how to share the backend (retirement pipeline) efficiently.

Zen would not be possible without Bulldozer's failure and AMD's frantic efforts to recover from it. As Jim Keller said, AMD had the technology already, they just needed to put it all together in a single package via a data fabric and then work out the platform. AMD was able to do so well on the platform as a whole because they didn't have to waste time redesigning everything from scratch. Heck, even Infinity Fabric is just a special optimization of HyperTransport .31.
 
Last edited:
You do know that a LOT of the IP in Zen is Bulldozer derived right? Even has FMA4 and some other Bulldozer specific extensions though undocumented.
They just fixed it. There would be no Ryzen without Bulldozer.

From reddit

Irrelevant. Bulldozer is a significantly different architecture, if it was running on 7nm, it would still suck.

Process isn't the defining feature here, it's the architecture. That was the point.

Being fixated on process is having blinders on.
 
Seems questionable. Are you going change CPU config for each load/game?

No, I assume it would be an experiment and then probably undone. Like low graphical settings and no AA in a single-player game just to see FPS, not long term once all the jaggies are there.
 
At least the Intel motherboards are not sold out like AMD. Someone said the 10700k runs cooler than the 9900k despite the higher TDP. You can always upgrade to the next gen chip after Comet Lake.
 
If I remember correctly, der8auer's delidding vid for 9900K was quite interesting. It took a combination of both delidding and lapping the die to get a serious reduction in temperatures (15-20C).
Now his Cometlake preview he also notes the difference in die height again.
  • 8700K = 0.44mm
  • 9900K = 0.88mm (removed 0.20mm)
  • 10900k = 0.58mm - no need to perform die lapping now? Delidding got him 4-5C reduction, but I don't believe he suggested lapping Comet Lake for additional improvement.
 
Well, we will never know who is right about the pricing.

Ultimately, we never will. However, when factoring in Intel's pricing strategies over the last decade and considering external factors which led to those strategies, I think it's unlikely Intel would have had mainstream CPU pricing as high as $1,000 for any given model. Without AMD on the other hand, Intel's HEDT market would be another matter. It would still be selling it's 18c/36t parts for $2,000 each. I think there is no question of that given that Intel's significant price reduction for those offerings is a direct result of AMD offering substantially better performing options for the same or less than Intel was.

It's obvious that competition has lit a fire under Intel that it hasn't had in over a decade. Unfortunately, that fire is slow to get going given Intel's size and the long development cycle for CPUs. But, product pricing isn't simply a matter of what a competitor does with competing products. It's about market conditions and what your customers can afford or are willing to spend. As I've said, AMD hadn't been competitive for a decade prior to the launch of the Ryzen 1000 series. In all that time, Intel's mainstream processor prices never reached anywhere near $1,000. Sure, AMD was there but it wasn't remotely competitive.
 
It's obvious that competition has lit a fire under Intel that it hasn't had in over a decade. Unfortunately, that fire is slow to get going given Intel's size and the long development cycle for CPUs. But, product pricing isn't simply a matter of what a competitor does with competing products. It's about market conditions and what your customers can afford or are willing to spend. As I've said, AMD hadn't been competitive for a decade prior to the launch of the Ryzen 1000 series. In all that time, Intel's mainstream processor prices never reached anywhere near $1,000. Sure, AMD was there but it wasn't remotely competitive.

With Intel turning their process lead, into a process deficit, right at the time of AMDs resurgence, Intel is kind of competing with one hand tied behind their back. We really won't see much competetion from Intel until they get the process mess straightened out. Which still looks like it's a couple of years away.
 
With Intel turning their process lead, into a process deficit, right at the time of AMDs resurgence, Intel is kind of competing with one hand tied behind their back. We really won't see much competetion from Intel until they get the process mess straightened out. Which still looks like it's a couple of years away.

I couldn't agree more. My personal estimate would be late 2021 at the earliest and that's admittedly optimistic.
 
I couldn't agree more. My personal estimate would be late 2021 at the earliest and that's admittedly optimistic.

It's actually surprising how well they are hanging in with old process and designs. I am interested to see how Rocket Lake works out, first serious desktop architecture update in years, but still on 14nm. More complex architecture probably won't clock as high as Skylake derivatives, and the old designs might still prevail for gaming.
 
With Intel turning their process lead, into a process deficit, right at the time of AMDs resurgence, Intel is kind of competing with one hand tied behind their back. We really won't see much competetion from Intel until they get the process mess straightened out. Which still looks like it's a couple of years away.
Its interesting you say "We really won't see much competetion from Intel". Anyone reading this would think AMD is dominating intel at everything when that couldn't be further from the truth. Check Digitial Foundry realistic gaming tests that clearly show Intel is the best choice for gaming and more noticeable with non canned benchmarks. And besides gaming, there are other apps as well where the speed is still king so I find it funny you say they are not competitive when they still dominate gaming clearly..and that on the old 14nm vs the much newer AMD process. So I can look at this in a very different way...I can say, its sad that even at 7nm AMD STILL cannot hit 5Ghz nor match Intels 14nm gaming performance (or even photoshop). So yeah, for some things like 3D Rendering, Intel cannot match the 16 cores of a 3950x but to claim is not competitive when it clearly beats AMD at others is just nonsense.
 
Its interesting you say "We really won't see much competetion from Intel". Anyone reading this would think AMD is dominating intel at everything when that couldn't be further from the truth. Check Digitial Foundry realistic gaming tests that clearly show Intel is the best choice for gaming and more noticeable with non canned benchmarks. And besides gaming, there are other apps as well where the speed is still king so I find it funny you say they are not competitive when they still dominate gaming clearly..and that on the old 14nm vs the much newer AMD process. So I can look at this in a very different way...I can say, its sad that even at 7nm AMD STILL cannot hit 5Ghz nor match Intels 14nm gaming performance (or even photoshop). So yeah, for some things like 3D Rendering, Intel cannot match the 16 cores of a 3950x but to claim is not competitive when it clearly beats AMD at others is just nonsense.

It's also interesting that you think people only play games or use Photoshop on a computer because those are essentially the only things Intel is winning at right now.

More to your point, you're making a grand assumption that Rocket Lake is going to clock like Skylake (e.g. your "speed is still king" statement). Even the most optimistic Intel supporters are generally in agreement that Rocket Lake clocks aren't going to be as fast as Skylake derived ones, and the idea is that the improvements to the architecture will make up the difference. Are you also going to claim that "its sad" when Intel can't reach 5Ghz with a new architecture?
 
It's actually surprising how well they are hanging in with old process and designs. I am interested to see how Rocket Lake works out, first serious desktop architecture update in years, but still on 14nm. More complex architecture probably won't clock as high as Skylake derivatives, and the old designs might still prevail for gaming.

While it is true that intel is behind in the nm war it is also common knowledge that intel's 14 nm is pretty close to TSMC's 10 nm tech it's more a matter of how it is measured. I am curious to see what intels new architecture will bring and also AMD zen 3 but for now my 9900k will do nicely, only thing that might make me consider upgrading is more for the security issues with current intel and the spectre, meltdown and other stuff.
 
It's also interesting that you think people only play games or use Photoshop on a computer because those are essentially the only things Intel is winning at right now.

My work workload is something like the following:
Office apps, fusion360, inventor, kicad/altium, occasional Davinci and a fair bit of CubeIDE compiling. I do occasionally use handbrake, but that is becoming a rare thing these days.

CubeIDE compiles quickly, typically in under 10 seconds on my box, sometimes as low as 5, and while I could net gains from more cores, shaving 1 second off compiles is not high on my list of priorities.

Fusion could benefit from AMD’s 3900x or 3950x, but not by a lot and rendering is only one part of the work pipeline. Typically, for me, rendering speeds are not the limitation, my ability/skill is.

On the other hand, I game, I play assassins creed origins at the moment, and a fair bit of doom eternal, flicking back to older and/or less demanding titles like ori and Starcraft 2. I play these at 4k/60hz. Despite my best efforts I haven’t gotten into the Witcher series.

These titles typically don’t stress a cpu, and only use up to 12 logical threads.

In my case, memory capacity is important, so intel wins here, and compatibility/stability is paramount.

If I had to prioritise:
Idle power usage and stability are right up there as the most important things, the former because I predominantly work from home, and with my work, the computer is typically waiting for me.

Memory compatibility and size is probably secondary, so I can have apps open without having to close stuff mid way through something.

Storage speed/general speed - so long as I have “enough” that I am not having to wait for general stuff, I am happy.. this goes for gaming too, things need to be pretty enough.

I am pretty happy with my 9900k choice, and if I were buying today would probably choose intel again (10700k or 10900k), mainly because of platform stability.
 
Last edited:
I don't see what the big deal is with this conversation. Buy what you want. Me, i prefer miidle of the road, not 10000 fps or a 2 sec. encode etc. quite happy with what i bought. I say be happy with what you have or throw it in the garbage and redo.
 
My work workload is something like the following:
Office apps, fusion360, inventor, kicad/altium, occasional Davinci and a fair bit of CubeIDE compiling. I do occasionally use handbrake, but that is becoming a rare thing these days.

CubeIDE compiles quickly, typically in under 10 seconds on my box, sometimes as low as 5, and while I could net gains from more cores, shaving 1 second off compiles is not high on my list of priorities.

Fusion could benefit from AMD’s 3900x or 3950x, but not by a lot and rendering is only one part of the work pipeline. Typically, for me, rendering speeds are not the limitation, my ability/skill is.

On the other hand, I game, I play assassins creed origins at the moment, and a fair bit of doom eternal, flicking back to older and/or less demanding titles like ori and Starcraft 2. I play these at 4k/60hz. Despite my best efforts I haven’t gotten into the Witcher series.

These titles typically don’t stress a cpu, and only use up to 12 logical threads.

In my case, memory capacity is important, so intel wins here, and compatibility/stability is paramount.

If I had to prioritise:
Idle power usage and stability are right up there as the most important things, the former because I predominantly work from home, and with my work, the computer is typically waiting for me.

Memory compatibility and size is probably secondary, so I can have apps open without having to close stuff mid way through something.

Storage speed/general speed - so long as I have “enough” that I am not having to wait for general stuff, I am happy.. this goes for gaming too, things need to be pretty enough.

I am pretty happy with my 9900k choice, and if I were buying today would probably choose intel again (10700k or 10900k), mainly because of platform stability.

Honestly, I haven't really noticed a big difference after the 1st gen Ryzen in terms of platform stability and memory compatibility issues. I played through AC Origins and Odyssey on a Ryzen system, and I wasn't thinking to myself I needed more CPU performance to make it more enjoyable at any point.

I wish there were a QuickCPU-like program for Ryzen as I'd be interested to see what the results would be for lowering the idle power draw.

The biggest thing for me is socket intercompatibility. With AM4, my mother has a X370/2400G, my brother has a B450/3600, my uncle has a X470/2600X, my gf has a B450/3600, and I have a X570/3600X (for now). I can upgrade my CPU and step everyone down to get an upgrade. It just doesn't work that well with Intel. I definitely thought about a 10700k mostly just to try something new, but decided to just stick it out with my Ryzen system.
 
It's also interesting that you think people only play games or use Photoshop on a computer because those are essentially the only things Intel is winning at right now.

More to your point, you're making a grand assumption that Rocket Lake is going to clock like Skylake (e.g. your "speed is still king" statement). Even the most optimistic Intel supporters are generally in agreement that Rocket Lake clocks aren't going to be as fast as Skylake derived ones, and the idea is that the improvements to the architecture will make up the difference. Are you also going to claim that "its sad" when Intel can't reach 5Ghz with a new architecture?
First did I ever say those are the only things people use? Please point me to exactly where I said that. YOU were the one making definitive statements about Intel not able to compete which are outrageous and silly.
Also, please show me that report you have that shows Photoshop is essentially the only thing where Intel wins. Cause you LOVE to make bold statements with zero evidence to back you up. LOL

Regarding the speeds for next Intel NOBODY knows so any assumption you think I made is as valid as yours unless you have a time travel machine and already saw it.
And about the 5Ghz comment, let me try to put it in simple terms for you. I am taking about 5Ghz as it pertains to TODAY's technology where AMD is unable to beat Intel mostly due to Intel's superior speed. But 5Ghz is just a NUMBER, and if tomorrow Intel or AMD can perform better running at 1Mhz than what we do today at 5Ghz then why would I give a damn about what the number is???? Do you get it now ;)
 
First did I ever say those are the only things people use? Please point me to exactly where I said that. YOU were the one making definitive statements about Intel not able to compete which are outrageous and silly.
Also, please show me that report you have that shows Photoshop is essentially the only thing where Intel wins. Cause you LOVE to make bold statements with zero evidence to back you up. LOL

Regarding the speeds for next Intel NOBODY knows so any assumption you think I made is as valid as yours unless you have a time travel machine and already saw it.
And about the 5Ghz comment, let me try to put it in simple terms for you. I am taking about 5Ghz as it pertains to TODAY's technology where AMD is unable to beat Intel mostly due to Intel's superior speed. But 5Ghz is just a NUMBER, and if tomorrow Intel or AMD can perform better running at 1Mhz than what we do today at 5Ghz then why would I give a damn about what the number is???? Do you get it now ;)

Have you ever read a 3900x review? Techradar, Techpowerup, Tom's Hardware, Anandtech, Digital Trends, Techspot, etc. It's not just making things up. It's actually reading reviews and looking at numbers.

You didn't make any qualifying statements with your 5Ghz comment, so there's no need to attempt to be condescending toward me when you didn't explain yourself. You simply said that "AMD STILL cannot hit 5Ghz" and "Speed is still king."

I should also point out that you were responding to an Intel fan saying that Intel is going to be behind due to the process technology deficit, and you went all "BUT MY PROCEZZUR HAS 5 GIGAHURTZ!!!!!!!!!" on him.
 
Last edited:
You must be trolling as their is way to much arrogance in that post.

It's like you are bitter that 'kids with less money' can get within 10% gaming of those with unlimited resources ... using a 2080ti at 1080p.

Then the node stuff. First off, nobody except you was surprised that AMD did not shoot up to 5 ghz with a node shrink. Smaller node does NOT isually mean higher clocks.

Glad you have been doing a lot of reading, but nobody believes that nonsense about 10nm Intel.

Your entire post does not seem genuine. "serious about gaming and will not accept second best":
Bro, tone it down a bit. Even the biggest nerds on here had to cringe some when they read that.

No real person talks like this.

No?

It really is that simple. I do not care if it's only 10 - 15 - 30 - 40 frames difference, I want the best. Intel is the best at gaming. I am not a benchmarker like ... seemingly all the AMD guys are with their praise of, we have more cores and we spend $1.50 less in electricity at month and it was cheaper to purchase. I mean, really?

Are you sure you just don't like people that aren't on the AMD band wagon? That seems more likely than you not being sure or not, guessing to if i am trolling or not. And no, I am not trolling.

I'm not a fan boy of brands. I could care less of what branding or color my box has. I'm all about performance. If AMD were in Intels shoes in regards to the absolute best performance in games, I would be talking about my AMD CPU. Trust me. Some of us do have the money to throw at performance regardless of how silly that must seem from others POV.

If anything, I see and hear more from AMD people speaking out when Intel people praise their platform, not the other way around.

Just because your CPU cost less, has more cores and uses a bit less electricity really doesn't mean anything to me. One thing is for sure, all of that sure doesn't help having the best gaming performance.
 
No?

It really is that simple. I do not care if it's only 10 - 15 - 30 - 40 frames difference, I want the best. Intel is the best at gaming. I am not a benchmarker like ... seemingly all the AMD guys are with their praise of, we have more cores and we spend $1.50 less in electricity at month and it was cheaper to purchase. I mean, really?

Are you sure you just don't like people that aren't on the AMD band wagon? That seems more likely than you not being sure or not, guessing to if i am trolling or not. And no, I am not trolling.

I'm not a fan boy of brands. I could care less of what branding or color my box has. I'm all about performance. If AMD were in Intels shoes in regards to the absolute best performance in games, I would be talking about my AMD CPU. Trust me. Some of us do have the money to throw at performance regardless of how silly that must seem from others POV.

If anything, I see and hear more from AMD people speaking out when Intel people praise their platform, not the other way around.

Just because your CPU cost less, has more cores and uses a bit less electricity really doesn't mean anything to me. One thing is for sure, all of that sure doesn't help having the best gaming performance.

Once again, "Best" means different things to different people, and not everybody builds gaming centric builds. What is best for you is not best for everybody else and vice versa. For some people the power draw is important especially in SFF systems for example. I was just specing out an ITX build and I don't want to deal with the power issues. So the 10900k is not for me.
 
delivering it through a plate of copper and 500-600 pins in easy. The fact that those 200 amps are running through a piece of glass not much bigger than the end of my thumb in a controlled, predictable, and useful manner instead of melting is what’s awe inspiring to me.

That does seem amazing to me, that a little plate of silicon can carry that much current. The copper isn't so much since those big currents only exist between the VRMs and the socket, there's lots of pins and traces there to handle it. From the power supply to the VRMs it's not a big deal since voltage is ten times higher so current is ten times lower.

I'm not a big gamer, just play a few older titles here and there. The main use for my computer is productivity stuff so AMD is a good deal for me. Thermal performance is a consideration for me. I'm using a 3700x which doesn't have big power requirements and doesn't need anything fancy to run relatively cool. I would not be happy if I had to deal with some kind of extreme cooling solution and live with a CPU that runs at higher temperatures. So that fact that AMD CPUs run cooler is a big plus for me.
 
Back
Top