Intel Internal Document Details How It Will Counter the AMD Threat

Pretty cool to hear some honest response from them, even if it was an internal memo. They definitely know they need to step up their game.

I think they really need to start looking at their whole stack. Especially now their offerings just don't line up.
$220 to get a 6/6 chip that can actually overclock. Thats not going to fly anymore.
 
The document mentions 15K developers.

I'd like to know what software these developers are making. Is what they are doing mostly server-side we don't see?

I think about 98% of those are focused on implementing and changing crap in Windows 10 that nobody asked for.

And they have maybe 5 people doing any type of testing before they release patches or new features.

Yes, I like Windows 10, but their process for actually testing stuff seems to be almost entirely done by those on the "fast ring" for updates.
 
I think about 98% of those are focused on implementing and changing crap in Windows 10 that nobody asked for.

And they have maybe 5 people doing any type of testing before they release patches or new features.

Yes, I like Windows 10, but their process for actually testing stuff seems to be almost entirely done by those on the "fast ring" for updates.

The Fast Ring is for beta testing the newest updates, I actually think it was smart for MS to do it this way instead of having internal testing only because there are literally millions of different combinations of hardware that need to be tested and there is no way MS could do that internally.
 
The Fast Ring is for beta testing the newest updates, I actually think it was smart for MS to do it this way instead of having internal testing only because there are literally millions of different combinations of hardware that need to be tested and there is no way MS could do that internally.

I know.. and I run the fast ring.

The amount of bugs reported in fast ring builds that make it into the final builds kinda makes me think that MS is not even bothering to take time to fix a bunch of stuff that has been reported.
 
I know.. and I run the fast ring.

The amount of bugs reported in fast ring builds that make it into the final builds kinda makes me think that MS is not even bothering to take time to fix a bunch of stuff that has been reported.

I had a bug that took 2 normal ring releases to solve.
 
Plan to Destroy AMD:

Phase 1: Collect Underpants
Phase 2: Claim that Intel is still better than AMD
Phase 3: ?
Phase 4: PROFITS!


P.S. If it's still working after 40 years, you don't change the secret sauce....
 
I know.. and I run the fast ring.

The amount of bugs reported in fast ring builds that make it into the final builds kinda makes me think that MS is not even bothering to take time to fix a bunch of stuff that has been reported.

Like the latest June patches that broke the event viewer on our 2012R2/2016 servers at work?.....
 
Intel's plan is always the same: pull out the gaming bechmarks

AMD is great for content creation and multi-tasking but until they catch up in gaming Intel will always have the lead
 
Intel's plan is always the same: pull out the gaming bechmarks

AMD is great for content creation and multi-tasking but until they catch up in gaming Intel will always have the lead

In fairness, most of the gaming benchmarks don't really mean anything. If 3000 series chips are within 5% of gaming performance, it shouldn't make a difference. The closest direct comparison Intel CPU is still $480. I can't imagine that the extra 5% is worth $150 to the general masses.
 
In fairness, most of the gaming benchmarks don't really mean anything. If 3000 series chips are within 5% of gaming performance, it shouldn't make a difference. The closest direct comparison Intel CPU is still $480. I can't imagine that the extra 5% is worth $150 to the general masses.

That depends on the game to be completely honest. Any fast paced highly competitive game such as CS, OW, Apex Legends, etc. that every extra fps give you an advantage over your competition, I wouldn't dream of using anything but the best. And I would dare to say that when such games as your livelihood such as streamers and professional gamers, they should absolutely always game on the fastest hardware to make sure they have every advantage they can get.

For everyone else, it doesn't really matter. The vast majority game at 1080p, 60Hz and don't need the very best. That vast majority is also buying pre-built system where Intel still dominates in sales.
 
That depends on the game to be completely honest. Any fast paced highly competitive game such as CS, OW, Apex Legends, etc. that every extra fps give you an advantage over your competition, I wouldn't dream of using anything but the best. And I would dare to say that when such games as your livelihood such as streamers and professional gamers, they should absolutely always game on the fastest hardware to make sure they have every advantage they can get.

For everyone else, it doesn't really matter. The vast majority game at 1080p, 60Hz and don't need the very best. That vast majority is also buying pre-built system where Intel still dominates in sales.

For streamers, more cores could be more important than a 5% (basically within margin of error in many benchmarks) difference in IPC.
 
For streamers, more cores could be more important than a 5% (basically within margin of error in many benchmarks) difference in IPC.

For streamers that stream for a living, a secondary dedicated streaming machine is worth infinitely more than more cores on one computer. This is also not an opinion, it's a fact.
 
For streamers that stream for a living, a secondary dedicated streaming machine is worth infinitely more than more cores on one computer. This is also not an opinion, it's a fact.

True, though not all of them want to go the dual computer route or have the space for it.
 
The one area that Intel really let me down was their intro to micro computing and controllers. They had such a good initial showing with the Edison and then they just ... dropped it all. This was a good area for them and they dropped the ball.
 
That depends on the game to be completely honest. Any fast paced highly competitive game such as CS, OW, Apex Legends, etc. that every extra fps give you an advantage over your competition, I wouldn't dream of using anything but the best.
Honest questions: (I'm thinking LoL here after watching my nephew play a match last night, but also e.g. CS:GO.)

1) How, exactly, is it advantageous to have the engine run at say, 510 FPS on Intel vs 490FPS on AMD when your monitor is running at, let's be extreme, 200Hz?

2) Additionally, what is the polling rate of their network servers? BF1 was 60x per second, so let's double that and say that it's 120x per second.
 
3) How many people are actually making money off of competitive gaming and streaming vs the masses who just buy computers and can't tell the difference between Product A and Product B in real life?
 
3) How many people are actually making money off of competitive gaming and streaming vs the masses who just buy computers and can't tell the difference between Product A and Product B in real life?

A lot. It’s a growth industry worldwide
 
Honest questions: (I'm thinking LoL here after watching my nephew play a match last night, but also e.g. CS:GO.)

1) How, exactly, is it advantageous to have the engine run at say, 510 FPS on Intel vs 490FPS on AMD when your monitor is running at, let's be extreme, 200Hz?

2) Additionally, what is the polling rate of their network servers? BF1 was 60x per second, so let's double that and say that it's 120x per second.

1) The higher the fps, the lower the input lag regardless of monitor refresh rate

2) Depends on the game, I know CS:GO has 64 and 128 tick servers.
 
1) The higher the fps, the lower the input lag regardless of monitor refresh rate

2) Depends on the game, I know CS:GO has 64 and 128 tick servers.

Few % input lag difference at those frames is imperceptible.
 
....I'm not generally a conspiracy theorist, however this entire format is questionable to me, coupled with the lack of certain pertinent information....

It's a morale booster and some damage control. Not sure "conspiracy" is even remotely accurate description.

so... translation.. "we don't have a fucking clue on how to fight agaisn't AMD, we lost this fight, we are fucked until couple of years and we are going to try to do some shady things to force companies intel stuff instead of AMD as we did in the past.."..

Force companies to do what?? The document has nothing to do with that. Intel has been guilty of a lot of things, but that isn't what this document is about (and that was a long time ago).
 
In professional competitions by professionals, every % is perceivable. I'm sorry, but you wrong here.

No, every percentage is NOT perceivable above a certain level of PC performance.
That's the point some posters are trying to make.

If a professional wants to believe it's perceivable and purchase accordingly, that's of course their choice, but once above this very high threshold in performance, which AMD and Intel can both reach, there's no perceptible gain given to a player. Zero.

Professional competitors typically also run games at the lowest possible graphical settings that don't impact their gameplay.
You do not need a high-end PC to achieve that.

It's a morale booster and some damage control. Not sure "conspiracy" is even remotely accurate description.

Hence why I only asked the question and mused that it was a curious document to be reading.
 
No, every percentage is NOT perceivable above a certain level of PC performance.
That's the point some posters are trying to make.

If a professional wants to believe it's perceivable and purchase accordingly, that's of course their choice, but once above this very high threshold in performance, which AMD and Intel can both reach, there's no perceptible gain given to a player. Zero.

Professional competitors typically also run games at the lowest possible graphical settings that don't impact their gameplay.
You do not need a high-end PC to achieve that.



Hence why I only asked the question and mused that it was a curious document to be reading.

This reminds me of the horseshit claims that people couldn't possibly discern anything over 30 fps, then 60 fps, and most recently 120 fps and 144 fps.

Guess what, you might be oblivious to the differences, but other people are more perceptive to those differences.

Claiming something is imperceivable is your opinion, not a fact.
 
No, every percentage is NOT perceivable above a certain level of PC performance.
That's the point some posters are trying to make.

If a professional wants to believe it's perceivable and purchase accordingly, that's of course their choice, but once above this very high threshold in performance, which AMD and Intel can both reach, there's no perceptible gain given to a player. Zero.

Professional competitors typically also run games at the lowest possible graphical settings that don't impact their gameplay.
You do not need a high-end PC to achieve that.

Agreed. Above a certain level it is all placebo effect. Same as with the Audiophile community. At least high end PC hardware costs less than some of the ridiculous car priced audio jewelry some of those clowns convince themselves they can hear the difference in. :whistle:

Now don't get me wrong. If I had GPU performance coming out of my wazoo, sure, I'd go for a little higher framerate/refresh just to be on the same side, but as it stands, even the fastest GPU on the market, overclocked on water, isn't really fast enough for 4k Ultra 60hz, and I know which matters more to me. I'm definitely choosing the 4k and ultra settings.

That said, I don't really play competitive titles online anymore. I've lost interest. If I did, I'd probably be focused a little bit more on framerate, just to be on the safe side. Still 240hz seems a little bit nuts. What even renders that fast on any hardware. Q3A? Is anyone still playing that?
 
This reminds me of the horseshit claims that people couldn't possibly discern anything over 30 fps, then 60 fps, and most recently 120 fps and 144 fps.

Guess what, you might be oblivious to the differences, but other people are more perceptive to those differences.

Claiming something is imperceivable is your opinion, not a fact.

You must agree that at some point humans can no longer tell a difference right? Or is it just "the faster the better"? Maybe we should have Mhz refresh rates?

Why is it so difficult for you to accept that at some point we reach the limits of human abilities above which there is no more perceivable difference, and that this level is likely somewhere under 240hz?

I don't claim to know exactly where that limit is, but it is probably slightly different from person to person, and I'd argue its probably below 120hz.


I really wish I had the time and money. I'd run a scientific experiment.

I'd construct two identical PC's with high end hardware and a 240hz g-sync screen and set them up side by side.

Next I'd install a game on them. Something old so that it can hit all the way up to 240hz consistently. It would also have to be something with a frame limit. Again, Q3A maybe? Fans would be set to 100% as to not give away which system was working harder.

Then I'd create a random test schedule. Each of the framerates (24, 30, 60, 75, 90, 120, 240 should do it) would be compared against each of the others, three times per matchup.

I'd recruit a few hundred test subjects. As part of recruitment process we would note their age, gender, how much they play games, and their preferred platform (mobile, console, pc)

Then test would happen something like this:

1.) Test tech sets one refresh rate on one screen and one on the other, per the predetermined test schedule.
2.) Test tech leaves room, and test observer enters.
3.) The observer is just there to make sure the test subject does not cheat and look at settings. The observer does not know the settings of each of the systems as to not bias the test subject.
4.) The test subject gets to move back and forth between the two systems as many times as they want until they are ready to answer two questions: Which system they believe has the higher framerate, and if they thought it was a small, moderate or large difference.
5.) test subject and observer leaves room
6.) Test tech enters, sets up the two machines again with the next two refresh rates to be compared.
7.) Tech leaves and Observer and subject enters again.

Repeat until all matchups have been done 3x times on all subjects.

Analyze results. Heck, maybe even write up a paper and submit to journals.

I think they would surprise many of you, just like the HydrogenAudio.org results regarding mp3 sound quality did, when it turns out that audiophiles on high end equipment couldn't pick which sample was an uncompressed wave file and which was an mp3 compressed with alt-preset standard VBR settings more than 50% of the time (essentially the same as chance)


Placebo is HUGE in these types of decisions, especially as everyone feels the need to defend their choices for some silly reason. Our brains experience what we want them to experience, not what really happens. This is the same for all humans. You cannot trust your own experiences, no matter who you are.
 
You must agree that at some point humans can no longer tell a difference right? Or is it just "the faster the better"? Maybe we should have Mhz refresh rates?

Why is it so difficult for you to accept that at some point we reach the limits of human abilities above which there is no more perceivable difference, and that this level is likely somewhere under 240hz?

I don't claim to know exactly where that limit is, but it is probably slightly different from person to person, and I'd argue its probably below 120hz.


I really wish I had the time and money. I'd run a scientific experiment.

I'd construct two identical PC's with high end hardware and a 240hz g-sync screen and set them up side by side.

Next I'd install a game on them. Something old so that it can hit all the way up to 240hz consistently. It would also have to be something with a frame limit. Again, Q3A maybe? Fans would be set to 100% as to not give away which system was working harder.

Then I'd create a random test schedule. Each of the framerates (24, 30, 60, 75, 90, 120, 240 should do it) would be compared against each of the others, three times per matchup.

I'd recruit a few hundred test subjects. As part of recruitment process we would note their age, gender, how much they play games, and their preferred platform (mobile, console, pc)

Then test would happen something like this:

1.) Test tech sets one refresh rate on one screen and one on the other, per the predetermined test schedule.
2.) Test tech leaves room, and test observer enters.
3.) The observer is just there to make sure the test subject does not cheat and look at settings. The observer does not know the settings of each of the systems as to not bias the test subject.
4.) The test subject gets to move back and forth between the two systems as many times as they want until they are ready to answer two questions: Which system they believe has the higher framerate, and if they thought it was a small, moderate or large difference.
5.) test subject and observer leaves room
6.) Test tech enters, sets up the two machines again with the next two refresh rates to be compared.
7.) Tech leaves and Observer and subject enters again.

Repeat until all matchups have been done 3x times on all subjects.

Analyze results. Heck, maybe even write up a paper and submit to journals.

I think they would surprise many of you, just like the HydrogenAudio.org results regarding mp3 sound quality did, when it turns out that audiophiles on high end equipment couldn't pick which sample was an uncompressed wave file and which was an mp3 compressed with alt-preset standard VBR settings more than 50% of the time (essentially the same as chance)


Placebo is HUGE in these types of decisions, especially as everyone feels the need to defend their choices for some silly reason. Our brains experience what we want them to experience, not what really happens. This is the same for all humans. You cannot trust your own experiences, no matter who you are.

You are missing the fact that I specified professionals playing in a highly competitive environment in very high paced games. It is definitely not always the case, 120Hz may very well be above and beyond what a professional needs in a slower paced game such as say LoL or DOTA2. But high paced FPS games are totally different beasts. The other fact I clearly stated is professionals, not your average joe shmoe who has average skill. I'm talking about the cream of the crop, the Shrouds if you will. You can ask any one of them and all of them will tell you, 144Hz minimum and as high of an FPS as you can afford and almost all of them game 1080p or 1440p, not a single pixel more.
 
You must agree that at some point humans can no longer tell a difference right? Or is it just "the faster the better"? Maybe we should have Mhz refresh rates?

Human perception is fluid- I think we'll be able to pin visual responses down to a point beyond which is imperceptable, just like CD audio has done, but we're not there yet.

Also note that with MP3s, you're adding another layer on top of the complexity that mastering already is. And while with audio we have subjective 'better / worse' perception, the reality is that it's just 'different' unless the encoding is exceedinly poor.

If I were to guess, something around 1000Hz might do it, and 200Hz is still not 'enough'.
 
Intel's plan is always the same: pull out the gaming bechmarks

AMD is great for content creation and multi-tasking but until they catch up in gaming Intel will always have the lead

The very very very weak low margins market of "desktops" is not where the money is. In fact, even at its height it was small potatoes... however, certainly doesn't hurt marketing.

When you see AMD hardware take over entire datacenters, then, if I were Intel, I'd be very very concerned. That's the high volume, big dollar market.
 
So this is where all the tech press acquisitions went into.

And it's nice to see how Intel "works with" tech sites from their new employees to write articles.

intel-vs-amd-gaming.jpg


“Performance are based on testing as of May 17, 2019 and may not reflect all publicly available security updates”
“Software and workloads used in performance testing may have been optimized for performance only on Intel microprocessors”

So much for REAL WORLD commitment.
 
Last edited:
Because no professional would ever use freesync or gsync. What kind of trolling question is this even?
The additional nvidia lag observed there is in fixed refresh rate mode. And the bigger your FPS number, the bigger the additional input lag is. So it is very much an esports question.
 
That depends on the game to be completely honest. Any fast paced highly competitive game such as CS, OW, Apex Legends, etc. that every extra fps give you an advantage over your competition, I wouldn't dream of using anything but the best. And I would dare to say that when such games as your livelihood such as streamers and professional gamers, they should absolutely always game on the fastest hardware to make sure they have every advantage they can get.

For everyone else, it doesn't really matter. The vast majority game at 1080p, 60Hz and don't need the very best. That vast majority is also buying pre-built system where Intel still dominates in sales.


CS:GO, Overwatch run on parties at beyond refresh rates of high end monitors on low end hard ware, the determining factor is human twitch reflexes and not hardware,shroud will destroy you on a 9900k and 2080ti SLI with a core 2 duo and 8800gtx because he is simply much faster.

We had a gaming team for battlefield 4, one of our players used an i7 920 and GTX560 to and would finish 30 minute rounds with 200 + kills without using vehicles a pro gamer will not be held back by hardware. More FPS is the unjustified excuse onlinegamers cry, as a top 50 player in battlefield 4 I played with garbage hardware for most of the years the game dominated, I used a i5 2400 and GTX 670 which was mediocre hardware, I got a 5960x in 2016 which was the final year the game was prominent before battlefield 1 and later 5 released

As to pre builds depends on where you buy, if you buy sell etc those aren't gaming machines and if you buy from custom rig stores they sell amd and Intel customisable rigs
 
CS:GO, Overwatch run on parties at beyond refresh rates of high end monitors on low end hard ware, the determining factor is human twitch reflexes and not hardware,shroud will destroy you on a 9900k and 2080ti SLI with a core 2 duo and 8800gtx because he is simply much faster.

We had a gaming team for battlefield 4, one of our players used an i7 920 and GTX560 to and would finish 30 minute rounds with 200 + kills without using vehicles a pro gamer will not be held back by hardware. More FPS is the unjustified excuse onlinegamers cry, as a top 50 player in battlefield 4 I played with garbage hardware for most of the years the game dominated, I used a i5 2400 and GTX 670 which was mediocre hardware, I got a 5960x in 2016 which was the final year the game was prominent before battlefield 1 and later 5 released

As to pre builds depends on where you buy, if you buy sell etc those aren't gaming machines and if you buy from custom rig stores they sell amd and Intel customisable rigs


Against scrubs maybe but pit a pro gamer vs a pro gamer and and that extra fps matters.
 
1) The higher the fps, the lower the input lag regardless of monitor refresh rate

2) Depends on the game, I know CS:GO has 64 and 128 tick servers.
Thanks for the reply.

Would you mind explaining how the server, at 128 ticks, interacts with the local game engine running at 500FPS? Again, I'm thinking of LoL and CS:GO as they are popular.
- Data has to be sent to the server, that takes time.
- Data has to be processed on the server, that takes time,
- Data has to be sent to the client, that takes time.
- How would the engine, running at 500FPS, react to having it's state needing to be synced up to the state the server sends it?*


* I realise that this is game specific, some engines might just alter their local state with input from the server, and occasionally reset it's state. (Hello lag!)
 
Back
Top