Intel Internal Document Details How It Will Counter the AMD Threat

You are missing the fact that I specified professionals playing in a highly competitive environment in very high paced games. It is definitely not always the case, 120Hz may very well be above and beyond what a professional needs in a slower paced game such as say LoL or DOTA2. But high paced FPS games are totally different beasts. The other fact I clearly stated is professionals, not your average joe shmoe who has average skill. I'm talking about the cream of the crop, the Shrouds if you will. You can ask any one of them and all of them will tell you, 144Hz minimum and as high of an FPS as you can afford and almost all of them game 1080p or 1440p, not a single pixel more.

And I think you're missing the point that the for every professional that needs that extra boost that Intel might provide, however small that might be, there are thousands upon thousands of amateurs who could benefit from saving a few bucks by buying a 3700X vs a 9900k. That was the whole point.
 
Against scrubs maybe but pit a pro gamer vs a pro gamer and and that extra fps matters.

Proper pro gaming events are all localised and the machines are the same hardware and software so as to rule out unfair advantages, online playing is almost exclusively scrubs and when you run into pro gamers you tend to know the difference. When i played open servers i would run around withe the mares leg and get called a hacker all the time, it was funny.
 
So this is where all the tech press acquisitions went into.

And it's nice to see how Intel "works with" tech sites from their new employees to write articles.

View attachment 170905

“Performance are based on testing as of May 17, 2019 and may not reflect all publicly available security updates”
“Software and workloads used in performance testing may have been optimized for performance only on Intel microprocessors”

So much for REAL WORLD commitment.

Looks right to me. Game developers are going to optimize for Intel first and AMD second, and a 9900K should be 20%-30% faster than a 2700X.
 
It all comes down to personal needs and that is subjective with many smaller factors playing into it, not everyone's needs are for utmost FPS or highest system cost. Statistically the middle ground is the highest likely sales points where people blend needs and affordability an area where AMD is soaking up sales.
 
Thanks for the reply.

Would you mind explaining how the server, at 128 ticks, interacts with the local game engine running at 500FPS? Again, I'm thinking of LoL and CS:GO as they are popular.
- Data has to be sent to the server, that takes time.
- Data has to be processed on the server, that takes time,
- Data has to be sent to the client, that takes time.
- How would the engine, running at 500FPS, react to having it's state needing to be synced up to the state the server sends it?*


* I realise that this is game specific, some engines might just alter their local state with input from the server, and occasionally reset it's state. (Hello lag!)

All decent games now days have a ton of client side prediction. This means having more fps feels smoother. It also means that see anything that you recieve from the server as fast as possible.

For instance apex legendary only runs at a tick rate of 20, pubg has run as low as 11, but you wont hear anyone who says it's a good idea to cap your fps at 20.
 
All decent games now days have a ton of client side prediction. This means having more fps feels smoother. It also means that see anything that you recieve from the server as fast as possible.

For instance apex legendary only runs at a tick rate of 20, pubg has run as low as 11, but you wont hear anyone who says it's a good idea to cap your fps at 20.

and pubg @ 11 sucks. haven't tired apex but tick rate of 20? this sounds like we are going backwards.
 
All decent games now days have a ton of client side prediction. This means having more fps feels smoother. It also means that see anything that you recieve from the server as fast as possible.

For instance apex legendary only runs at a tick rate of 20, pubg has run as low as 11, but you wont hear anyone who says it's a good idea to cap your fps at 20.

Yeah I hate this and stopped playing seriously because of this... I miss CS 1.x (hey I want to include everyone even if I the earliest were the best)...
The new games with auto matchmaking gathering a party of "equal" skills scattered around the country with predictions is killing it for me... We should be able to select our server / host it and I would only play with server in my country (unless competitive).

But I get it, they need to go for the mass market and not the niche one... I do something else with my time now lol.

EDIT: I guess it's "fair" because everyone suffer from the same thing (level of been affected is random I guess) but nope, I can't play a game seriously hoping I'm on the good side of luck. I'm not playing lotto...
 
You are comparing a 490 euro cpu to a 290 euro cpu , I should hope it´s a lot faster.
  • One includes GPU, the other doesn't.
  • One is made on a more advanced process node and can OC to 5GHz on air, the other cannot.
  • One includes a pair of 256bit AVX units and 256bit datapaths, the other doesn't.
  • Performance is a nonlinear function of price. That is also why the R5-3600X is 20% more expensive than R5-3600 but isn't 20% faster or why the R9-3950X is 50% more expensive than R9-3900X but isn't 50% faster.
 
trolling
Someone has the bold out to make a statement.

AVX for general use is meh

OC's to 5ghz on 3rd party expensive cooling but failed to operate stock with a generic Intel cooler

Most expensive node with the most expensive back door intrusions

9900k is more expensive than the 9700k but offers negligible gains so you are right on the last point
 
CS:GO, Overwatch run on parties at beyond refresh rates of high end monitors on low end hard ware, the determining factor is human twitch reflexes and not hardware,shroud will destroy you on a 9900k and 2080ti SLI with a core 2 duo and 8800gtx because he is simply much faster.

We had a gaming team for battlefield 4, one of our players used an i7 920 and GTX560 to and would finish 30 minute rounds with 200 + kills without using vehicles a pro gamer will not be held back by hardware. More FPS is the unjustified excuse onlinegamers cry, as a top 50 player in battlefield 4 I played with garbage hardware for most of the years the game dominated, I used a i5 2400 and GTX 670 which was mediocre hardware, I got a 5960x in 2016 which was the final year the game was prominent before battlefield 1 and later 5 released

Suppose we bribe Scotty or Chief O'Brien to spawn two teams worth of copies of you any awesome player to your liking. For the sake of argument, let's be generous and consider Battlefield a competitive game, and pit said two teams against each other in BF5.

Team Red
- gets low end membrane keyboards with a two key limit
- is equipped with old laser mice and no mouse pads
- gorgeous 10 bit HDR 60 Hz LCD designer screens

Team Blue
- gets to play with whatever they want

My question to you is: who would win in this hypothetical scenario?


p.s. You're making the type of argument pay2win guys make when explaining how if you're good enough you can destroy idiots who have sank thousands in in-game gear. That may be true. But what about the times when you're not playing noobs? Think, McFly, think!
 
Suppose we bribe Scotty or Chief O'Brien to spawn two teams worth of copies of you any awesome player to your liking. For the sake of argument, let's be generous and consider Battlefield a competitive game, and pit said two teams against each other in BF5.

Team Red
- gets low end membrane keyboards with a two key limit
- is equipped with old laser mice and no mouse pads
- gorgeous 10 bit HDR 60 Hz LCD designer screens

Team Blue
- gets to play with whatever they want

My question to you is: who would win in this hypothetical scenario?


p.s. You're making the type of argument pay2win guys make when explaining how if you're good enough you can destroy idiots who have sank thousands in in-game gear. That may be true. But what about the times when you're not playing noobs? Think, McFly, think!


As I stated elsewhere, competitive gaming is fixed hardware provided at the venue, there is no personal computer or hardware so as to negate unfair advantage. The only thing they allow you of yoru own is keyboard and mouse due to comfort levels but it is stringently checked for macros. Other than that the systems and monitors are all the same. In DGL the teams with the better team work wins.
 
As I stated elsewhere, competitive gaming is fixed hardware provided at the venue, there is no personal computer or hardware so as to negate unfair advantage. The only thing they allow you of yoru own is keyboard and mouse due to comfort levels but it is stringently checked for macros. Other than that the systems and monitors are all the same. In DGL the teams with the better team work wins.

Ah, but I thought we're doing our own hypothetical competition just to prove a point. We are pitting exact copies against each other.

But ok. Even with just peripherals being different, who would win in our scenario?
 
AMD has always been about smoke and mirrors after 2005 or so. People buy them anyway because they have cooler heat spreaders.
 
Ah, but I thought we're doing our own hypothetical competition just to prove a point. We are pitting exact copies against each other.

But ok. Even with just peripherals being different, who would win in our scenario?


If you purposefully give junk hardware then you are scewing the outcome, if you are infering that the junk hardware is AMD that is also incorrect as current AMD parts game well enough to not be a holding back outcome, most well over 100FPS, even with a high quality 60hz monitor it will not be a hinderance to a professional. But if we are saying one team gets Pentium D's with GTX 260's and the other team gets new hardware then the better hardware will win just down to unplayability on the other system.
 
  • One includes GPU, the other doesn't.
  • One is made on a more advanced process node and can OC to 5GHz on air, the other cannot.
  • One includes a pair of 256bit AVX units and 256bit datapaths, the other doesn't.
  • Performance is a nonlinear function of price. That is also why the R5-3600X is 20% more expensive than R5-3600 but isn't 20% faster or why the R9-3950X is 50% more expensive than R9-3900X but isn't 50% faster.

Lol at the igpu, really...o_O
yes it can hit 5GHz spewing out up to 250 w of heat , that´s pretty toasty, did you say throttle ?
Either way , another week or so till Zen 2 launch, I think the only way Intel will be able to counter is with some more drastic price cuts.
 
Lol at the igpu, really...o_O
yes it can hit 5GHz spewing out up to 250 w of heat , that´s pretty toasty, did you say throttle ?
Either way , another week or so till Zen 2 launch, I think the only way Intel will be able to counter is with some more drastic price cuts.

Lower prices, shareholder would hate that. It would be like admitting defeat or that AMD HW is competitive to "that" point. But I agree with you, we shall see !
 
Lower prices, shareholder would hate that. It would be like admitting defeat or that AMD HW is competitive to "that" point. But I agree with you, we shall see !

Intel is already, supposedly, going to lower prices a bit. Nothing significant, but they're basically admitting that AMD is competitive and that they have nothing to counter them just by doing a small cut.
 
Intel is already, supposedly, going to lower prices a bit. Nothing significant, but they're basically admitting that AMD is competitive and that they have nothing to counter them just by doing a small cut.

Yes but those price adjustment are minimal compared to what it would "need" to be from an average joe perspective but I agree they're reacting.
 
Yes but those price adjustment are minimal compared to what it would "need" to be from an average joe perspective but I agree they're reacting.

I'll agree they are reacting when there are actual price adjustments.
 
This is why I used to term "supposedly". His response to me has that "supposedly" already implied due to my use of it.

I wasn't replying to you, and threads aren't logically consistent enough here to assume that every reply assumes agreement with everything in the previous post.
 
I wasn't replying to you, and threads aren't logically consistent enough here to assume that every reply assumes agreement with everything in the previous post.

When discussing something theoretical, the fact that it is theoretical needs to be established only once. Every person in the theoretical discussion does not need to constantly reestablish the theoretical nature of that discussion. The fact that the discussion is based on a theoretical topic is implied in every response after the fact was established.

I established that the price cuts are theoretical, he responded directly to me making that statement. Therefore, the discussion is still theoretical in nature. He did not need to say "if they happen" to reestablish that the topic is theoretical as that is already implied by what he is responding to. You decided to willfully ignore the context of the conversation you were replying to in order to put up some kind of lame defense of Intel.
 
Take your bickering elsewhere, and let's get back on topic.
 
Lol at the igpu, really...o_O
yes it can hit 5GHz spewing out up to 250 w of heat , that´s pretty toasty, did you say throttle ?

No.
Integrated GPU is not horrible. More than sufficient for some users. I use it mainly for quicksync..

250W? You have to be kidding. More like 160-185w. (That’s a good 65W less than you quoted). KS is likely to do better.
...
 
So this is where all the tech press acquisitions went into.

And it's nice to see how Intel "works with" tech sites from their new employees to write articles.

View attachment 170905

“Performance are based on testing as of May 17, 2019 and may not reflect all publicly available security updates”
“Software and workloads used in performance testing may have been optimized for performance only on Intel microprocessors”

So much for REAL WORLD commitment.

Yeah.

Publishing an article with a performance comparison against Zen+ a week before Zen2 launched is disingenuous at best. Add to that that they are testing without all hardware mitigation patches, and it is downright misleading.

They also don't include raw framerate numbers. If this is a "120fps vs 144fps" issue, who cares? It is irrelevant. It's like the old Q3A CPU benchmarks. "Mine is faster because I get 400fps in Q3A and you only get 380...."

At least they had the common decency to have the y axis on the chart at zero. That's the number one way disingenuous marketeers overstate their benefit.
 
Back
Top