Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Not really sure what you are on about with regards to freesync/gsync. With 144hz ,165hz, and 240hz monitors you are probably going to need that extra 12%
They aren't dropping xeons (unless I missed a news item), they're just recommending epyc as an alternative to (some) customers in the face of current supply constraints.I'm just really hoping Intel realizes they are soon to be no longer the top dog and adjust pricing accordingly. Even HP Enterprise is dropping the Xeon in favor of Epyc chips.
And these people are not going to notice this extra performance compared to the cheaper CPU from the next tier down.
yea if you are going for frames at 1080p. let me know when you see a card push either 144fps let alone 240fps consistently at ultra even at 1080P. That is the whole purpose of freesync and gsync. I don't get this idea about chasing hz! How many gamers actually drop quality on purpose to increase their frames? Hey lets build a 4k gaming rig and drop quality and chase that fps, makes no sense.
80-column cards were far better than paper tape, you just had to remember to set up the program ("drum") card on your IBM 029 properly.Paper tape? What luxury, I still have the cards. You fatcats enjoy your paper tape!
I used to have to enter the boot loader into a pdp11 using the front-panel toggle switches, one word at a time.I used to have to load a 16k program into a pdp11 with no console, that was a piece of test equipment.
Everytime the power failed.
Gsync with a lower framerate doesn't look or feel anywhere near as smooth as just having a higher framerate/refresh rate in the first place, at least for me. It's a nice thing to have if your framerate is dropping low but it's no substitute for just having more frames. I agree pushing for 144 and 240hz is a bit pointless though. I love gaming at high refresh rates (which is why I'm still using a 1080p gysnc monitor) but beyond maybe 100hz there are seriously diminishing returns.
Gsync with a lower framerate doesn't look or feel anywhere near as smooth as just having a higher framerate/refresh rate in the first place, at least for me. It's a nice thing to have if your framerate is dropping low but it's no substitute for just having more frames. I agree pushing for 144 and 240hz is a bit pointless though. I love gaming at high refresh rates (which is why I'm still using a 1080p gysnc monitor) but beyond maybe 100hz there are seriously diminishing returns.
Had to periodically clean the fingers on the cards with a pencil eraser too.
yea if you are going for frames at 1080p. let me know when you see a card push either 144fps let alone 240fps consistently at ultra even at 1080P. That is the whole purpose of freesync and gsync. I don't get this idea about chasing hz! How many gamers actually drop quality on purpose to increase their frames? Hey lets build a 4k gaming rig and drop quality and chase that fps, makes no sense.
Gsync with a lower framerate doesn't look or feel anywhere near as smooth as just having a higher framerate/refresh rate in the first place, at least for me. It's a nice thing to have if your framerate is dropping low but it's no substitute for just having more frames. I agree pushing for 144 and 240hz is a bit pointless though. I love gaming at high refresh rates (which is why I'm still using a 1080p gysnc monitor) but beyond maybe 100hz there are seriously diminishing returns.
I completely agree. My other rig is a 2600k/1080ti/1440p-144hz-Gsync setup. I play with everything at ultra/manually maxed. Most modern games I play are so demanding they average 60-110fps. Very, very, rarely I see them hit 144fps. I can't really tell the difference at above 120 but the sweet spot seems to be ~90-110. SOTTR gave occasional drops into 45-55 and even though there was no noticeable tearing, the drop was definitely perceivable. At this point I'd rather see increased color accuracy/HDR/contrast ratios than hz.
yea if you are going for frames at 1080p. let me know when you see a card push either 144fps let alone 240fps consistently at ultra even at 1080P. That is the whole purpose of freesync and gsync. I don't get this idea about chasing hz! How many gamers actually drop quality on purpose to increase their frames? Hey lets build a 4k gaming rig and drop quality and chase that fps, makes no sense.
My clients do if I read your comment and phrasing correctly.
If asked to build a without any preferences except that it just runs, then I can build a rig with the goal of 90% performance at 70% (or less) in price.
Last year and this year I built such a rig. So for $1000 it will perform as will as many of the $1400 at %90 or better in performance.
So I just saved my client $400 for something they will hardly know the difference. It adds up with you build 4 competitive rigs for the price of 3 if you purchase elsewhere.
This is the reason why choose certain things at certain market points because I know that there is a certain value that can be applied to my customer base and everyone is happy.
This info does not come free or easy. It's 28 years build rigs and going on 32 years overall (cry) longer if you count me messing with Trash 80's back in late 79.
Or messing with the teletypes (land line to the local university at the time) trying to learn computer programming.... god this dates me...
yea if you are going for frames at 1080p. let me know when you see a card push either 144fps let alone 240fps consistently at ultra even at 1080P. That is the whole purpose of freesync and gsync. I don't get this idea about chasing hz! How many gamers actually drop quality on purpose to increase their frames? Hey lets build a 4k gaming rig and drop quality and chase that fps, makes no sense.
Gamers, as this CPU is obviously aimed at gamers since the test results are all games. Obviously a 9900K is good for productivity as well.Which people? You cannot speak for everyone as everyone has different goals.
I'm not happy with AMD either, as AMD didn't exactly keep their CPU prices in line with their own previous generation. AMD is pricing their products based on what Intel is pricing, which Intel has long ago left the realm of reasonable prices. You can't expect game developers to use more than 2-3 cores if the majority of people have 2-4 core CPUs. Anything above $200 is something people avoid buying for CPUs, and the benefits to games beyond 4 cores diminishes greatly.I'm just really hoping Intel realizes they are soon to be no longer the top dog and adjust pricing accordingly. Even HP Enterprise is dropping the Xeon in favor of Epyc chips.
Which people? You cannot speak for everyone as everyone has different goals.
This isn't new though, right? Incremental performance at the top end is never on a linear perf/$ line. You want top dog? Pay out the arse.
Yeah, for other people, they'd be much better served with a 2700X, 9700, or a rung down from those. If you give a crap about perf/$. Some don't.
...which is what we expected?
Yes, you're going to pay for that extra 12%+. That's also to be expected, and as alluded to above, you cannot get this performance in an AMD product so Intel is free to price as they wish.
Unless you buy a 2080 TIThat is not entirely accurate. Simply slamming the sliders to the right and max things out, at least at 4k, does not automatically mean the game looks better. Reducing some settings to get playable framerates well still looking good is the thing to do. At 1080p, higher fps with a monitor that can match probably does make a difference, I just have never tried it myself.
yea if you are going for frames at 1080p. let me know when you see a card push either 144fps let alone 240fps consistently at ultra even at 1080P. That is the whole purpose of freesync and gsync. I don't get this idea about chasing hz! How many gamers actually drop quality on purpose to increase their frames? Hey lets build a 4k gaming rig and drop quality and chase that fps, makes no sense.
Somehow I don't think paying 66% more money for only a 12% performance difference is very economical.
Seeing as only those with money to burn upgrade every single cycle.
Go AMD.... save enough money to upgrade to a new AMD chip next cycle. Cause all you will have to do is pick up a new CPU and plop it in. Counting the ebay resale value of your still current year old AMD chip its likely you will still have enough money to do it again the cycle after.
No doubt your twice upgraded for = money AMD chip will destroy your now 2-3 year old Intel chip.
Intel systems are buy complete once type deals. Chances are the next 2-3 zen revisions will be drop in upgrades.
If you look at Principal Technologies report, there are a number of games chosen that are old and favor Intel a lot. Like World of Warcraft which is a 2004 game engine with updates that runs 30 fps faster on Intel vs the 2700x. Also, that's not an easy game to benchmark as it's an online game with no built in benchmark. They do have DX12 right now, and multithreading will be added in patch 8.1, but for now the game has always favoured Intel. Just ask the MMO-Champion guys as they circle jerk Intel chips for this reason.Yes, but that is only 12% in gaming performance, I would bet it isn't 12% better in other non gaming tasks.
LOL right.I'm just really hoping Intel realizes they are soon to be no longer the top dog and adjust pricing accordingly. Even HP Enterprise is dropping the Xeon in favor of Epyc chips.
Any links for this?LOL right.
HPE is full-on XEON. They’re moving so many XEONs out the door they can’t keep up with current orders.
Very few Epyc offerings right now - hopefully that’ll change soon.
Check your facts.
Gamers, as this CPU is obviously aimed at gamers since the test results are all games. Obviously a 9900K is good for productivity as well.
The same people you referenced in your original quote: "if there is only one product at the top, is how much extra are people willing to pay for that last % of performance."
I didn't say everyone.
Yeah, like I’m right in the middle of buying a bunch of Gen10 servers right now.Any links for this?
Sure sure, I understand.Yeah, like I’m right in the middle of buying a bunch of Gen10 servers right now.
Been in tons of roadmap mtgs, Cascade Lake mtgs etc...
Then why are you so clueless?Sure sure, I understand.
Other way around, you post something and provide no facts.Then why are you so clueless?
Uh, YOU posted your false nonsense first with no facts. Where’s your proof HPE is “dropping” XEONs? You got called out on that twice and still haven’t provided any proof. We’re all waiting for this great industry insight you have.Other way around, you post something and provide no facts.
Other way around, you post something and provide no facts.
Even HP Enterprise is dropping the Xeon in favor of Epyc chips.