Gideon
2[H]4U
- Joined
- Apr 13, 2006
- Messages
- 3,008
He is not coming back, the former review staff have a site tho just might have to wait a bit to get a review until they get established enough to get sampled.
ssshhhhhhttt, we can always hope.He is not coming back, the former review staff have a site tho just might have to wait a bit to get a review until they get established enough to get sampled.
Opinions are literally bias- calling GN 'Intel biased' is a new level, though![]()
What are you even talking about? There is no limitation on the hardware at all and no restrictions. You are obviously talking out of your ass.which is why "pro" gaming is rule bound, they limit it to 120hz so that hardware is not the limitation but the individual, about 95% of gamers are not "pro" and the need for 240hz is merely a spend money on something. 60 to 120hz was major, 120 to 144hz I had no real perceivable difference.
And in the end AMD or Intel its normally going to be a Graphics card limitation.
How many vulnerabilities and fixes have been put in place since then that possibly effect streaming? How many did GamersNexus have in place when they tested? What streaming conditions did they use, what resoltuion, etc.. there are so many variables that are at play here.
I also have the opinion, just by various pieces GamersNexus has produced, that they are very bias towards Intel. So, it might as well be Intel making that claim, it means about as much to me.
My Commodore 64 had a 1 MHz CPU... No overclocking though.
What are you even talking about? There is no limitation on the hardware at all and no restrictions. You are obviously talking out of your ass.
That "about" percentage is so far off as the pro scene is smaller than that and it's hard to make it to the top.
The 240hz is not "merely spend money on something" as you don't have to be a pro to take advantage of the monitor. Any high competitive player can take advantage of 240hz.
I can easily tell the difference of 120hz, 144hz, 165hz and 240hz from each other. Also I have a custom 270hz monitor and it's possible to buy a custom 300hz monitors now.
Just because you did not see or feel any perceivable difference it doesn't mean others can't.
It depends on the game. Cpu matters with low frame dipping.
You really have no clue about what you are talking about.
You haven't been reading too many threads around here the past few years, have ya?And how exactly would that be a "real world test", being that in the "real world" no one cares about trying to match their CPU and GPU brand. Otherwise Nvidia would be in trouble I guess?
I remember the pencil trick.Says the kid...![]()
I woke up again.Was that the amd's I don't remember jumpers were fun. Oh man I woke up again.
Oh. That's low.
My Commodore 64 had a 1 MHz CPU... No overclocking though.
All your points are correct.
I would make that choice anyway. AMD is just not giving an alternative for it and its going to be a long uphill battle on this one due to market share![]()
If this holds in 3rd party testing, Ryzen 3000 just crushes Intel on the desktop, where Intel were already losing the DIY desktop market to Ryzen 2000.
Their intel compiler fuckery
AMD has, it’s called AOCC. The differences in speed are small though, and no one in their right mind will optimise just for AMD for general apps (or games) with Intel’s install base
Thank you for updating me. I didn't know they finaly made on.
In this case developers could make 2 .exe' one compilet with ICC and one with AOCC. but im wondering if they have to adjust the code for the different compilers and therefor back to needed effort for improving a small install based![]()
You haven't been reading too many threads around here the past few years, have ya?![]()
No, you generally don't need to tweak the source. But - even dual compiling and supporting multiple binaries is a hassle. I work on big data crunching libraries, and even with that - ICC is just a couple percent faster on intel processors. Not worth the deployment and support hassle.
We're even less likely to do such things for AMD, of course. But luckily, in my experience it doesn't matter very much at all. Note - I do not currently work in game development. It possible there is a different picture there (but I doubt it).
There was a time when ICC had a bigger lead, and processors in general had less relative oomph so it was more appealing, but that ship has sailed IMO. The general compiler output from the agnostic compilers is quite good.
On the bottom line we seems to agree. Compiler optimzing for intel 80% of markets ( BS numbers) is going to be done way before doing AMD optimization for AMD's 20% of the markets ( Again BS number)
I like both Intel and AMD, but am rooting for AMD right now. Zen 2 is history repeating itself again. This kind of competition gives Intel the kick in the pants it needs to to actually innovate and be competitive pricewise again. Unless you are a pro gamer, I would argue that those extra couple of frames you *might* get from an overpriced Intel part will not be worth it. Intel, factor price into your statement and there really is no fight. The only other advantage Intel holds right now is Quicksync and some AVX512 workloads. That said, Intel will definitely be a consideration after my Zen 2 upgrade *when* they bring the competition back in both price and performance.
And for all you people that might try to argue that you need a new motherboard to get the gains from Zen 2, you have your head in the sand. This is not the case and has been verified. You get PCIe 4.0 from x570, but that is pretty much it and has a minuscule impact on almost all real world workloads.
So, this is what Intel looks like without Kyle Bennett, desperate.![]()
Nope, they're just still ahead of AMD in gaming. Maybe someday AMD will catch up to Intel's six year-old architecture.
I can’t stop losing IQ points when I read your posts.Nope, they're just still ahead of AMD in gaming. Maybe someday AMD will catch up to Intel's six year-old architecture.
Nah, this just Intel using AMDs hard acquired marketing team. No one believes you, anymore.
Nope, they're just still ahead of AMD in gaming. Maybe someday AMD will catch up to Intel's six year-old architecture.
Are there 3xxx benches you guys are bickering over?
Intel w/ Nvidia = Marks of the BeastProbably more than you, but if you want to actually elaborate on your comment, and what you think I missed, feel free.
Of course, Intel hasn't done anything except tweak their 6-year-old architecture either. I see very little reason to spend $500 on a 9900k in light of a $329 3700X. I don't think you'll see 40% more gaming performance for 40% more price. I can't imagine the difference will be +/- 10%.
Intel w/ Nvidia = Marks of the Beast
AMD w/ AMD = Enlightenment achieved
But what if I run an AMD GPU next to an Nvidia GPU on my Intel desktop? Next-level enlightenment?
![]()
Of course, Intel hasn't done anything except tweak their 6-year-old architecture either. I see very little reason to spend $500 on a 9900k in light of a $329 3700X. I don't think you'll see 40% more gaming performance for 40% more price. I can't imagine the difference will be +/- 10%.
The 2600 / 2600X are the 'sweet spot' for gaming until these Ryzen 3000-series CPUs hit. If you need higher framerates, Intel has you, and below that, it's a bit of a tossup depending on system budgets and applications.
Then you sound like a crypto-miner which might get you taken outside the city limits and stoned by the general population around here.
I don't waste cycles or power on imaginary money.
Nope, they're just still ahead of AMD in gaming. Maybe someday AMD will catch up to Intel's six year-old architecture.
I didn't know Amazon was the only place to buy computer parts.We should tell intel that whatever they bring to the party has to be available at retail... IE, In Stock.
...and you some how think AMD is the loser here. It's amazing how some people's thought process works. Simply amazing.
You're going to have to quote me on saying "AMD is the loser here". You're trolling and stating things that I did not say.
[hint: you won't find the quote]