AMD Running DOOM Vulkan Benchmarks On Intel Hardware

Bullshit.

I kept the same hardware. The same video card, ssd, memory. Only thing I changed was the CPU. Anyone here will tell you Intel can double your frame rates in World of Warcraft over AMD at least as far as the 1090t I had.

Even the settings in WoW were the same. In fact, if you go back and look at posts I made here on HardOCP back in 2010 or 2011 you can see where I talked about the ordeal.

So if the CPU didn't have anything to do with it, you're saying the chipset and motherboard doubled my frame rates?

lol

Did the cpu have any other problems then that online game? If not what does that tell you? He said it runs well enough not faster...
 
Bullshit.

I kept the same hardware. The same video card, ssd, memory. Only thing I changed was the CPU. Anyone here will tell you Intel can double your frame rates in World of Warcraft over AMD at least as far as the 1090t I had.

Even the settings in WoW were the same. In fact, if you go back and look at posts I made here on HardOCP back in 2010 or 2011 you can see where I talked about the ordeal.

So if the CPU didn't have anything to do with it, you're saying the chipset and motherboard doubled my frame rates?

lol

I'm assuming you had to reinstall Windows so his point about the issue being drivers or malware is still valid.
 
To be honest WoW problems are really boring. Raiding Molten Core and Black Wing lair we had severe issues for some. Something silly as Karazhan walking up the stairs would cause disconnects and shit framerate to boot.

Problems fixed with updates of WoW rather then hardware fixes were most common that time .... Back then I have heard no one solving any problems dumping AMD cpu ....
 
Not any kind of a big deal at all. Most benchmarks are published against Intel CPUs, so the numbers they produced would be meaningless if submitted from AMD CPUs. And for the record, while it isn't the fastest CPU ever made, my FX8350 (and AMD R9 290X) play every game I have at full detail at 1080p (the resolution of my monitor) just fine.


There is no game out there right now that has any meaningful benefit from the glut of CPU power we have all been sitting on for the last 5 years or so. Faster CPU is nice, but is hardly critical anymore. God, I miss the old days when even a simple OS upgrade necessitated more CPU power and RAM :)
 
Last edited:
Bullshit.

I kept the same hardware. The same video card, ssd, memory. Only thing I changed was the CPU. Anyone here will tell you Intel can double your frame rates in World of Warcraft over AMD at least as far as the 1090t I had.

Even the settings in WoW were the same. In fact, if you go back and look at posts I made here on HardOCP back in 2010 or 2011 you can see where I talked about the ordeal.

So if the CPU didn't have anything to do with it, you're saying the chipset and motherboard doubled my frame rates?

lol

You changed the CPU, therefore you had to reinstall windows. Thus pointing it back to a software or motherboard/NIC issue. And read my post again, I said "WELL ENOUGH." Of course the Intel will run it a helluva lot faster, but I've never seen a top level AMD chip like yours run it so poorly it was unplayable like you are suggesting. So you are either using hyperbole to slam all AMD cpu's because you have a grudge or you had another unresolved issue unrelated to the CPU's ability to process code. As someone else mentioned in this same thread, they hinted there was a possible SOFTWARE issue with the game and AMD CPU's, possibly related to multi-threading. Again that is a SOFTWARE issue not a CPU hardware issue. As I also mentioned, my fiance runs a 1075T right now in 2016, runs raids in WoW, and has zero issues hitting 100+ fps. My suspicion then is back to a software issue that has long since been patched out by Blizzard.

So I don't doubt you had issues back then, but my point is if you tried this same setup again today I doubt you would recreate it.
 
I wanted to ask you guys what your thoughts on this were. According to AMD, Radeon graphics takes DOOM to the "next level with Vulkan implementation," but they are using Intel CPUs during their tests. More than a few of you feel that it is a bit odd to see AMD using Intel hardware when it has CPUs / APUs of its own. So, is this a big deal to you? Not a big deal? I enlarged the text from the link above so you could read it.

"More than a few of you feel that it is a bit odd to see AMD using Intel hardware when it has CPUs / APUs of its own."

Only AMD fan boys feel this way.

Anyone that knows their shit knows better.

AMD hasn't produced a competitive gaming cpu in years and if you want to show your product in the best light you choose an intel CPU.

Even AMD knows this... the only people that don't are those in denial.
 
Using AMD CPUs will make their GPU look too bad and affect sales.

That simple. I don't care. Frankly, you want the CPU and GPU divisions to operate independently except for making APUs.
 
B-b-b-b-but AMD claimed to have Doom running on Summit Ridge. lol, AMD's still not showing the processor running, suggesting it's still about a year away.
 
Actually what's even more interesting than AMD using an Intel CPU in this announcement, they're using a $1k Intel CPU to do it. I'm think few people would pair a 480 with an 5960x.
 
True Story - Using AMD got me kicked out of my WoW Guild.

I was running an AMD cpu, a new build, I think it was the 1090t with a video card I cannot remember. This was around 2010 or 2011. Anyways, I was in a high-end guild doing heroics ( now mythic ) and my frame rates was causing all kinds of shit during boss pulls. They basically had to pull me from raid and then I was moved to inactive and so I basically was forced to quit the guild.

I had to sell the system and rebuild using intel. A 2600 or a 920 or 940 .. I can't remember. Long story short. I saw my frame rates double and triple.

AMD is a horrible gaming CPU. I don't care what people say.

That's a lot you can't remember. Until you can post exactly what cpu, mb, gpu you swapped out and what you swapped them with You Might As Well Be Comparing Apples and Broomsticks.
 
I wanted to ask you guys what your thoughts on this were. According to AMD, Radeon graphics takes DOOM to the "next level with Vulkan implementation," but they are using Intel CPUs during their tests. More than a few of you feel that it is a bit odd to see AMD using Intel hardware when it has CPUs / APUs of its own. So, is this a big deal to you? Not a big deal? I enlarged the text from the link above so you could read it.


AMD has been using intel hardware for their benchmark stuff for the last 3 to 4 years, if you go back and look at their announcements in the small print for their benchmarks it's almost always an intel motherboard/cpu. at least they acknowledge they know they don't have the top of the line hardware and they understand not to hold back their selling product so it's a smart move on their part to make sure it shows up in the best possible light. nothing wrong with it at all in my opinion.
 
AMD has been using intel hardware for their benchmark stuff for the last 3 to 4 years, if you go back and look at their announcements in the small print for their benchmarks it's almost always an intel motherboard/cpu. at least they acknowledge they know they don't have the top of the line hardware and they understand not to hold back their selling product so it's a smart move on their part to make sure it shows up in the best possible light. nothing wrong with it at all in my opinion.

I agree, I don't think there's anything wrong with using Intel CPUs. But I think using a top line $1k Intel CPU is wrong for this kind of card as almost no one buying a 480 is going put it in that kind of system. Not that the numbers would probably be much different using a much more modest i7-6700k, it's just not a real world pairing of hardware.
 
I agree, I don't think there's anything wrong with using Intel CPUs. But I think using a top line $1k Intel CPU is wrong for this kind of card as almost no one buying a 480 is going put it in that kind of system. Not that the numbers would probably be much different using a much more modest i7-6700k, it's just not a real world pairing of hardware.
I don't think its about "real world pairing" per se. I think it has to do with eliminating any chance of a CPU bottleneck at all cost to show the true potential of the GPU for marketing reasons. Throwing the fastest CPU with the most physical cores/threads at the problem would be ideal especially for DX12 multithreading.
 
Last edited:
Current year and we're still having this discussion? No one pairs a mid-range GPU with a mid-range CPU to show just how mid-range their performance is. That said, the difference is negligible between "Extreme" CPUs and your typical unlocked K processor for gaming these days.
 
Nothing really new, even the 7990 release video the end section had the fact it was running an intel cpu.



go to 2:24
 
I'm cool with the problem not being AMD but if it wasn't AMD and I didn't change any other hardware other than to an Intel CPU ..... what was it?

I've heard other similar stories about people moving from AMD to Intel seeing huge frame rate increases.

If changing an OS and Motherboard doubled my frame rate and not the Intel CPU then fine, so be it. Pretty amazing if you ask me. Like I said, I'm ok with whatever happened as long as I got away from that AMD cpu, that's for sure.
 
I agree, I don't think there's anything wrong with using Intel CPUs. But I think using a top line $1k Intel CPU is wrong for this kind of card as almost no one buying a 480 is going put it in that kind of system. Not that the numbers would probably be much different using a much more modest i7-6700k, it's just not a real world pairing of hardware.
Devil's advocate here. The reason they may want to pair it with such a high end cpu would be to eliminate any possible bottlenecks thus ensuring a consistent and pure performance metric.
 
Devil's advocate here. The reason they may want to pair it with such a high end cpu would be to eliminate any possible bottlenecks thus ensuring a consistent and pure performance metric.

And I understand that. But that doesn't change the fact that few people buying a budget card probably are going to pair it with a $1k CPU. It makes absolutely no sense to spend $240 dollars on a GPU and pair it with a $1k CPU if the idea is to maximize gaming performance within a tighter budget. It just seems that a much more reasonably priced i7-6700k would be a much more sensible paring and I'm guessing wouldn't change the results significantly. Indeed if going to an i7-6700k would make a big difference that calls into to question the value proposition of the 480 if it actually needs an Intel Extreme class CPU to flex it's muscles.
 
And I understand that. But that doesn't change the fact that few people buying a budget card probably are going to pair it with a $1k CPU. It makes absolutely no sense to spend $240 dollars on a GPU and pair it with a $1k CPU if the idea is to maximize gaming performance within a tighter budget. It just seems that a much more reasonably priced i7-6700k would be a much more sensible paring and I'm guessing wouldn't change the results significantly. Indeed if going to an i7-6700k would make a big difference that calls into to question the value proposition of the 480 if it actually needs an Intel Extreme class CPU to flex it's muscles.
I do agree it would be nice to see it benchmarked with mid tier CPU's, maybe even an AMD APU to see how it does in a budget build.
 
Who knows ... at the time, the 1090t had just came out and I might have gotten caught up in all the marketing for the CPU, such as, how great it was, 6 cores, affordable, etc. The peanut gallery was echoing this and, I just got caught up in the moment and decided buy the parts and build it. At the time I do remember having some serious concerns. I also run high-end GPU's and memory, SSD's etc within reason.

It's not important now. Like everyone else here, I am waiting on the new AMD cpu's to see how they perform. If they review well, who know's, I might build another AMD based system. :)
 
Played through a bit using Vulkan on my FX 8320 and R9 380 - felt really smooth and fast @ 1080p. I also switched the FOV to 120 (no idea that this was even an option) and the game felt great.
 
I do agree it would be nice to see it benchmarked with mid tier CPU's, maybe even an AMD APU to see how it does in a budget build.

I've not looked and AMD CPUs in forever and just went onto Newegg to price AMD's most expensive desktop CPU. I guess that's the AMD FX-9590 @ $200. The last AMD CPU I bought I think in 2005 cost me over twice that.
 
I've not looked and AMD CPUs in forever and just went onto Newegg to price AMD's most expensive desktop CPU. I guess that's the AMD FX-9590 @ $200. The last AMD CPU I bought I think in 2005 cost me over twice that.
They have allowed their CPU to lag considerably over the years, which has lots of AMD fans on the edge to see what Zen does. As a diehard Athlon 64 fan whose original dual core lasted me a good long time till I upgraded to an i7 3770 I am eager to see AMD return to competition.
 
I don't care what cpu they use, I just want to see Vulkan succeed and DX12 fail. :)

As long as there is an Xbox and about 95% of PC clients running Windows, with going on half of those on Windows 10, DX 12 isn't going to fail. Maybe Vulkan support will be more prevalent however.
 
They have allowed their CPU to lag considerably over the years, which has lots of AMD fans on the edge to see what Zen does. As a diehard Athlon 64 fan whose original dual core lasted me a good long time till I upgraded to an i7 3770 I am eager to see AMD return to competition.

Sure, it would be nice to see AMD compete at the higher end. It's not healthy for the industry where AMD has to sell it's best CPUs for $200 while Intel can charge much more simply because they have such a better product.
 
I think it depends on what we want to prove with Vulcan. Is it more efficient on the CPU, or GPU, or both.

IIRC, AMD's Mantel was geared towards being more efficient on the CPU, and in that situation, it would make sense to use AMD CPU to show that these games works well on low cost CPU.
 
The real sad part is that [H] hasn't kept up with the rest and doesn't have performance tests of their own so they are left talking about the small print on an AMD article. Others already beat them to the punch and have found truly interesting things like Nvidia gaining NOTHING from Vulkan which is by far more interesting to discuss.

index.php
Those are lucky numbers on my gtx1080 I lose 20-25% on avg using vulkcan vs opengl.
 
Its really simple i think , most of the reviews are done with Intel cpus. So why do you release latest benchmarks using an AMD cpu that performs worse thats not comparable to the 95% review baseline eg Intel CPU.

If you want to be-little AMDs gains i guess you nit pick and find this as a negative. I just think its common sense.

Fury X on par with 1070 - likely in this game. RX480 beating a GTX1060 now in this game - likely (we will see). Its another win from AMD and another net zero gain for NV with a modern API.

I wonder if hard will now use this game in their 1060 review ? won't hold my breath.
 
No AMD knows bulldozer sucks, I mean hell we don't have to tell them that, Just look at benchmarks around the web. Hell look at their stocks!

But if they had Zen out I bet you they would be using that in the benchmarks, unless they have another faildozer
 
This doesn't surprise me. If they used one of their own CPUs, it would be damn near free press for intel using an LGA2011 and the same Radeon card showing superiority. Smart of AMD to leverage the marketed as the fastest available consumer parts right now. They can't afford to hold off until Zen to push their new GPU parts.

Barton anyone? That's the last I remember that was a better option than Intel... I liked my 2500+

I'd beg to differ because of the first A64 X2 Manchester offerings in 2005... AMD completely owned Intel for a small jaunt until the "Core"-based Conroe came out in 2006.

Ah, the days of the $800+ Athlon64 X2 4800+ ...so glad those CPU price points are gone from the mainstream and performance segments.
 
I don't think its about "real world pairing" per se. I think it has to do with eliminating any chance of a CPU bottleneck at all cost to show the true potential of the GPU for marketing reasons. Throwing the fastest CPU with the most physical cores/threads at the problem would be ideal especially for DX12 multithreading.

That is the exact opposite of what DX12 is supposed to accomplish. DX12 allows performance improvements for lower end cpu's to help alleviate that bottleneck. In fact the faster the processor the less improvement you'll see, although it does tend to help alleviate xfire and sli bottlenecks as well.
 
Actually what's even more interesting than AMD using an Intel CPU in this announcement, they're using a $1k Intel CPU to do it. I'm think few people would pair a 480 with an 5960x.

Maybe they wanted something slower then upcoming Zen and say goodbye to using Intel hardware in a ceremonial way :)
 
As long as there is an Xbox and about 95% of PC clients running Windows, with going on half of those on Windows 10, DX 12 isn't going to fail. Maybe Vulkan support will be more prevalent however.

Just what I expected from you.

id Software Dev Puzzled By Devs Choosing DX12 Over Vulkan, Claims Xbox One DX12 Is Different Than PC « GamingBolt.com: Video Game News, Reviews, Previews and Blog

"Speaking on Twitter, Gneiting said that developers using DirectX 12 over Vulkan ‘literally makes no sense.’"
 
As long as there is an Xbox and about 95% of PC clients running Windows, with going on half of those on Windows 10, DX 12 isn't going to fail. Maybe Vulkan support will be more prevalent however.

If that were true, you wouldn't be sweating it enough to MS-patrol any comments about Vulkan appearing on this board. If we assume your best-case hypothetical to be true and have a 50-50 split, a developer still needs to support DX11+DX12. Or ... just use Vulkan. Gee which one is simpler, hits more platforms and benefits more people?

Xbox? Nonfactor, since the proprietary "DX12" of the Xbox One is DX12 in marketing-naming only - separate codepath.

I don't think we'll see any DX12-only games any time soon, at least not from non-MS studios - it makes no financial sense for them.
 
Last edited:
Just what I expected from you.

id Software Dev Puzzled By Devs Choosing DX12 Over Vulkan, Claims Xbox One DX12 Is Different Than PC « GamingBolt.com: Video Game News, Reviews, Previews and Blog

"Speaking on Twitter, Gneiting said that developers using DirectX 12 over Vulkan ‘literally makes no sense.’"

Just what you expected of me and this guy is complaining about devs supporting DX 12. That's too funny. What I find interesting from that post is this:

Speaking on Twitter, Gneiting said that developers using DirectX 12 over Vulkan ‘literally makes no sense.’ Elaborating on his stance, and in response to some questions, Gneiting pointed out that with Windows 7 forming a major chunk of the PC gaming market, and with DirectX 12 being incompatible with Windows 7, using DirectX in an attempt to have ‘one codebase’ makes no sense, since developers would need to create two separate ones anyway. He pointed out that the argument that programming for Xbox One and Windows 10 becomes easier by using DirectX 12 is moot too, because DirectX 12 on Windows and on Xbox is very different, necessitating two separate code paths anyway.

Read more at id Software Dev Puzzled By Devs Choosing DX12 Over Vulkan, Claims Xbox One DX12 Is Different Than PC « GamingBolt.com: Video Game News, Reviews, Previews and Blog

So a seven year old OS that's on it's way out, that may very well be unsupported by the time many of these titles hit the market, is the reason devs should use Vulkan. No mention of Linux or Android here. Not saying he doesn't have a point with the cross-platform argument, just Windows 7 in the coming years is going to decline more and more and I doubt most gamers buying new Windows games will be on 7.
 
Back
Top