Here are AMD's Radeon VII Benchmarks

AlphaAtlas

[H]ard|Gawd
Staff member
Joined
Mar 3, 2018
Messages
1,713
AMD announced their Radeon VII card yesterday, and made some interesting performance claims to go with it. AMD says their 7nm Vega Radeon is at least 25% faster than Vega 64, and showed a few benchmarks with even higher gains during their CES presentation. But, in footnotes buried in their presentation and their press release, AMD spelled out the testing methods behind those claims. On January 3rd, AMD benched Vega 64 and Radeon VII on an Intel i7 7700K rig with 16GB of 3000Mhz DDR4. They didn't mention the motherboard or any other hardware specifics, but they did say they used "AMD Driver 18.50" and Windows 10. All the games they tested were run at "4K Max Settings," and they took the average framerate of 3 separate runs to get an average FPS measurement for each game.

I've tabulated the benchmark data here.

Most games do indeed show a 25% percent gain or better, but there are clear outliers in the data. Performance in Fallout 76, for example, seems to have increased a whopping 68%. Strange Brigade saw a similarly disproportional 42% increase, while Hitman 2 only ran about 7% faster. The Strange Brigade data is particularly interesting, as that was the game AMD seemingly used to compare performance to Nvidia's RTX 2080. Using what appears to be the same test setup as above, AMD claims Strange Brigade ran at 73 fps on a RTX 2080, compared to 61 FPS on Vega 64 and 87 FPS on Radeon VII. Regardless, benchmarks from manufacturers have to be taken with a grain of salt, and as AMD mentions, performance on pre-release drivers can be sketchy anyway. We'll be sure to verify some of AMD's claims ourselves soon.
 
*Edit, ok working now. Thanks OP.

I'm not.

15471366628pyisvxx1q_1_1_l.jpg
 
Not the best PR move that they used an Intel CPU to bench their new GPU.
Not that it matters to me: I'm not buying it until I see an [H] review, if even then.
 
So it's basically a 1080Ti at $700? How many years late? :(
No, it's RTX 2080 performance about three months later, at the same price or less.

This is one of the best manufacturer released performance numbers that I have seen, in that there are specifications as to what was done, how it was done, and the large number of games involved. Most of the time, you only get one or two of the best case scenarios. I don't expect to see a well setup test from a reviewer show significantly different numbers. The [H] review will do gameplay tests, so will likely show some different numbers, but I expect the performance increase over Vega 64 will be similar.

Edit: I also don't expect to see as good of improvement at 1080 or 1440, as I imagine the 16GB is really helping this card at 4K
 
RTX 2080ti performance at RTX 2080 prices, and then we're talking. RTX 2080 performance at slightly less than RTX 2080 prices... ok, but not a game changer like Ryzen was.
Without any of the other features of RTX. Oh well, there's always next time I guess. After seeing the specs I became very interested but this is underwhelming.
 
Not the best PR move that they used an Intel CPU to bench their new GPU.
Not that it matters to me: I'm not buying it until I see an [H] review, if even then.

ryzen2 is not out yet and most benchmarkers use the intel chip for higher (current) IPC.

This is the current industry standard. not sure what your crying about.
 
Without any of the other features of RTX. Oh well, there's always next time I guess. After seeing the specs I became very interested but this is underwhelming.

I am underwhelmed as well. But I am interested in seeing if AMD comes up with some type of inexpensive solution to ray tracing using a $50 GPU for example. Many of the ray tracing haters on these forums don't really understand the big picture benefits of moving to ray tracing over traditional rasterization. Or they just hate anything that comes from Nvidia.
 
Meh. While I welcome the card's introduction, and I think it will compete quite nicely with the 2080, I am going to wait for the next gen. My Vega 64 maxes my 3440x1440 monitor at 75 hz already with ultra settings in most games. I will wait and get something like Samsungs new 49" 32:9 1440p 144hz monitor and a new graphics card to drive it next year or so. Until then, I am perfectly happy with what I have.
 
I am underwhelmed as well. But I am interested in seeing if AMD comes up with some type of inexpensive solution to ray tracing using a $50 GPU for example. Many of the ray tracing haters on these forums don't really understand the big picture benefits of moving to ray tracing over traditional rasterization. Or they just hate anything that comes from Nvidia.

We just hate 1300 big ones to play @1080p. There's no dressing that one up.
 
I am underwhelmed as well. But I am interested in seeing if AMD comes up with some type of inexpensive solution to ray tracing using a $50 GPU for example. Many of the ray tracing haters on these forums don't really understand the big picture benefits of moving to ray tracing over traditional rasterization. Or they just hate anything that comes from Nvidia.

"Ray tracing" is a nice marketing buzz word which is rich with many ways to implement it and at its low end can look far worse than anything DirectX or OpenGL can spew out. Ray tracing does not magically guarantee an accurate image.

I do a lot of ray tracing work and I am not impressed with what I have seen from the NVidia implementation so far. Then again, my expectations are pretty high.
 
Well this is a just a node shrink, so 25% with double/triple patterning is actually good especially since Vega was a train wreck, as others have stated its merely a stop gap until Navi, as long as Navi is a better arch you should see gains, and then from there EUV/UVL is a lot more precise so there should be gains from that as well whether its raw power or better thermals or a lot more efficiency.

Honestly at first I thought this was stupid and idiocy releasing a 1080ti close to 2 years later, but after some thought, the 1080ti is sold out and price jacked with limited quantities, due to higher tier pricing on RTX lines I can see NVidia will be forced to do a 50$ price cut or concede to AMD as the 2080 is not really an upgrade to the 1080ti and the 2080ti is way too expensive for most.

Mind you this is expecting the V-R7 to ship and sell at MSRP with acceptable quantities with no price gouging which if AMD history is any indication it won't making this a poor option as if E-tailers even gouge 50$ off these it will make it a poor substitute instead of buying a 2080.

2020 will be the year of sparkle and amazement I wouldn't bother buying anything until 7nm EUV/UVL as the most gains will come from there, and honestly until we see 4k Ultrawide 120htz or 4k 144htz monitors with OLED/Microled/QLED at reasonable costs without the draw backs its really at the stupidity of chasing frames for zero reasons as most are 1080p or 1440p 100-144htz. On a side note for 1080p RTX 2080ti is perfect for the 60-100 fps range currently. I imagine as we get more games with better optimization we might see a playable shift to 1440p at 60fps.
 
This old man is out of the gpu game... I miss "great gpus for 130-170$" from 2001-2015 or so. Those days seem to be long gone.
I got my GTX 1060 6GB new for $190 in June of 2017. I get ~4K 60FPS in most things with some settings tweaked. I'm waiting for the next card to replace this at the $200 price point.
 
I repeat, it is an awkward time to be a GTX1080 owner. The only value upgrade out of either of these is a GTX2080ti and it doesn't remotely compare to the value I'd getting spending that money on bourbon. I could shower in woodford for these prices

tenor.gif
 
This old man is out of the gpu game... I miss "great gpus for 130-170$" from 2001-2015 or so. Those days seem to be long gone.
If you are only playing at 1080p and using a 60hz monitor or slower there are a lot of great parts out there in that range that do really great. It's once you step out of that zone that things take off in price.
 
Min FPS will be the significant number to watch for. AMD did a good job here, not sure you can say it competes with nV 2000 series but rtx is game limited and will prolly take a generation or two before the masses care. If nV could get DLSS working on old titles that would be a game changer(pun intended). If I was buying now the bang for buck choice would be a 1080Ti or Radeon VII, tough decision as there are usually kinks to work out with new tech. I'll probably wait for the next gen monitors to show up and keep rockin my measly 1070@1080p.
 
Back
Top