Here are AMD's Radeon VII Benchmarks

Looking forward to getting this card. Its a better option than a 1080 Ti simply because of performance issues on the Nvidia while running HDR. I own a HDR Freesync monitor and BFV looks great in HDR the difference is most noticeable on the Aras map.
[H} should do an article on HDR as it is not covered that widely yet. Im confident the HDR adds more value than the gimmicky RTX. Simply because it works great in so many titles already.
 
This is the way I see it... this is the high end segment. People have been yelling and screaming for months about how AMD doesn't have a product to compete with Nvidia in the high end segment. AMD's invested a ton of money in their Vega architecture and it underperforms.Raju, who was heading the project left because his best engineers were moved to the much more successful and profitable Zen architecture and Navi which has it's R&D cost shared by both Sony and Microsoft. More than likely there are contractual agreements where AMD can't announce Navi until Sony or Microsoft are ready to announce they're next gen consoles. In the meantime, you've invested heavily in 7 nm. Why not recoup some of your investment and answer the public demand for a product that can compete with Nvidia's 2nd highest product offering.

At $700 you're paying 30% more for 30% more performance and 3 triple AAA games that can drive a 4K/60 or 144/120 or 1080/240. This will satisfy all the AMD fanboi's that want to play at these resolutions and frequencies with and AMD product.

No it doesn't do 4K/120 but then again how many players have 3K to spend on a setup for just the monitor and GPU. Yes I know I'm on [H}ardforum but even most of us aren't gonna spend that much on just a those two pieces of hardware.

Instead, now I'm getting into just my theory territory, if I were AMD, Vega 2/Rx 590 is stop gap solution until I can deliver Navi. Remember, AMD bet the house on Ryzen and they won. They won big, but that just means they can finally start adding to their graphics R&D budget. It's not realistic to think that they'd have a product ready for launch in the 6 months to 1 year with that additional funding. If we were to look at their model, Zen/Epyc is based of a scalable architecture where all you need to do is throw more cores to get more performance. You get economies of scale and as you're not having to have different fabs for different segments of the market, then I would surmise that Navi would be designed the same way. A scalable GPU chiplet connected through their infiti fabric. You can add more chiplets for more rendering power, add it to an APU.

By doing this you can stay on one platform for years, with tic's being small improvements in the design for 10% gains and toc's being smaller silicon/architecture. This allows them to be more competitive without the R&D budget.

Now between Radeon VII vs the RTX 2080. Ray tracing is way too infant, and Nvidia's trying to recoup some of their R&D by gouging Nvidia fanboi's. Until at least 50% of triple AAA games support it, I won't even bother to use it in a features comparison. DLSS on the other hand is much more interesting and I think much more easily implemented. Whereas AMD offers 16GB of ram. I prefer the 16GB of ram as I think it's more future proof. I think it'd be easier and more useful for game developers but increasing details in Textures. Some titles already use up 11 GB's.
 
Last edited:
I actually trust the numbers somewhat as there previously released CPU numbers matched up to real benchmarks.. I also think the increase in vram on the card will make it more future proof than the two utterly useless features on the rtx cards.

If you've seen Brent's article recently were vram usage is 8gb without rtx
 
Last edited:
I'll say it again, maybe everything I've read is wrong but NAVI is a Polaris mid-range replacement. This is it for a while on the high-end from AMD - Radeon7 / Vega2 whatever you want to call it. I was surprised as the rumors were Vega 7nm was a Datacenter / Compute release only. That was Lisa's Surprise for CES.
 
I am underwhelmed as well. But I am interested in seeing if AMD comes up with some type of inexpensive solution to ray tracing using a $50 GPU for example. Many of the ray tracing haters on these forums don't really understand the big picture benefits of moving to ray tracing over traditional rasterization. Or they just hate anything that comes from Nvidia.
You mean the benefits of having to have completely different work arounds and hacks for shit RT can't do that Rasterization can? People are acting like RT is some savior, it's not, and never has or will be. It's a DoA tech that only serves to make the current problems worse, not better.
 
Not the best PR move that they used an Intel CPU to bench their new GPU.
Not that it matters to me: I'm not buying it until I see an [H] review, if even then.
They said before the reason they do that is because it's the most commonly used high end processor on the market. Just like they previously used the 6700k for their test benches before the 7700k.
 
I got my GTX 1060 6GB new for $190 in June of 2017. I get ~4K 60FPS in most things with some settings tweaked. I'm waiting for the next card to replace this at the $200 price point.

"some settings tweaked" = running shit at low/mid for anything that actually stresses the video card. I'm running on a 2080 and it's only just acceptable at 4k resolution in any graphic intensive games. So I'm gonna have to call bullshit on your 1060 being able to push 4k/60 solid.
 
Anyone willing to spend $700 on a GPU probably already owns something powerful enough to make this card unattractive.
 
3dfx for life.

Man do I miss buying my VooDoo 3 2000 for $89 at Software ETC. Damn what a value!

For people saying to take these results with a grain of salt, the vega 64 numbers match up to review site benchmarks almost identically (guru3d anandtech etc)

You do realize we are posting on a hardware review site. Of course they say wait for our benchmarks. They want the clicks. I would be more surprised if there was a blurb in the review site reviews saying that AMD was actually spot on with their numbers for once. When / IF they match up.

This old man is out of the gpu game... I miss "great gpus for 130-170$" from 2001-2015 or so. Those days seem to be long gone.

You and I both sir. Quake 2 1024 x 768 @ 40 fps was the bees knees.

"some settings tweaked" = running shit at low/mid for anything that actually stresses the video card. I'm running on a 2080 and it's only just acceptable at 4k resolution in any graphic intensive games. So I'm gonna have to call bullshit on your 1060 being able to push 4k/60 solid.

Agreed. Running RX 470 4GB MSI Gaming X on a BenQ 4k monitor. Running mostly RPG and RTS games and at 4K resolution get at best 45-50 FPS on high settings. Let alone higher settings.
 
Unfortunately Jensen Huang is right to call it underwhelming. GTX 1080 TI levels of performance for $700? It's not out now either but something around June. By June Nvidia will have a RTX 2070 Ti and would have lowered the price of the RTX 2080. This shit is so predictable it hurts. Their CPU's has the right price and the right performance but their graphic card pricing is just as bad as Nvidia's RTX 2060.

This Radeon 7 shouldn't be more than $550. What's their new Radeon mid range cards going to cost, $350 like Nvidia's? I'm just going to sit here with my RX 480 and watch both Nvidia and AMD have poor sales. I'm not paying $350 for a $200 graphics card that can't do Ray-Tracing, and it fucking can't, and AMD's alternative is probably going to be just as bad if not worse. Meanwhile RX 480's are going for around $100 used, while used 1060's are $150~ish. AMD and Nvidia can both eat a bag-o-dicks.
 
I cannot hold my laughter any more I got to get this off my chest, the people that have been paying for overpriced GPU by Nvidia telling us the card is overpriced, I mean this is hilarious.

btw (not just in this thread) .
 
Anyone willing to spend $700 on a GPU probably already owns something powerful enough to make this card unattractive.

Is not this how it has been recently? According to my subjective memory

IE: Bulk Mainstream . . . One off . . . Bulk Mainstream . . . One off
IE: 270/280/380/390 . . . Fury . . . RX460/470/480/580/590/56/64 . . . Vega vii . . . Future
 
Doesn't change what he said. You can grab a 1080Ti for $500 easily.

AMD zealots are equally as useless. Both sides ultimately are once it turns into a red herring fest that ignores facts.

Used 1080Ti, if you are lucky. Good luck getting a new one with a new warranty. Worse is the Nvidia zealots making claims about others.
 
Unfortunately Jensen Huang is right to call it underwhelming. GTX 1080 TI levels of performance for $700? It's not out now either but something around June. By June Nvidia will have a RTX 2070 Ti and would have lowered the price of the RTX 2080. This shit is so predictable it hurts. Their CPU's has the right price and the right performance but their graphic card pricing is just as bad as Nvidia's RTX 2060.

This Radeon 7 shouldn't be more than $550. What's their new Radeon mid range cards going to cost, $350 like Nvidia's? I'm just going to sit here with my RX 480 and watch both Nvidia and AMD have poor sales. I'm not paying $350 for a $200 graphics card that can't do Ray-Tracing, and it fucking can't, and AMD's alternative is probably going to be just as bad if not worse. Meanwhile RX 480's are going for around $100 used, while used 1060's are $150~ish. AMD and Nvidia can both eat a bag-o-dicks.

How can you even run Ray tracing at this point with GPU underwhelm by its performance, even for 2080 Ti...
Unless you running at 1440p/1080p...

1080 Ti performance is near identical to 2080. Radeon 7 cheaper and run about the same seems like a competitive choice to me.
 
Radeon VII probably doesn't matter much.

AdoredTV (and others) have suggested that it's only being produced to sell off a limited number of excess Radeon Instinct chips. And of course it gives AMD the right to claim the "first 7nm consumer graphics card"; fair enough. But look at the zero-dollar aesthetics... no bling investment at all, which would be (?) unprecedented for a normal release. This seems consistent with a very small production run.

Real-world reviews will be interesting, though AMD has benchmarked so many games that the general picture seems clear. But they didn't mention power draw, likely for a reason - one informed estimate has it close to 300W. If so, and at the MSRP, there won't be a lot of demand anyway.

So I doubt we'll hear much about this card ongoing.
 
Last edited:
So basically, this is the card AMD should have released instead of the Vega 64 back in 2017. It's about 10% faster than a 1080Ti, and on par with a 2080 minus Raytracing, PhysX and CUDA at about the same price.

At this rate, AMD is screwed to low and mid range status and need a generational leap to catch up. Navi needs to get out of the gate and fast and when it does, it better be able to take on the latest Nvidia Titan by double digits just so they can hang with the 3080Ti when it's released.
 
Navi needs to get out of the gate and fast and when it does, it better be able to take on the latest Nvidia Titan by double digits just so they can hang with the 3080Ti when it's released.

Relax

Navi is 2H 2019

EDIT: Should qualify this, R7 is it for high end AMD for now. Navi Mid-Range. AMD has ceded the high end market on these gens.
 
So basically, this is the card AMD should have released instead of the Vega 64 back in 2017. It's about 10% faster than a 1080Ti, and on par with a 2080 minus Raytracing, PhysX and CUDA at about the same price.

At this rate, AMD is screwed to low and mid range status and need a generational leap to catch up. Navi needs to get out of the gate and fast and when it does, it better be able to take on the latest Nvidia Titan by double digits just so they can hang with the 3080Ti when it's released.

Correct me if I am wrong but I thought Navi is more of a replacement for RX500 series?
 
Correct - Navi is Polaris replacement.
You're right. screwed up on that part.

Still whatever replaces Vega needs to have that generational leap if AMD wants to compete in the high end market again or at the very least be price competitive. Releasing a card that competes at last year's specs at equal price isn't getting them any more market share with gamers. If this card was $600 or even less they would fly off shelves (not that it matters anymore because of Crypto. This card is guaranteed to sell even if they were in the $1000 range because of Perf per watt alone as long as AMD can produce enough of them for the crypto channel, which they can at 7nm)
 
They said before the reason they do that is because it's the most commonly used high end processor on the market. Just like they previously used the 6700k for their test benches before the 7700k.
They use Intel because it paints the AMD GPU in the best possible light.

If a Ryzen could get equal or more FPS to Intel out of the GPU, you really they'd stick to the Intel anyway because "it's the most commonly used"?

Get out of my office.
 
Last edited:
And if they would have used a AMD CPU people would bitch about that as a one off skewed rig.

A
 
Last edited:
  • Like
Reactions: Mega6
like this
I'm looking at the cheapest regular priced Vega 64 on NewEgg, not even looking around for sales.

Vega 64 : $399.

Radeon 7 : $699

2080 : $699

So 30% increase in performance as of right now for $300 more. Overclocked a bit and tweaked the Vega 64 performance has been shown to really increase. At the same price point as the newer tech 2080...I think that the Radeon 7 will be the best thing to ever happen to Vega64 sales! Come oooooonnnn Canadian prices :)
 
I'm looking at the cheapest regular priced Vega 64 on NewEgg, not even looking around for sales.

Vega 64 : $399.

Radeon 7 : $699

2080 : $699

So 30% increase in performance as of right now for $300 more. Overclocked a bit and tweaked the Vega 64 performance has been shown to really increase. At the same price point as the newer tech 2080...I think that the Radeon 7 will be the best thing to ever happen to Vega64 sales! Come oooooonnnn Canadian prices :)

But can the Vega 64 do acceptable 4k?

A
 
Just standing back and watching this unfold... of anything positive, Radeon 7 was the first nail in the coffin of G-sync.
 
They use Intel because it paints the AMD GPU in the best possible light.

If a Ryzen could get equal or more FPS to Intel out of the GPU, you really they'd stick to the Intel anyway because "it's the most commonly used"?

Get out of my office.
No shit Sherlock if it was 1080p, I'm just saying what their reasoning for it they gave. Either way at 4k Intel or AMD processor means absolutely nothing so take the crap somewhere else.
 
Just standing back and watching this unfold... of anything positive, Radeon 7 was the first nail in the coffin of G-sync.

The first nail was actually put in my Nvidia themselves when they decided to start to "support" VRR on certain FreeSync monitors.
 
  • Like
Reactions: Mega6
like this
No shit Sherlock if it was 1080p, I'm just saying what their reasoning for it they gave. Either way at 4k Intel or AMD processor means absolutely nothing so take the crap somewhere else.

If CPU really meant absolutely nothing then AMD would absolutely use a Ryzen CPU in their benchmarks. It's really that simple. There'd be no business reason not to.

The reason they choose the CPU that will benchmark their GPU's the best is because they know that the GPU numbers are what's going to be splashed everywhere. Test config is but a footnote.
 
Last edited:
ryzen2 is not out yet and most benchmarkers use the intel chip for higher (current) IPC.

This is the current industry standard. not sure what your crying about.


Pretty sure it's the fact that they.. who make said Ryzen CPU's, aren't even bothering to promote their CPU's with their GPU's for best performance...

Which IMO.. is Kinda funny that they used Intel over their own product to show off how good there other product is.. they simply used the competitors as if saying were not worthy.. but that's how I interpreted his comment anyways...
 
"some settings tweaked" = running shit at low/mid for anything that actually stresses the video card. I'm running on a 2080 and it's only just acceptable at 4k resolution in any graphic intensive games. So I'm gonna have to call bullshit on your 1060 being able to push 4k/60 solid.

I game at 4k60 on a 1070 and an R9 290. Most games I play run great at 4k60 on the r9 290. Reduced settings really help, but it still looks better than 1080p.

There's actually only a handful of games I own that will stress my 1070 at 4k.

I don't care about Ambient Occlusion, AA, Depth of Field, and Motion Blur so that automatically gives me a huge performance lift.
 
Pretty sure it's the fact that they.. who make said Ryzen CPU's, aren't even bothering to promote their CPU's with their GPU's for best performance...

Which IMO.. is Kinda funny that they used Intel over their own product to show off how good there other product is.. they simply used the competitors as if saying were not worthy.. but that's how I interpreted his comment anyways...

Or outside the Techies "in the know" - most gamers have Intel, are using Intel and it's a good base to compare off of.
 
So it's basically a 1080Ti at $700? How many years late?
At least they can make a "1080Ti" and yes it is late and that's not optimal, but the main issue here are how valid are 1080Ti performance now, and what price do the competitor sell that performance at.

I am wondering why Hitman and Forca only see that low gains.
Hitman in particular, but my grey goo don't store game benchmarks any more as no one make interesting games to me and i am no longer a tech writer, but maybe AMD in general take a hard hit on those 2 titles
 
Back
Top