AMD Radeon VII Benchmarks Leak Out

Sure, just overclock the AMD card- if it's +50w versus closest competitor at stock, then it's approaching +100w to it's closest competitor with both overclocked :D

I have had no issues when I overclock all my AMD cards and they perform very well. Of course, I do not skimp on my power supplies and have not done so since 2006 or earlier. But hey, this is even in an ITX build which is fully AMD. (650 Watt Thermaltake Power Supply.)
 
Ask that again in 5-10 years when the majority of games use real ray tracing. Nvidia's hybrid approach exists in one game with an abysmal implementation.

Unless the next gen consoles are going to be ray tracing, I don't think there's much to worry about with regard to the value of raytracing in a card. SO not a biggie until at least what? 2025? 2026?
 
I have it on good authority that extra power consumption only matters in certain situations.

You mean when the performance envelope is being pushed and the closest competition is in the dust?

Who would have thought that context matters!

Please keep defending +50w for the same performance.
 
Unless the next gen consoles are going to be ray tracing, I don't think there's much to worry about with regard to the value of raytracing in a card. SO not a biggie until at least what? 2025? 2026?
I think next generation consoles will have ray tracing abilities. But I believe that they will only use ray tracing for certain in game effects and not for completely ray traced games. Fully ray traced games at high resolutions and framerates I think are still about a decade away.
 
Fully ray traced games

I don't know if we're ever likely to see this; we're very good at rasterization, and it is more efficient at a great number of tasks. Further, we're not likely to see raster hardware disappear soon.
 
You mean when the performance envelope is being pushed and the closest competition is in the dust?

Who would have thought that context matters!

Please keep defending +50w for the same performance.

1. Nobody knows actual performance or power draw yet, we've only seen what AMD published, which shows more performance, for more power .

2. You didn't seem to think that was the case the last time I mentioned you talking out both sides of your mouth in the 590 review thread.

18% faster on average vs the 1060, oc and stock, and it had a "blown efficiency curve"

9900k used 31% more power for 16% faster performance, and it got the quote I posted above.

Which one is it, power doesn't matter as long as you're substantially faster than the competition, or every watt matters.

In the end, people will buy what works best for them. We certainly don't need daddy idiot in here protecting us from ourselves, for me a lateral move from my 1080ti is worth it for working freesync and less driver issues.
 
It won't be that long, most of the RTX features are already a part of the DX12 spec or being implemented in DX12 revisions at a later date, I would say it will be 3 years at most until they are in both DX12 and Vulkan as an open spec. And by that time we should have hardware on the market that can actually handle it with out shitting the the bed.

well, heard that song once to many now, seeing is believing...
 
Which one is it, power doesn't matter as long as you're substantially faster than the competition, or every watt matters.

It's both.

One is a lower-midrange product (granted the best AMD is willing to do for consumers). Pushing power draw versus the competing part for a small increase in performance isn't helpful. The target market is largely made up of people putting these things in budget systems, where power is coming from el cheapo PSUs be they Rosewill or whatever Dell and HP are slinging. Or they're enthusiasts that actually have a power budget for an ITX build. Either way, extra power usage for no gain does nothing for them.

The other is a halo part.

These are two different parts with two different target markets. So yes, two different standards apply, as you see in reviews.

We certainly don't need daddy idiot in here protecting us from ourselves

Oh no, I just laugh. I can only point out ignorance; you can choose to remain so if you like.

for me a lateral move from my 1080ti is worth it for working freesync

And this is just forum gold. It's clear why you don't mind innefficient parts: you're trading out for more power draw and less performance, and paying for the privilege!
 
It's both.

One is a lower-midrange product (granted the best AMD is willing to do for consumers). Pushing power draw versus the competing part for a small increase in performance isn't helpful. The target market is largely made up of people putting these things in budget systems, where power is coming from el cheapo PSUs be they Rosewill or whatever Dell and HP are slinging. Or they're enthusiasts that actually have a power budget for an ITX build. Either way, extra power usage for no gain does nothing for them.

The other is a halo part.

These are two different parts with two different target markets. So yes, two different standards apply, as you see in reviews.


More bullshit double speak, it's only ok on the high end because you say that midrange parts are either on a garbage bin PSU or an itx build. Glad that you were here to clarify the use cases for us.

In both cases there was extra performance for extra power draw. You claimed one was ok, but the other was not. Now you're trying to frame it as extra power for no gain.

It's right above, from [H] reviews.

18% performance increase, AMD blown efficiency curve
16% performance increase, Intel should do all that and more if it means extra performance.


Oh no, I just laugh. I can only point out ignorance; you can choose to remain so if you like.

As far as sitting back and laughing, hardly, you're a prolific poster, reminding everyone of what should be obvious, depending on which of your two standards we're applying.

And this is just forum gold. It's clear why you don't mind innefficient parts: you're trading out for more power draw and less performance, and paying for the privilege!

On the last point, I have a freesync panel, I've had i for years. There's still no gsync equivalent or I'd have one to replace it. Freesync doesn't work well with my 1080ti, tearing and juddering depending on frame rate. Plus the drivers that enable it cause black screen on screen wake .

My position on buying a Radeon vii is based on my use case, and yes I'm willing to pay something to use freesync (which worked on my fury's fine) and live with the extra power draw.

It's baffling that you're so concerned for everyone, that you spend so much time in this and other threads, making sure they know your view.

Edit:typo
 
Last edited:
Damn are people still arguing about 50 watts of power, can we just agree in certain situations, 50 watts is nothing and in others, 50 watts can be an issue. For power users, 50 watts means nothings for the majority of us but for PC manufacturer, sure that 50 watts can matter since it means they have to spend more of PSU. Seems like a silly thing to argue over.
 
My position on buying a Radeon vii is based on my use case, and yes I'm willing to pay something to use freesync (which worked on my fury's fine) and live with the extra power draw.

And your specific edge case, supposing that a driver or firmware isn't developed to deal with the quirks of your specific display, make sense. How you got to the point that it was even a consideration though is on you.

More bullshit double speak

You're saying that [H] reviews that consider target markets are 'double speak'?

Of course I'm going to consider the target market.
 
Damn are people still arguing about 50 watts of power, can we just agree in certain situations, 50 watts is nothing and in others, 50 watts can be an issue. For power users, 50 watts means nothings for the majority of us but for PC manufacturer, sure that 50 watts can matter since it means they have to spend more of PSU. Seems like a silly thing to argue over.


99% of what people argue over for gpu's is dumb, it's literally a game of one upmanship trying to find some way that x is "better" than y. Same situation on most tech forums, gets old fast.
 
And your specific edge case, supposing that a driver or firmware isn't developed to deal with the quirks of your specific display, make sense. How you got to the point that it was even a consideration though is on you.

Now THAT's forum gold. How could I ever have gotten to the point where I own a large 4k panel. Well, I wanted one to replace my portrait eyefinity setup, and it was one of only a couple of choices on the market. I suppose I could have not bought it almost 3.5 years ago and just waited for.....the still to be released BFGD? The fact that it supports VRR was a plus, and now it will be again. You've got a set of balls on you, I'll give you that.



You're saying that [H] reviews that consider target markets are 'double speak'?

Of course I'm going to consider the target market.


Overclocking midrange parts to meet or exceed high end parts is literally what this site is about, you're only pretending that the extra power use out of mid range parts is unacceptable because people who use mid range parts only run a dell or use garbage PSUs, but high end parts can consume whatever power they can, for whatever performance increase they can, without regard.

That's the double speak, you're crazy excuse for talking out both sides of your mouth, this arbitrary delineation between mid and high end parts to suit statements you've already made.
 
How could I ever have gotten to the point where I own a large 4k panel.

But you got one without G-Sync- and you bought a 1080Ti. You made the decisions that got you to the point of having a hardware incompatability.

Overclocking midrange parts to meet or exceed high end parts is literally what this site is about

Sure, now you're +100w over the competition- if the card doesn't croak (or take something else out!) because it's already cranked over the efficiency curve at stock :ROFLMAO:
 
But you got one without G-Sync- and you bought a 1080Ti. You made the decisions that got you to the point of having a hardware incompatability.



Sure, now you're +100w over the competition- if the card doesn't croak (or take something else out!) because it's already cranked over the efficiency curve at stock :ROFLMAO:

You have a serious comprehension issue. I'll take it slow, there is no gsync equivalent monitor today, and certainly wasn't in 2015 when I bought it. If I could have kept freesync compatibility when I upgraded gpus, I obviously would have. Now that I can, I'm clearly not bothered by the extra power vs my existing gpu. It's baffling why you would be.

We're way OT, suffice to say everyone gets your point. Power consumption matters. Unless it doesn't, because it's high end. Also, 50w is a serious amount of extra power to worry about. Unless it's not. Who the fuck knows.
 
there is no gsync equivalent monitor today, and certainly wasn't in 2015 when I bought it. If I could have kept freesync compatibility when I upgraded gpus, I obviously would have

So you'd rather have a faster GPU than variable v-sync- why does it matter so much now?

everyone gets your point

Everyone reasonable does- it's not hard to understand that different target markets have different priorities. Probably not so much for the Radeon VII (look! the topic!), or a 9900K (or a 2950X...), but for mid-range stuff? It matters.

More so, if there is extra power consumption versus the competition with no gain and/or if the difference is especially large or pushes a product into a different power bracket.
 
So you'd rather have a faster GPU than variable v-sync- why does it matter so much now?

I'm pretty sure I explained it. Lets try again.

" If I could have kept freesync compatibility when I upgraded gpus, I obviously would have."

I bought my Ti at launch, the fastest AMD gpu was Fury which I already owned two of. (There's a bug with some Crosshair VI boards and some Sapphire Fury Nitros, DP1.2 won't work. Asus acknowledged the bug, but couldn't fix it). There was no other option.

So you'd rather have a faster GPU than variable v-sync- why does it matter so much now?



Everyone reasonable does- it's not hard to understand that different target markets have different priorities. Probably not so much for the Radeon VII (look! the topic!), or a 9900K (or a 2950X...), but for mid-range stuff? It matters.

More so, if there is extra power consumption versus the competition with no gain and/or if the difference is especially large or pushes a product into a different power bracket.

Now we're back to your arbitrary decision on whats acceptable for each market segment. Right now a VII is a lateral move, not a slower one, and it's not for no gain, I get VRR back (which I like having a lot) and AMD drivers. That's two upgrades, ones that I've articulated more than once, and ones that I'm more than willing to pay for.
 
There was no other option.

You just noted a number of options. Hell, motherboard would have been the most economical. Waiting for Vega, given what you're willing to give up to get FreeSync on your apparently unique monitor, would have been another. As I said, your choices put you in the position to even be considering putting down cash for a side-grade in GPU performance.

Now we're back to your arbitrary decision on whats acceptable for each market segment.

It isn't arbitrary; we wouldn't call a GT1030 or an RX560 'high end' any more than we'd call a Vega or 1080Ti 'low end'. These are in different market segments. And at the very least you should be able to understand power brackets between cards that need no PCIe power connector, those that need one, those that need two... in 75w increments. Each jump limits potential host systems a bit more in terms of power draw and thermal capacity. Reveiwers regularly note this.
 
You just noted a number of options. Hell, motherboard would have been the most economical. Waiting for Vega, given what you're willing to give up to get FreeSync on your apparently unique monitor, would have been another. As I said, your choices put you in the position to even be considering putting down cash for a side-grade in GPU performance.

As I said, why would you care? We're down to the point where I need to pull up time frames for hardware purchases to prove a point.

Ryzen MBs were rare as hens teeth at launch, I had my cpu in hand a week before I got my MB. There were no more to be had, and it wasn't confirmed where the issue lay for at least a couple months. I waited for asus and Sapphire to figure it out, 1080ti launched a month later and I bought one . There wasn't a launch date for Vega at that point (which ended up being in August).

I've already said that I was ok with giving up vrr in order to get back to playing, at that point it had been a month stuck on either 1080p or 4k30.

I'll do the same thing I did then, sell my car existing hardware and move on .

It isn't arbitrary; we wouldn't call a GT1030 or an RX560 'high end' any more than we'd call a Vega or 1080Ti 'low end'. These are in different market segments. And at the very least you should be able to understand power brackets between cards that need no PCIe power connector, those that need one, those that need two... in 75w increments. Each jump limits potential host systems a bit more in terms of power draw and thermal capacity. Reviewers regularly note this.

Clearly there are market segments, what's arbitrary is your decision that power only matters on everything below high end.

Does it matter more on super low end, maybe, depends on if the GPU you pick has the ability to take in more power than the pcie slot can provide.

But otherwise, it's a decision made by the purchaser, based on their other hardware. Plenty of people who buy mid range gpus have PSUs that can handle another 50 or 100w, just like people who buy Halo parts.

This isn't a real position, you're taking it to allow you to have two different options on the same topic, depending on the vendor.
 
This isn't a real position, you're taking it to allow you to have two different options on the same topic, depending on the vendor.

No, depending on target market. But you're determined to show that it's vendor-based for some reason, despite myself including AMD CPUs and GPUs in the same 'high performance' category in previous posts.

On the other hand, you appear to be willing to sacrifice quite a bit to support a particular vendor to your own financial detriment for some unnamed reason- and that wouldn't be a big deal if you weren't using your own misguided purchases as examples ;).
 
No, depending on target market. But you're determined to show that it's vendor-based for some reason, despite myself including AMD CPUs and GPUs in the same 'high performance' category in previous posts.

On the other hand, you appear to be willing to sacrifice quite a bit to support a particular vendor to your own financial detriment for some unnamed reason- and that wouldn't be a big deal if you weren't using your own misguided purchases as examples ;).

Again, in any target market, high, mid, low, there will be people who want the lowest power consumption, some who want the most performance regardless of power and all sorts in between.

YOU were the one that claimed that case two, max performance, was only ok in the case of high end parts. It's a bullshit argument to allow you to hold two different standards on the same issue.


I'm supporting the vendor that can give me the performance I want, with the features I want. Lol. It's not like I'm throwing a 1080ti in the trash, so the little bit it costs me is worth it.

I eagerly await your upcoming book, An Idiots guide to building computers. This thread has given me confidence that you're just the right guy for the job.
 
On the face of it, that sounds bad. On the other hand, it's Gibbo saying it, which makes me take it with more than just a pinch of salt. He's just trying to drum up sales, as per usual.

"Hey guys, this card is available in VERY LIMITED NUMBERS. Get your order in with OCUK as soon as possible!"

Now, it probably will be of limited availablity, but I don't trust his numbers at all :D
 
Its an interesting card, but at $700 I'll be keeping my water cooled V56 I paid $349 for.
 
Back
Top