Radeon 7 (Vega 2, 7nm, 16GB) - $699 available Feb 7th with 3 games

Aye, both the of AIB versions of VII that I've seen images of look to be absolute reference versions, using the reference AMD triple-fan cooler. In fact, the only difference between the AIB and AMD cards seems to be the sticker on the card.


The cooler looks good for reference so let's see how it performs.
 
Except that Nvidia uses compression techniques that AMD does not. That is one of the reasons that side by side, AMD's final onscreen image looks better than Nvidia's. Therefore, AMD needs greater bandwidth.
They BOTH use memory compression, and while nvidia likely does use better memory compression than AMD, they both use entirely lossless compression. There is absolutely no change in image quality from memory compression. Any on screen image quality differences come from other factors, like different mip-mapping algorithms and whatnot.

NO, NO, NO!!!

Bandwith compression is LOSSLESS!
What you are descibing is akain to "Loudness" in music....aka "vibrance".

Again, it has nothing to do with bandwith or compression!!!

And the "AMD has better colors" or "AMD looks sharper" has been utterly debunked a looooooong time ago:
https://hardforum.com/threads/a-real-test-of-nvidia-vs-amd-2d-image-quality.1694755/

Yeah, this guy is right. GPU memory compression is lossless, period.
 
The cooler looks good for reference so let's see how it performs.
Of course, let's wait and see how it performs. I wasn't commenting on the quality or otherwise of the reference cooler, just pointing out that the AIB cards all appear to be identical.

At least it's not yet another AMD blower!
 
Damn it. I was going to get this but had $50 in ebay bucks and while browsing saw someone selling brand new ROG Strix OC 2080s for 700(had more then 10 available). Got it 650 out the door. AMD getting my money for zen 2 for sure though! Unless the dude is trying to scam me I think I pulled a good deal. He did have 311 (100%) feedback. Paid with CC so not worried about losing out. Should have it upcoming week.

Hopefully the strix passes the random black screen test. The reason I returned the zotac for. hey either way if I go for Radeon 7. one will be up for sale here in the forums for sure. Haha.
 
Last edited:
Your eyes do lie...hence why we use equipment to measure....and have sites like HardOPC to runs tests...because your eyes are simply to bad to base anything off...keep denying facts, what is next...the earth is flat?

No, my eyes do not lie, at all. LOL! Next thing you will be telling me my ears, taste buds and nose lies as well. :D Nope, someone else should tell me or a machine should tell me what really is, not what actually is in front of me with my own two eyes. :D
 
No, my eyes do not lie, at all. LOL! Next thing you will be telling me my ears, taste buds and nose lies as well. :D Nope, someone else should tell me or a machine should tell me what really is, not what actually is in front of me with my own two eyes. :D

The funny part is, you're being /s but objective measurements have shown that all of those things in fact do lie. And everything that goes through your brain is subject to your subjective opinion.
Try talking to a trial attorney about how unreliable eye-witness testimony is. Do you know how many people have been wrongly sent to prison who didn't see what they thought they saw? My Dad does appelate law. It's magical how a persons shirt or eyes or hair can keep changing color in someone's memory. (Because people "know what they know what they know" and it was "that guy that done did it").
Do you understand what double blind testing is and why they do it? Because things like confirmation bias are big problems. The mind wants to see patterns and therefore does everything it can to fill in information in order to create one. Even in the absence of objective facts or proof.
Even memory is significantly more ethereal and fragile than your comment suggests. The fact that you would make statements like not wanting or needing "objective measurements" attests to it.
---
To be clear, I don't have a dog in this fight. I don't know whether these performance advantages nVidia uses comes at the cost of visual fidelity or not. But I would say it's very safe to say that if there WERE differences, there would be proof of said differences, and it would be more apparent under objective testing than not.

There are at least a half-dozen high profile websites that all deal with multiple AMD and nVidia GPU's simultaneously day in and day out. Some have spent hundreds of hours on these cards in the process of benchmarking them. I think it's more than reasonable to say that if this does exist you should be able to link up at least some sort of objective proof from one of them. Kyle, as an example has been doing this ish for well over 20 years. Never said a single word of phrase about it. And I would say that it's "fair to say" that he has handled more AMD and nVidia cards than you by a wide wide margin (not even including variants of the same card by different AIB manufacturers).
 
Last edited:
People did say the image quality was better on the AMD Freesync setups IIRC in the blind testing hardforum did.

I have Nvidia now but there’s no doubt in my mind that AMD has better black levels connected to my Panasonic AE8000U projector. It’s not a slight difference. I’ve never figured it out. I have tried the following cards on my current projector and way more than that going back about 15 years on other three other projectors I’ve owned. (Panasonic AE500u, Panasonic AE700u, Epson 8350) It’s like there’s a problem with the HDMI space setting on the setup. Nvidia cards have gray blacks on my projectors. AMD cards have correct blacks. I have my Nvidia cards set to 0-255 in the control panel. But 16-255 looks worse.

Nvidia GTX 560
Nvidia GTX 670
AMD 285
AMD 380
AMD Fury X
AMD Vega 56
AMD Vega 64
Nvidia 1080ti


It’s frustrating.

Also I’ve read over the years that if you set Nvidia color saturation to 55% instead of default of 50% it looks more identical. I’d say that holds true in my experience to the colors, but I’ve never figured out the black level issue. The black levels are not a problems on a monitor.
 
Last edited:
  • Like
Reactions: noko
like this
People did say the image quality was better on the AMD Freesync setups IIRC in the blind testing hardforum did.

They did, I agree with you there. But that was in reference to smoothness, not image quality. And that also was done with A-B testing. With monitors that were identical (expect for Freesync vs Gsync). And setups that were identical. With resolutions and settings that were identical.

This article: https://www.hardocp.com/article/2018/03/30/amd_radeon_freesync_2_vs_nvidia_gsync

It does mention that HDR made a difference on AMD's. But that isn't even what we're talking about here. Because there should be a difference whether HDR is on or off (on the AMD side) if nVidia's compression lowers visual fidelity.
 
Last edited:
The monitors weren't identical, actually- that was one of the issues I had with the test given the highly subjective nature of the conclusions.

I guess since nVidia supports VRR they could do an apples to apples now.
 
I guess since nVidia supports VRR they could do an apples to apples now.

Sort of?

There are 'identical' monitors from some companies now; the monitors used for the last test had different panels (with different panel types).
 
  • Like
Reactions: N4CR
like this
The funny part is, you're being /s but objective measurements have shown that all of those things in fact do lie. And everything that goes through your brain is subject to your subjective opinion.
Try talking to a trial attorney about how unreliable eye-witness testimony is. Do you know how many people have been wrongly sent to prison who didn't see what they thought they saw? My Dad does appelate law. It's magical how a persons shirt or eyes or hair can keep changing color in someone's memory. (Because people "know what they know what they know" and it was "that guy that done did it").
Do you understand what double blind testing is and why they do it? Because things like confirmation bias are big problems. The mind wants to see patterns and therefore does everything it can to fill in information in order to create one. Even in the absence of objective facts or proof.
Even memory is significantly more ethereal and fragile than your comment suggests. The fact that you would make statements like not wanting or needing "objective measurements" attests to it.
---
To be clear, I don't have a dog in this fight. I don't know whether these performance advantages nVidia uses comes at the cost of visual fidelity or not. But I would say it's very safe to say that if there WERE differences, there would be proof of said differences, and it would be more apparent under objective testing than not.

There are at least a half-dozen high profile websites that all deal with multiple AMD and nVidia GPU's simultaneously day in and day out. Some have spent hundreds of hours on these cards in the process of benchmarking them. I think it's more than reasonable to say that if this does exist you should be able to link up at least some sort of objective proof from one of them. Kyle, as an example has been doing this ish for well over 20 years. Never said a single word of phrase about it. And I would say that it's "fair to say" that he has handled more AMD and nVidia cards than you by a wide wide margin (not even including variants of the same card by different AIB manufacturers).

The only way to objectively see the difference is with your own two eyes in front of the computers themselves. You cannot subjectively or objectively copy and paste images onto the internet and show anything because by that point, everything about them has changed. As for your comments on the sites that do not say anything, that is not their job nor it is their primary focus. They all know when it is good to get involved in something and when it is better to just leave well enough alone.
 
The monitors weren't identical, actually- that was one of the issues I had with the test given the highly subjective nature of the conclusions.

Well, only because Nvidia did not win. :rolleyes:

Edit: Also, since Nvidia did not support Freesync and needed G Sync, of course there were going to be different monitors. Not AMD's fault that Nv was using proprietary tech.
 
Last edited:
The only way to objectively see the difference is with your own two eyes in front of the computers themselves.

We covered this. Your eyes aren't objective.


You cannot subjectively or objectively copy and paste images onto the internet and show anything because by that point, everything about them has changed.

Print screen is more than accurate enough. Even if it's not one could film themselves.


As for your comments on the sites that do not say anything, that is not their job nor it is their primary focus. They all know when it is good to get involved in something and when it is better to just leave well enough alone.

I tend to think that if there is such an important change to rendering and visual fidelity there would be sites talking about this. You might be "right" that it "isn't the focus of hardware gaming sites" (although the only thing ANY of them talk about all day is talk about min-maxing settings and trying to get things to look as good as possible while rendering as many FPS as possible. We can also ignore the fact that you "notice" these differences, but for some reason people with 1000's of hours don't, which I would say is an insult to people's whose profession it is to test video cards). But even if that IS the case, then you're ignoring a massive segment in the market in terms of professional users. You could easily post any of them discussing how there is big difference visually in any 3D rendering programs as a result of compression.

Hell if this is a real issue the difference between AMD vs nVidia cards would have to be a massive choice when making purchasing decisions for render farms for visual effects teams in movies as well as fully 3D rendered films (Pixar etc). I've seen zero discussion there either. This is a lot of people's jobs. As in, something that affects their bottom lines. Professional colorists as an example.

People spend $2000+ on monitors in this market. They're willing to spend $6000+ on video cards. And that's for single systems, they might buy a dozen. This is taken incredibly seriously. And I've seen no discussion about a visual fidelity difference between the two.
 
Last edited:
Why do you have to do color testing at all with Freesync or Gsync monitors? Color isn't affected by either so the only objective way to do it is if the monitors are truly identical.

The only way to objectively see the difference is with your own two eyes in front of the computers themselves.

No, the only way to be objective is to use a spectrophotometer. Everyone's eyes (technically, the interpretation of what our eyes send to our brain) is subjective because interpretation is dependent of what we perceive, i.e. subjective by definition.
 
Quote inside a quote which is unquotable.

This is how/why you sound like a conspiracy theorist. Because you have zero proof. Only YOU can see the difference. Anyone else whose profession it is "can't" (including industries that produce products that cost $100 million dollars+). And any discussion to the contrary is wrong.


EDIT: Just thought of even another item. If there was a visual fidelity difference, AMD themselves would make note of it. Because any performance advantage you can advertise is marketable. Being able to literally print on their boxes: "Looks better than nVidia!", and be legally correct and defensible would be huge.
 
Last edited:
I tend to think that if there is such an important change to rendering and visual fidelity there would be sites talking about this. You might be "right" that it "isn't the focus of hardware gaming sites" (although the only thing ANY of them talk about all day is talk about min-maxing settings and trying to get things to look as good as possible while rendering as many FPS as possible. We can also ignore the fact that you "notice" these differences, but for some reason people with 1000's of hours don't, which I would say is an insult to people's whose profession it is to test video cards). But even if that IS the case, then you're ignoring a massive segment in the market in terms of professional users. You could easily post any of them discussing how there is big difference visually in any 3D rendering programs as a result of compression.

Hell if this is a real issue the difference between AMD vs nVidia cards would have to be a massive choice when making purchasing decisions for render farms for visual effects teams in movies as well as fully 3D rendered films (Pixar etc). I've seen zero discussion there either. This is a lot of people's jobs. As in, something that affects their bottom lines. Professional colorists as an example.

People spend $2000+ on monitors in this market. They're willing to spend $6000+ on video cards. And that's for single systems, they might buy a dozen. This is taken incredibly seriously. And I've seen no discussion about a visual fidelity difference between the two.

Except that I would imagine that is a hugh difference between a consumer level card and a professional level card.
 
We covered this. Your eyes aren't objective.




Print screen is more than accurate enough. Even if it's not one could film themselves.




I tend to think that if there is such an important change to rendering and visual fidelity there would be sites talking about this. You might be "right" that it "isn't the focus of hardware gaming sites" (although the only thing ANY of them talk about all day is talk about min-maxing settings and trying to get things to look as good as possible while rendering as many FPS as possible. We can also ignore the fact that you "notice" these differences, but for some reason people with 1000's of hours don't, which I would say is an insult to people's whose profession it is to test video cards). But even if that IS the case, then you're ignoring a massive segment in the market in terms of professional users. You could easily post any of them discussing how there is big difference visually in any 3D rendering programs as a result of compression.

Hell if this is a real issue the difference between AMD vs nVidia cards would have to be a massive choice when making purchasing decisions for render farms for visual effects teams in movies as well as fully 3D rendered films (Pixar etc). I've seen zero discussion there either. This is a lot of people's jobs. As in, something that affects their bottom lines. Professional colorists as an example.

People spend $2000+ on monitors in this market. They're willing to spend $6000+ on video cards. And that's for single systems, they might buy a dozen. This is taken incredibly seriously. And I've seen no discussion about a visual fidelity difference between the two.

Actually I have seen quite a few comments from people jumping Nvidia>AMD that have commented on visual quality. Placebo or not it deserves further investigation. Before the 1000 series Nvidia only gave you 8bit outside of DX windows but most were not running 10 bit screens until recently which could have been part of it but I have also seen people jumping from 1000 series with the same comments.
 
Actually I have seen quite a few comments from people jumping Nvidia>AMD that have commented on visual quality. Placebo or not it deserves further investigation. Before the 1000 series Nvidia only gave you 8bit outside of DX windows but most were not running 10 bit screens until recently which could have been part of it but I have also seen people jumping from 1000 series with the same comments.
Most games still use the sRGB color space, so using 10-bit would just make the image appear oversaturated. An oversaturated image to untrained eyes looks impressive, but it is not at all what the intended appearance of the content is. To get a proper image the game would need the option to render in a different color space, and that is mostly limited to HDR.

I have read other things about the difference in image quality that is more believable, though. It could be that the color compression that NVIDIA uses is less accurate than what AMD uses.
 
Except that I would imagine that is a hugh difference between a consumer level card and a professional level card.

Not as much as you'd think or you'd like to think. This Radeon VII as an example is just a cut down Radeon Instinct card. nVidia does the same thing.

So, why hasn't AMD made note of their visual fidelity difference and placed it on all of its marketing?
 
For reference, 6800 graphics score in Fire Strike Ultra is about what a Titan X does stock. The RTX 2080 FE does around 6600.

another point of reference. My V64 Liquid does 6380 in Fire Strike Ultra with an undervolt and HBM at 1100mhz
 
Most games still use the sRGB color space, so using 10-bit would just make the image appear oversaturated. An oversaturated image to untrained eyes looks impressive, but it is not at all what the intended appearance of the content is. To get a proper image the game would need the option to render in a different color space, and that is mostly limited to HDR.

I have read other things about the difference in image quality that is more believable, though. It could be that the color compression that NVIDIA uses is less accurate than what AMD uses.
10Bit was never an issue for games but outside of DX windows it was (I guess that includes OGL etc), of course this impacts some other uses too. Most decent cameras these days can work in 10bit or beyond if lucky and yes colour spaces become an issue but the 10bit train has been around for a while if you have a use for it. I work with lasers and they make even the best screens look pretty crappy in gamut comparison, so when I have to work with those colours whilst selecting other colours it becomes pretty obvious. Tried to do it on a older TN panel recently and it was just frustrating to work with. Banding though... banding is easy to see for most, more screens have a 10bit LUT now though and seem to fudge it with the 8bit + FRC stuff... they look ok.

Supposedly compression is lossless but Nvidia seems to have a big leg up on AMD with compression efficiency in the past generations, so maybe will we see a spectre-esque reduction if some sort of lossy compression is uncovered. Problem with GPUs is there are so many layers to 'cheat' on.. shaders and the whole nine yards beyond my level of comprehension.. leave it up to the experts ;)

another point of reference. My V64 Liquid does 6380 in Fire Strike Ultra with an undervolt and HBM at 1100mhz
Interesting. Wonder if all that memory bandwidth can be used on the VII, have a feeling not for current gaming but the VRAM will definitely be of use looking forwards.
 
For reference, 6800 graphics score in Fire Strike Ultra is about what a Titan X does stock. The RTX 2080 FE does around 6600.

My 1080 Ti FTW3 at "stock" does 7321 .. but it's probably boosting to almost 2Ghz because it's under water.

Hopefully the R7 does higher in real games but at least they are around 2080/1080Ti scores which was the target.
 
Going to 1180 HBM my score goes to 6517. This should be fun!
What voltage?! Sounds like you got some great HBM there. Take it easy with those volts though heard it's not the hardest to kill lol.
 
What voltage?! Sounds like you got some great HBM there. Take it easy with those volts though heard it's not the hardest to kill lol.
stock volts on HBM. I set P3 state to 1000mV which is just voltage floor. I run it daily in BFV at 1050Mhz.
 
stock volts on HBM. I set P3 state to 1000mV which is just voltage floor. I run it daily in BFV at 1050Mhz.
Jesus that's a good result for benching at those volts. Looks like I'll need to tinker when back home in a few months.
 
So do we think this will perform better than a 1080Ti @ 4k? That's the main thing I'm concerned with.

I'd like to upgrade my 1080Ti for better 4k performance, but no way in hell am I paying $1k for a videocard.
 
So do we think this will perform better than a 1080Ti @ 4k? That's the main thing I'm concerned with.

I'd like to upgrade my 1080Ti for better 4k performance, but no way in hell am I paying $1k for a videocard.

Unless you can get close to $700 for your 1080Ti, it looks like it won't be worth it. From the little we know Radeon 7 will be around the same performance with what you have already. Some it will win and some it will lose.
 
Unless you can get close to $700 for your 1080Ti, it looks like it won't be worth it. From the little we know Radeon 7 will be around the same performance with what you have already. Some it will win and some it will lose.

Well, the 2080 is generally as good as an OC'd 1080ti, so, if the Vega 7 is on par with it it should be about as fast as... an overclocked 1080ti.. What that is worth to you is up to you, EG do you have a monitor that supports freesync? (or perhaps a TV)
 
Well, the 2080 is generally as good as an OC'd 1080ti, so, if the Vega 7 is on par with it it should be about as fast as... an overclocked 1080ti.. What that is worth to you is up to you, EG do you have a monitor that supports freesync? (or perhaps a TV)

That's what i mean, it really is a side grade performance wise, it would only really be worth it if he sold his 1080Ti for the price of what the Radeon 7 is.
 
Back
Top