Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Aye, both the of AIB versions of VII that I've seen images of look to be absolute reference versions, using the reference AMD triple-fan cooler. In fact, the only difference between the AIB and AMD cards seems to be the sticker on the card.
They BOTH use memory compression, and while nvidia likely does use better memory compression than AMD, they both use entirely lossless compression. There is absolutely no change in image quality from memory compression. Any on screen image quality differences come from other factors, like different mip-mapping algorithms and whatnot.Except that Nvidia uses compression techniques that AMD does not. That is one of the reasons that side by side, AMD's final onscreen image looks better than Nvidia's. Therefore, AMD needs greater bandwidth.
NO, NO, NO!!!
Bandwith compression is LOSSLESS!
What you are descibing is akain to "Loudness" in music....aka "vibrance".
Again, it has nothing to do with bandwith or compression!!!
And the "AMD has better colors" or "AMD looks sharper" has been utterly debunked a looooooong time ago:
https://hardforum.com/threads/a-real-test-of-nvidia-vs-amd-2d-image-quality.1694755/
Of course, let's wait and see how it performs. I wasn't commenting on the quality or otherwise of the reference cooler, just pointing out that the AIB cards all appear to be identical.The cooler looks good for reference so let's see how it performs.
Your eyes do lie...hence why we use equipment to measure....and have sites like HardOPC to runs tests...because your eyes are simply to bad to base anything off...keep denying facts, what is next...the earth is flat?
Has Kyle commented on if we'll see a review before launch?
Has Kyle commented on if we'll see a review before launch?
Thanks! I've been waiting for Kyle's game performance data to tell me how I should feel about this card.it'll probably be 6 am PST or CST day of launch.. usually how it's been for amd/intel/nvidia for a while now.
No, my eyes do not lie, at all. LOL! Next thing you will be telling me my ears, taste buds and nose lies as well. Nope, someone else should tell me or a machine should tell me what really is, not what actually is in front of me with my own two eyes.
People did say the image quality was better on the AMD Freesync setups IIRC in the blind testing hardforum did.
With monitors that were identical
The monitors weren't identical, actually- that was one of the issues I had with the test given the highly subjective nature of the conclusions.
I guess since nVidia supports VRR they could do an apples to apples now.
The funny part is, you're being /s but objective measurements have shown that all of those things in fact do lie. And everything that goes through your brain is subject to your subjective opinion.
Try talking to a trial attorney about how unreliable eye-witness testimony is. Do you know how many people have been wrongly sent to prison who didn't see what they thought they saw? My Dad does appelate law. It's magical how a persons shirt or eyes or hair can keep changing color in someone's memory. (Because people "know what they know what they know" and it was "that guy that done did it").
Do you understand what double blind testing is and why they do it? Because things like confirmation bias are big problems. The mind wants to see patterns and therefore does everything it can to fill in information in order to create one. Even in the absence of objective facts or proof.
Even memory is significantly more ethereal and fragile than your comment suggests. The fact that you would make statements like not wanting or needing "objective measurements" attests to it.
---
To be clear, I don't have a dog in this fight. I don't know whether these performance advantages nVidia uses comes at the cost of visual fidelity or not. But I would say it's very safe to say that if there WERE differences, there would be proof of said differences, and it would be more apparent under objective testing than not.
There are at least a half-dozen high profile websites that all deal with multiple AMD and nVidia GPU's simultaneously day in and day out. Some have spent hundreds of hours on these cards in the process of benchmarking them. I think it's more than reasonable to say that if this does exist you should be able to link up at least some sort of objective proof from one of them. Kyle, as an example has been doing this ish for well over 20 years. Never said a single word of phrase about it. And I would say that it's "fair to say" that he has handled more AMD and nVidia cards than you by a wide wide margin (not even including variants of the same card by different AIB manufacturers).
The monitors weren't identical, actually- that was one of the issues I had with the test given the highly subjective nature of the conclusions.
The only way to objectively see the difference is with your own two eyes in front of the computers themselves.
You cannot subjectively or objectively copy and paste images onto the internet and show anything because by that point, everything about them has changed.
As for your comments on the sites that do not say anything, that is not their job nor it is their primary focus. They all know when it is good to get involved in something and when it is better to just leave well enough alone.
We covered this. Your eyes aren't objective.
Print screen is more than accurate enough. Even if it's not one could film themselves.
Nope, for this, you are wrong.
The only way to objectively see the difference is with your own two eyes in front of the computers themselves.
Quote inside a quote which is unquotable.
For reference, 6800 graphics score in Fire Strike Ultra is about what a Titan X does stock. The RTX 2080 FE does around 6600.Benchmark leaked at videocardz
https://videocardz.com/79870/amd-radeon-vii-benchmarks-leak-ahead-of-launch
I tend to think that if there is such an important change to rendering and visual fidelity there would be sites talking about this. You might be "right" that it "isn't the focus of hardware gaming sites" (although the only thing ANY of them talk about all day is talk about min-maxing settings and trying to get things to look as good as possible while rendering as many FPS as possible. We can also ignore the fact that you "notice" these differences, but for some reason people with 1000's of hours don't, which I would say is an insult to people's whose profession it is to test video cards). But even if that IS the case, then you're ignoring a massive segment in the market in terms of professional users. You could easily post any of them discussing how there is big difference visually in any 3D rendering programs as a result of compression.
Hell if this is a real issue the difference between AMD vs nVidia cards would have to be a massive choice when making purchasing decisions for render farms for visual effects teams in movies as well as fully 3D rendered films (Pixar etc). I've seen zero discussion there either. This is a lot of people's jobs. As in, something that affects their bottom lines. Professional colorists as an example.
People spend $2000+ on monitors in this market. They're willing to spend $6000+ on video cards. And that's for single systems, they might buy a dozen. This is taken incredibly seriously. And I've seen no discussion about a visual fidelity difference between the two.
We covered this. Your eyes aren't objective.
Print screen is more than accurate enough. Even if it's not one could film themselves.
I tend to think that if there is such an important change to rendering and visual fidelity there would be sites talking about this. You might be "right" that it "isn't the focus of hardware gaming sites" (although the only thing ANY of them talk about all day is talk about min-maxing settings and trying to get things to look as good as possible while rendering as many FPS as possible. We can also ignore the fact that you "notice" these differences, but for some reason people with 1000's of hours don't, which I would say is an insult to people's whose profession it is to test video cards). But even if that IS the case, then you're ignoring a massive segment in the market in terms of professional users. You could easily post any of them discussing how there is big difference visually in any 3D rendering programs as a result of compression.
Hell if this is a real issue the difference between AMD vs nVidia cards would have to be a massive choice when making purchasing decisions for render farms for visual effects teams in movies as well as fully 3D rendered films (Pixar etc). I've seen zero discussion there either. This is a lot of people's jobs. As in, something that affects their bottom lines. Professional colorists as an example.
People spend $2000+ on monitors in this market. They're willing to spend $6000+ on video cards. And that's for single systems, they might buy a dozen. This is taken incredibly seriously. And I've seen no discussion about a visual fidelity difference between the two.
Most games still use the sRGB color space, so using 10-bit would just make the image appear oversaturated. An oversaturated image to untrained eyes looks impressive, but it is not at all what the intended appearance of the content is. To get a proper image the game would need the option to render in a different color space, and that is mostly limited to HDR.Actually I have seen quite a few comments from people jumping Nvidia>AMD that have commented on visual quality. Placebo or not it deserves further investigation. Before the 1000 series Nvidia only gave you 8bit outside of DX windows but most were not running 10 bit screens until recently which could have been part of it but I have also seen people jumping from 1000 series with the same comments.
Except that I would imagine that is a hugh difference between a consumer level card and a professional level card.
For reference, 6800 graphics score in Fire Strike Ultra is about what a Titan X does stock. The RTX 2080 FE does around 6600.
10Bit was never an issue for games but outside of DX windows it was (I guess that includes OGL etc), of course this impacts some other uses too. Most decent cameras these days can work in 10bit or beyond if lucky and yes colour spaces become an issue but the 10bit train has been around for a while if you have a use for it. I work with lasers and they make even the best screens look pretty crappy in gamut comparison, so when I have to work with those colours whilst selecting other colours it becomes pretty obvious. Tried to do it on a older TN panel recently and it was just frustrating to work with. Banding though... banding is easy to see for most, more screens have a 10bit LUT now though and seem to fudge it with the 8bit + FRC stuff... they look ok.Most games still use the sRGB color space, so using 10-bit would just make the image appear oversaturated. An oversaturated image to untrained eyes looks impressive, but it is not at all what the intended appearance of the content is. To get a proper image the game would need the option to render in a different color space, and that is mostly limited to HDR.
I have read other things about the difference in image quality that is more believable, though. It could be that the color compression that NVIDIA uses is less accurate than what AMD uses.
Interesting. Wonder if all that memory bandwidth can be used on the VII, have a feeling not for current gaming but the VRAM will definitely be of use looking forwards.another point of reference. My V64 Liquid does 6380 in Fire Strike Ultra with an undervolt and HBM at 1100mhz
Going to 1180 HBM my score goes to 6517. This should be fun!Interesting. Wonder if all that memory bandwidth can be used on the VII, have a feeling not for current gaming but the VRAM will definitely be of use looking forwards.
For reference, 6800 graphics score in Fire Strike Ultra is about what a Titan X does stock. The RTX 2080 FE does around 6600.
What voltage?! Sounds like you got some great HBM there. Take it easy with those volts though heard it's not the hardest to kill lol.Going to 1180 HBM my score goes to 6517. This should be fun!
stock volts on HBM. I set P3 state to 1000mV which is just voltage floor. I run it daily in BFV at 1050Mhz.What voltage?! Sounds like you got some great HBM there. Take it easy with those volts though heard it's not the hardest to kill lol.
Jesus that's a good result for benching at those volts. Looks like I'll need to tinker when back home in a few months.stock volts on HBM. I set P3 state to 1000mV which is just voltage floor. I run it daily in BFV at 1050Mhz.
So do we think this will perform better than a 1080Ti @ 4k? That's the main thing I'm concerned with.
I'd like to upgrade my 1080Ti for better 4k performance, but no way in hell am I paying $1k for a videocard.
Unless you can get close to $700 for your 1080Ti, it looks like it won't be worth it. From the little we know Radeon 7 will be around the same performance with what you have already. Some it will win and some it will lose.
Well, the 2080 is generally as good as an OC'd 1080ti, so, if the Vega 7 is on par with it it should be about as fast as... an overclocked 1080ti.. What that is worth to you is up to you, EG do you have a monitor that supports freesync? (or perhaps a TV)