AMD Live Webcast From Computex 2016

$200 is a good price, but it's disappointing they don't have anything that competes with at least the 1070, Nvidia's prices suck. If they announced a $300 1070 competitor I'd probably wait and pick up two.

I'm sure they'll sell a bunch of these.

Oh, I don't know if this has already been posted. They were right about the GTX 1080 and GTX 1070, and it matches up with what AMD has said. The top variant (67DF:C7) is supposedly the 480x and the lower (67DF:C4) the 480. Slower than the Fury, which is a little disappointing.

AMD-Radeon-R9-480-3DMark11-Performance.png
 
Last edited:
$200 is a good price, but it's disappointing they don't have anything that competes with at least the 1070, Nvidia's prices suck. If they announced a $300 1070 competitor I'd probably wait and pick up two.

I'm sure they'll sell a bunch of these.

Oh, I don't know if this has already been posted. They were right about the GTX 1080 and GTX 1070, and it matches up with what AMD has said. The top variant (67DF:C7) is supposedly the 480x and the lower (67DF:C4) the 480. Slower than the Fury, which is a little disappointing.

Copy pasting this link for something like 10th time today: Result
 
Copy pasting this link for something like 10th time today: Result

So C7, is actually slower than a 970 & 390x? Interesting. I'll assume the higher score is overclocked. $250 for a 8GB 970ish performance is kinda strange. Might as well get the 4GB for $200 since that is plenty of VRAM for that performance level...
 
So C7, is actually slower than a 970 & 390x? Interesting. I'll assume the higher score is overclocked. $250 for a 8GB 970ish performance is kinda strange. Might as well get the 4GB for $200 since that is plenty of VRAM for that performance level...
There are two GPUs, C4 and C7 both clocked at 1266 MHz.
 
So C7, is actually slower than a 970 & 390x? Interesting. I'll assume the higher score is overclocked. $250 for a 8GB 970ish performance is kinda strange. Might as well get the 4GB for $200 since that is plenty of VRAM for that performance level...
I believe the lower score is simply terrible drivers and the higher one is the relevant one (if you pay attention driver versions reported are different in those). But yes, otherwise it does come out as slightly faster than 970 and slightly slower than 390/390x in this particular canned benchmark. And bear in mind, 390 had 970ish performance with 8gb of VRAM :p
There are two GPUs, C4 and C7 both clocked at 1266 MHz.
Nah, both are C7 in my link.
 
until you start thinking omg fuck this multi card solution... hoops and jumping through them to make stuff work.

Yes, Crossfire and SLI often have issue with early drivers for releases of various games. You still can't tell me the possibility of paying $200 less for GTX 1080 performance isn't the least bit intriguing, particularly considering what FreeSync costs compared to GSync.
 
$200 is a good price, but it's disappointing they don't have anything that competes with at least the 1070, Nvidia's prices suck. If they announced a $300 1070 competitor I'd probably wait and pick up two.
More people buy GTX 970's and 960's than 980's and 980 Ti's. Steam hardware survey shows the 970 at over 5% of users, with the 960 at 3.8%. The 980 and 980 Ti have about 1% each. It makes no sense for AMD to fight for the 1%. And while the 1070 is meant to replace the 970, I can't see it having the same success as the 970. The 970 is a 10% slower 980 at $320 on release. The 1070 is a 25% slower 1080, with a price tag of $380. The RX 480 will bring 970ish performance for the price of $200. That is literally a game changer.

But now the $300 price range is rather empty. You won't buy a GTX 970 or R9 390 because you now have the RX 480. You won't buy a 980, 980 Ti, or Fury X when you have the 1070 and 1080. Unless Nvidia and AMD price those products down to the $300ish price range, which I doubt they would.
 
AMD, what are you doing!!???

Holding back information, especially a lot of information in a competitive market, where you are late to the game in your current generation is NEVER going to be a good idea.

This does not sit with me well.

Fuck I was always a fan of ATI since their headquarters were by my place... I waited years to upgrade from my OCd 7950, gen after gen, waiting.

980Ti is too expensive in Canada, our dollar sucks, and our retailers are dirty pieces of shit.
Dual 1070s??? Fuck I don't know

You don't let your left hand know what your right hand is doing. Why would they spill the beans to Nvidia in advance so that they can counter their announcements and fanfare? Just good common sense. ;) Also that's what these speculation and discussion threads are for!
 
I'd be curious if Steam would start weighting the GPUs by the performance level *required* by library.
E.g. you could very well get away with the Intel IGPs, dominating four spots out of the top 10, if you play retro titles only.

But what happens if you weigh gaming libraries differently? E.g. libraries with higher priced titles (e.g. modern FPS) aren't going to run on those Intel IGPs.

My point is that the Steam survey is not properly capturing the discrete card market - another metric needs to be made.
Heck the Steam VR bench is right there; I'm sure they could create a separate category based on:
1) The number of users who ran it (I don't see Intel IGP users being interested haha)
2) The average score
3) Mode - the most common GPU
4) Mean - the average performance GPU

Etc...

Lots to improve on that front, in terms of getting a proper snapshot of the Steam userbase.
 
If it does come in just under the Fury like that, then it'll be a great card for the vast majority of users. I think I'll stay cautious about that until the numbers come in.
 
I had 7970 trifire and FC3 ran like shit and BF4 had serious texture issues.

Look at recent [H] and pcper (frame times) and crossfire is currently trash. They were on a decent spree there... made the frame times better and XDMA with the 290x, then seems like they just stopped trying.

If a 480 suits your needs so be it, looks like it might be great value. I wouldn't suggest crossfire to anyone. Wait for Vega...

I may not be reading the same reviews as you, but last I checked, even comparing Fury X to 980 ti, AMD had better xfire scaling than Nvidia did.

"But what about that direct AMD and NVIDIA comparisons?
Despite what we might have expected going in, the AMD Radeon R9 Fury X actually scaled in CrossFire better than the NVIDIA GeForce GTX 980 Ti.
This comes not only in terms of average frame rate increases, but also in lower frame time variances that result in a smoother gaming experience.
In several cases the extra scalability demonstrated by the Fury X allowed its dual-GPU performance to surpass a pair of GTX 980 Ti cards even though in a single GPU configuration the GeForce card was the winner.
GRID 2 at 4K is one example of this result as is Bioshock Infinite at 4K. And even in a game like Crysis 3 at 4K where we saw NVIDIA's card scale by a fantastic 84%, AMD's Fury X card scaled by 95%!"

AMD Fury X vs. NVIDIA GTX 980 Ti: 2- and 3-Way Multi-GPU Performance | Power Consumption and Closing Thoughts
 
I may not be reading the same reviews as you, but last I checked, even comparing Fury X to 980 ti, AMD had better xfire scaling than Nvidia did.

"But what about that direct AMD and NVIDIA comparisons?
Despite what we might have expected going in, the AMD Radeon R9 Fury X actually scaled in CrossFire better than the NVIDIA GeForce GTX 980 Ti.
This comes not only in terms of average frame rate increases, but also in lower frame time variances that result in a smoother gaming experience.
In several cases the extra scalability demonstrated by the Fury X allowed its dual-GPU performance to surpass a pair of GTX 980 Ti cards even though in a single GPU configuration the GeForce card was the winner.
GRID 2 at 4K is one example of this result as is Bioshock Infinite at 4K. And even in a game like Crysis 3 at 4K where we saw NVIDIA's card scale by a fantastic 84%, AMD's Fury X card scaled by 95%!"

AMD Fury X vs. NVIDIA GTX 980 Ti: 2- and 3-Way Multi-GPU Performance | Power Consumption and Closing Thoughts

Sure am!

You left our parts of the closing thoughts like:

"This story that focuses on the performance scaling capability of the AMD Radeon R9 Fury X and the NVIDIA GeForce GTX 980 Ti has revealed some interesting information to us. Let's start with the easiest outcome to decipher: 3-Way SLI and 3-Way CrossFire just do not present a positive experience for gamers of either camp"

And out of all the games he tested he could name two that scaled well. The reality is the majority of the games scale like shit and are filled with microstutter, in many cases making it feel the same or worse than a single card.

image.jpg
image.jpg
image.jpg
image.jpg

I dislike both SLI and Crossfire equally. Even if AMD scales a little better, they aren't supported as early or in as many titles.

That's the main thing that bugged me in the 480 presentation... Suggesting crossfire... On a low-mid range card.
 
Last edited:
Sure am!

You left our parts of the closing thoughts like:

"This story that focuses on the performance scaling capability of the AMD Radeon R9 Fury X and the NVIDIA GeForce GTX 980 Ti has revealed some interesting information to us. Let's start with the easiest outcome to decipher: 3-Way SLI and 3-Way CrossFire just do not present a positive experience for gamers of either camp"

And out of all the games he tested he could name two that scaled well. The reality is the majority of the games scale like shit and are filled with microstutter, in many cases making it feel the same or worse than a single card.


I dislike both SLI and Crossfire equally. Even if AMD scales a little better, they aren't supported as early or in as many titles.

That's the main thing that bugged me in the 480 presentation... Suggesting crossfire... On a low-mid range card.

I agree that trying to compare a pair of 480's to a 1080 is silly. They're not in the same bracket, never will be.

I misunderstood your previous post to indicate that SLI was working better than xfire, so I posted the conclusion section, but paper still draws the same conclusion with 2way xfire, that it works well a good potion of the time, which is what 90+% of the few people running multi gpu use. It's a small subset of a subset that run more than two gpus.

I guess it comes down to persona preference. How much of an increase in frame times, and how much variance in frame times is acceptable? Looking at all the graphs on the games shows me that there's definitely a gain, with acceptable trade offs in a two gpu system.

To be a counter to the quote about 3 way xfire/sli...

"But let's talk about the results of standard old two card SLI and CrossFire. In most of our games and resolutions we actually saw gains that demonstrated the benefits of adding a second card to a system for a power user that demands high frame rates, high resolutions or both. Games like Crysis 3 and Battlefield 4 show us that when done well, and with a GPU-bottlenecked scenario, you can see scaling from 75-95% depending on settings. Other games, like Metro: Last Light or even GRID 2 require 4K resolutions to scale at any reasonable levels."
 
Did you never see a CF setup AFTER AMD fixed the frame time issue?

I had dual 7970s for quite a while. Most games worked just fine even before the fixed drivers.

The releases after they fixed frametime issues made most stuff a lot smoother.

Now I am on a single R9 390 (because I got it for free), and since I game at 1920x1200, I haven't seen any reason to get anything faster.

Oh, and I had dual 6870s before I had the 7970s. Compared to 5870, the 6870 ran way cooler and used less power, even if they were a tiny bit slower than the 5870 they were very good cards.

I still have them, and will be putting them in service in an XP "retro box" I am building.

The 480 with only 1x 6-pin plug???? Now that is a huge leap for AMD in regards to power efficiency.

Really tempted to sell my 390 and get a 480 8GB before the prices tank on the 390.

I really want to see real reviews of the 480 though.

You bought the stuff then had to wait for it to be fixed. While you could just go out and get the competing card and have no fuss with one card doing the same as the two cards. (price/performance) as for power and noise. Not issues to me IMO unless you put the fan up to like 70-100% for OCing. Power, I don't care a bout, it's a desktop machine and only throttles up when gaming. For the 50$ a year extra in cost i'll eat that, if it was even that. The amount saved vs buying a new video card doesn't make any sense at all when talking about 5870 to 6970. Also I run have been running 1440P for the last 3 years and there was no way my 5870 could keep up so I finally upgraded to the 780GTX a huge leap and happy I did. Was thinking of going 1070 for my occulus but seems to work fine with all my games and 3x 27" 1440P monitors.
 
Last edited:
Back
Top