AMD RX 480 - First 3DMark results

Ocellaris said:
AMD and AMD fans spent the last generation explaining that 8GB was needed for consistent performance and a variety of features. Would seem cheesy for that now to turn into "No oneneeds 8GB!"

Are people getting in VR really going to skimp $30 to get a 4GB card instead of a 8GB card? Or better yet, do you think people dropping $800 on a Vive are looking at $200 GPUs?

It would all be nice if AMD made a cheap headset. If AMD wants mainstream VR, they need to solve the cost of the headset, not the GPU.
I think you're confused. It was Nvidia and Nvidia fans who spent the last generation trying to convince people that 6GB/8GB was needed for consistent performance. Everybody else knew that the benchmarks proved 4GB was plenty for most everything but insane resolutions w/all features on high.

VR is in its infancy right now, of course pricing is high. As time goes on more companies will be producing VR units, the number of headsets sold will increase and they will inevitably get cheaper. It's not up to AMD to provide headsets as well as video cards. If the 480 is powerful enough to run VR then people who already have a 480 in their system will be able to immediately use a VR headset once it becomes affordable to them. It happens all the time. People wait for price drops until they can justify the purchase. Even a lot of people who want a 1080 are waiting for the aftermarket units to hit the market so they don't have to pay extra for the Nvidia "Early Adopter Tax" Founders Edition.
 
Of course it'll be able to access all of it, unless AMD screwed something up like nvidia did with the 970. Probably won't happen though.

If you're trying to ask if 8GB of VRAM will be used in such a way to matter for performance then the answer is: depends on the game and its settings. If you game at 1080p I don't think you'll need more than 4GB for a while yet. If you game at 4K 8GB might not be enough in a few years.

Right now very few games really need over 4GB of VRAM. That will change in the future but other factors are just as likely, if not more so, to effect performance in the future too (ie. tessellation, async compute, etc). You might find even with a 8GB card you'll be considering upgrading in a couple of years anyways.

I think what he wants to say is.. Will this GPU be powerful enough to make use of 8GB of memory(running into a gpu bottleneck before we hit the memory bottleneck). and if it performs like the 390, I don't see why not.
 
What I'm most excited about are the supposed optimizations being made to Crossfire. Once VR systems can really utilize two cards well (one per eye), then a 480 CF setup for under $500 could end up being incredible value.

Even without, I'm really really hoping CF gets a lot of love on this gen. It seems like AMD wants to position 2x480s as a competitor to the 1080. That would require excellent scaling that can be done as a game agnostic function.
 
AMD has been consistently talking down CF and SLI for the past several months. Don't expect it to see much attention at all.

When AMD showed two 480s beating a 1080, that wasn't in crossfire-- it was using DX12 explicit multi-adapter.
 
AMD has been consistently talking down CF and SLI for the past several months. Don't expect it to see much attention at all.

When AMD showed two 480s beating a 1080, that wasn't in crossfire-- it was using DX12 explicit multi-adapter.

Well, that's even better! If they plan on ditching CF and using the open DX12 standard, I'm even more excited. That kind of optimization is much better in the long run.
 
I completely agree-- and it applies to heterogeneous configurations, so you can mix/match AMD, Nvidia, and even intel GPUs in the same system and get performance out of each.
 
AMD has been consistently talking down CF and SLI for the past several months. Don't expect it to see much attention at all.

When AMD showed two 480s beating a 1080, that wasn't in crossfire-- it was using DX12 explicit multi-adapter.
CF and SLI seriously suck. So that's amazing. That removes much of the doubt I had regarding dual GPUs.



I completely agree-- and it applies to heterogeneous configurations, so you can mix/match AMD, Nvidia, and even intel GPUs in the same system and get performance out of each.
That's pretty incredible, actually.
 
Why do you attack people on here , you could make your point without and yet you have to lash out for what reason is that exactly ?
I think the R9 290x was one of the first cards with 4 GB extra and never that I can remember were any of us shouting that you needed 8 GB ....

That AMD was explaining 4 GB was enough is because of the physical limitations of HBM makes sense right if you want to sell your 4 GB product.

It is not that hard to explain why you will run out of 4GB frame buffer if you have high enough resolution and plenty options enabled. Some games have very poor buffering tho.

I don't see an attack where you see one. I personally saw a lot of amd reviews that stated how important 8gb was over 4gb. Or hell how 4gb was SOOOO much better than 3.5 :)
 
SLI and Crossfire both suck ass. That is why with DX12 they have added Multi-GPU. So now its on the developer to support xfire/sli with games.

That way we can now blame lazy developers.
 
SLI and Crossfire both suck ass. That is why with DX12 they have added Multi-GPU. So now its on the developer to support xfire/sli with games.

That way we can now blame lazy developers.

And there are plenty of those. It remains to be seen how much effort will go into this. I bet we see more games without any dual GPU love at all.
 
And there are plenty of those. It remains to be seen how much effort will go into this. I bet we see more games without any dual GPU love at all.

I got a feeling AMD and NVidia are going to pay Developers to add it in. We already know Deus Ex and BF1 will support it.
 
I completely agree-- and it applies to heterogeneous configurations, so you can mix/match AMD, Nvidia, and even intel GPUs in the same system and get performance out of each.

This is the part that interested me about it. Specifically, Oxide noted how you could program one GPU to handle specific tasks within the game while one GPU handles the lion's share. And just how would this fit in? Imagine a game using your Intel iGPU (something that otherwise goes un-utilized most of the time) to handle the lighting, or the shadows, or something else specific while your discrete GPU continues to handle the main game, but with a 10-15% performance boost. With Intel owning most of the actual GPU market, developers could use DX12, get "closer to the metal," and program specific tasks for Intel's hardware knowing that a large percentage of gamers would have this. Since I built my PC I have had the iGPU disabled (no shared memory, no driver). But I do plan to enable it when games begin to utilize this.
 
I got a feeling AMD and NVidia are going to pay Developers to add it in. We already know Deus Ex and BF1 will support it.

That's great news about Deus Ex especially. Maybe I was being a bit pessimistic.
 
This is the part that interested me about it. Specifically, Oxide noted how you could program one GPU to handle specific tasks within the game while one GPU handles the lion's share. And just how would this fit in? Imagine a game using your Intel iGPU (something that otherwise goes un-utilized most of the time) to handle the lighting, or the shadows, or something else specific while your discrete GPU continues to handle the main game, but with a 10-15% performance boost. With Intel owning most of the actual GPU market, developers could use DX12, get "closer to the metal," and program specific tasks for Intel's hardware knowing that a large percentage of gamers would have this. Since I built my PC I have had the iGPU disabled (no shared memory, no driver). But I do plan to enable it when games begin to utilize this.

That could certainly be an amazing feature if it is implemented correctly. I had no idea that they could do that. I knew about the mix and matched video cards though.
 
That's great news about Deus Ex especially. Maybe I was being a bit pessimistic.

Yea I expect all future Square Enix games to use it. They were one of the first developers showing Multi-GPU technology. Even showing what the intel iGPU could do.

The BF1 was the game I did not expect.
 
I scored P14 461 in 3DMark 11 Performance

Pretty damned impressive for a $199 card. This is about equal to or ever so slightly higher than an average 970 or 390.

The results would be comfortably VR capable too.

Going just by this synthetic benchmark, we've got a card that is 40% cheaper at less than half the wattage than the 390 with the same performance.

Just a correction here, the GPU score for that bench is over 18K. For reference, 970 sits around 15K, 980 at 18.5K, and both 390X and Fury sitting at 18K as well. Fury throws weird 3D11 scores, so not sure if you can really build a comparison there.

Regardless, it should definitely be above 970/390 level. Synthetics are synthetics though, benches can't come soon enough.

If we can pull 10-20% overclocks on this thing....
 
This is the part that interested me about it. Specifically, Oxide noted how you could program one GPU to handle specific tasks within the game while one GPU handles the lion's share. And just how would this fit in? Imagine a game using your Intel iGPU (something that otherwise goes un-utilized most of the time) to handle the lighting, or the shadows, or something else specific while your discrete GPU continues to handle the main game, but with a 10-15% performance boost. With Intel owning most of the actual GPU market, developers could use DX12, get "closer to the metal," and program specific tasks for Intel's hardware knowing that a large percentage of gamers would have this. Since I built my PC I have had the iGPU disabled (no shared memory, no driver). But I do plan to enable it when games begin to utilize this.
The only question here is having the iGPU enabled. We have seen all the issues present in most games and normal desktop use when it is enabled with a dGPU. Its gonna be a pain till AMD and Intel fix the issue with it being enabled but primary being a dGPU.(Maybe I should say everyone, software, OS included)
 
I don't see an attack where you see one. I personally saw a lot of amd reviews that stated how important 8gb was over 4gb. Or hell how 4gb was SOOOO much better than 3.5 :)

Wow you actually now talking about something else.
Doesn't Nvidia still have that lawsuit coming where they are held liable for the 3.5+.5 fiasco?

AMD and AMD fans spent the last generation explaining that 8GB was needed for consistent performance and a variety of features. Would seem cheesy for that now to turn into "No one needs 8GB!"

That is what I responded to ....
 
Wow you actually now talking about something else.
Doesn't Nvidia still have that lawsuit coming where they are held liable for the 3.5+.5 fiasco?



That is what I responded to ....

Personally I think it's a dumb lawsuit. But I have no idea. I've owned 5 970's and not once in 2 years have I been able to force an issue with it. I currently have 3 actively running in gaming machines without an issue.

People complain about the dumbest shit imho.
 
Imagine if they had labeled it 3.5 GB and then someone discovered it actually had an extra 512 MB of slower RAM. Holy crap, an extra half-gig they didn't even tell us about! 970 is the best card ever! Instead the discovery went the other way and the reaction was negative, but in both cases it's the exact same card with the exact same performance at the exact same price. It was a completely manufactured controversy over nothing.
 
Imagine if they had labeled it 3.5 GB and then someone discovered it actually had an extra 512 MB of slower RAM. Holy crap, an extra half-gig they didn't even tell us about! 970 is the best card ever! Instead the discovery went the other way and the reaction was negative, but in both cases it's the exact same card with the exact same performance at the exact same price. It was a completely manufactured controversy over nothing.

The controvery wasnt over the lack of the 512 MB... It was that once the card used the 512 MB, it slowed down dramatically.

No, there wouldnt be a positive spin if the card was labeled 3.5 GB and people later found out that it has an extra 512 MB that would slow the card down if used. No one would be happy over the extra 512 MB that slows the card down, no matter how it was branded...
 
Imagine if they had labeled it 3.5 GB and then someone discovered it actually had an extra 512 MB of slower RAM. Holy crap, an extra half-gig they didn't even tell us about! 970 is the best card ever! Instead the discovery went the other way and the reaction was negative, but in both cases it's the exact same card with the exact same performance at the exact same price. It was a completely manufactured controversy over nothing.
You're conveniently omitting the fact that Nvidia flat-out LIED about the specs of the card for 4½ months. And the 3.5GB + 0.5GB memory kludge ob until called out on the carpet by enthusiasts who were trying to figure out why their cards weren't performing the way they should. Jen-Hsun didn't post a public apology about the 970 because he was so very proud of it.
 
You're conveniently omitting the fact that Nvidia flat-out LIED about the specs of the card for 4½ months. And the 3.5GB + 0.5GB memory kludge ob until called out on the carpet by enthusiasts who were trying to figure out why their cards weren't performing the way they should. Jen-Hsun didn't post a public apology about the 970 because he was so very proud of it.

7 billion dollars and crazy market share with a card that's above what people normally would she'll out. I would be proud also. The 970 made nvidia a killing. I can see people forcing themselves to be butt hurt out of a no issue situation because quite frankly I think the 970 was one of the best cards to ever come out in the history of cards for its price, performance and target. The 980 though was a bag of shit.
 
Imagine if they had labeled it 3.5 GB and then someone discovered it actually had an extra 512 MB of slower RAM. Holy crap, an extra half-gig they didn't even tell us about! 970 is the best card ever! Instead the discovery went the other way and the reaction was negative, but in both cases it's the exact same card with the exact same performance at the exact same price. It was a completely manufactured controversy over nothing.
Had then done the former then yes, it would have been fine. The latter, however, is borderline false advertising -- hence the controversy.
 
You can't just go around saying 290x vs 390x unless you know which 290x your comparing because the last 290x models was major improved on, 1350Mhz Samsung memory and non-throttle 1020 gpu clock with major cooling and 6 phase power redesign, so it also thrash's a ref 290X and falls between the 390/390x in performance with just 4Gb of ram.

Now the link from the OP showing the 3DMark score, that score is 2000+ higher then the improved 290x and it is faster then the 390 because of SP count. So that is 390x /980GTX fast.

This is a improved 290x on PCI Express 2 (6 year old hardware) at stock speed, I could dig more out of it just ramping up the cpu more.

I scored P13 164 in 3DMark 11 Performance
 
7 billion dollars and crazy market share with a card that's above what people normally would she'll out. I would be proud also. The 970 made nvidia a killing. I can see people forcing themselves to be butt hurt out of a no issue situation because quite frankly I think the 970 was one of the best cards to ever come out in the history of cards for its price, performance and target. The 980 though was a bag of shit.

Of course you would say that.

trentchau said:
I've owned 5 970's
 
Get the 8GB.

Check the sig: I bought a 4 GB version of the gtx 670. This was when a lot of folks said 2 gb was usable and 4 was a waste. Still able to game with mods and hi res textures at 1080 just fine.

If you buy the 4gb, you may regret it. If you get the 8, you'll know you got all the perf the manufacturer could offer.
 
The controvery wasnt over the lack of the 512 MB... It was that once the card used the 512 MB, it slowed down dramatically.

No, there wouldnt be a positive spin if the card was labeled 3.5 GB and people later found out that it has an extra 512 MB that would slow the card down if used. No one would be happy over the extra 512 MB that slows the card down, no matter how it was branded...
It was only in special cases that the last 512 was used and even more rare for a situation where the GPU could have run those settings at a decent frame rate anyway. For the issue to actually affect the user experience was so rare that it took months for anyone to figure it out. If you wouldn't have noticed if no one had pointed it out, it's not worth getting worked up over.
 
  • Like
Reactions: c3k
like this
It was only in special cases that the last 512 was used and even more rare for a situation where the GPU could have run those settings at a decent frame rate anyway. For the issue to actually affect the user experience was so rare that it took months for anyone to figure it out. If you wouldn't have noticed if no one had pointed it out, it's not worth getting worked up over.

This. See sig. My 970 was a huge upgrade. I'd be lying if I said that the 3.5 + .5 issue was anything I noticed, either on my own or after being notified about it.
 
Imagine if they had labeled it 3.5 GB and then someone discovered it actually had an extra 512 MB of slower RAM. Holy crap, an extra half-gig they didn't even tell us about! 970 is the best card ever! Instead the discovery went the other way and the reaction was negative, but in both cases it's the exact same card with the exact same performance at the exact same price. It was a completely manufactured controversy over nothing.
You are deflecting the real issue. nVidia lied about the cards specs. Not just the VRAM either. They used deceptive advertising with inaccurate specifications. Yet some people think this is OK. I can't understand it, personally.
 
You are deflecting the real issue. nVidia lied about the cards specs. Not just the VRAM either. They used deceptive advertising with inaccurate specifications. Yet some people think this is OK. I can't understand it, personally.

Maybe I'm naive, but I took it as an honest mistake.
 
Maybe I'm naive, but I took it as an honest mistake.

I don't really see it as an honest mistake. Not sure if you are joking or not. In saying that both Nvidia and AMD have lied about graphics cards in the past. It's just a sad part of business these days.
 
It was only in special cases that the last 512 was used and even more rare for a situation where the GPU could have run those settings at a decent frame rate anyway. For the issue to actually affect the user experience was so rare that it took months for anyone to figure it out. If you wouldn't have noticed if no one had pointed it out, it's not worth getting worked up over.
Actually you are glazing over the big picture issue here. It isn't about the 970 all-in-all, it is about a company lying about the specs on their product to consumers. It wasn't just the memory but the entire specs of the card, being a great deal of the original specs were cut off or disabled. It is about a company that lied to its customers, likely to win in the market against a competitor's 4Gb card that even now it cant beat performance-wise.
 
Actually you are glazing over the big picture issue here. It isn't about the 970 all-in-all, it is about a company lying about the specs on their product to consumers. It wasn't just the memory but the entire specs of the card, being a great deal of the original specs were cut off or disabled. It is about a company that lied to its customers, likely to win in the market against a competitor's 4Gb card that even now it cant beat performance-wise.
You sure have plenty of OC vs OC benchmarks to back up last claim, don't you?
 
Back
Top