AMD's chance to catch up

  • Thread starter Deleted member 93354
  • Start date
D

Deleted member 93354

Guest
So the Maxwell edition cards have the same IPC as [edit]Pascal [/edit](Sorry it was late at night and I was tired) . And the 1080 actually starts to lose efficiency because memory on the board starts to bottle neck performance.

Here's your chance AMD...Don't blow it.
 
Last edited by a moderator:
Well the 480 loses 'ipc' (shader efficiency, right?) compared to hawaii for example. Just look at how it performs in AotS.

Also in Warhammer I just saw pcgh updated their benchmarks and the rx480 is not faring well at all

I'm gonna post the benches in a minute, you guys know how much I love arguing against the 'NV dx11, amd dx12' view :p it's really game dependent, just depends on where the bottleneck falls on every card

Internal benchmark, DX11 vs DX12
vT1y1dn.png

NQemU6n.png

x0i41ED.png

8Ua684l.png

So we have;

Forza Apex, Gears of War, running more or less equally well (comparing cards in the same class; 390(x) vs 970/980. FX vs 980Ti etc)

Hitman and AotS (and Quantum Bork if you're so inclined) running better on AMD hardware

Rise of the Tomb Raider, Total Warhammer running better on NV

So where is the AMD's inherent DX12 advantage ? Seems like it just depends on the game to me. Sure, AMD has tons to gain from API drawcall submission being properly multithreaded, and sure, AMD can extract more performance from their hardware in DX12 (multithreading + asynchronous compute) but really, it's the same as always. Some games will favor one ihv over the other.

What upsets (irritates) me personally, and has led to many heated arguments in the past, is the tendency for people (online) to dismiss anything favoring NV as being some kind of conspiracy against AMD. Tomb Raider was lauded for using purehair, and many "AMD fans" (for lack of a better word) held very positive views of it. DX12 isn't going to shift hardware bottlenecks, it's going to relieve software ones; no API in the world will allow AMD to make up for lackluster geometry performance, and no API in the world will allow NV to make up for lower raw shader throughput across the board.

Edit:

A common misconception I've encountered is the following; some people seem to think that geometry processing is handled by entirely different units; as if the computation isn't performed by the very same ALUs that work on pixels. This is untrue as many of you here should know, the actual performance of a GPU in games should in theory be proportional to it's shader throughput; a 9Tflop GPU should be 50% faster than a 6tflop GPU. If it isn't, there's a bottleneck, and shader efficiency is considered to be lowered.
 
Last edited:
Your points are fair, and I'm not an NV vs AMD conspiracy theorist, nor do I have any particular love or hate for one vs the other.
The problem is that the RX 480 is a $239 card while the 1080 is, well, to be honest with all the Founders Edition vs MSRP vs Real MSRP coupled with limited supply - I'm not even sure what it's supposed to sell for.

I was in the market for an upgrade recently (knowing that the RX 480 launch day was coming). I was very excited by the prospects of a 1070 at ~$380 as well. So I made a pact with myself - I'd buy either a 1070 or a 480, depending on which I could get get first at MSRP +/- $20 or so. I ended up bagging an 8GB 480 for $239 on launch day. So for me the choice was based largely on 'availability', but with an eye towards value as well (I kind of wasn't willing to pay hundreds over MSRP).

But that's just me. YMMV. Like I said, I wasn't buying on 'hate' or 'conspiracy' or for 'fanboy' reasons.
 
Your points are fair, and I'm not an NV vs AMD conspiracy theorist, nor do I have any particular love or hate for one vs the other.
The problem is that the RX 480 is a $239 card while the 1080 is, well, to be honest with all the Founders Edition vs MSRP vs Real MSRP coupled with limited supply - I'm not even sure what it's supposed to sell for.

I was in the market for an upgrade recently (knowing that the RX 480 launch day was coming). I was very excited by the prospects of a 1070 at ~$380 as well. So I made a pact with myself - I'd buy either a 1070 or a 480, depending on which I could get get first at MSRP +/- $20 or so. I ended up bagging an 8GB 480 for $239 on launch day. So for me the choice was based largely on 'availability', but with an eye towards value as well (I kind of wasn't willing to pay hundreds over MSRP).

But that's just me. YMMV. Like I said, I wasn't buying on 'hate' or 'conspiracy' or for 'fanboy' reasons.
Should have waited a couple of weeks to see what the 1060 did for you.
 
Mod, while I get my link was deleted because "Reddit is not a source", the person the author quoted, who works for Sapphire, actually is in that thread and validates what was said (Vapor-X is the Sapphire rep). Shouldn't that lend enough validity that the thread in this case should have been an exception and not been deleted?
 
index.php

guru3d always paints a night and day difference then the links leldra posts....ge i wonder why?


Mainly because they don't retest all the cards, they just add new results to existing data, they're not running the same settings, and they don't do DX11 vs DX12 comparisons :)

Oh, and they're just using the built-in benchmark

The G3D results you're seeing span different driver sets and different game versions
 
Well, I'll leave Reddit out of it and just reference the you tube video:




For those who don't feel like watching the video, Ed from Sapphire at the GTX 1060: "The 480 Freaked Nvidia out... they scared the death of this, they rushing the 1060 to market.."
 
Well, I'll leave Reddit out of it and just reference the you tube video:




For those who don't feel like watching the video, Ed from Sapphire at the GTX 1060: "The 480 Freaked Nvidia out... they scared the death of this, they rushing the 1060 to market.."


Words of wisdom from an AMD exclusive AIB. Under no circumstances would I take anything he says about the matter at less than face value, it's the best independent source you'll ever get.
 
xx
Well, I'll leave Reddit out of it and just reference the you tube video:




For those who don't feel like watching the video, Ed from Sapphire at the GTX 1060: "The 480 Freaked Nvidia out... they scared the death of this, they rushing the 1060 to market.."

nvidia using a non-sleeved pig tail to move the power plug on the card does sound very rushed lol
 
xx

nvidia using a non-sleeved pig tail to move the power plug on the card does sound very rushed lol

Sapphire Ed (I presume) "Was there an issue with the chip, and 8 gig didn't work correctly?"

Yeah. It's called bus width. The issue is bus width. LOL.

Either this guy is being very silly, or he's a marketing dude. I genuinely don't know who he is
 
Sapphire Ed (I presume) "Was there an issue with the chip, and 8 gig didn't work correctly?"

Yeah. It's called bus width. The issue is bus width. LOL.

Either this guy is being very silly, or he's a marketing dude. I genuinely don't know who he is
Lol his point still stands as it's obvious the PCB was not designed for 1060 as it currently stands. Of course it's not a big deal, nor does him claim it.
 
Well, I'll leave Reddit out of it and just reference the you tube video:




For those who don't feel like watching the video, Ed from Sapphire at the GTX 1060: "The 480 Freaked Nvidia out... they scared the death of this, they rushing the 1060 to market.."

Of course he is going to say that they only make AMD GPUs, his business depends on hyperbole.
 
Of course he is going to say that they only make AMD GPUs, his business depends on hyperbole.

Look, I just put it out there. I can't force people to believe that sometimes the opposition makes valid points. I myself prefer to keep an open mind and consider all sources with the same starting point, yet also with the same respect of believability if what they put forth appears to be true.

I think it's utterly stupid to dismiss something out of hand. Take it with a larger grain of salt, sure, but still - I couldn't help but feel for some of you that if Red and green were switched, if you would be making the same statements...
 
Look, I just put it out there. I can't force people to believe that sometimes the opposition makes valid points. I myself prefer to keep an open mind and consider all sources with the same starting point, yet also with the same respect of believability if what they put forth appears to be true.

I think it's utterly stupid to dismiss something out of hand. Take it with a larger grain of salt, sure, but still - I couldn't help but feel for some of you that if Red and green were switched, if you would be making the same statements...
No, this would be no different if EVGA came out and made these kinds of claims about AMD, no one should trust a vendor specific AIB manufacturer to give a clear picture.
 
No, this would be no different if EVGA came out and made these kinds of claims about AMD, no one should trust a vendor specific AIB manufacturer to give a clear picture.

At the same time, it doesn't mean a vendor specific AIB is lying. Like I said, while there is a need to take something with a larger grain of salt, I find it foolish to dismiss such things out of hand.
 
At the same time, it doesn't mean a vendor specific AIB is lying. Like I said, while there is a need to take something with a larger grain of salt, I find it foolish to dismiss such things out of hand.

Sure, but when a representative from said vendor-exclusive AIB partner makes fun of the fact that there are "only" six VRAM chips on a 192-bit GPU you really gotta start asking yourself questions.
 
It was probably designed for a GP104, not GP106 (hint, the 1060Ti).

I doubt we'll see a TI unless AMD builds something that Nvidia wants to crush. That seems to be the function of TI cards.

The GTX 1060 PCB does seem to have 2 unused BGA solder pads above the GPU, maybe GP106 does have a 256-bit bus it's just disabled for the GTX 1060. Who knows? The founders edition design does seem a little rushed. Nvidia has already announced that they'll be limited availability, most cards will probably be custom designs.
 
You could be correct. However, it appears that nVidia has decided, for whatever reason, to release a lot of cards all over the product stack regardless of competition from AMD. The only thing we know that's coming from AMD that is supposedly faster than the RX 480 is Vega, unless you want to postulate that a RX 490 is based on Polaris (as rumored in another thread here at [H]), and that would probably only compete with the 1070 in most scenarios where dual GPUs are supported.
 
You could be correct. However, it appears that nVidia has decided, for whatever reason, to release a lot of cards all over the product stack regardless of competition from AMD. The only thing we know that's coming from AMD that is supposedly faster than the RX 480 is Vega, unless you want to postulate that a RX 490 is based on Polaris (as rumored in another thread here at [H]), and that would probably only compete with the 1070 in most scenarios where dual GPUs are supported.
I highly doubt we will see Ti variants of any cards this year, except maybe a 1050 tI
 
Mainly because they don't retest all the cards, they just add new results to existing data, they're not running the same settings, and they don't do DX11 vs DX12 comparisons :)

Oh, and they're just using the built-in benchmark

The G3D results you're seeing span different driver sets and different game versions

Yeah their performance reporting methodology is terrible. I understand it's hard to retest everything every time but TPU and other sites seem to pull it off.
 
Yeah their performance reporting methodology is terrible. I understand it's hard to retest everything every time but TPU and other sites seem to pull it off.

I've been reading that website for 10 years, and I gotta be honest, the reviews are not all that great and I feel like quality has been going downhill in the last few years. Way too much copy pasted content, way too little attention to detail
 
Honestly leldra...In this forum sub section i dont think many of us care that much if they were freaked out or not. Its a you tube video for goodness sake, not to mention who made it. The guy was making fun of the an nvidia pcb design and its good for a laugh since you have to take time out from you usual Anti-AMD troll threads lol. After all i pretty sure a million pci-e slots for 480 owners went up in fire last week.:)
 
Bv4paKl.png


this is 100% NOT confirmed to be an actual nvidia engineer, I am not posting this because of the (claimed) authority of the poster, just because it makes sense and I agree with what is said

edit:

this guy's posts are actually great

overview for Manny_Sundavi

Right. I always said since the release of the 1070 that is was designed to do battle with AMD directly.

RX 480 supports GDDR5X. If the part had 1070-level performance, it would have needed expensive and fairly rare GDDR5X to feed it (due to only matching Maxwell-level DCC).

The cut 1070/GDDR5 would have run laps around AMD's uncut RX 480 GDDR5X part (better availability, higher profit margins), but it didn't have to. So they push out their midrange faster than planned.
 
The 480 has half the ACEs Fury has.

Does it ?

Also half the rops and half the memory bandwidth, and promises of improved geometry performance

Right. I always said since the release of the 1070 that is was designed to do battle with AMD directly.

RX 480 supports GDDR5X. If the part had 1070-level performance, it would have needed expensive and fairly rare GDDR5X to feed it (due to only matching Maxwell-level DCC).

The cut 1070/GDDR5 would have run laps around AMD's uncut RX 480 GDDR5X part (better availability, higher profit margins), but it didn't have to. So they push out their midrange faster than planned.

There was/is a g5x 480 ?
 
I can't wait till someone posts a video from adoredtv saying we should listen to that regarding the 1060.
 
So the Maxwell edition cards have the same IPC as [edit]Pascal [/edit](Sorry it was late at night and I was tired) . And the 1080 actually starts to lose efficiency because memory on the board starts to bottle neck performance.

Here's your chance AMD...Don't blow it.


I do hope they capitalize on any chance they get.

 
Does it ?

Also half the rops and half the memory bandwidth, and promises of improved geometry performance



There was/is a g5x 480 ?


Nobody knows. The only thing we know for certain is the RX 480 die supports GDDR5X, so it was on the cards if they could clock it into the stratosphere. You can be sure they put GDDR5X on a test board somewhere to validate this, which is probably why it was in tons of rumors.

Just think of the 4870: you would have never heard about the GDDR5 support if they hadn't clocked it that high. But because they did, they launched with both GDDR5 and GDDR3 PCBs.

32 ROPs don't matter so much if you can clock them at 1.5 GHz. It's already matching a monster like the 390 with half the ROPs.
 
Last edited:
Mod, while I get my link was deleted because "Reddit is not a source", the person the author quoted, who works for Sapphire, actually is in that thread and validates what was said (Vapor-X is the Sapphire rep). Shouldn't that lend enough validity that the thread in this case should have been an exception and not been deleted?

Using Sapphire as a source for Nvidia anything? This is like going to a Ford dealership and asking them about Chevy.
 
Last edited:
AMD might not be "winning" for me, but it's sure as hell winning for the mid range market. I have 2 brothers in law who are both mid-range pc gamers. They both play at 1920x1080 and run their hardware for several years before even considering and upgrade. Right now one is using a 460 gtx, and the other using a gtx 760. For $200, they get a huge upgrade and are able to play the newest games at ultra/high settings just fine. Most people here don't like that AMD doesn't have the top card, but for people like my brothers, it's not even close.. AMD all day for them.

Again, don't get me wrong. I would love to see AMD on top again (even more so with their CPUs). I think they hit it out of the ballpark though with a $200 card.
 
Back
Top