No AMD 300-series review samples?

For some reason they didn't test the 780 ti. Probably because it would look bad.
2560_1440.gif

Keep in mind that the 290 ran between the 780 and the Titan. In TW3 it's 28% faster than the 780 and 14% faster than Titan. This is a game that is poorly optimized for AMD as well. It's not the only game and it's not false claims.

If anything the 780 is only 3 frames(11%) faster than a 280x(7970) when at release the 780 was around 33% faster than a 7970 ghz edition. Personally, I don't think nvidia is crippling Kepler performance they are just not optimizing it it new games
 
It really boggles the mind that some are trying to defend Nvidia in this. It's either: A) they slowed down their earlier cards to make their newer ones look better, B) their Kepler architecture wasn't as good as they advertized and we thought compared to Hawaii. Both make Nvidia look bad, and both have complications for their future cards. What if the same happens with 980 after Pascal is released?

But unfortunately, thanks to the army of fanboys, this will be forgotten soon.

Where dosn't "Kepler" meet the claimed performance?
You feel the need to post, but don't read the thread and thus fail, because you are clueless about options C:

"Maxwell" was an evolution of Kepler" in all but one field: FP64
Just like "Kepler" was an evolution of "Fermi".

Are you suprised that an older architechture cannot keep up with a newer one?
I bet you cannot even find a single benchmark where GK110 has regressed.

If you don't know what that means, I will help you a little:
You are wrong, because you seem to lack sufficient knowlegede about the topic and my bet is that you, like some others, are just parroting stuff you have heard...but do not understand.

But I will indulge you.

Show me where GK100 has regressed performance wise, shouldn't be hard...you sound VERY sure...so enlighten us..please :)
 
If anything the 780 is only 3 frames(11%) faster than a 280x(7970) when at release the 780 was around 33% faster than a 7970 ghz edition. Personally, I don't think nvidia is crippling Kepler performance they are just not optimizing it it new games

What game has the 780 lost 20% performance in?
Or are you using "data" in the worst possible manner: The wrong way?
 
What game has the 780 lost 20% performance in?
Or are you using "data" in the worst possible manner: The wrong way?

I'm interested in knowing what you think is the issue.

780 has always traded blows with the 290. That "was" its level of performance.
Now it is trading blows in the 280-285 level??

Are you a proponent then of the opposite theory that AMD was holding back 290 performance so that they could slowly increase performance through driver manipulation to make it look like their card had greater longevity?
Or perhaps, AMD is so bad at drivers that they are just now figuring out how to make the card really shine and it was always better than a 780.

All I can tell you is I wish I had bought a 290 when I bought my 780. I would be getting more for my money right now.
Hell, people with early 280/7970 should be ecstatic! They really got their money's worth.
 
Has Kepler lost performance or did GCN gain performance?
God forbid AMD did something right over the last year.
 
I'm interested in knowing what you think is the issue.

780 has always traded blows with the 290. That "was" its level of performance.
Now it is trading blows in the 280-285 level??

Are you a proponent then of the opposite theory that AMD was holding back 290 performance so that they could slowly increase performance through driver manipulation to make it look like their card had greater longevity?
Or perhaps, AMD is so bad at drivers that they are just now figuring out how to make the card really shine and it was always better than a 780.

All I can tell you is I wish I had bought a 290 when I bought my 780. I would be getting more for my money right now.
Hell, people with early 280/7970 should be ecstatic! They really got their money's worth.

Can you add one more false claim, then it would be really fun.
If you are serious about putting words into my mouth, let me know...I will respond then
 
Has Kepler lost performance or did GCN gain performance?
God forbid AMD did something right over the last year.

Kepler was a terrible design. The only reason that AMD is so low on the Steam Charts is because R9 290, R9 290x, R9 295x2 were completely bought out by Bitcoin miners. This caused video card buyers to get substandard Nvidia equivalents as the AMD cards were completely sold out for a year. I had to buy a used one from a miner because all retailers were sold out. Remember NewEgg when they were selling the $1,300 290x?

I mean hell the Nvidia cards were so gimped that they couldn't even process Bitcoin mining in a timely fashion. Just the truth. By removing all the CUDA cores Nvidia was able to save on heat and develop a false sense of efficiency with their community. Those cards were the equivalent to buying car stereo amps from a swap meet. Feels nice and heavy until you find out later that there is nothing of substance inside.

Kepler; the lost generation.
 
There was an article floating around discussing how Nvidia cards are going to show their age in DX12 compute performance.
It was an interesting read, not sure where it went. Buried by Nvidia shills somewhere, I imagine.
 
So let me get this straight (last post of the day, got stuff to do)

Because in some newer games (not all) Kepler starts to show its age, people claims "planned obsolence"?

This is just as hillarious as "GameWorks is black box!!!" or "Look at Crysis 2 tessellation" :D
I need to visit more forums...even if it it's kinda scary to read a lot of posts...it is also very FUNNY!


1. I don't claim AMD held back anything in their drivers. Their current DX 11 implementation does have a larger overhead than NVIDIA, but that is a diffrent topic.
2. I don't claim NVIDIA made "kepler" with "planned obsolence".

There are NO facts supporting either, but claims are just as invalid/stupid.

"Kepler" performs like always, there has be ZERO performance degradation.
ALL reviews show this.
What is happingng is that i a few games (less than 5 that I am aware of) the compute load has increased.
The Schedule Manager (SM) have to do more work now (A hint to those that want real facts about were to "dig")

What does this mean?
It means that a larger portion (percentage wise) of CUDA cores in "Kepler" are inactive.

Again, I point people to charts like this:

nvidia-kepler-vs-maxwell-sm.gif


See the finer graining in "Maxwell" vs "Kepler?

This is what is giving "Maxwell" the egde...in games where shaders need to make room for more and more compute (SSAO, PhysX, HairWorks, Shader AA etc.)

No evil doing in drivers.
No planned obsolence (if you bother to look at the cadence from G80-> G92-> GT200 -> GF100-> GK110-> GM200 you will spot the naturla evoution of the architechture..and you need to claim that NVIDIA could see GPU laods 2½ years into the future *chough*)

It really that simple (the architechtual differences)

But I guess if you don't have a clue (or an agenda)...the other options seems "better"
 
Kepler was a terrible design. The only reason that AMD is so low on the Steam Charts is because R9 290, R9 290x, R9 295x2 were completely bought out by Bitcoin miners. This caused video card buyers to get substandard Nvidia equivalents as the AMD cards were completely sold out for a year. I had to buy a used one from a miner because all retailers were sold out. Remember NewEgg when they were selling the $1,300 290x?

I mean hell the Nvidia cards were so gimped that they couldn't even process Bitcoin mining in a timely fashion. Just the truth. By removing all the CUDA cores Nvidia was able to save on heat and develop a false sense of efficiency with their community. Those cards were the equivalent to buying car stereo amps from a swap meet. Feels nice and heavy until you find out later that there is nothing of substance inside.

Kepler; the lost generation.


if you want a card to make bitcoins with then get a radeon, but even before the mining craze began, it was pretty obvious you ain't going to be making money doing that stuff. If you want a gaming card get a gaming card pretty simple. The r290x has the same performance as the 780ti, but has 1 more gig of vram which helps in in newer games at higher settings, easy to see where the 780 ti is held back.

The radeons are very inefficient for power usage, this is even seen back then, where the 780 had a 20% advantage and the previous generation of cards as well. This will be seen again with the r390 and very likely Fury. To say what you are saying is very naive, and totally miss engineering efforts. When chips are designed they look at everything before they begin, and one of the maxwell 1 and 2 design elements was power usage (performance per watt). AMD started the perf per watt race with 4xxx series, they just haven't been able to keep up.

It wasn't the bitcoin mining that created a scarcity for the x290, they had low supply.

About lost generation, the 7xxx and the r290 have lost so much marketshare for AMD that they are now at the lowest point in their history. To say Kepler was the lost generation, no those two gens have hurt AMD more, they hurt themselves. All of the troubles they are in now is because the inability of AMD products to compete at the time of release in the metrics they themselves started. I am very sure when they saw the 750 ti they knew they were in trouble in the coming generation.
 
Last edited:
Kepler was a terrible design. The only reason that AMD is so low on the Steam Charts is because R9 290, R9 290x, R9 295x2 were completely bought out by Bitcoin miners. This caused video card buyers to get substandard Nvidia equivalents as the AMD cards were completely sold out for a year. I had to buy a used one from a miner because all retailers were sold out. Remember NewEgg when they were selling the $1,300 290x?

I mean hell the Nvidia cards were so gimped that they couldn't even process Bitcoin mining in a timely fashion. Just the truth. By removing all the CUDA cores Nvidia was able to save on heat and develop a false sense of efficiency with their community. Those cards were the equivalent to buying car stereo amps from a swap meet. Feels nice and heavy until you find out later that there is nothing of substance inside.

Kepler; the lost generation.

It was so terrible that AMD won the consumers..oh wait lol

bNqJYgA.png


RDF is not just for Apple-consumers I can see :D
 
I can explain it to you, but I cannot understand it for you.

But I like how the tune is not "Gameworks is black box, AMD can do nothing to optimize!"
Now it has transformed into "Gameworks is black box, AMD cannot do 100% the same as NVIDIA!".

I wonder how much "thinner" we can make this "argument"..since it based on PR FUD not facts ;)

Why don't you start adressing the TECHNICAL aspects of my post, instead of trying to go all personal...start a fight and try to mod involved...your post stinks of an attempt at derailing...because it lacking any tehcnical response, and is all emotional.

So I will only respond to the techincal parts of you posts for the future and ignore:
- Fallacy of shifting the burden of proof (AMD started whining...but they havn't documented anything.)
- Ad Hominem (I wish NVIDIA would pay to game, but alas I am forced to work like most other people)
- Red herrings.

That leaves me the following to respond to, when cutting in your post:

" "

Have a nice day :)

You have given zero proof, inferred points into others posts that are untrue and shown you have zero knowledge of facts. Prove where I said game works is a black box and AMD can do NOTHING. Never said that so why do you claim I did? Easy. It is you deferring your ignorance and inability to debate in a logical matter. You have yet to refute any claim I have made directly. I posted an article that says the same EXACT thing I stated. You have yet to post any proof or make any real arguments. It's ok. Not everyone can be capable of a mature and rational discussion.
 
*Fallacy: Shifting the burden of proof* *
*Fallacy: Ad hominem*
*Fallacy: Red herring*

Have a nice day (you sound very angry, might want to get looked into that), see you tomorrow...if you have more than fallacies ;)
 
It was so terrible that AMD won the consumers..oh wait lol


RDF is not just for Apple-consumers I can see :D

We couldn't buy the Radeons because they were sold out to businesses mining Bitcoin in factories. They were buying hundreds of cards at a time. These factories were paying over $1,000 for a 290x. Obviously you don't know much about Nvidia or AMD. :)
 
You have given zero proof, inferred points into others posts that are untrue and shown you have zero knowledge of facts. Prove where I said game works is a black box and AMD can do NOTHING.

It is blind optimizations. No one said they cant just any optimizations are just trial and error. The Witcher3 drivers from AMD increased performance but the issue would then be what % of perfect optimization where the released at. Its like being blindfolded and being told to put 10 pages in order. There is a chance you get it but far more likely you don't, it then comes down to how close did you get.


Being blind and working with ASM are two different things, this is why I stated its harder to optimize without the original source but it is still doable and doing it close to high level code. Any good coder is still able to optimize in ASM as a final step in optimizing if they want as much performance as they can get. But with today's hardware is moving so fast that this step is usually not done.

The difference between ASM and raw code is that the time it takes to write ASM, just food for thought, raw code might be 2 lines but in ASM it might be 20 lines.

Its also easier to make mistakes in ASM, because call conventions are not checked if they are properly being used.

There are quite a few other differences too.
 
We couldn't buy the Radeons because they were sold out to businesses mining Bitcoin in factories. They were buying hundreds of cards at a time. These factories were paying over $1,000 for a 290x. Obviously you don't know much about Nvidia or AMD. :)

You know why this doesn't make sense right? Where is the marketshare.........

and AMD had a big stock pile of inventory left over in the last few conference calls.
 
You know why this doesn't make sense right? Where is the marketshare.........

and AMD had a big stock pile of inventory left over in the last few conference calls.

When the 900 series came out because Nvidia finally caught up. The marketshare is in China, India, etc. Wherever the Bitcoin factories dumped their old gear after they have run it to death.
 
No evil doing in drivers.
No planned obsolence (if you bother to look at the cadence from G80-> G92-> GT200 -> GF100-> GK110-> GM200 you will spot the naturla evoution of the architechture..and you need to claim that NVIDIA could see GPU laods 2½ years into the future *chough*)

It really that simple (the architechtual differences)

But I guess if you don't have a clue (or an agenda)...the other options seems "better"

explanation for the Witcher 3 Kepler performance driver?

best case scenario is Nvidia's driver team has limited resources and focused on their latest tech, Maxwell, then focused on Kepler afterwards

worse case scenario is Nvidia focused on Maxwell while ignoring Kepler until the Kepler owners started having a fit over the Internet
 
When the 900 series came out because Nvidia finally caught up. The marketshare is in China, India, etc. Wherever the Bitcoin factories dumped their old gear after they have run it to death.


No marketshare is based on cards sold at a given time, based on JPR and mercury numbers and this is world wide numbers. Not what is being used at the time. There really is no way to see what cards people are using outside of steam surveys and such and this is not a good metric to show marketshare.
 
No marketshare is based on cards sold at a given time, based on JPR and mercury numbers. Not what is being used at the time. There really is no way to see what cards people are using outside of steam surveys and such and this is not a good metric to show marketshare.

Well did you think that AMD stopped making video cards for a year? Because you sure couldn't buy one retail for a long time. Where do you think they went to? Stuck them into a warehouse for 2015 financial day?
 
Well did you think that AMD stopped making video cards for a year? Because you sure couldn't buy one retail for a long time. Where do you think they went to? Stuck them into a warehouse for 2015 financial day?


Look if all those cards were being sold to factories as you say, they marketshare numbers will be higher.

JPR and mercury get their numbers straight from manufacturing to point of sale to retailers to sales to consumers. Factories will be part of that too.
 
I was planning on CF'ing a pair of 290's for the wife. and maybe a pair of 380/90's for me. Should I even bother with the 3XX series when they drop in price near the end of the year?
 
just wait and see, should have benchmarks and review very soon. Also do you need to CF, personally I never use multicards in my home system, just too many down falls to them (both sli and cf)
 
I wouldn't buy more than one GPU until DX12 is proven to work well with multiple GPU setups. Or you are going to do something like AMD LiquidVR and the Nvidia equivalent.
 
ditto, multi-card is more trouble than its worth

Where does this fallacy come from? Yes Crossfire is a disaster thanks to AMD not giving adequate profile support but SLI works great in almost every single game I play and I have quite an extensive library on Steam + Origin. In over 100+ games, I can only think of ONE game that I own which doesn't support SLI outright and that's Ark Evolved (because it uses UE4). Other than that, SLI tends to work just fine in every game with no stutter and decent utilization.
 
personal experience, I have had too many issues with SLI and CX for the amount of money I spent. Of course that was a while back but still, just more comfortable with putting something into my computer and having it work.
 
Like I said, Crossfire definitely has issues because of the lack of support but I don't see that with SLI at all. I think some people tend to focus on a few unique cases and then try to apply it broadly. I've used SLI for the past few years now without any issue (580M SLI/680M SLI/680 SLI/Titan SLI/Titan X SLI).
 
It was so terrible that AMD won the consumers..oh wait lol

bNqJYgA.png


RDF is not just for Apple-consumers I can see :D

If you look at it right, AMD was holding steady for a while, and actually winning back market share after the 280x-290x, up until the 295x2, GT 730/740. Not enough to turn the tables, obviously, and it fell back quick, but you can't say the 290x was a bust.
 
personal experience, I have had too many issues with SLI and CX for the amount of money I spent. Of course that was a while back but still, just more comfortable with putting something into my computer and having it work.
Me too. I'm thankful I avoided bumpgate and didn't have cards that were killed by drivers, and wasn't running Nvidia when Vista came out. Of course that was a while back.
 
Where does this fallacy come from? Yes Crossfire is a disaster thanks to AMD not giving adequate profile support but SLI works great in almost every single game I play and I have quite an extensive library on Steam + Origin. In over 100+ games, I can only think of ONE game that I own which doesn't support SLI outright and that's Ark Evolved (because it uses UE4). Other than that, SLI tends to work just fine in every game with no stutter and decent utilization.

no falacy here, ive used both Xfire and SLI and had nothing but problems with bouth.

no thank you.
 
If you look at it right, AMD was holding steady for a while, and actually winning back market share after the 280x-290x, up until the 295x2, GT 730/740. Not enough to turn the tables, obviously, and it fell back quick, but you can't say the 290x was a bust.

They went up 5% and then steadily dropped. The 290X was a bust. After today's news things are not looking any better.
 
They went up 5% and then steadily dropped. The 290X was a bust. After today's news things are not looking any better.

Yeah. I'm sure OEMs and botique builders will hate tiny high performance GPUs.
Cause you know, no one thought the GTX970 ITX was pretty awesome for that form factor.
 
Yeah. I'm sure OEMs and botique builders will hate tiny high performance GPUs.
Cause you know, no one thought the GTX970 ITX was pretty awesome for that form factor.

True. Also small form factor PCs seem to be on the rise.
 
Back
Top