AMD takes yet another market share blow

Listen it could be worse at least amd is not the Cyrix/S3 zombie that lives on in Via to this very day. listen everyone knew that this year was going to be ass for amd as they had literally nothing new to give us besides excavator and meh gpu refreshes fury was a surprise. next year amd has to have competitive with its 14nm product stack or its bankruptcy before 2019. id say they have just enough in the tank with all the talented people amd has acquired over the past 5 years to save their bacon after all they have been here before.
 
Last edited:
Listen it could be worse at least amd is not the Cyrix/S3 zombie that lives on in Via to this very day. next year has to be a hit with its 14nm product stack or its bankruptcy before 2019 its when the 1st large bond is due for the fabs they sold

As much as I wan to be excited for Zen...

I just can't anymore. I just cant be fed hype. Bulldozer and Fury were just too much for my poor hype-sack to handle.
 
fury did what amd needed it to do launch and compete in a sector it was absent from and not to mention secure priority form hynix for hmb2 dies over nvidia. pascal may be a bit late next year cause of this
 
That chart goes out to Q2 2015. Fury launched at the very end of Q2 2015. Give it a quarter to see what the impact of fury is.
 
Listen it could be worse at least amd is not the Cyrix/S3 zombie that lives on in Via to this very day. listen everyone knew that this year was going to be ass for amd as they had literally nothing new to give us besides excavator and meh gpu refreshes fury was a surprise. next year amd has to have competitive with its 14nm product stack or its bankruptcy before 2019. id say they have just enough in the tank with all the talented people amd has acquired over the past 5 years to save their bacon after all they have been here before.

AMD's problem has never been talent. They've had talented people for a long time. AMD's problem now is the same as it's been for a long time: Terrible management. AMD has been floundering since the socket A days due to piss poor management. Their purchase of ATI should have led to a very strong competitive market, but instead it has lead to the situation they're in now. I have zero faith in AMD as a company anymore, due entirely to how shit the people running it are at doing their damn jobs. Zen might be good, but AMD could very well repeat the mistakes of the current GPU generation and ruin the launch with outright lies and terrible pricing strategies.
 
If AMD hasn't been competitive, how were they able to release almost all rebrands as Nvidia fans make sure they are called, that are completely competitive with latest Nvidia? Or are they assuming that everyone buys 980 TI / Titan Xs?
 
If AMD hasn't been competitive, how were they able to release almost all rebrands as Nvidia fans make sure they are called, that are completely competitive with latest Nvidia? Or are they assuming that everyone buys 980 TI / Titan Xs?

Thats probably an argument counter to what you may be trying to argue.

Yes AMD are still competitive to some degree, as I illustrated before: Nvidia are still reacting to them, which means they acknowledge AMD as a competitor. But the re-branded lines probably didn't help AMD's market position among enthusiasts.
 
AMD killed the company by panic, rather than making logical decisions.

When Intel announced the GMA 900 in 2005 (DX9, with the obvious implication that it would eventually be DX10, fully-programmable, and later integrated on-chip), AMD overreacted and bought ATI so they could guarantee they would have an advanced IGP under their control. They pawned the company to get their new toy, even though already in 2005 the discrete graphics market sales were already beginning to fall (total discrete sales had been falling for years). They spent 6 billion dollars on what they knew was a dying marketplace, so they could get a tiny piece of it.

They wasted five years of designers and fab time porting ATI's bulk silicon designs to Silicon on Insulator (no due diligence to work out the effort before making the purchase), then invented a bullshit concept of HSA to try and justify their stupid waste of time.

They spent years developing open architectures, programming models, and nobody uses it except themselves. And we have yet to see a real everyday useful example of HSA speeding up a process significantly over pure CPU or pure compute.
 
Last edited:
I feel a lot of people here are posting "AMD should do this" "AMD is a failure for this" without a true understanding of what is really going on and how to run a business.

Opinions everywhere...some more valuable than others of course.
 
The GPU market is completely bribed by Nvidia, that is the whole purpose of Gameworks.

Nvidia will provide you with support as long as you don't want to optimize for AMD/Intel hardware, if you do, you have to pay to access the source code and the contract you sign forbids you from working with AMD/Intel to optimize the code or let them optimize their drivers.

Gameworks optimization is awful:

Here is massive tessellation in water that is hidden:

nFc1Q0N.jpg


And here is some very nice tessellation on a barricade:

BxIhVHM.jpg


Iwrnm7S.jpg


Lets not forget Witcher 3:

Q11ApKj.jpg


http://wccftech.com/fight-nvidias-gameworks-continues-amd-call-program-tragic/

Here are plenty more examples from Crysis 2:

https://youtu.be/IYL07c74Jr4?t=1m46s

But hey, anything to make your old card run bad so they can sell you a new one.

Don't worry though, all the guys with SLI Titan X's will tell you about how that's normal for PC gaming and there are no issues.

STOP!!!

This FUD needs to die...someone might take you seriously, I have adressesed this before:

http://hardforum.com/showpost.php?p=1041667964&postcount=253

LOL

This is hillarious, but bear with me

All the ocean simulation is run on a small square and after that tiled out.
So that mesh you are seeing is not true indication of the load of how the games runs.
After that the game-engine does a z-buffer writeback for something that is called "GPU Occlusion Query"
It not rendered.
It's occluded ergo it is NOT drawn.

I would almost call that article dishonest.
Why?
Because they:
- Isolate the mesh
- Render the mesh at maximum tesselation
- Disable occlusion culling
- Disable dynamic tesselation
- Disable adaptive tesselation.

And the present it as "This is how the game works!"

But this is not what the games does.
The claim smells like another PR FUD thing.

It's all documented here:
http://docs.cryengine.com/display/SDKDOC4/Culling+Explained

People should read that...and if they do not understand it...they should stop posting about the topic.
Simple as that.
If you post about stuff you don't know anything about, in a negative manner...you become a FUD'ster.

(Besides I doubt NVIDIA had anything to with that "ocean"...it seems more like something coded by the devs themselfes to be frank.)

So lets see if I can tally the list so far:

"GameWorks is black box and AMD can do nothing!!!" = FUD and false PR
"Planned obsolence for "Kepler"" = FUD and false PR
"Crysis 2 is using to much tesselation because of evil NVIDIA" = FUD and false PR.

Besides AMD PR, you know what else is a common factor in all 3?

Ignorance about the topic.
But that dosn't stop people from screaming all over forums and IMOH making them selfes look like uninformed, rabid PR FUD'sters.

But quite funny that the arguments used to declare NVIDA for "evil" all seem to come from AMD PR...and spread around forums like gospel.

I think it would be a good idea for a "Technical GPU"-sub forum, where people with technical insigth could debate this things, with out all the false claims.

IT is complicated, I know, I have been working with IT in +20 years (coding, hardware, networking)..and there are a LOT of topics I will not enter, because I have insufficient knowlegede about the topic.
But I will read with joy, ask questions and learn.

I will not run around like a braindead parrot, posting crap I don't understand...but I guess people have different standards...

I will now chalck up the FUD-arguments you use:

1. Crysis 2 and tesselation (Argument from Ignorance)
 
I feel a lot of people here are posting "AMD should do this" "AMD is a failure for this" without a true understanding of what is really going on and how to run a business.

Opinions everywhere...some more valuable than others of course.

I agree. Despite all of the armchair economists' (including myself) lack of credibility and logic, one thing is for certain: AMD's ever-changing board members cant run a business to save themselves.
 
Did you miss the witcher 3?

Somehow the game had really bad issues on pre-maxwell hardware that went unnoticed by Nvidia until after all the reviews and benchmarks were out... wonder why.

I have countered this one too before:

http://hardforum.com/showpost.php?p=1041668262&postcount=277

So let me get this straight (last post of the day, got stuff to do)

Because in some newer games (not all) Kepler starts to show its age, people claims "planned obsolence"?

This is just as hillarious as "GameWorks is black box!!!" or "Look at Crysis 2 tessellation" :D
I need to visit more forums...even if it it's kinda scary to read a lot of posts...it is also very FUNNY!


1. I don't claim AMD held back anything in their drivers. Their current DX 11 implementation does have a larger overhead than NVIDIA, but that is a diffrent topic.
2. I don't claim NVIDIA made "kepler" with "planned obsolence".

There are NO facts supporting either, but claims are just as invalid/stupid.

"Kepler" performs like always, there has be ZERO performance degradation.
ALL reviews show this.
What is happingng is that i a few games (less than 5 that I am aware of) the compute load has increased.
The Schedule Manager (SM) have to do more work now (A hint to those that want real facts about were to "dig")

What does this mean?
It means that a larger portion (percentage wise) of CUDA cores in "Kepler" are inactive.

Again, I point people to charts like this:

nvidia-kepler-vs-maxwell-sm.gif


See the finer graining in "Maxwell" vs "Kepler?

This is what is giving "Maxwell" the egde...in games where shaders need to make room for more and more compute (SSAO, PhysX, HairWorks, Shader AA etc.)

No evil doing in drivers.
No planned obsolence (if you bother to look at the cadence from G80-> G92-> GT200 -> GF100-> GK110-> GM200 you will spot the naturla evoution of the architechture..and you need to claim that NVIDIA could see GPU laods 2½ years into the future *chough*)

It really that simple (the architechtual differences)

But I guess if you don't have a clue (or an agenda)...the other options seems "better"




1. Crysis 2 and tesselation (Argument from Ignorance)
2. "Kepler" and planned obsolence (Argument from Ignorance)
 
STOP!!!

This FUD needs to die...someone might take you seriously, I have adressesed this before:

http://hardforum.com/showpost.php?p=1041667964&postcount=253



I will now chalck up the FUD-arguments you use:

1. Crysis 2 and tesselation (Argument from Ignorance)

Sorry for the double post, but I just wanted to do two things:

Firstly: Awesome post, thanks for sharing some info, I certainly learned a lot.

but

Secondly: How do you propose Nvidia secures its position while spending as little money as possible? I'm not saying they're evil, I'm saying they are a publicly traded business that NEEDS to show growth in order to retain shareholders' trust. How do they ensure AMD market share drops while theirs increases, all the while earning MORE profit than the last YTD? Eventually, AMD will drop out then Nvidia will really have no metric besides profit to show growth. What would your business plan be?
 
The GPU market is completely bribed by Nvidia, that is the whole purpose of Gameworks.

Nvidia will provide you with support as long as you don't want to optimize for AMD/Intel hardware, if you do, you have to pay to access the source code and the contract you sign forbids you from working with AMD/Intel to optimize the code or let them optimize their drivers.

I almost missed this part, that would be 3 FUD arguments now:

http://hardforum.com/showpost.php?p=1041664658&postcount=144

Could you do me a favour.
List the gamning companies that gives sourcecode to GPU-vendors so they can "optimize" and then list the companies that dosn't?

This "black-box" whining is getting out of hand.

(FYI: +99% of games are optimzed using the binaries...running...not via sourcecode...feel free to prove my claim false..I dare you! :D )



1. Crysis 2 and tesselation (Argument from Ignorance)
2. "Kepler" and planned obsolence (Argument from Ignorance)
3. Gameworks is "Black Box" (Argument from Ignorance)


All you argumentation is based on ignorance and FUD/Fear-mongering...why is that?

And why is that you "argumentation" consist of the 3 most flawed(but still wildy used) fallacies against NVIDIA?

I get that you are very emotional, but that does not justify posting flawed arguments due to you limitied understanding of the topic, sorry.
 
Sorry for the double post, but I just wanted to do two things:

Firstly: Awesome post, thanks for sharing some info, I certainly learned a lot.

but

Secondly: How do you propose Nvidia secures its position while spending as little money as possible? I'm not saying they're evil, I'm saying they are a publicly traded business that NEEDS to show growth in order to retain shareholders' trust. How do they ensure AMD market share drops while theirs increases, all the while earning MORE profit than the last YTD? Eventually, AMD will drop out then Nvidia will really have no metric besides profit to show growth. What would your business plan be?

That plan is to find new markets that can leverage your existing GPU tech. Nvidia has been trying this for years, with limited success. If they can find a foothold in auto, it may be their saving grace.
 
Sorry for the double post, but I just wanted to do two things:

Firstly: Awesome post, thanks for sharing some info, I certainly learned a lot.

but

Secondly: How do you propose Nvidia secures its position while spending as little money as possible? I'm not saying they're evil, I'm saying they are a publicly traded business that NEEDS to show growth in order to retain shareholders' trust. How do they ensure AMD market share drops while theirs increases, all the while earning MORE profit than the last YTD? Eventually, AMD will drop out then Nvidia will really have no metric besides profit to show growth. What would your business plan be?

I would suggest to you that the answer to that question lies in the R&D budgets ;)
 
It's unlikely that AMD completely ceases to exist in the short or mid term. What is a distinct possibility, and if you go by there statements what they are actually in the process of doing, is the gradual emphasis and deprioritizing of the PC desktop market (keep in mind they are not the only "traditional" company with this approach). The market simply is moving in this direction.

This is also why I don't see why some people believe in the idea of a buyout of AMD by a third party would beneficial for the PC market. For example Samsung, which has been proposed, there seems to be very little incentive for them to invest signficiantly into AMD just to focus on discrete GPUs. If anything the likelihood for graphics would be in mobile and hastening a move away from the desktop.

Sorry for the double post, but I just wanted to do two things:

Firstly: Awesome post, thanks for sharing some info, I certainly learned a lot.

but

Secondly: How do you propose Nvidia secures its position while spending as little money as possible? I'm not saying they're evil, I'm saying they are a publicly traded business that NEEDS to show growth in order to retain shareholders' trust. How do they ensure AMD market share drops while theirs increases, all the while earning MORE profit than the last YTD? Eventually, AMD will drop out then Nvidia will really have no metric besides profit to show growth. What would your business plan be?

What is interesting now in terms of positioning is that Nvidia stands to gain relatively little from actually gaining more marketshare from AMD. I don't know the Q2 data but apparently discrete GPU sales dropped 13% in Q1 (compared to last year). So if this quarter had a similar drop then this market share swing doesn't actually even offset the total loss in sales.

Nvidia actually has an incentive to encourage discrete GPU and PC gaming over alternatives growth (or at least slow the decline ) and explore other markets (eg. auto, Tegra). Simply just chipping away at AMD is at the point of dimishing returns.

What is interesting is that Nvidia's largest competition from AMD is likely indirectly in the form of the consoles. This is actually why Gameworks is important to them even though the controversey has been focused on the direct impact in the discrete GPU competitive space.
 
Last edited:
1. Crysis 2 and tesselation (Argument from Ignorance)
2. "Kepler" and planned obsolence (Argument from Ignorance)
3. Gameworks is "Black Box" (Argument from Ignorance)

1) If you bothered to look at the images or video posted, it was much more than the water underneath buildings, which btw, is still processed even though its removed from the final output. Please read the source article I mentioned for details.

2) I never said planned obsolescence, I said that they are even hurting their old hardware, and that was the case by defaulting Witcher to 64x tessellation which was then *fixed* by moving down to 16x max. This wasn't AMD fans complaining, it was Nvidia ones.

3) Gameworks is a blackbox. AMD can not see, and developers can't see unless they pay (and then are not allowed to work with AMD to fix performance). That is the definition of a black box system. Other developers send their source code or can at least collaborate with GPU vendors to optimize their games.

The only one spreading FUD is you, I'm listing FACTS. Learn the difference.
 
3) Gameworks is a blackbox. AMD can not see, and developers can't see unless they pay (and then are not allowed to work with AMD to fix performance). That is the definition of a black box system. Other developers send their source code or can at least collaborate with GPU vendors to optimize their games.

The only one spreading FUD is you, I'm listing FACTS. Learn the difference.

Well, I'm pretty sure developers are free to share development builds and game source code with AMD even if they are licensed GameWorks partners. They just cant share the GameWorks source with non-licensees, which means they cant share GameWorks source with AMD. They are still free to work with AMD and share non-GW code. This is pretty standard. By that definition, pretty much ALL closed-source software is 'black box'.

I don't like GameWorks, and I think there are some pretty dodgy things going on with it, but locking up source code is just standard practice, not illicit in any regard.
 
1) If you bothered to look at the images or video posted, it was much more than the water underneath buildings, which btw, is still processed even though its removed from the final output. Please read the source article I mentioned for details.

Nothing in that article changes my claims.
Why don't you tell me which part of my answer you do not understand, then we will take it from there.

2) I never said planned obsolescence, I said that they are even hurting their old hardware, and that was the case by defaulting Witcher to 64x tessellation which was then *fixed* by moving down to 16x max. This wasn't AMD fans complaining, it was Nvidia ones.

But the gist of your post was to invoke "planed obsolence" or something similar, again, which part of my post don't you understand, then we take it from there?

3) Gameworks is a blackbox. AMD can not see, and developers can't see unless they pay (and then are not allowed to work with AMD to fix performance). That is the definition of a black box system. Other developers send their source code or can at least collaborate with GPU vendors to optimize their games.

Name a game that shipped sourcecode, not binaries to NVIDIA/AMD for optimizations?
Again, you seem to never have looked at shader code running realtime in the various stages of the pipeline, so for the 3rd time:
Tell me which part of my post you do not understand, then we will take it from there?


The only one spreading FUD is you, I'm listing FACTS. Learn the difference.

Sorry, but repeating false claims will not get you out of this.
You choose to post FUD based on your own ignorance.
And not just one false claims.
Not even just 2 false claims.
But 3 false claims just in row.
Don't cry when you bluff is called.

I look forward to your replies, but sadly I think you will contiune to use the same fallacies...the path of Argument from Ignorance is hard to change in people.

So I repeat:
1. Crysis 2 and tesselation (Argument from Ignorance)
2. "Kepler" and planned obsolence (Argument from Ignorance)
3. Gameworks is "Black Box" (Argument from Ignorance)
 
I already posted this in another thread but launch benchmarks for the Witcher 3 had Kepler and Maxwell show the same (within +/- 1%) performance drop when enabling Hairworks from two separate sources.

Can people who have a Kepler or Maxwell card and the Witcher 3 post 1.07 test the performance difference with Hairwork low vs high? From what I understand this does affect more than simply the tesselation level, however it would be interesting to see whether or not the relative difference changes. If the alleged claim that high tesselation is directed to cripple Kepler were true then then Keplers performance delta should drop less.
 
Nothing in that article changes my claims.
Why don't you tell me which part of my answer you do not understand, then we will take it from there.
I'm going to take the word of the developers over yours, so again, read the article as it directly refutes your claims.

But the gist of your post was to invoke "planed obsolence" or something similar, again, which part of my post don't you understand, then we take it from there?

Again, performance was handicapped on Nvidia hardware at release and then fixed later in drivers after reviews were out and many nvidia customers complained... This was do to once again, way over done tessellation.

Name a game that shipped sourcecode, not binaries to NVIDIA/AMD for optimizations?
Again, you seem to never have looked at shader code running realtime in the various stages of the pipeline, so for the 3rd time:
Tell me which part of my post you do not understand, then we will take it from there?

Funny, there is a huge thread on it right now, pretty sure you've posted in it as well...

Let me remind you:
To this end, we have made our source code available to Microsoft, Nvidia, AMD and Intel for over a year. We have received a huge amount of feedback. For example, when Nvidia noticed that a specific shader was taking a particularly long time on their hardware, they offered an optimized shader that made things faster which we integrated into our code.

I can't speak for other developers but I don't see why they wouldn't. Nvidia works directly with developers who use Gameworks, there is no reason for developers to not want Nvidia/AMD to be able to optimize their game as much as possible. Their code would be copyright protected and under NDA.

Sorry, but repeating false claims will not get you out of this.
You choose to post FUD based on your own ignorance.
And not just one false claims.
Not even just 2 false claims.
But 3 false claims just in row.
Don't cry when you bluff is called.

I look forward to your replies, but sadly I think you will contiune to use the same fallacies...the path of Argument from Ignorance is hard to change in people.

Oh the irony is strong here.
 
Well, I'm pretty sure developers are free to share development builds and game source code with AMD even if they are licensed GameWorks partners. They just cant share the GameWorks source with non-licensees, which means they cant share GameWorks source with AMD. They are still free to work with AMD and share non-GW code. This is pretty standard. By that definition, pretty much ALL closed-source software is 'black box'.

I don't like GameWorks, and I think there are some pretty dodgy things going on with it, but locking up source code is just standard practice, not illicit in any regard.

I never said they can't share anything, but if they can't fix the Gamework specific features, those will not be optimized and can not be by AMD, only be the developers who have to pay Nvidia to do so
 
I'm going to take the word of the developers over yours, so again, read the article as it directly refutes your claims.

I suspect you lack the knowlegede to counter my points, so I will ask you directly:
What part of that video do you think counters my claims?
Speciffically?!
It's it the part about Z-culling that confuses you?

Just point to a video and try and get away with a "Argument from Authority" dosn't work...you need to counter my points, or tell where the video counters my points.
No "free FUD" to you today, sorry.



Again, performance was handicapped on Nvidia hardware at release and then fixed later in drivers after reviews were out and many nvidia customers complained... This was do to once again, way over done tessellation.

Define "handicapped".
Do you mean on purpose?
Do you claim that "Kepler" lost performance during it's lifecycle"?
Your "arguments" are so weak, I fail to see their relevance?


Funny, there is a huge thread on it right now, pretty sure you've posted in it as well...

Let me remind you:

That is not a game.
That is a pre-alpha benchmark.
You need to be able to read to not embarress yourself.


I can't speak for other developers but I don't see why they wouldn't. Nvidia works directly with developers who use Gameworks, there is no reason for developers to not want Nvidia/AMD to be able to optimize their game as much as possible. Their code would be copyright protected and under NDA.



Oh the irony is strong here.

Then be quiet please, no offense.
And please, which delevoper do you claim to speak for?
(The "other developers" part makes it sound like you speak on behalf of a developer, are you?)
 
The really sad part about this thread is how some people think:

"NVIDIA is gaining marketshare because of EVIL tactics".

I'd summize thay are gaining marketshare due to have the right products with the right features at the right pricelevel for the consumers.

Unlike forums where a small, but very vocal AMD user/fan-segment creates a lot of noise about how "unfair NVIIDA" is, making it appear that a major battle is still undergoing...in the consumer space...AMD is fading.

Crying on forums with FUD won't change that one bit.

Products with the right features at the right price will.

Business 101...
 
I really hope they turn it around sooner than later. I'm sure they will but it needs to be quick so that the progression of GPU technology doesn't go stale. I want to see some heavy weight fights for the top spot again.

I don't see how they can turn anything around until they reset their product stack with 16nm from top to bottom.

Their products are most probably putting them at disadvantage cost wise which impacts profit margins.

1000$ - no AMD products, Nvidia sells full GM200 (full 600 mm^2 chip)
650$ - Fury X with full 600 mm^2 chip, Water Cooling and HBM vs cut down GM200 with classic GDDR5
550$ - cut down Fury
450$ - Full Hawaii with 8 GB of ram, 512 bit bus and 300W TDP (which complicates PCB and increases requirements from components) vs <200W GTX 980
330$ - same but with cut down chips
200-240$ - AMD has cut-down 360 mm^2 Tonga with 256 bit bus vs 230 mm^2 gm206 with 128 bit bus
<200 Pitcarn is 212 mm^2 but 256 bit vs rumored incoming gm206 cut downs so probably only segment when they are close
 
The GPU market is completely bribed by Nvidia, that is the whole purpose of Gameworks.

Nvidia will provide you with support as long as you don't want to optimize for AMD/Intel hardware, if you do, you have to pay to access the source code and the contract you sign forbids you from working with AMD/Intel to optimize the code or let them optimize their drivers.

Gameworks optimization is awful:

Here is massive tessellation in water that is hidden:



And here is some very nice tessellation on a barricade:



Lets not forget Witcher 3:



Here are plenty more examples from Crysis 2:

l]

But hey, anything to make your old card run bad so they can sell you a new one.

Don't worry though, all the guys with SLI Titan X's will tell you about how that's normal for PC gaming and there are no issues.

You have posted something that appears to have actual value. That barricade is ridiculous. This is why despite having hardware that is 8x of these current consoles PC games aren't looking 8x better. People, we are putting all our money, all of the power of our GPU into post processing effects. I do believe Nvidia definitely bribes the PC market, its probably too easy to do; And these developers are probably all too willing to take money or development support for a PC version they really aren't thrilled at making. I don't really think its unethical of Nvidia to do this, however, I just think its really, really, good marketing. It is terrible for the PC gaming market, however.


People are passing up AMD cards because they are sick of (some of) these games running like crap. Now take that unfortunate fact and pitch it to the average consumer who also doesn't really know better and they think the problem is much much worse then it is. Now, combine that with AMD's real problems of HEAT. I think its stupid that people complain about how much power the card consumes, because we are talking about only when gaming is happening and it adds up to literally like 8 bucks a year. But that power consumption is related to HEAT. The higher heat the AMD GPU's put out means a lot of the more budget minded cards are going to be very noisey compared to Nvidia equivalents. So many of the average consumer is going to go to their friends house and hear that 'Powercolor' brand HD 7950 fire up and write off AMD. I just picked that brand at random I don't know if it is loud or not, lol.
 
These issues showed up AFTER AMD decided to skydive. When the GPU market was more balanced, developers would never sabotage 45% of their customers. Now that AMD's GPU share is 15%, they are essentially 'niche' and won't be considered for important updates.

You're looking at it wrong. The market share is based on new sales, not what people are actually using. So if a software dev gimps its game on AMD it's not angering 15% (which is already enough to generate a shitstorm IMO) but close to 30% according to the current Steam survey. And even NVIDIA can't pay enough to betray a third of your customers.


The sad part about this is the grin on nvidia fanboys faces as they cheer, if only they known they stand to loose on this as well. Now NVIDIA even has the courage to abandon their last generation cards to make you buy a new one every year, when advances in gaming wouldn't necessarily justify it. Just imagine what will happen with absolutely no competition? Current cards will never get cheaper, they'll simply disappear from the market, when a new product takes over. Which means everyone who is not loaded with cash will suffer greatly they'll be forever damned to the mediocrity of low-end or mid range cards.
 
What a shocker, AMD is now sub 20%..really..I'm just shocked because I thought with that overclockers dream of a GPU they'd have clawed their way to 40% by now. Maybe they can keep crying some more about GameWorks, I'm sure that will win them even more marketshare. At this point the smart thing for AMD as a company would be to sell off ATi to someone (perhaps NVIDIA) and then setup a licensing agreement with them to share GPU IP so they can integrate it into their CPUs. They lost their focus on the CPU market by acquiring ATi and have no real direction. If current trends continue, and there's no reason to think they won't, I predict AMD folding from the high end dGPU market by 2017. If they don't shuttle ATi by then, they'll likely reshuffle it to work on APU + consoles.

Honestly, if you look at AMD financial reports, the GPU market is a big loser for them and they keep pouring endless money into this black hole. They can plug this leak by exiting the market sooner than later and focusing on specialized areas such as APU, consoles and maybe even take a stab at the auto market that NVIDIA has ventured into. The PC gaming market is done for them, there's no return from 18% and falling.

It's almost like you are reveling in the performance of a company in a market (dGPU) that's already a duopoly? Do you think this makes your previous infantile comments somehow justified? It doesn't.

Nvidia's successes aren't your successes and when AMD does badly it's not a win for you. Just being a crazy fanboy doesn't doesn't give you the right to be so cringe-inducingly pleased about this. Adults in this thread seem to understand that this is bad for the market whichever GPU is your personal preference.

Edit: Good to see you're banned, add this site to the list eh? I'm sure you'll try and spin this in your head as you being the crusader for truth and everyone else is biased or fanboys. Honestly the problem is you.
 
Last edited:
You're looking at it wrong. The market share is based on new sales, not what people are actually using. So if a software dev gimps its game on AMD it's not angering 15% (which is already enough to generate a shitstorm IMO) but close to 30% according to the current Steam survey. And even NVIDIA can't pay enough to betray a third of your customers.


The sad part about this is the grin on nvidia fanboys faces as they cheer, if only they known they stand to loose on this as well. Now NVIDIA even has the courage to abandon their last generation cards to make you buy a new one every year, when advances in gaming wouldn't necessarily justify it. Just imagine what will happen with absolutely no competition? Current cards will never get cheaper, they'll simply disappear from the market, when a new product takes over. Which means everyone who is not loaded with cash will suffer greatly they'll be forever damned to the mediocrity of low-end or mid range cards.

Would you please stop with the "planned obsolence" FUD as long as you cannot document that claim?
 
Would you please stop with the "planned obsolence" FUD as long as you cannot document that claim?

What kind of documentation would you find acceptable? In fact even theoretically what kind of documentation could you ever expect with this type of claim? Nvidia in a billion years aren't going to admit anything like this (not to imply this claim is even true), no company ever would. You've flooded this thread with these comments but what do you really expect? Internal emails from Nvidia? Hidden camera footage?
 
AMD's problem has never been talent. They've had talented people for a long time. AMD's problem now is the same as it's been for a long time: Terrible management. AMD has been floundering since the socket A days due to piss poor management. Their purchase of ATI should have led to a very strong competitive market, but instead it has lead to the situation they're in now. I have zero faith in AMD as a company anymore, due entirely to how shit the people running it are at doing their damn jobs. Zen might be good, but AMD could very well repeat the mistakes of the current GPU generation and ruin the launch with outright lies and terrible pricing strategies.

let me fix that for you the 1st x86-64 cpu was amd and that was the k8 and it decimated intel's netburst p4, killed ia64 and rambus ram. however amd was caught between intel a big player in the market and its sales and intel used anticompetive practices to lock out amd at oem's. thus the pittance of under 2b intel gave amd for essentially screwing them out of market share for a decade when intel rakes in record profits year over year form their tactics previously used tactics. nv is doing something similar to lock amd out of the game devs with a paid middle ware software solution.

the huge problem with amd management was always hector ruiz as he has the blood of many semiconductor business's on his hands. His deeds include Killing amd's flash division, overpaying for for ati in 06, and the complete mismanagement of amd's fabs leading to failed product launches. the asshat that followed dirk meyer did the best he could but sold off the key mobile division of amd for a pittance that thrive today as apart of qualacomm and completed the selling of the fabs w/o giving all the fab debt to ATIC/global foundries. the last two have been attempting to right the ship just have to see where that goes
 
Last edited:
Would you please stop with the "planned obsolence" FUD as long as you cannot document that claim?

So you want written documentation that proves what they're doing internally, under strict NDA. Are you really that dense?

Planned obsolescence existed for a hundred years. Every manufacturer takes it into account, the only difference is how far they can go with it. Questioning it is like questioning if the earth is round. I have no doubt in my mind that if AMD had the same share as NVIDIA they'd do the same thing. It's just business as usual.
 
Nothing in that article changes my claims.
Why don't you tell me which part of my answer you do not understand, then we will take it from there.



But the gist of your post was to invoke "planed obsolence" or something similar, again, which part of my post don't you understand, then we take it from there?



Name a game that shipped sourcecode, not binaries to NVIDIA/AMD for optimizations?
Again, you seem to never have looked at shader code running realtime in the various stages of the pipeline, so for the 3rd time:
Tell me which part of my post you do not understand, then we will take it from there?




Sorry, but repeating false claims will not get you out of this.
You choose to post FUD based on your own ignorance.
And not just one false claims.
Not even just 2 false claims.
But 3 false claims just in row.
Don't cry when you bluff is called.

I look forward to your replies, but sadly I think you will contiune to use the same fallacies...the path of Argument from Ignorance is hard to change in people.

So I repeat:
1. Crysis 2 and tesselation (Argument from Ignorance)
2. "Kepler" and planned obsolence (Argument from Ignorance)
3. Gameworks is "Black Box" (Argument from Ignorance)

I repeat, I repeat, I repeat, I repeat...... ad nauseum. :rolleyes: Whether you like it or not, facts are facts and they do not change to fit your own narrative. However, keep pushing that way, someday, you might even convince yourself. ;)
 
You're looking at it wrong. The market share is based on new sales, not what people are actually using. So if a software dev gimps its game on AMD it's not angering 15% (which is already enough to generate a shitstorm IMO) but close to 30% according to the current Steam survey. And even NVIDIA can't pay enough to betray a third of your customers.

Thing is that a large chunk of that 30% are APUs.

I'm too lazy to just filter the discrete cards, but I'm pretty sure the numbers will favor nvidia in a big way.
 
The problem is different...if AMD goes under, Nvidia has no more competition, which means no more need to pour money into R&D which leads to stagnant development and in the end, us the consumers will lose.

I know there were rumors that Samsung wanted to buy AMD, at this point, I wish that those rumors were true and AMD will become part of Samsung.
Be pretty shitty not like Samsung isn't known for their anti-competitive practices.
 
I repeat, I repeat, I repeat, I repeat...... ad nauseum. :rolleyes: Whether you like it or not, facts are facts and they do not change to fit your own narrative. However, keep pushing that way, someday, you might even convince yourself. ;)

i see the paid nv forum crew is awake and out in full force trying to state its opinions as fact. we know amd is getting screwed in gameworks titles hell we know nv's older hardware is getting screwed as well. yet the nv echo chamber still faithfully believes in their graphical savior JJH and nv the company can do no wrong.
 
If AMD hasn't been competitive, how were they able to release almost all rebrands as Nvidia fans make sure they are called, that are completely competitive with latest Nvidia? Or are they assuming that everyone buys 980 TI / Titan Xs?

They are not competitive, that's why their market share has dropped so low.

Facts.
 
Back
Top