Did AMD fail to take marketshare with their latest gpus?

even though I have prime1 on ignore I see she still does her thing
 
even though I have prime1 on ignore I see she still does her thing
I wish people would stop giving me reasons to defend AMD.
It makes me feel dirty.

If I buy an Nvidia card someday, does it mean I'm going to automatically start pulling random shit out of my ass on internet forums? Hope not.
 
The truth is the only advantage was price. Factor in heat and poor drivers, most people pony up for nvidia if they can afford to.

Yeah, no. Drivers were a non-issue for AMD starting years ago. Now they are just as good as nvidia's if not better (I've never seen AMD drivers destroy a video card, but NV had to pull their 196.75 and 320.18 drivers because they were destroying GPU's).

I don't understand how people can continue to claim nvidia's drivers are better and continually bash AMD's when nividia released multiple WHQL drivers that destroyed their own hardware...
 
I don't understand how people can continue to claim nvidia's drivers are better and continually bash AMD's when nividia released multiple WHQL drivers that destroyed their own hardware...


I am also amazed on how people keep touting Nvidia is better.
This is simply not true. AMD have a great product and drivers.

Now what was the topic,OH yeah did they lose market share,Nvidia dominating.

I do not really know,but I would be a bad one to ask,I bought several 780Ti cards.

Bought a few R9 290/X ,so did lots of benchmarks and testing AMD/Nvidia .
My conclusion was AMD had better drivers and ran better than Nvidia with my setup.

I kept R9 290 Crossfire and could not be happier.

Sold all Nvidia cards and a couple R9 290x cards.

Yes Nvidia sells alot of cards and so does AMD. The people who try out both sides usually find the best setup for themselves and for me AMD all the way.
 
Yeah, no. Drivers were a non-issue for AMD starting years ago. Now they are just as good as nvidia's if not better (I've never seen AMD drivers destroy a video card, but NV had to pull their 196.75 and 320.18 drivers because they were destroying GPU's).

I don't understand how people can continue to claim nvidia's drivers are better and continually bash AMD's when nividia released multiple WHQL drivers that destroyed their own hardware...

It's funny, nvidia fanboys never acknowledge that those drivers ever existed.
 
Well. AMD's drivers are still shit. Everytime a new game comes out it runs like shit on AMD.
Watch Dogs, Wolfenstein, WildStar. And those are just from the last few weeks.

How many times in your life has a new game released, and you see a bunch of "It runs bad on AMD" or "It's not optimized for AMD" or "We have to wait for AMD to release a new driver"?
It feels like every game these days.
 
If an Nvidia-based machine has an issue: must be the machine!

If an AMD-based machine has an issue: must be the AMD products!

With that mentality, of course AMD drivers and products are rubbish!

Scenario 1

Consumer: My PC keeps BSODing!

Tech: What GPU do you have?

Consumer: Nvidia!

Tech: okay, let's work it out and find the problem.

Result: virus, reinstall system: works fine after reinstall.

Scenario 2

Consumer: My PC keeps BSODing!

Tech: what GPU do you have?

Consumer: AMD!

Tech: fucking problems with AMD, buy an Nvidia GPU, and reinstall to get rid of AMD shit drivers.

Result: new Nvidia-based GPU and reinstall system: works fine after new GPU and system reinstall.

See the issue here?
 
^^^ Case in point to what you are saying.

Well. AMD's drivers are still shit. Everytime a new game comes out it runs like shit on AMD.
Watch Dogs, Wolfenstein, WildStar. And those are just from the last few weeks.

How many times in your life has a new game released, and you see a bunch of "It runs bad on AMD" or "It's not optimized for AMD" or "We have to wait for AMD to release a new driver"?
It feels like every game these days.

Watchdogs runs like crap on nVidia and it's a poorly optimized console port. It runs bad on AMD hardware and it's poor AMD drivers. This is even though it's a TWIMTBP title with special nVidia optimized Game Works DLL's, etc. That can't have anything to do with issues on AMD cards. It has to be poor AMD drivers.

Also needing 3gig of VRAM to run ultra textures is a fail of the game, not an advantage or reason to consider Tahiti over GK104. Even now people will still claim that buying a new card for 1080, 2 gig VRAM is plenty. IMO, some of this is ill informed consumers but a lot of it is purposely perpetuating misinformation by people with a vested interest.
 
Well. AMD's drivers are still shit. Everytime a new game comes out it runs like shit on AMD.
Watch Dogs, Wolfenstein, WildStar. And those are just from the last few weeks.

How many times in your life has a new game released, and you see a bunch of "It runs bad on AMD" or "It's not optimized for AMD" or "We have to wait for AMD to release a new driver"?
It feels like every game these days.

I'm pretty sure I see 'runs like crap on (XBRAND)' threads for either GPU manufacturer. Watch-Dogs ran noticeably smoother on AMD tech, Tomb Raider ran like butter on AMD at launch, sleeping dogs was amazing on either vendor, and these are games that actually matter.

I don't think your comments are substantiated...
 
I've had mixed luck with both sides. My 9500 Pro, 9700 AIW, 4870, and 5870 were all fantastic. My 9600 Pro and 6970 were both pieces of crap. On the green side, I loved my 6800 unlocked to a 6800 GT, 7900 GTX SLI, 8800 GTS 640, GTX 680, and my current 780 Ti. I absolutely HATED my Ti 4200, 7800 GT, 275 GTX, and had issues with my GTX 680 SLI setup.

Hell, if my 780 Ti wasn't $500 then I would have picked up a 290X instead. I got lucky that I got what I did, but I'm more or less brand agnostic. I've had performance issues, driver issues, and quality issues on both sides. It's amazing how many people crap on one side or the other because their brand new game didn't work for a week, or they had a card that was DOA once, or (my favorite) canned benchmarks show better results despite real world differences being nil.

Re-reading that, I realize how many GPUs I've had.....damn, that's a lot of wasted money....
 
I don't know why people complain so much about drivers. nVidia and AMD will have a crappy driver every now and then. Looking back with my AMD 4890 I've had a wonderful experience other than having to RMA the MSI card (not amd's fault). It's been working flawlessly and I can't remember a time where I've had driver issues (it seemed like I was getting a slight performance increase with each release). I've always ran dual monitors with this card (game on only my main monitor though). I also power my 5.1 sound system through the HDMI on this card. Overall it's been a very good experience with maybe some minor minor hiccups every now and then (which goes for nVidia and AMD).
 
Yeah, no. Drivers were a non-issue for AMD starting years ago. Now they are just as good as nvidia's if not better (I've never seen AMD drivers destroy a video card, but NV had to pull their 196.75 and 320.18 drivers because they were destroying GPU's).

I don't understand how people can continue to claim nvidia's drivers are better and continually bash AMD's when nividia released multiple WHQL drivers that destroyed their own hardware...

Two drivers in a decade that have had fan speed issues that just caused people's computers to lockup once they got too hot (which hopefully prompted most to troubleshoot and find out that they could just roll back a driver version and be fixed), is hardly comparable to the vast majority of AMD drivers having major issues with various games, anything but a one-monitor one-gpu setup, and generally being inferior in release speed and updates. I've owned many of both brands over the years as I said before, so I'm hardly a "fanboy" of one side or the other I just buy what works nowadays and is the best value for my money (and that's not spending more time tweaking than gaming with an AMD card, let me tell you! I've loved my GTX 780 though ;)).
 
You mean offering more performance for less money doesn't equal sales?!
Hardware community is weird.

There's nothing more AMD can do to boost their sales aside from, I don't know, running TV adverts.

-More performance? Nope, or very rarely when it has happened. Of course, I would argue actual usability of said performance has always mitigated any advantages in raw speed amd may have had at certain points in time.
-Less money? Sure in some cases/price brackets.
-Hardware community realizes software drivers and nVidia's features like physx, txaa, shadowplay, and driver updates being reliable, are all part of the overall value proposition.
-Hardware community is not weird and understands the above, so they buy nVidia at a 2:1 ratio for discrete PC video cards, per market statistics.
 
Two drivers in a decade that have had fan speed issues that just caused people's computers to lockup once they got too hot (which hopefully prompted most to troubleshoot and find out that they could just roll back a driver version and be fixed), is hardly comparable to the vast majority of AMD drivers having major issues with various games, anything but a one-monitor one-gpu setup, and generally being inferior in release speed and updates. I've owned many of both brands over the years as I said before, so I'm hardly a "fanboy" of one side or the other I just buy what works nowadays and is the best value for my money (and that's not spending more time tweaking than gaming with an AMD card, let me tell you! I've loved my GTX 780 though ;)).
Actually, judging by your adamant defence of Nvidia at every turn, I'd say you are quite the fanboy of Nvidia.

I've had the exact same experience as you, though. Except swap the word Nvidia with AMD and vice-versa. So, when anybody says that AMD drivers give them trouble, require tweaking and whatnot, I've had the same experience with Nvidia.

You know what that means? A ton of people are blaming their general system instability on their scapegoat of choice. Like I wrote above.
 
Two drivers in a decade that have had fan speed issues that just caused people's computers to lockup once they got too hot (which hopefully prompted most to troubleshoot and find out that they could just roll back a driver version and be fixed), is hardly comparable to the vast majority of AMD drivers having major issues with various games, anything but a one-monitor one-gpu setup, and generally being inferior in release speed and updates. I've owned many of both brands over the years as I said before, so I'm hardly a "fanboy" of one side or the other I just buy what works nowadays and is the best value for my money (and that's not spending more time tweaking than gaming with an AMD card, let me tell you! I've loved my GTX 780 though ;)).
Actually, judging by your adamant defence of Nvidia at every turn, I'd say you are quite the fanboy of Nvidia. You, Unknown-One and PRIME1 should be getting paid by Nvidia, because you're working pretty hard at clearing their name and promoting team green. Good on you, too; everyone needs their religion. Like those people who live and die by Apple products. It's largely placebo, but hey, whatever floats your boat.

I've had the exact same experience as you, though. Except swap the word Nvidia with AMD and vice-versa. So, when anybody says that AMD drivers give them trouble, require tweaking and whatnot, I've had the same experience with Nvidia.

You know what that means? A ton of people are blaming their general system instability on their scapegoat of choice. Like I wrote above.
 
Uh-huh......Anyway.



She!?

For some of us, non-native speakers, in our native language, the hardware parts are not referred by "it", but, has a gender tag. For example in Polish, CPU is "he", gpu is "she" and cooling would be referred like "it". So when writing in English, I sometimes might translate my native language rules to your language.

Might be this case, and not that the one, who refers to gpu as "she" lives in close relationship with the card :D
 
Nvidia is more like the woman. Well built, costs a lot, fun to play with

AMD is more like the man. Ugly, strong, uses way more resources than necessary.
 
Actually, judging by your adamant defence of Nvidia at every turn, I'd say you are quite the fanboy of Nvidia. You, Unknown-One and PRIME1 should be getting paid by Nvidia, because you're working pretty hard at clearing their name and promoting team green. Good on you, too; everyone needs their religion. Like those people who live and die by Apple products. It's largely placebo, but hey, whatever floats your boat.

I've had the exact same experience as you, though. Except swap the word Nvidia with AMD and vice-versa. So, when anybody says that AMD drivers give them trouble, require tweaking and whatnot, I've had the same experience with Nvidia.

You know what that means? A ton of people are blaming their general system instability on their scapegoat of choice. Like I wrote above.

Nah, I just say what I think is correct, like most rational people do when someone spews forth misinformation ;). You may claim it's placebo but the marketplace doesn't agree, unlike with apple where they have a tiny market share. Nvidia meanwhile has a 2/3 share of discrete GPUs, outselling AMD 2:1, because I'm hazarding a guess most people share my experience and not yours ;). Not the same thing whatsoever as an apple product. As you said, I guess everyone needs a religion/forum warrior cause, eh... you demonstrate that perfectly Kazeo!

When someone has to resort to personal insults like you are, they've already lost their argument. :)
 
To stir the pot a bit :p here is my experience of drivers.

Previously I had 2x5770 in crossfire.
Only one game ran well in crossfire, that was Dirt 2.
Crossfire rarely worked.
Single card worked ok when the drivers let you use all the features.
I ended up being forced to use a modded driver from some enterprising chap on Guru3D that fixed most of the problems ATI couldnt!


Then I got a GTX580 with AC Xtreme II cooler and I've never had such a flawless experience + over 20% overclock to boot!
It just worked and worked really well.
I was never scared to try drivers, my fun had returned.


Now I have a 290 with a 290x BIOS, fully unlocked, with AC Extreme III cooler.
Temps are brilliant and its a really really quick card, but it does some very annoying things in either 290 or 290x mode.

My previous Windows install slowly self destructed after putting the 290 in.
And some applications would get a black screen when run. Like Unigine Heaven, in full screen mode instant black screen needing a reboot. But when run in a window, works perfectly.

I would get occasional black screens on boot that became more frequent until they became permanent.
I could get into windows in safe mode, disable the gfx card and could then get back into windows normally to change the driver.
I tried 3 different drivers, 2 WHQLs, all did the same.
Tried using DDU to remove all traces, no luck.

So reinstalled Windows and soon after tested Unigine Heaven in full screen mode and it worked flawlessly.
Then I tried the latest beta driver out and bang, Heaven does the old black screen trick in full screen mode again, and no matter what driver I use now, it is back for good.

Then I found the 14.6 beta driver only let me use the monitor plugged into the HDMI out.
The other 2 displays disappeared so were no longer available to use so I couldnt use my projector or HDMI surround sound processor.
I later found that the displays became available in CCC when they were turned off lol, some idiot reversed the detection code.
But this presented another problem when I then tried to use my projector when it was turned off, in the vain hope it would appear for use when switched on.
Then I couldnt even use my HDMI display unless I left the projector on because as soon as the projector was switched off, it became the default display and NOTHING I did could stop it, apart from completely removing the display driver again.
Luckily the 14.6 WHQL driver sorted this out but now it doesnt let me use my projector as a 3D display.
GRRR.

There have been other black screen problems as well.
I hate changing driver with this thing, I dont trust them to work one bit.
Takes a lot of fun out of my PC experience.
Grrr.
 
Nenu, that's been my experience as well, they work OK for barebones basic 1-screen 1-gpu gaming, but once you start piling on any add-in features as wants, or anything not the basic experience for your PC in general, the issues start to pile up. :( That's why I keep trying the AMD products each gen hoping they've gotten better since their pricing is lower, but have so far most always ended up back with nVidia.

______________________________

EDIT: On a side note, people don't buy cards that are notably more expensive because they like to waste money, they buy them because they enjoy them more, as we've seen.
 
Last edited:
Nah, I just say what I think is correct, like most rational people do when someone spews forth misinformation ;). You may claim it's placebo but the marketplace doesn't agree, unlike with apple where they have a tiny market share. Nvidia meanwhile has a 2/3 share of discrete GPUs, outselling AMD 2:1, because I'm hazarding a guess most people share my experience and not yours ;). Not the same thing whatsoever as an apple product. As you said, I guess everyone needs a religion/forum warrior cause, eh... you demonstrate that perfectly Kazeo!

When someone has to resort to personal insults like you are, they've already lost their argument. :)

I don't mean to insult, and I try not to show bias: I've sold far more Nvidia cards in systems than AMD, largely by leveraging existing bias in customers. Apple has a small marketshare? IOS is one of the dominant platforms for app development, and I'm pretty sure it's damn near 50% total phone marketshare, and probably an embarrassing majority in tablet markets.

But honestly, back to my point. Less than 6 months ago I built, repaired, serviced and designed gaming computers for a retail chain in Sydney. In my experience: Nvidia's dominance is reminiscent of Coca Cola. In a blind 'Pepsi challenge' style situation, people actually prefer Pepsi, but when given the choice: they choose coke. Why? Brand loyalty. There is a TON of research put into this phenomenon, and it mostly boils down to the parts of our brains that are responsible for religion. In actuality, when diodes are connected to the heads of subjects during 'Pepsi challenge' style scenarios fans of Product 'A' are given product 'B' and told that it is in fact product 'A'. When asked how they enjoyed it, the response is usually positive, and the pleasure centres of their brains are active. When given the exact same specimen, only under the advisory that it is product 'B', the subjects react negatively, and indicate that they are not enjoying the product. Their pleasure centres are largely dormant at this stage. When asked about the original sample, falsely labeled as product 'A', their pleasure centres light up once more as they describe their experiences. When told that the original sample was in fact Product 'B', their pleasure centres immediately flatline and they no longer express any positive association with the original sample. Indicating that they 'felt something was off' or 'only said that they enjoyed it, but didn't really'

This factored HUGELY in how I designed my gaming machines. Early in my career, I had a customer who preferred Nvidia. They were of the notion that because AMD was the smaller company, Nvidia must be a better product. I designed a machine with an AMD 5770 card (awesome value for the era) an recommended that they try it. They weren't initially receptive, but I, being much younger and more concerned with building an awesome system for their budget and not listening to their cues and wants, insisted that the 5770 would be the best bang-for-buck card in the budget. The customer took my suggestion, and the next day they picked up their computer, freshly built PC.

About a week later, the customer brought back the machine with BSOD issues. They were angry at me personally for recommending the AMD card and according to them the card was causing all the issues. We were happy to look into the machine ASAP as per warranty. Pretty easy repair, and the problem could easily be replicated. turned out the HDD was a bad-apple and needed to be replaced: bad luck. We contacted the customer same-day and told him the news. We replaced the HDD and ran a full test just to be sure. He was not happy, he demanded that the AMD card was the issue, and that we remove it and replace it with an Nvidia equivalent. There was no reasoning with this customer, he was threatening to call up Australian fair-trading and demand a refund on the system unless we replaced the AMD card with an Nvidia. We obliged, and reinstalled the OS, set up the system and tested it, and returned it to the customer.

Later on, months later, the customer came in to buy some DVDs or some other consumable, and I asked (as we salespeople do) "how are you enjoying your system?" His response?

"It hasn't had a problem since you took out that damned AMD rubbish!"

That, my friend, is why the market perceives Nvidia is better. Because Nvidia say that they are better.

I saw similar situations many times again in my career, and on the sales-floor: it's a battle not worth fighting. This is why I try to promote the FACT that both vendors are similar in quality and stability. Maybes stories will help people realise that brand loyalty is only a hinderance.

I'm not really a fanboy who blindly defends AMD GPUs, my favourite card EVER was my 7800 GS. I just don't like to see Nvidia being defended when really, as you expressed: they don't really need the help. A 2:1 market domination is not a good thing.
 
Last edited:
Two drivers in a decade that have had fan speed issues that just caused people's computers to lockup once they got too hot (which hopefully prompted most to troubleshoot and find out that they could just roll back a driver version and be fixed), is hardly comparable to the vast majority of AMD drivers having major issues with various games, anything but a one-monitor one-gpu setup, and generally being inferior in release speed and updates. I've owned many of both brands over the years as I said before, so I'm hardly a "fanboy" of one side or the other I just buy what works nowadays and is the best value for my money (and that's not spending more time tweaking than gaming with an AMD card, let me tell you! I've loved my GTX 780 though ;)).

Yeah, right! :D

Actually, judging by your adamant defence of Nvidia at every turn, I'd say you are quite the fanboy of Nvidia.

I've had the exact same experience as you, though. Except swap the word Nvidia with AMD and vice-versa. So, when anybody says that AMD drivers give them trouble, require tweaking and whatnot, I've had the same experience with Nvidia.

You know what that means? A ton of people are blaming their general system instability on their scapegoat of choice. Like I wrote above.

Agreed. If the drivers had all of these problems in general everyone would be using consoles. There are a lot of different hardware/software combinations and users who are lacking in understanding that there are a lot of interactions going on.

I'll give you one example that comes to mind.Had a guy using FRAPS, CCC, Afterburner, Radeon Pro, and Trixx (IIRC) all at the same time to adjust/monitor different aspects of their graphics. He then updated drivers and had issues with crashing when he started any Game. Immediately ranted on how bad the new drivers were (Like any company would release drivers that simply crashed in every game). He never stopped to think that it could be the porridge of applications that were interacting with each other and the drivers. Never found out which app or combination caused the issue, but uninstalling everything, cleaning the registry, deleting some old game files, and reinstalling everything fixed it.




Back on topic. It's pretty obvious that AMD sold every card they had. I can understand them going conservative with wafer allocations. They've still got to be stinging about having to pay GloFo to get out from under over allocating a couple of years ago. Add all of the other components needed to manufacture cards and I can understand them keeping things close. There's also the amount of chips they needed for the consoles to consider. Nobody saw the litecoin craze coming, and contrary to what people seem to think it's impossible to simply make a phone call/get an appointment and double/triple (or whatever additional quantities you need) your orders. You make your plan, you get your allocations and that's what you have to work with. Then there's the myriad of other agreements for marketing the final product. You can't simply throw those out the window and start charging your distribution channel more.

The company I work for buys finished product from China, so it's a much simpler process than what AMD/nVidia does. Even in our case it takes a month to allocate production, a month to manufacture, and a month to deliver. That's 3 months minimum. Often it's 4 months if all the components don't come together perfectly. And as I said, that's often. That's not high tech product that can only be sourced from one supplier in the world either.

When I lived in Dallas a company I worked for was under new ownership (a very large company) and was trying to make their bottom line look better for investors after the acquisition. They decided to cut orders by 50% (just temporarily) so that wouldn't be showing such a large deficit (When you order something you actually owe the money for it on paper). The supplier was in Dallas and I knew their rep well enough that she'd talk to me when she was in. Orders were placed 12 months in advance. Her concern, once she found out that the dramatic drop in order size wasn't business lost to a competitor, was that if we later upped the orders to where they should be they wouldn't be able to supply. They wouldn't even have access to the steel (which was sourced in the States) needed if it wasn't ordered then and the electronic components would be even less likely to be acquirable (coming from overseas).

It's just not as simple as, "We're selling more so let's buy more." If it was don't you think AMD would have done that? It doesn't take multiple degrees and decades of experience to send an email if that's all it took. They had what they had allocated for and that was more or less all they had to work with. I'm sure they could juggle allocations a bit, but that would be about all they could do.

Sorry for the wall of text. :eek:
 
When someone has to resort to personal insults like you are, they've already lost their argument. :)

Well, your previous post seems to indicate knowledgeable/sensible people buy Nvidia. At least that's the gist I got, I may be reading it wrong.
 
Last edited:
A very true and smart post kazeo. Re marketshare android is a vast majority at this point and windows, but I don't want to sidetrack there. You are right that many do just get so into X brand they disregard anything else.
 
For some of us, non-native speakers, in our native language, the hardware parts are not referred by "it", but, has a gender tag. For example in Polish, CPU is "he", gpu is "she" and cooling would be referred like "it". So when writing in English, I sometimes might translate my native language rules to your language.

Might be this case, and not that the one, who refers to gpu as "she" lives in close relationship with the card :D

"she" was referring to a user. PRIME1 specificilly, not hardware.
 
Well I just build a new computer in January was going for a 290X but I could never get one at a reasonable price. I then decided to get a a Gigabyte 780 Ghz edition. I been very happy with it and glad now I did not get a 290X.
 
This discussion will go back ON TOPIC or it will be closed.
 
What didn't help is that 290 came out just before the holiday period, production would be on the way down and with the mining craze.
 
What didn't help is that 290 came out just before the holiday period, production would be on the way down and with the mining craze.

I think AMD dropped the ball this gen. I personally would go for a 780 if I were forced to upgrade now, swing as how Nvidia has (mostly) fixed up their multi-monitor issues from wha I've heard. This gen was a weird one, though: the current arch in use is ~3 years old at this point, and looks to be a fully mature 3 years by the time new arch comes out... On the same node. I'm thinking the GPU battle is going to be a weird one from now on. Gone are the days where we get 30-50% improvements per watt every year. I think we will see more modest improvements over a longer timespan. This means people (the average Joe User) won't really be given reason to upgrade their GPU in their desktop for a long time between cards.

Either way, I hope the market evens out, as this sort of thing helps both parties compete more effectively.
 
Back
Top