Surprise: AMD cancels implicit Primitive Shader driver

Says the guy whose forum signature is an AMD marketing wet dream




The only thing that has been FAST, LONGTERM and STABLE has been AMD's decline in the graphics market.

Savage.

LOL! So much decline that Nvidia has to respond to it in an anti competitive attempt to hold on to their market share, got it. :D
 
A better question is why don't YOU care that the hardware you're purchasing is shit compared to the alternative?


Because the company that repeatedly lies to it's loyal fans every chance it gets is morally better than the company that sometimes lies.

Both companies will screw you as hard as they think they can get away with but people get attached for some reason beyond me.

This thread is hilarious.. after hundreds of pages of drivel in the Vega Rumor thread. Everyone saw this coming except for one or two people. :)
 
Last edited:
Without driver-level integration and enabling, these features are going to go unused 99% of the time.

This has been the case for AMD graphics since before it was AMD graphics- they'd throw random crap into their hardware and hope that someone used it.

Maybe one AAA-title per generation implements said feature(s), usually far less effectively than our current Far Cry 5 example (which is actually pretty cool to see working so well), but the features get ignored and go unused by the industry as a whole. What's worse is that most of these features are just AMD jumping the gun with a half-assed version of something that actually will get standardized later on, but of course that standard won't support the now obsoleted hardware.

Yeah, I owned the very first Radeon too.
 
A better question is why don't YOU care that the hardware you're purchasing is shit compared to the alternative?
Because I can do basic math and understand that going by price point, AMD gives you the best bang for my money.
 
Because I can do basic math and understand that going by price point, AMD gives you the best bang for my money.

You must be high. AMD finally caught up to an overclocked 980ti from three years ago for roughly the same price. That isn't best bang for your dollars, my friend.
 
You must be high. AMD finally caught up to an overclocked 980ti from three years ago for roughly the same price. That isn't best bang for your dollars, my friend.
Maxwell doesn't do 10 bit outside of dx windows, they are not truly comparable in all aspects for all users. I wouldn't run Maxwell or earlier nv gpus, they would be a downgrade for productivity IQ.
 
Maxwell doesn't do 10 bit outside of dx windows, they are not truly comparable in all aspects for all users. I wouldn't run Maxwell or earlier nv gpus, they would be a downgrade for productivity IQ.

Sorry, you got me rolling here.

10bit is so rarely used by desktop users; you either know that you need it and buy the appropriate hardware, or you don't know what you're doing.
 
Maxwell doesn't do 10 bit outside of dx windows, they are not truly comparable in all aspects for all users. I wouldn't run Maxwell or earlier nv gpus, they would be a downgrade for productivity IQ.

But we aren't here talking about the 1%, are we? If even that many. This is an overall card comparison when used in games and referring to FPS. The Vega cards may have newer tech, but when you look at the game performance, who cares?
 
Sorry, you got me rolling here.

10bit is so rarely used by desktop users; you either know that you need it and buy the appropriate hardware, or you don't know what you're doing.

Support is more commonplace now than it was when the 980 was king, funny that took them another generation to stop gimping shit. There is finally media to take advantage of it too, not to mention many screens have 10 bit LUT finally.
Once you see banding, you never want to deal with that shit again.
 
But we aren't here talking about the 1%, are we? If even that many. This is an overall card comparison when used in games and referring to FPS. The Vega cards may have newer tech, but when you look at the game performance, who cares?
Fair call. Just playing devil's advocate on a hardware enthusiasts forum. We are not the masses..
 
Support is more commonplace now than it was when the 980 was king, funny that took them another generation to stop gimping shit. There is finally media to take advantage of it too, not to mention many screens have 10 bit LUT finally.
Once you see banding, you never want to deal with that shit again.

As an aside, banding can be caused by a lot of things- but output precision limitations can absolutely cause it with very fine gradients too.

But generally speaking, 10bit support has only really arrived at the consumer level due to HDR requiring it and HDR being the new marketing buzzword to move product.

Actual displays with consistent output, including the needed 10bit (or better) LUTs to match up that output to an established workflow, will remain quite expensive relative to displays that are 10bit for marketing purposes.
 
  • Like
Reactions: N4CR
like this
you should probably return to elementary school if you believe so.. ;).

You must be high. AMD finally caught up to an overclocked 980ti from three years ago for roughly the same price. That isn't best bang for your dollars, my friend.

Who says I am looking at high end cards, mid range fits my needs perfect and before miners fucked pricing up, AMD was/is the best value.

But hey, haters gonna be ignorant and hate.
 
Who says I am looking at high end cards, mid range fits my needs perfect and before miners fucked pricing up, AMD was/is the best value.

But hey, haters gonna be ignorant and hate.

There's no hate in the fact that base Vega 56 cards are going for the same price today as I paid for my 1080Ti with AIO last year.

You can complain about mining, but you cannot ignore the effect on pricing.
 
There's no hate in the fact that base Vega 56 cards are going for the same price today as I paid for my 1080Ti with AIO last year.

You can complain about mining, but you cannot ignore the effect on pricing.

But videocards are not the same as toilet paper if there to expensive just don't buy them?

The only people that have been complaining about pricing in this forum consistently were Nvidia customers.
 
The fucking hell is an Nvidia customer?

You mean you never noticed people continue posting about Nvidia even when it was clear AMD was better look at this guy he posts a lot on this forum it seems the idea of price performance is not lost on him ?
https://hardforum.com/threads/amd-radeon-r9-290x-video-card-review-h.1787400/page-21#post-1040320581

Even when Nvidia is better they come over here and cry about AMD not being competitive to drive the cost down on their Nvidia card (which is rather odd either it is worth the money or not) ...
 
You mean you never noticed people continue posting about Nvidia even when it was clear AMD was better look at this guy he posts a lot on this forum it seems the idea of price performance is not lost on him ?
https://hardforum.com/threads/amd-radeon-r9-290x-video-card-review-h.1787400/page-21#post-1040320581

Even when Nvidia is better they come over here and cry about AMD not being competitive to drive the cost down on their Nvidia card (which is rather odd either it is worth the money or not) ...

So you are projecting your own small-minded perspective of seeing everyone as exclusive fanbois on to others?

Glad we understand each other.

I buy what works, thanks.
 
So you are projecting your own small-minded perspective of seeing everyone as exclusive fanbois on to others?

Glad we understand each other.

I buy what works, thanks.

Actually you are proving my point for me. Most people have some validation in posting their opinions without falling to insults pretending to explain what other people wrote while quoting someone that said a totally different thing.
If you have trouble understanding things just ask..
 
Vega 56.... 470$. Okay. 700$ is 48% more money. According to the performance data posted above the 1080Ti is ~33% faster for 48% more money.

Similarly, the LC Vega 64 edition is $700. 13% faster than Vega 56.

VALUE.

Buying 980Ti for 650$ 2+ years ago and have it perform within 10% of 1080 aka Vega 64 ? Fucking nvidia buyers and their stupid purchasing decisions eh?
jesus christ we get it.... You think Nvidia is teh leet sauce and that competition isnt good.
 
jesus christ we get it.... You think Nvidia is teh leet sauce and that competition isnt good.

On the contrary, if you want competition, you shouldn't accept Vega as AMD's best attempt. Letting AMD get away with Vega is the path to no competition.

AMD can do better. AMD has done better. You shouldn't hold their hand and give them the blue ribbon for essentially hyping up a die-shrink, promising "Over 100 new features!!1" on said die-shrink, Releasing the die-shrink LITERALLY YEARS AFTER their competition had released faster products that use less power, and overclock further, and for all intents and purposes cost less.

I'm not going to let AMD get away with Vega.
 
II'm not going to say that you made a BAD decision, My best friend bought TWO, count'em TWO Vega 56 cards for his build. You know how I felt when FarCry 5 Benchmarks came out? I was HAPPY. I was HAPPY that his investment is maybe paying off, because in the end I want competition in the market, and I want my friend to enjoy his games.

Some parents spoil their child and refuse to see their child do any wrong, others who discipline their child when they do wrong because they know the child is capable of more.

I trash Vega because I KNOW AMD is capable of more. I know AMD has made truly generation-defining products in the GPU space in the past. Vega is not good work. Vega is a poor excuse for innovation. If my child presented me with Vega when I know they've accomplished the 7970 or 9700 before, I'd feel they didn't even try.

Vega is not acceptable as a product from AMD.

Its value as a product for sale is all dependant on price and your situation, and I don't scorn ANYONE who's bought a Vega-based card.

I've got 5 TitanXP Jedi editions....1 is used for gaming and I highly regret not selling it off now, micro stuttering POS in Vermintide 2 and MWO, how the hell is that possible?! If I wanted to ever play mind numbing repetitive games like GTA V then it's the card to use I guess

I've got one sitting in the box for when mining takes off again because the 580 does a better job with non mainstream games, absolutely no difference between a 1070 and a 580 has been visible in Dark Souls 3 as another example of WTF did I pay for.
 
I've got 5 TitanXP Jedi editions....1 is used for gaming and I highly regret not selling it off now, micro stuttering POS in Vermintide 2 and MWO, how the hell is that possible?! If I wanted to ever play mind numbing repetitive games like GTA V then it's the card to use I guess

I've got one sitting in the box for when mining takes off again because the 580 does a better job with non mainstream games, absolutely no difference between a 1070 and a 580 has been visible in Dark Souls 3 as another example of WTF did I pay for.

A Titan is faster than a 1070 and WAY faster than a 580. A 1070 runs Vermintide 2 perfectly and far better than a RX580; it will pull 60fps where a 580 can only pull 40. Also not sure how you don't see a difference in Dark Souls 3 unless you're running at 720p or blind; From what I've seen at 1440p the 580 runs 40-50fps whereas even a 1070 will never drop below the 60fps cap.
 
Back
Top