AMD Chief Confirms New High-End GPU: "We Will Be Competitive in High-End Graphics"

Nvidia don't do anything you wouldn't in their position, what is wrong is you taking the ass ramming on price then turning around after and saying "thank you best birthday ever". The nvidia fan club seems to enjoy the punishment and defend their champion after

We'll see what the new year brings

Honestly it sounds more like jealousy from you for what other people are spending their money on. No one is forcing those people to pay for a new product. If they buy it. It's their choice. This isn't the exactly $750 Epi pens you know.


But oh wait. You gotta pretend other people look the fool and make a an analogy of a prison ass pounding to make yourself feel better. I understand.
 
Last edited:
Yeah I don't understand why nvidia didn't just rebrand their 10 series to 11 and charge higher msrp like amd did with the 470 and 480's. I completely don't understand why nvidia can justify charging more for products that actually do perform measurably faster that are built on a different process. That's extremely shifty of them.

So you're OK with the 2080 Ti costing $500 more than the card it replaces? The 2080 costing $300 more and the 2070 costing $200 more than their replacement? By that logic, when the 3080 Ti comes out, it should cost $2000 amirite.

Really think they'd have done that if AMD had competitive cards? I don't. Doesn't make them bad people, just means we have to pay thru the nose for cards that should cost 1/3 or more LESS than they do.

Think AMD wouldn't be charging the same for GPUs if they were in Nvidia's position? Think Nvidia should be charging less than what the market is willing and eager to pay? I'd suggest googling supply and demand.

It really helps to remember that corporatioms are running businesses - not charities, not fanboy highschool.

Never said AMD wouldn't. I'm saying Intel didn't price gouge when AMD had nothing to compete with them for several years but Nvidia did. I'm not putting AMD up for pope, they very well may have done the same thing. I'm not brand bashing anybody, I'm just saying we desperately need some competition in the GPU market so we're not having to still pay MSRP and higher for 2+ year old GTX cards and nearly twice as much for current gen cards.
 
So you're OK with the 2080 Ti costing $500 more than the card it replaces? The 2080 costing $300 more and the 2070 costing $200 more than their replacement? By that logic, when the 3080 Ti comes out, it should cost $2000 amirite.

Really think they'd have done that if AMD had competitive cards? I don't. Doesn't make them bad people, just means we have to pay thru the nose for cards that should cost 1/3 or more LESS than they do.


.

Absolutely 100% fine with it. It's a luxury. If I buy a 2020 four runner and want a sun roof/moonroof and a hood scope I would pay $1500 more without any hesitation. It's extras and a luxury. It's not like the 2080ti and the 1080ti are the exact same thing. If I truly felt it was worth it I would spend the money. Cutting edge is expensive and dulls quickly. You pay the premium. Been that way since people have traded anything of value to each other.

I'm actually happy with my 1070ti now because I play games that came out 5 years ago. But more power to the people who choose to spend their own money on what they want. Who am I to judge their wants compared to my own and mock them for purchasing things that make them happy?
 
AMD will say anything to give stock holders some sort of chance for the 40% loss the stock took the last 5 days
 
I don't disagree with you at all. I absolutely love competition. But what would you do if you are asked to be a part of a marathon you have been training for most of your life and you find out your rival and competitor broke their leg? Do you shoot yourself in the foot to give them a fighting chance?

Given corporate practices, it's more likely that you broke your competitors leg and only got a slap on the wrist for it.
 
AMD has shown off a four stack HBM2 Vega product for awhile now. I was expect announcement of this part at SIGGRAPH a few months ago. However at the same time, AMD has made clear that that part is not meant for consumers due to costs involved. Similarly there is a Vega die with one stack of HBM that found its way into Intel's Kaby Lake-G. The presumption is that that part would appear in other products but so far it has been married to Intel's design. Presumably Polaris based designs are cheaper but for mobile the smaller foot print would have worked out well for designs that needed a GPU (say the 10 nm Cannon Lake chips that don't have integrated graphics).

As for Navi, I do think that it is getting a bit overhyped as most AMD GPU's tend to be. I do expect a massive overhaul of the internal design as Vega wasn't as extensive as a change from Polaris as marketing would lead you to believe: they have evolved from GCN roots. xGMI support has been leaked in Linux drivers but I don't think that that is what most people are reading into. My hope is that AMD out flanks nVidia by going with a chiplet design and XGMI is just the on-package interconnect for it. I have a different idea how the off package IO will work. Still if AMD goes with a chiplet style design for high end Navi, they'll certainly be able to scale up to defeat nVidia at the high end via shear brute force by throwing significantly more silicon into the design until they win. Case in point is that nVidia's GV100 is the largest single die ever mass produced at 818 mm^2 but four 350 mm^2 dies could combine for a total of 1400 mm^2 silicon toward compute. The TU102 die on the RTX2080 Ti is a 'mere' 754 mm^2 and would face the same problem from a quad chiplet Navi. nVidia's architecture is efficient but it can't overcome such a deficient in raw resources if that is the path AMD is going down. A dual chiplet design could compete well against the RTX 2080 and RTX2070 and a single chiplet would compete with the GTX 1080 and below. (And for the record, nVidia is going the chiplet route too but their efforts appear to be behind AMDs.). AMD's chiplet plans extend to the CPU side too but they'll hit their server parts first before trickling down to consumers as packaging price decreases. Or Navi could just be another big monolithic die that attempts to compete but unless AMD has found a significant means of increasing compute efficiency, they can't compete with the insane amount of resources nVidia has been throwing into their high end chip designs. I'm hoping for the former but realistically expecting the later.

There are also a couple of features in both AMD and nVidia designs that I'm waiting to be enabled on the consumer side. Both parties have done some extensive work to build support for a unified memory space between the CPU and GPU but actually using it hasn't come to market. I also have this nagging assumption that both nVidia and AMD have higher bandwidth IO protocols in their designs but they haven't been enabled. Case in point, I'm surprised that Turring doesn't have PCIe 4.0 support. While PCIe 4.0 hasn't arrived in the vast x86 world, IBM has been shipping a PCIe 4.0 enabled POWER designs for nearly a year now. Depending on how it plays out, especially on the Quadro side for nVidia, we may end up seeing a nvSwitch bridge released if they figure out how to do 3-way and 4-way GPU scaling again. Ray tracing is supposed to scale up near linearly with additional processing power. Speaking of scaling up linearly, the Quadro side of things does permit memory sharing so two 24 GB cards can now operate as if there is a single 48 GB pool of memory. This feature has yet to appear on the consumer side but strictly speaking it'd come with a performance hit. There was talk of a hybrid mode where only part of the memory capacity would be shared across cards while key assists would be mirrored on every card (pretty much how SLI traditionally worked). AMD isn't to be left out of the high bandwidth IO party either. They are a members of the OpenCAPI consortium. It was rumored early on that both the Zeppelin and Vega 10 dies had OpenCAPI capabilities or at least the ability to route their own Infinity Fabric over the pins used by PCIe. AMD did divulge that Vega has Infinity Fabric on-die but never really dived into why on that product. Perhaps a Zeppelin + Vega in a single package was explored at some point? Still moving forward I would expect AMD to leverage OpenCAPI to connect their GPUs to their EPYC server chips.

The other thing worth pointing out where Navi can place in the market is that nVidia still has room with Turing to enhance performance further: none of their Turing lineup is fully enabled. This is most noticeable on the high end RTX 2080 Ti where they can add an extra 1 GB of GDDR6, though I suspect that is coming down the line regardless of what AMD for a hypothetical 'Titan T' card. A RTX 2070 Ti/RTX 2075 is very probable with more enabled ALUs and higher clocked GDDR6 in my eye if AMD can stage a come back. The same can be done to the RTX 2080 but nVidia shot themselves in the foot with their naming convention by already releasing an RTX 2080 Ti. Ideally they should have named the TU102 part the RTX 2090 to give their naming convention room to expand.
 
Absolutely 100% fine with it. It's a luxury. If I buy a 2020 four runner and want a sun roof/moonroof and a hood scope I would pay $1500 more without any hesitation. It's extras and a luxury. It's not like the 2080ti and the 1080ti are the exact same thing. If I truly felt it was worth it I would spend the money. Cutting edge is expensive and dulls quickly. You pay the premium. Been that way since people have traded anything of value to each other.

I get what you're saying but to continue with your analogy, Toyota makes the 2015 4Runner and sells it for $30,000. They don't make one in 2016 or 2017 and when the 2018 model finally comes out, it costs $60,000 but only has a few minor upgrades. Can't afford that, no problem because Toyota is still selling their 2015 models but now they're $35,000.

I totally get paying more for a faster card and the 2080 should be more expensive than the 2070 but the 2080 should NOT be $300 more than a 1080, a 2+ year old card that they're phasing out but somehow still costs as much if not more than it did when it was released those 2+ years ago.
 
Given corporate practices, it's more likely that you broke your competitors leg and only got a slap on the wrist for it.

Hedge your bets on one horse on your stable and it dies cause of someone else's fault, it's still your fault for not having the insight to possibly diversify your goals. No one likes a guy who cries and points fingers and focuses on slights over those that takes the punches and fights. Amd was amazing and they have a chance to be amazing again. I'm rooting for them if they are gonna be fighters. But their fans I've found have been very much vocal whiners. Not all. But many.
 
I get what you're saying but to continue with your analogy, Toyota makes the 2015 4Runner and sells it for $30,000. They don't make one in 2016 or 2017 and when the 2018 model finally comes out, it costs $60,000 but only has a few minor upgrades. Can't afford that, no problem because Toyota is still selling their 2015 models but now they're $35,000.

I totally get paying more for a faster card and the 2080 should be more expensive than the 2070 but the 2080 should NOT be $300 more than a 1080, a 2+ year old card that they're phasing out but somehow still costs as much if not more than it did when it was released those 2+ years ago.

The extra is for unproven tech. Which the gambler in me appreciates. Buy early into a possibility that promised features might be revolutionary. But a 2015 Toyota wouldn't be a good analogy because remember the 2080 also comes with a completely new warrenty and support. Back end stuff that would cost money. While a 1080 bought new does also. A 2015 Toyota would already have depreciation. It wouldn't be the same. Now if Toyota completely took their old sitting stock from 2015 and out a new chassis on it and sold it as "new". That would be a little fucked up.
 
Last edited:
Yeah I don't understand why nvidia didn't just rebrand their 10 series to 11 and charge higher msrp like amd did with the 470 and 480's. I completely don't understand why nvidia can justify charging more for products that actually do perform measurably faster that are built on a different process. That's extremely shifty of them.
Actually they did.
The 2080=rebranded 1080ti + ray tracing + $200
the 2070= rebranded 1080 + ray tracing + $100
 
Actually they did.
The 2080=rebranded 1080ti + ray tracing + $200
the 2070= rebranded 1080 + ray tracing + $100

OH I didn't realize they used the exact same process and hardware and just uped the clock speed like the 480 to 580.

Let me do a bios update real quick on my 1070ti so I can have some rtx cores real quick. BRB .
 
This is really interesting, normally I would take these sort of statements as marketing hype, but Lisa Su is definitely not known for overpromising. All the way from financials to actual products, this version of AMD is quite conservative. "We will be..." is not something to take lightly.
 
Actually they did.
The 2080=rebranded 1080ti + ray tracing + $200
the 2070= rebranded 1080 + ray tracing + $100

I can't help but notice that you actually did give Nvidia a huge pat on the back. They didn't have lateral movement. But they followed the path of making the next gen the same speed as one tier up from the gen before. Price wasn't ever locked. It was pretty fluid. So while some people may balk at paying more. I noticed that performance have improved even despite a lack of competition. I find that admirable.



I'm not loyal to Nvidia by the way. They are just the better choice right now. I'm perfectly content going to amd if they release something nice. Hell I only have 1 gsync monitor and 5 free sync. And amd option would be wonderful.
 
I can't help but notice that you actually did give Nvidia a huge pat on the back. They didn't have lateral movement. But they followed the path of making the next gen the same speed as one tier up from the gen before. Price wasn't ever locked. It was pretty fluid. So while some people may balk at paying more. I noticed that performance have improved even despite a lack of competition. I find that admirable.



I'm not loyal to Nvidia by the way. They are just the better choice right now. I'm perfectly content going to amd if they release something nice. Hell I only have 1 gsync monitor and 5 free sync. And amd option would be wonderful.
I don't recall AMD ever releasing a card with same performance +- 10% for higher msrp. Perhaps I am wrong though, please remind me which cards they did.
BTW agree on Nvidia best gpu to buy, it's obvious. But that does not justify the milking prices.
I bought a used 1070ti for $300, it was a no brainer at that price so I am clearly not an AMD exclusive fan.
 
I have a feeling Nvidia fans are going to shit themselves when RTX finally makes its way to gaming titles, the performance loss is going to really piss people off. Which is good for AMD :)

So they can be three generations behind at rasterization and ray tracing?
 
So they can be three generations behind at rasterization and ray tracing?

I'm going to gamble that full implementation of Ray Tracing without massive performance issues is 5 years away. In the meantime, they need to focus on raw performance while working on a card with Ray Tracing behind scenes that doesn't take the performance hit that we will shortly see out of the RTX line.

Sometimes its good play second place, you can always learn from the mistakes of the person or company ahead of you.
 
I'm going to gamble that full implementation of Ray Tracing without massive performance issues is 5 years away. In the meantime, they need to focus on raw performance while working on a card with Ray Tracing behind scenes that doesn't take the performance hit that we will shortly see out of the RTX line.

Sometimes its good play second place, you can always learn from the mistakes of the person or company ahead of you.

I am going to say there is a good chancecha won't see RTX enabled this year.
 
I am so tired of this stupid and ultimately anti-capitalist arguement. You think its pro capitalism, but it's not, competition is, and that is all he and most others are asking for.

Like you have competition in ISP? How is that one working out exactly ?

I can tell you how it worked out for AMD they had the R9 290X that card was way faster and competition did nothing for them.
In the end you decide what you buy and what you pay is what you think it is worth. No one at any time is standing behind you while you purchase and makes you buy anything, people who claim otherwise are just fooling themselves.

Every time you hand over money to Nvidia nothing will stop them from asking more because they can and they will keep doing this regardless of competition.

I'm going to gamble that full implementation of Ray Tracing without massive performance issues is 5 years away. In the meantime, they need to focus on raw performance while working on a card with Ray Tracing behind scenes that doesn't take the performance hit that we will shortly see out of the RTX line.

Sometimes its good play second place, you can always learn from the mistakes of the person or company ahead of you.
But But then you can't play all of the ray tracing games that will be flooding the market because Nvidia will invest their money straight into to all game development and make sure every game has ray tracing.
 
I am going to say there is a good chancecha won't see RTX enabled this year.

Oh absolutely not. Which is why its questionable to buy an RTX card, you're basically paying for an advertisement and a promise. Never pay for promises.
 
But But then you can't play all of the ray tracing games that will be flooding the market because Nvidia will invest their money straight into to all game development and make sure every game has ray tracing.

I'm sure they will. It will be like when they tried to pump Physx into everything. What did we get 2-3 games that were actually worth using it? Borderlands & Batman. I think I'll live in a post Rasterized world.
 
I can tell you how it worked out for AMD they had the R9 290X that card was way faster and competition did nothing for them.

Well, nearly all of that was on AMD- which is not atypical of the 'competition' that they bring. Crap drivers was a lot of it, but it was also loud and hot- and by the time they had that sorted the competition had superseded them.
 
Going to be awesome when AMD is no longer around in the GPU segment.
With no competition we can look forward to the Intel decade of quad core releases with +3% performance improvement each year and founders edition pricing.
 
I'm going to gamble that full implementation of Ray Tracing without massive performance issues is 5 years away.
I am going to say there is a good chancecha won't see RTX enabled this year.
It will be like when they tried to pump Physx into everything.

:ROFLMAO:

Because it's not already working in all of the engines?

And five years? At worst, it will be next year- and that will be due to typical developer release slippage, ray tracing is the easy part. And I love the half-assed Physx comparisons, no better way to say "I have no idea what's going on" lol!
 
Well, nearly all of that was on AMD- which is not atypical of the 'competition' that they bring. Crap drivers was a lot of it, but it was also loud and hot- and by the time they had that sorted the competition had superseded them.

Yeah, I think that's the biggest hump they need to get over, to deliver a card that doesn't run hot or use so much power. Once they can do that, they sky is the limit. I hope Navi won't follow that trend, but you know what they say about history?
 
Going to be awesome when AMD is no longer around in the GPU segment.
With no competition we can look forward to the Intel decade of quad core releases with +3% performance improvement each year and founders edition pricing.
Consumerism you are your own victim ;)
 
Going to be awesome when AMD is no longer around in the GPU segment.
With no competition we can look forward to the Intel decade of quad core releases with +3% performance improvement each year and founders edition pricing.

Given how easy ray tracing is to implement, Intel will likely be the competition that AMD never could manage. Of course, that would take an epic failure on AMD's part- and as much as I believe that they're capable of staying competitive, they just work so hard to prove my faith misguided.
 
:ROFLMAO:

Because it's not already working in all of the engines?

And five years? At worst, it will be next year- and that will be due to typical developer release slippage, ray tracing is the easy part. And I love the half-assed Physx comparisons, no better way to say "I have no idea what's going on" lol!

I compare Physx because that was a big selling point for Nvidia, just like RTX.
 
Yeah, I think that's the biggest hump they need to get over, to deliver a card that doesn't run hot or use so much power. Once they can do that, they sky is the limit. I hope Navi won't follow that trend, but you know what they say about history?

Let's say they leveraged some of their current revenue glut toward RTG, they very well could produce a more specialized product line akin to how Nvidia addresses the market, and perhaps could tamp down on the power draw and resulting emission of noise and heat.
 
:ROFLMAO:

Because it's not already working in all of the engines?

And five years? At worst, it will be next year- and that will be due to typical developer release slippage, ray tracing is the easy part. And I love the half-assed Physx comparisons, no better way to say "I have no idea what's going on" lol!

Yawn with your ranting, wake me when rtx is finally enabled for even shadow of the tomb raider, a game everyone is pretty much already done with, advertised the feature for, and zzzzzz.
 
Let's say they leveraged some of their current revenue glut toward RTG, they very well could produce a more specialized product line akin to how Nvidia addresses the market, and perhaps could tamp down on the power draw and resulting emission of noise and heat.

I think that is what they are trying to do. They knew they couldn't compete with Nvidia to make a good enough profit for research, but could with Intel & have succeeded. Im sure that was their plan to funnel some of the money made from Ryzen to fund research into their GPUs.
 
I compare Physx because that was a big selling point for Nvidia, just like RTX.

From a marketing standpoint, sure- but it's incomparable from a technology standpoint. PhysX made sense from the perspective of limited threading on CPUs the same way that hardware audio made sense- now that's all software too.

Ray tracing is an entirely different level of processing.
 
Yawn with your ranting, wake me when rtx is finally enabled for even shadow of the tomb raider, a game everyone is pretty much already done with, advertised the feature for, and zzzzzz.

'Yawn' right back at you with your insistence in criticizing a technology that's been proven and absolutely is the future of real-time graphics because you can't wait a month.
 
1. NVidia's actions are shitty and they are consumer hostile
2. Without competition, NVidia has every right (and corporate imperative) to maximize prices to meet demand
3. Prices are out of line with delivered performance (using history as a guide), and therefore consumers would be strongly encouraged to balk
4. If consumers balk, demand will drop and and so will prices (and we may be seeing some of this as the $999 2080 Ti's are starting to show up on websites; down from launch prices ranging from $1,199 - $2,499)

There is no reason to accept that just because something can happen, that it also must happen. The correct response to NVidia in the short-term is to take demand away from them by buying out remaining stock of the 10xx series and ignoring the 20xx series until the middle of next year when actual RTX supporting software just begins to become available. Noting the exceptionally poor price / performance ratio is acceptable from a consumer perspective and hopefully sales at the 2080 Ti are negligible (although, given the pricing, I believe NVidia is hoping for that two as yields are likely exceptionally poor for that particular die).

The most important takeaway is that pricing is not one-sided; if you believe the value proposition is poor, either look for alternatives or don't buy it. If enough people actually did that, NVidia would respond to rebalance supply / demand pricing levels to remain profitable. But it does mean that in the short-term, some of us would actually have to "do without". =)
 
  • Like
Reactions: N4CR
like this
The most important takeaway is that pricing is not one-sided; if you believe the value proposition is poor, either look for alternatives or don't buy it. If enough people actually did that, NVidia would respond to rebalance supply / demand pricing levels to remain profitable. But it does mean that in the short-term, some of us would actually have to "do without". =)

And doing without is not something today's generation is willing to do, nor do they have the patience for. We've long since entered the "now generation".
 
Agreed. Framerates always win.

that "sir" is exactly why Ngreedia "wins" because it is way too easy to fuck around and get the higher FPS numbers even if the end result is anything but "smooth" and "good looking" experience, far far to easy to screw with software to make the "numbers" appear larger than they actually are Intel has been doing this for decades and Nv is on the same bandwagon as well (both open pocketbooks to 3rd party devs to "tweak")

Objects in mirror are not always as they appear.

As for some of the other folks saying AMD has never had faster GPU then Nv ever, or for that matter faster for less.

That depends on your thinking.
Hd4k--HD5k--HD6k-HD7k and "some" of the R5-7-R9 (some of course being pure rebrands) absolutely were competitive if not trashing Nv if one takes every side into question.
features, performance, build quality, sound profile, cost in power or heat or temperatures given, game performance quality (eye candy quality, not "BS" but as "real" as possible) density/mm2, performance per watt, watt to performance and so forth.

Yes Intel has been the "better" for CPU if one chases overall performance for the price, power use etc and build quality till the 3000-4000 series (mostly) but GPU wise Nv is not as clear cut a winner in my opinion mainly because since the 500 series they have been cutting things away just to get raw FPS up more than anything else.

-------------------------------------------
-----------------

I guess some folks really do like their trucks (sort of speak) to be as loud and sounding fast as possible instead of having "sleepers" that are all grunt without having to resort to BS tricks to make you think they are faster than actually are.

anyways...yes the :most modern ones: from Nv being the 900/1000 now 2000 series are "fast" and do not use all that much power to give it BUT there are other BS shaddy things that Nv has done all along more so over the last few years than prior generations to "achieve" this.

mehh, I know where my $ goes, certainly not into Nv pocket that is for damn sure. Difference between AMD and Nv (in my books) is the former admits to mistakes either ahead of time OR when "discovered" (tries to avoid pounding the dog future product release) and the latter never admits to any wrongdoing even when they are given the proof of it.

folks keep throwing $$$$$$ at them (Nv) like they are a shriveled up hooker on a pole waving saggy tits around for bus fare and then demanding to get more because she is the best you can get (while blinding you with acid).
 
Back
Top