So buy Ampere or wait for AMD response?

There's a huge unknown here. You can support HDMI 2.1 without supporting HDMI 2.1 VRR.

Nvidia has a conflict of interest and motivation to protect their G-Sync module business. AMD doesn't.

If AMD supports HDMI 2.1 VRR and Nvidia doesn't, that's a pretty massive advantage IMO.
 
My thoughts on possible RDNA2 cards...!

Radeon RX 6700 / Navi 23 / 36CU / 8GB GDDR6 / $299 / equal to RTX 3060
Radeon RX 6700 XT / Navi 23 / 40CU / 8GB GDDR6 / $399 / equal to RTX 3060 Super

Radeon RX 6800 / Navi 22 / 56CU / 12GB GDDR6 / $499 / equal to RTX 3070
Radeon RX 6800 XT / Navi 22 / 64CU / 12GB GDDR6 / $599 / equal to RTX 3070 Super

Radeon RX 6900 / Navi 21 / 72CU / 16GB GDDR6 / $699 / equal to RTX 3080
Radeon RX 6900 XT / Navi 21 / 80CU / 16GB HBM2e / $899 / equal to RTX 3080 Super

Radeon RX 6950 XT / Navi 21 / dual 80CU / 32GB HBM2e total / $1799 / surpasses the RTX 3090

Raytracing across the board...!!

Winning...!!! ;^p
 
Last edited:
There's a huge unknown here. You can support HDMI 2.1 without supporting HDMI 2.1 VRR.

Nvidia has a conflict of interest and motivation to protect their G-Sync module business. AMD doesn't.

If AMD supports HDMI 2.1 VRR and Nvidia doesn't, that's a pretty massive advantage IMO.

Turing already supports HDMI VRR. Why would they remove a feature they already have?
 
RT is so limited that it should have next to no impact on anyone’s decision making. The same is true for DLSS. Don’t get me wrong, they’re both great technologies but unfortunately the majority of devs will build games around console hardware which means limited RT usage and nothing that will heavily impact performance. DLSS is still not a ubiquitous technology and with it being proprietary it might as well not exist.

I’ll borrow this from FrgMstr for your viewing pleasure:
View attachment 275494

It was quite hilarious when Jenson led the event with "We have tons of ray tracing games coming". Sure you do, buddy. At Turing's release, a lot of us knew it was going to take generations for RT to gain any real traction - and a lot of us were right.

In any respect, traditional rasterization performance sounds like it's going to be excellent. I'm stoked, but will probably wait to see what Big Navi does. I'd concur with the leanings that Big Navi will likely be 20-30% faster than 2080 ti. The simple math, without any core efficiency improvements, is 5700 xt + 45-50% (40 CUs vs 72 CUs).
 
Just depends on how big of a chip they want. Everyone assumes 80 CU is their limit but we don't know anything for sure right now.

I don’t think 80Cu is the limit with RDNA. It’s just everyone is assuming that is what they will do for this gen.
 
RT is so limited that it should have next to no impact on anyone’s decision making. The same is true for DLSS. Don’t get me wrong, they’re both great technologies but unfortunately the majority of devs will build games around console hardware which means limited RT usage and nothing that will heavily impact performance. DLSS is still not a ubiquitous technology and with it being proprietary it might as well not exist.

I’ll borrow this from FrgMstr for your viewing pleasure:
View attachment 275494
You didn't play any games with RTX or DLSS tis year? :(

Personally, I played 5. Probably will be a few more before the year is over.
I would not invest in a GPU without RT or DLSS, and luckily I won't have to :)
 
The 3090 has only 20% higher raw specs than the 3080. If AMD can come for the 3080 they can challenge the 3090 too.

On price I absolutely agree. For the total raw performance crown? Probably not but I hope so.
 
My thoughts on possible RDNA2 cards...!

Radeon RX 6700 / Navi 23 / 36CU / 8GB GDDR6 / $299 / equal to RTX 3060
Radeon RX 6700 XT / Navi 23 / 40CU / 8GB GDDR6 / $399 / equal to RTX 3060 Super

Radeon RX 6800 / Navi 22 / 56CU / 12GB GDDR6 / $499 / equal to RTX 3070
Radeon RX 6800 XT / Navi 22 / 64CU / 12GB GDDR6 / $599 / equal to RTX 3070 Super

Radeon RX 6900 / Navi 21 / 72CU / 16GB GDDR6 / $699 / equal to RTX 3080
Radeon RX 6900 XT / Navi 21 / 80CU / 16GB HBM2e / $899 / equal to RTX 3080 Super

Radeon RX 6950 XT / Navi 21 / dual 80CU / 32GB HBM2e total / $1799 / surpasses the RTX 3090

Raytracing across the board...!!

Winning...!!! ;^p

I like your summation up to "6950 XT", probably not too far off. I think it's 50/50 on whether there will be an 80CU model, and if one exists, I really don't see it beating 3090. An 80 CU model will likely be somewhere between 3080 and 3090, and thus I doubt it will cost $1799.
 
You didn't play any games with RTX or DLSS tis year? :(

Personally, I played 5. Probably will be a few more before the year is over.
I would not invest in a GPU without RT or DLSS, and luckily I won't have to :)

Just Control and I ran q2 RT to see how it looked but that’s about it. So I definitely don’t think very highly of RT right now. The problem I see is that consoles are the baseline for RT so I don’t see it being a feature on the PC that will be used heavily or if it is, probably with minimal use that barely impacts performance.
 
Just Control and I ran q2 RT to see how it looked but that’s about it. So I definitely don’t think very highly of RT right now. The problem I see is that consoles are the baseline for RT so I don’t see it being a feature on the PC that will be used heavily or if it is, probably with minimal use that barely impacts performance.

Consoles will certainly be too slow to do any heavy RT but just the mere fact they support it at all is huge. That means developers will add support to their engines even if they only use it lightly. Since RT is a very scalable feature this will pave the way for easier adoption of scaled up effects on PC. This will be similiar to the situation today where PC gets beefed up textures, shaders and resolution. Just add "more RT" to that list.
 
Consoles will certainly be too slow to do any heavy RT but just the mere fact they support it at all is huge. That means developers will add support to their engines even if they only use it lightly. Since RT is a very scalable feature this will pave the way for easier adoption of scaled up effects on PC. This will be similiar to the situation today where PC gets beefed up textures, shaders and resolution. Just add "more RT" to that list.

Global Illumination, which is my favorite RT, only has around a 10% hit IIRC with RTX at least. It's reflections that cost a ton (40%ish, which I care about the least...)

They could implement RT on certain aspects that don't hit as hard.
 
But the 3080 requires a 750watt power supply and the 3070 requires 650 watt power supply. So i guess i would get a 3070
I don't think my Corsair AX760 can't handle a 3080... The card doesn't pull that much more than a GTX 1080ti. Even it pulls 300W, the rest of my systems maybe pulls 200W. A decent 750W PSU is enough. If you have a $50 PSU with 750W you should think about replacing that.
 
Just Control and I ran q2 RT to see how it looked but that’s about it. So I definitely don’t think very highly of RT right now. The problem I see is that consoles are the baseline for RT so I don’t see it being a feature on the PC that will be used heavily or if it is, probably with minimal use that barely impacts performance.

That's the main thing that has me worried about more widespread DLSS support. There's still lots of value in the turing cards if DLSS was widely used.
 
There have been no leaks about AMD cards and RTX 3080 is going on sale on the 17th.
If they have anything competitive, it would be sensible to leak something to make people wait for their cards. If nothing comes out I will assume they have nothing to compete with RTX 3080.
 
My guess is they will drop these cards, wait for amd, then drop the 3080ti version. i will just wait for the Ti version. I have always had good luck with those cards, EVGA 1080ti being the latest.
 
My guess is they will drop these cards, wait for amd, then drop the 3080ti version. i will just wait for the Ti version. I have always had good luck with those cards, EVGA 1080ti being the latest.

I don't think we would see a 3080 Super/Ti for at least 6 months. The only variation likely before then is 20GB cards.
 
If you're satisfied with the 3080 or the 3090 performance/price wise, I'd say go for it. So far, we have no reason to believe AMD will be able to counter those options - I do hope they can, for the sake of competition, but there's been no indication of this so far. The actually interesting conversation is the 3070 and the 3060. AMD will surely have something to counter those, and if the pricing is right + AMD's usually more generous nature with RAM... that could be a compelling option and reason to wait.
 
That's the main thing that has me worried about more widespread DLSS support. There's still lots of value in the turing cards if DLSS was widely used.
If history is evident it probably won’t be. May be select titles that nvidia promotes. But once the cards are sold with hype I don’t think they care too much about following through. It was the same with dlss. So much hype but hit and miss with follow through. I just don’t think developers can be counted on with something that they have to put special effort in and is not a part of universal code.
 
If history is evident it probably won’t be. May be select titles that nvidia promotes. But once the cards are sold with hype I don’t think they care too much about following through. It was the same with dlss. So much hype but hit and miss with follow through. I just don’t think developers can be counted on with something that they have to put special effort in and is not a part of universal code.

I think the ratio of Developer Effort vs FPS Gains, for DLSS 2.0 are too favorable to be ignored.

It's similar amount of work to implementing TAA, which nearly ever game does now. This isn't a image feature like Ray Tracing which has to be coded into every scene in the game. It's simply something hooked into the pipeline once and ignored after.

AMD needs it's own system to counter this. If the hooks into the game engine can be decently aligned, then developers will just right a piece of code to abstract the difference away for the developer and games will get options to choose AMD/NVidia advanced Scaler-AA systems in most games eventually.
 
I want a custom PCB 3080 card and a waterblock; which means I won't be able to buy on release day anyway. Waiting a few weeks to see if AMD is going to be competitive isn't going to cost me anything.
 
I think the ratio of Developer Effort vs FPS Gains, for DLSS 2.0 are too favorable to be ignored.

It's similar amount of work to implementing TAA, which nearly ever game does now. This isn't a image feature like Ray Tracing which has to be coded into every scene in the game. It's simply something hooked into the pipeline once and ignored after.

AMD needs it's own system to counter this. If the hooks into the game engine can be decently aligned, then developers will just right a piece of code to abstract the difference away for the developer and games will get options to choose AMD/NVidia advanced Scaler-AA systems in most games eventually.

Well that’s what I am saying. Even if amd comes out with something exclusive you won’t have developers code for it. So it would have to be something that automatically does it without too much developer. If history is evident any game with dlss won’t take advantage of anything amd does if they do something closed like dlss. Nvidia will just put exclusive tag on the game so it’s a lose lose situation for amd. Unless they somehow do it on their end or as a part of DX. Because nvidia will never be about open standard.
 
I want a custom PCB 3080 card and a waterblock; which means I won't be able to buy on release day anyway. Waiting a few weeks to see if AMD is going to be competitive isn't going to cost me anything.
I thought AIOs are using a reference design for the 3080, so maybe EKWB will have something for the FE and the reference AIOs, of course custom PCBs will take some time to be supported. But I see no value in buying an overpriced custom PCB if you want to watercool.
 
Turing already supports HDMI VRR. Why would they remove a feature they already have?

Doenst that only work specifically with the LG 9 series OLEDs? There's other LG TVs with the same chipset where its not enabled and some other manufacturer TVs with HDMI 2.1 where its not enabled either.
 
No, there's a difference between what's going on with the LG OLEDs and the actual HDMI 2.1 VRR spec. The LG OLED implementation was proprietary which is why it doesn't work on other TVs.
Would you provide documentation for that?

Based on what I've read in the thread on the 48" CX, LGs implementation is HDMI VRR, whereas implementations such as Samsung's have been the unstandardized 'HDMI Freesync' that AMD invented.
 
It looks like nvidia is really controlling that benches right now. Like with doom and few other games that it probably gives you the best case scenario. I think overall average 3080 will probably end up being 30-40% give or take faster than 2080ti. 60-70% or so vs 2080 when it’s tested for variety of games. Still great but not as great as limited number of the games they are showing.
 
I thought AIOs are using a reference design for the 3080, so maybe EKWB will have something for the FE and the reference AIOs, of course custom PCBs will take some time to be supported. But I see no value in buying an overpriced custom PCB if you want to watercool.

What I want is the better power delivery and higher power limits from a custom design. Water cooling is to get all the extra heat away without going deaf.
 
Doenst that only work specifically with the LG 9 series OLEDs? There's other LG TVs with the same chipset where its not enabled and some other manufacturer TVs with HDMI 2.1 where its not enabled either.

Do you have an example of an HDMI 2.1 VRR TV that doesn't work with Turing?
 
Normally I'd say wait for benchmarks, but that DF video left me believing much of the Ampere hype. Sadly, Big Navi is still a mystery. AMD still seems pretty hyped about their upcoming products, but I've lived through Bulldozer hype, Polaris hype, Vega hype, and Navi hype. Ryzen was probably the only time since the Athlon days that AMD really needed a product to shine and seriously nailed it. RDNA2 will probably be competitive at some tiers, but only time will tell if they can hang with the 3080.
 
Normally I'd say wait for benchmarks, but that DF video left me believing much of the Ampere hype. Sadly, Big Navi is still a mystery. AMD still seems pretty hyped about their upcoming products, but I've lived through Bulldozer hype, Polaris hype, Vega hype, and Navi hype. Ryzen was probably the only time since the Athlon days that AMD really needed a product to shine and seriously nailed it. RDNA2 will probably be competitive at some tiers, but only time will tell if they can hang with the 3080.

I think some people are reading between the lines between the tweet saying the GA104 wont be able to compete and the leaked benchmark a few months ago showing a RDNA card doing ~30% faster than the 2080 Ti, plus the internal rumors saying AMD doesn't think their card will beat the 3080 maybe implies a card around 10-15% behind the 3080. If so they probably can't launch for more than 550-599 at most.
 
Last edited:
Back
Top