Where Gaming Begins: Ep. 2 | AMD Radeon™ RX 6000 Series Graphics Cards - 11am CDT 10/28/20

More like 5 years ago, and yes, 8k is pointless for the next 5+ years.
Flat 8K panels I don't think will ever be worth it for the average PC gamer using a desktop monitor--couch/tv gamers maybe. I can however, easily see 8K or even higher resolution VR being HUGE.
 
No need to lose any more money here.
Yep. And while AMD is guaranteed to be enjoying decent margins on these, the same can't be said for Nvidia, or both Nvidia and Samsung. There's little doubt that their margins are slim to none right now. I've heard that Samsung is essentially eating all their costs. They can afford to, no prob, they're huge and it's better to have a good partner, especially one that is presumably working *very* hard to improve their manufacturing, but ultimately:

Nvidia probably can't afford to price their cards lower. We already know there's a secret $50 price bump baked into at least the 3080 dies.

AMD could probably lower their prices, but like everyone's saying, why? It would be friendly, but dumb.

We're going to need a fundamental or generational shift before there's really room for a price war.
 
Flat 8K panels I don't think will ever be worth it for the average PC gamer using a desktop monitor--couch/tv gamers maybe. I can however, easily see 8K or even higher resolution VR being HUGE.

We are still 5+ years away from any serious adoption, likely longer for hardware that can push it with any serious IQ settings and not using trickery like DLSS.
 
I wouldn't say significantly, outside the 3090 but they don't have anything up there to counter the CUDA environment and despite their 8K gaming bullshit, the 3090 is a workstation card. But serious shots fired, I look forward to seeing NV's response and not their press release I want to see their boots on the ground war faces.
When I say signficantly (price) - a 3090 for $1500 MSRP vs a 6900XT for $1k..

$500 - that's pretty significant I'd say. Plus.. Try finding a card for MSRP. Ebay is the only place to get a 3090 and they are going for $2300+
This BS will end in a month.
 
More like 5 years ago, and yes, 8k is pointless for the next 5+ years.

Meh, who is counting, I'd rather not think how much I have aged. Anyway my original point was that since pushing rendering resolution up with brute force performance is getting harder the stuff like DLSS (and whatever AMD equivalent is going to be) is the future, I firmly believe. Hopefully there will be hardware vendor agnostic open solutions eventually and not just DLSS with Nvidias stranglehold.
 
Meh, who is counting, I'd rather not think how much I have aged. Anyway my original point was that since pushing rendering resolution up with brute force performance is getting harder the stuff like DLSS (and whatever AMD equivalent is going to be) is the future, I firmly believe. Hopefully there will be hardware vendor agnostic open solutions eventually and not just DLSS with Nvidias stranglehold.

Don't buy tech today for the future, buy the future tech in the future.

Today 8k is pointless.
 
I didn't watch but I'm guessing there's no date for mid-range cards yet?
 
When I say signficantly (price) - a 3090 for $1500 MSRP vs a 6900XT for $1k..

$500 - that's pretty significant I'd say. Plus.. Try finding a card for MSRP. Ebay is the only place to get a 3090 and they are going for $2300+
This BS will end in a month.
That’s why I said outside the 3090, the pricing for the rest of the stack is pretty close. I’m not going to count scalper prices for anything because until AMD’s cards hit the shelves there is nothing saying their cards will fare any better.
 
Not sure if this is the place to discuss this, but I'm getting a lot of feedback AMD's drivers? They that bad?
 
Meh, who is counting, I'd rather not think how much I have aged. Anyway my original point was that since pushing rendering resolution up with brute force performance is getting harder the stuff like DLSS (and whatever AMD equivalent is going to be) is the future, I firmly believe. Hopefully there will be hardware vendor agnostic open solutions eventually and not just DLSS with Nvidias stranglehold.
I wouldn’t call it a stranglehold, NVidia has poured large resources into CUDA and it’s reaping many benefits. It’s good, and there isn’t anything quite like it out there that does what it does better or easier.
 
AMD Drivers aren't bad. This is an old trope. They haven't been bad since the 90's.

Yep not any worse than nvidias that always have issues as well, I mean they are listed right on the release notes and discussions all over about it. Nvidia doesn't hotfix every other day for nothin'
 
AMD Drivers aren't bad. This is an old trope. They haven't been bad since the 90's.
Eh...from my experience they haven’t been bad, but they have been “less polished” than Nvidia’s even in the 2000’s and early 2010’s. I’ve had an ATI 9800 and HD 5870, and an AMD HD 7970 and a R9 390. Definitely still had more bugs with them than my Nvidia cards, but nothing unmanageable. I haven’t had an AMD card since 2016 so I don’t know their quality over the past 4 years.
 
Don't want to derail this thread, but AMD drivers are nice when they work. Lots of options and you can also adjust settings in real-time while playing games.

They've been decent for years, there were some problems recently with black screens and such, but AFAIK those have been resolved. I wouldn't let that hold you back from getting an AMD card.
 
Not at all, not even close. (video by forum member chameleoneel)



Way to many people buying into the DLSS is magic BS.

What a load of crap. I'm pretty sure my eyeballs work just fine and if I run death stranding at native 1440p it looks like s*** compared to running quality dlss. Native resolution has way more aliasing and crawling than using dlss quality mode.
 
Definitely. It was kinda weird that AMD didn’t talk about existing raytracing titles like Control or Cyberpunk. It would be terrible if they don’t run on the 6 series.

AMD didn’t talk raytracing performance at all.
No they did say that all their cards support raytracing in all existing and upcoming games and those numbers they showed were supposed to be at max settings.
 
What a load of crap. I'm pretty sure my eyeballs work just fine and if I run death stranding at native 1440p it looks like s*** compared to running quality dlss. Native resolution has way more aliasing and crawling than using dlss quality mode.

I can attest personally, dlss does not magically increase image quality in any title to date. It does give you extra fps. Glad you can't tell the difference, but I have noticed it in every title.

The idea that somehow you get something for nothing is a marketing position, not reality. its an upscaler and has ALL the issues of upscaling, just to a reduced degree by algorithm.

In my experience, the video is 100% accurate in regards to death stranding.
 
I find it curious that RayTracing performance was not at all talked about. Must not be a strong point of this generation. Which is fine, as the wide scale of adoption isn't there yet. Even if AMD is 80% good as RTX then Nvidia, thats a good start.
 
What a load of crap. I'm pretty sure my eyeballs work just fine and if I run death stranding at native 1440p it looks like s*** compared to running quality dlss. Native resolution has way more aliasing and crawling than using dlss quality mode.
The anti-aliasing from DLSS in Death Stranding is not in question. I mention in the video that its really good. Even with basically no aliasing shimmer during high motion.

No crap, here. That footage was captured with Nvidia's own Shadowplay feature, at 100,0000kbps. And even after youtube's compression, the visual shortcomings of DLSS in Death Stranding, are very easy to see.
 
I find it curious that RayTracing performance was not at all talked about. Must not be a strong point of this generation. Which is fine, as the wide scale of adoption isn't there yet. Even if AMD is 80% good as RTX then Nvidia, thats a good start.
Each architecture does raytracing differently. You are going to need games to be optimized. It's not a matter of just flicking on a switch and getting good performance out of it.
 
I can attest personally, dlss does not magically increase image quality in any title to date. It does give you extra fps. Glad you can't tell the difference, but I have noticed it in every title.

The idea that somehow you get something for nothing is a marketing position, not reality. its an upscaler and has ALL the issues of upscaling, just to a reduced degree by algorithm.

In my experience, the video is 100% accurate in regards to death stranding.
Let me talk facts instead of opinions. It is 100% fact that dlss 2.0 in quality mode looks better overall than native resolution while running better.
 
Not sure if this is the place to discuss this, but I'm getting a lot of feedback AMD's drivers? They that bad?
They are not, most of AMD's supposed driver issues have been caused by other things, the faulty ram chips being one, they also use a simpler power delivery system so PSU's that are operating outside of spec can cause a real headache for them where an NVidia card will still operate within reason. There have been a few notable driver problems but most of them were windows related and from years back and were addressed in a moderately reasonable timeframe?
 
Zero mention of any improvements to AMDs encoders, and zero showing on any DXR equipped game already out this year or put out last ,speaks volumes.
The RTX titles used DXR 1.0, the DXR 1.1 spec is the one rolled into DX12U and maintains full backward compatibility. NVidia, didn't deviate from the spec so unless the developers did some funky implementations the AMD cards should work out the gate or with a minor patch.
 
Each architecture does raytracing differently. You are going to need games to be optimized. It's not a matter of just flicking on a switch and getting good performance out of it.
wrong. they already said that it's all based off of DX12U DXR even nvidia's RTX. and they said that the cards will do raytracing in all existing and future titles
 
Back
Top