Probably will keep the price as it is as there are folks out there that still needs 24GB of ram and probably offered better RayTracing.Oof. 6900XT trading blows with the 3090 for $500 less. It's going to be interesting to see how Nvidia reacts.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature currently requires accessing the site using the built-in Safari browser.
Probably will keep the price as it is as there are folks out there that still needs 24GB of ram and probably offered better RayTracing.Oof. 6900XT trading blows with the 3090 for $500 less. It's going to be interesting to see how Nvidia reacts.
Why are we talking about proprietary DLSS? A resolution scaler that can take a 1440p image and make it 4K to increase frames? Why would we make such a comparison with an AMD card rendering at native resolution?And where does it say DLSS is enabled in those limited titles? AMD would definitely show that they are beating a 3080 with DLSS enabled if that was the case. They wouldn't waste an opportunity like that.
Also, how does it make sense that AMD would beat Nvidia in games where DLSS is supported but trading blows in DLSS unsupported games?
You are spreading completely made up, unsupported fud.
Well there are tens of thousands of PC games that do not support DLSS. The question maybe should be how many in the future will and/or if other similar technology will be brought out. The industry is not standing still on one proprietary technology, TAA is being improved, AMD has non proprietary tech that enhances IQ that works with Nvidia hardware as well, how nice of them. I wonder if they expanded the Radeon Boost game support which would allow you to run a game at a much higher resolution, better quality and yet when in action get great frame rates? As for RT, probably only newer titles that uses DXR 1.1 or higher, AMD hardware, optimized will we really know the performance. Current RT games are optimized for Nvidia RTX for the most part.
Now for those who care less for 4K resolution, want higher frame rates with 240hz+ monitors, looks like AMD further the lead over Nvidia. That will probably be talked about a lot.
It was being discussed because tangoseal made up unsupported claims that AMD beats Ampere cards in games even when DLSS is enabled. That’s the only reason it was brought up.Why are we talking about proprietary DLSS? A resolution scaler that can take a 1440p image and make it 4K to increase frames? Why would we make such a comparison with an AMD card rendering at native resolution?
Ok, Nvidia doesn't have that AMD 5000 boost which beats the 3080 in everything. Should we now ask Nvidia to release their own CPUs to do the same thing?
I'm interested in Raytracing for Cyberpunk 2077 and the Witcher 3 patch that will add it, but aside from that I think we're still a generation away, maybe two, before it's ubiquitious. At the same price I'd probably opt for a better RT card, but 6900XT is $500 cheaper than the 3090, and looks to offer better performance than the 3080, so it's not so easy to compare.Right now if you ask me DLSS and RT are useless features. I say in 2-3 generations they will be worth it then.
But, as of right now in my eyes they are useless. Just like the Nvidia latency BS they have, all marketing bullpoop.
My best guess is we'll see RT catch on in SP games first, where 60fps is just fine and people can appreciate the visual goodies. Multiplayer games will take longer, especially the competitive ones as that's where 120Hz+ really shines.Correct. There's not enough games, performance and overall visual impact for it to matter. Once we can get it run beyond 120Hz, it's will catch on big time.
the 6900XT Had both the "Rage Mode" On and "SAM" (required 5000 cpu) enabled, so likely, it will be slightly slower than a 3090 without those on, and require some OCing to catch up.
I agree for development work the 3090 still has a place. That amount of memory will matter. But for gaming? At that price? Naw.At 5 to 700 bones less too! You can keep the 3090 Jensen. Drop the price and then you have people's attention again. Productivity though you can't shake that sweet 24GiB of vram though. Thats a good amount for doing real work.
Ok I see now.It was being discussed because tangoseal made up unsupported claims that AMD beats Ampere cards in games even when DLSS is enabled. That’s the only reason it was brought up.
That was at 4K, at 1440p, Ultra Wide resolutions and so on it will be even faster. We still don't know how much further past Rage settings these cards will go. In any case it is beating more games than not from this AMD picked selection. We just need reviews to find out how it performs in RT and so on as well as how much headroom. Since performance for RT was not presented with performance numbers, might have to wait for the reviews and if that is important to one's buying decision.the 6900XT Had both the "Rage Mode" On and "SAM" (required 5000 cpu) enabled, so likely, it will be slightly slower than a 3090 without those on, and require some OCing to catch up.
It is always best practice to wait for review, that being said, I am just glad no one has to wait for a long time for a 3080/3090 competitor unlike its previous release.That was at 4K, at 1440p, Ultra Wide resolutions and so on it will be even faster. We still don't know how much further past Rage settings these cards will go. In any case it is beating more games than not from this AMD picked selection. We just need reviews to find out how it performs in RT and so on as well as how much headroom. Since performance for RT was not presented with performance numbers, might have to wait for the reviews and if that is important to one's buying decision.
View attachment 293505
Nvidia could extend support to the RTX 3090 with Titan class drivers, which will open up features for professional applications to be able to be used. I expect AMD to slowly dish out more card info as time goes on as well as how Nvidia responds. Like a Radeon RX 6800 8gb version, 18% faster than a 2080Ti for $499 if Nvidia reduces price on the RTX 3070.I agree for development work the 3090 still has a place. That amount of memory will matter. But for gaming? At that price? Naw.
Didn't they reveal that in the presentation? Rage offers 3-9% or something like that. A little more than the nvidia cards seem to be able to overclock.So I am strongly considering a 6900XT. But need to know the following before making the jump:
1) How does it perform without Rage and Smart Access Memory enabled.
2) What is the RT performance like.
AMD was clever making an exclusive feature for Ryzen 5000 CPU/X570 owners. I wonder what it’s overall performance gains are as that is enticing for me as someone who will soon purchase a 5900X.
I believe that was Rage + SAM.Didn't they reveal that in the presentation? Rage offers 3-9% or something like that. A little more than the nvidia cards seem to be able to overclock.
It definitely adds an interesting twist. Means I'll be more inclined to upgrade my CPU if I go the AMD route. I think I'm still leaning 3080 but if the reviews for the 6800 XT and/or 6900 XT are compelling I could be swayed.AMD was clever making an exclusive feature for Ryzen 5000 CPU/X570 owners. I wonder what it’s overall performance gains are as that is enticing for me as someone who will soon purchase a 5900X.
Rumor had it that RT performance was on around the 2080 Ti level (not sure which card, possibly the 3800XT), but waiting for reviews appears to be the only way to find out.It’s odd they didn’t show ANY RAy Tracing performance numbers... I wonder if this is a situation like back with the 5000 series where with first gen tesselation hardware AMD couldn’t compete. I still want the 6800xt but would have been nice to see some numbers of RT performance.
Maybe it was the implementation but the RT game demos they showed didn't look very impressive. I think it will be interesting to compare Nvidia and AMD on both RT performance and quality.Rumor had it that RT performance was on around the 2080 Ti level (not sure which card, possibly the 3800XT), but waiting for reviews appears to be the only way to find out.
You just know that if it was at the same/near performance level that rasterisation was at compared to the 3080/3090 that they'd have included some sort of numbers in the launch. There's no way that they wouldn't have.
Dont blame me, if I didn't mention it, 50 other people would have.Ok I see now.
You are not going to be able to discern fidelity on a youtube stream.Maybe it was the implementation but the RT game demos they showed didn't look very impressive. I think it will be interesting to compare Nvidia and AMD on both RT performance and quality.
I am simply blown away. The performance they showed is a breath of fresh air. I have had ATi/AMD cards for a long time. Only this last year did I do a switch or nVidia after my aging RX 480 was starting to push daisies and the new AMD offerings were not worth the price. I currently have an nVidia RTX 2070 Super and it runs peach-keen. AMD is now showing a more competitive card(s) and it brings back the nostalgia of the past ATi/AMD releases before nVidia started to wipe the floor with them for a while there. Before my 2070 Super, last nVidia card was the GeForce MX 200. Looks like I may go back to AMD. According to what I have seen/read, in order for best performance it needs the 5000 series of CPU? Is this Dragon all over again, but with real performance uplifts? I might pick up a 6800 XT or even save up for the 6900 XT. Another thing to note is the power usage to achieve the performance. This is a big one for me. I am still using my faithful OCZ Fatal1ty 750 Watt Semi-modular PSU. Currently powering its 4th PC. I had seen the power usage of the 3080/3090 and my power socket shit its pants. AMDs cards show way better power usage and it means there could be headroom for OC. I may get a new PSU just so I can run a 6800xt/6900xt and my 2070 super in tandem for mixed GPU rendering for Ashes.
It was actually called "Best API" if my memory serves me well (and it often doesn't). I gathered that they were talking about DX12 vs Vulkan.Super resolution is their answer to DLSS, and they do it via shaders being loaded dynamically as I understand on what data is out there. (Not AI based)
I also am wondering, if "Full API" means, that the benchmarks actually did include ray tracing on, for both nVidia and AMD.
Imagine that.
Oh shoot, I totally misinterpreted in both regards then. I appreciateIt was actually called "Best API" if my memory serves me well (and it often doesn't). I gathered that they were talking about DX12 vs Vulkan.
Yep, and the 6900XT is also 300 watts. The 3090 spikes to 450 watts. Huge difference! Of course, we'll have to wait and see what the 6900XT spikes to with Rage enabled, but the PPW for AMD is way better than nVidia this gen it seems.Isn’t a 6800XT 300 watts and a 3080 320 watts?
It was actually called "Best API" if my memory serves me well (and it often doesn't). I gathered that they were talking about DX12 vs Vulkan.
Yep, and the 6900XT is also 300 watts. The 3090 spikes to 450 watts. Huge difference! Of course, we'll have to wait and see what the 6900XT spikes to with Rage enabled, but the PPW for AMD is way better than nVidia this gen it seems.
Maybe. Maybe not. We always have screenshots, and reviewer impressions. If it's close enough in quality you can't discern the difference on a Youtube video, that's a big win for AMD IMO.You are not going to be able to discern fidelity on a youtube stream.
The way it was said I felt like the B550 would also had it, from memory it was the 500 motherboard, I could be wrong there but I would imagine that it was included in there, otherwise that made the B550 a really bad product.So I am strongly considering a 6900XT. But need to know the following before making the jump:
1) How does it perform without Rage and Smart Access Memory enabled.
2) What is the RT performance like.
AMD was clever making an exclusive feature for Ryzen 5000 CPU/X570 owners. I wonder what it’s overall performance gains are as that is enticing for me as someone who will soon purchase a 5900X.