Join us on November 3rd as we unveil AMD RDNA™ 3 to the world!

The issue they will have is simply zero experience with such designs.

That's what I keep coming back to. AMD threaded the needle and found technology to make -- let's be honest -- outdated, older nodes, perform damn near as well as the cutting edge, all while using less silicon* (edit silicone is boobs) and less power. Sure, at the added cost of construction. But that's something Intel and Nvidia will have to learn to do, too. Mixing and matching is the way forward.

And they have years of experience implementing it that Intel and Nvidia just can't even.

It's AMD 3D chess at this point. To further mix metaphors, they've turned a sow's ear into a silk semiconductor.
 
Last edited:
That's what I keep coming back to. AMD threaded the needle and found technology to make -- let's be honest -- outdated, older nodes, perform damn near as well as the cutting edge, all while using less silicone and less power. Sure, at the added cost of construction. But that's something Intel and Nvidia will have to learn to do, too. Mixing and matching is the way forward.

And they have years of experience implementing it that Intel and Nvidia just can't even.

It's AMD 3D chess at this point. To further mix metaphors, they've turned a sow's ear into a silk semiconductor.
Next few gens of CPUs and GPUs are going to be fun. Both Intel and Nvidia will be going that way as well. AMD will either reap the benefit of a lead... or perhaps loose it completely. I hope they make hay while they can the next 1-3 years in both CPU and GPU.
There is also always the possibility that NV and or Intel fall on their face and have a generation that stinks. Nvidia sort of had that back with Tesla... they added in tensors but the Raster uplift wasn't there, it was so poor they didn't even bother releasing a consumer version. AMD allowed that of course by not really competing. I'm not sure Nvidia could get away with that again if their next gen doesn't best lovelace. Same goes for Intel... they are obviously competing very well with AMD now, they don't want another generation of stagnation. AMD beat them up just short of into the ground, despite 2-3 solid AMD win generations it only really takes one good Intel generation for people to start saying I knew Intel would bounce back. lol
 
There is also always the possibility that NV and or Intel fall on their face and have a generation that stinks.

This or they will reverse-engineer the tech and dump money into development and bury AMD, patents with or without standing. TSMC is a mercenary company, and will serve its bidders. Especially right now, political-climate-wise (not going into this -- if anyone reading this wants to talk politics, sub to GenMay, it's tits).

AMD is the corporate equivalent of Colin McRae. When in doubt, flat out. I think they're flat out on chiplets, whether they're GCD plus MCDs, along with stacking, as they've done this generation, or flat out on all chiplet options, which I believe they are, based on the dual-GCD stuff they either teased or "teased."

Jesus, I bet even the most die-hard Nvidia fanbois are pissed at, and proud of, AMD this round, because they want to beat AMD even harder. Instead, they got cards that are expensive, suck too much power, and sometimes catch fire. That's not fanboi comfort zone.
 
DLSS 3.0 looks like trash with it's interpolation. FSR's not bad it basically does the same thing and they announced FSR 3.0 will be dropping in 2023 which they're saying can give up to 2x fps at 4k and then they got this Hyper-RX thing coming too.

just nvidia thinking $1600 is a great price to charge for graphics cards is enough to "woo" me to AMD
Yeah but their FSR 3 gets that 2x FPS increase thanks to their new “Motion Engine” aka their AI frame generator just like Nvidia’s Optical Flow. I can’t imagine AMD’s AI generated frames will look any better, let alone the latency issues it’s bound to have as well.

The 7900xt or the card formerly known as the 4080 12GB are the cards that interests me but I’m still eagerly awaiting reviews.
 
I'm out of the loop. How bad is the raytracing on rdna 3? Can it at least match a 3090? And what about some of the more active raytracing games like control or 2077?
 
Yeah but their FSR 3 gets that 2x FPS increase thanks to their new “Motion Engine” aka their AI frame generator just like Nvidia’s Optical Flow. I can’t imagine AMD’s AI generated frames will look any better, let alone the latency issues it’s bound to have as well.

The 7900xt or the card formerly known as the 4080 12GB are the cards that interests me but I’m still eagerly awaiting reviews.
Is AMD only doing a version of dlss 3.0 or are they going to do the same style as dlss 2.0 also?
 
I'm out of the loop. How bad is the raytracing on rdna 3? Can it at least match a 3090? And what about some of the more active raytracing games like control or 2077?
Honestly if you are at 1440p then the 3090 is more than good enough at Ray Tracing but for all their talk of 4K and 8K I don’t see how these are going to stack up there. But 4K is a 3% and that is handily going to NVidia with the 4090 for now, so AMD can talk a big game there but I think their 8K is what most others I know of call 6K, which these will also do fine.

Honestly I love Ray Tracing, it is the new thing but it’s still not here totally yet and won’t be until the next major console refresh. Until the consoles can do ray tracing the industry must be raster first ray-traced second.

The 7900 cards are what AMD needs, they are “cheap” to both produce and sell, they force Nvidia to respond and generate a lot of positive stuff for AMD on both the consumer and investor side of things. Assuming AMD can get the chips out in good volumes they utterly ruin Nvidia’s ability to sell the 3000 series overstock and that’s a big deal for AMD.
 
When it comes to Nvidia, how its usually been. If you want the best you have to pay for the best. It's the reason I got a 3090 on release day. Same goes for the 4090 RTX. It is the fastest card and will be once the 7900xtx is released imo.
If you have the $$ for the best you pay for the best. The rest of the world does some sort of (performance/price) equation that makes sense in reality for them (wife/kids/bills etc..don't want to bore ya with reality for the 99% ya)
[H] used to be THE place for the most from the least. Not the most from the most with effort ZERO!
Enjoy your card bruh
 
I'm out of the loop. How bad is the raytracing on rdna 3? Can it at least match a 3090? And what about some of the more active raytracing games like control or 2077?
Depends what we mean by that, but if you mean does a card has strong has the 7900xtx should beat a 3090 with RT enabled in some game yes, seem a possible mixed result for the more active title:

Ratio over the 6950xt at 4k with RT on according to AMD (https://www.dsogaming.com/news/amd-...on-rx-7900xtx-prices-dates-gaming-benchmarks/)
Resident evils: 1.5x
Metro exodus: 1.5x
Doom eternal: 1.6x

According to:
https://www.techpowerup.com/review/amd-radeon-rx-6950-xt-reference-design/32.html

The 3090 was over the 6950x
Resident evils: 1.06x
Metro exodus: 1.44x
Doom eternal: 1.41x

So maybe not that much if by any in something like control-cyberpunk, where the 3090 was 60-65% higher than a 6950xt.
 
Honestly if you are at 1440p then the 3090 is more than good enough at Ray Tracing but for all their talk of 4K and 8K I don’t see how these are going to stack up there. But 4K is a 3% and that is handily going to NVidia with the 4090 for now, so AMD can talk a big game there but I think their 8K is what most others I know of call 6K, which these will also do fine.

Honestly I love Ray Tracing, it is the new thing but it’s still not here totally yet and won’t be until the next major console refresh. Until the consoles can do ray tracing the industry must be raster first ray-traced second.

The 7900 cards are what AMD needs, they are “cheap” to both produce and sell, they force Nvidia to respond and generate a lot of positive stuff for AMD on both the consumer and investor side of things. Assuming AMD can get the chips out in good volumes they utterly ruin Nvidia’s ability to sell the 3000 series overstock and that’s a big deal for AMD.
I'll likely be getting the 7900 xtx either way, but I'm on a 4k oled cx10 for my pc display and after this I'm never going back to non emissive displays. But I am going to continue to need more performance to drive this screen. It looks like the 4090 would be a better card for my display but not going that route.
 
No one with any sense. Not until we see reviews.
With how hard it seem to be to get those cards and with how easy return policy tend to be (or to resell them use), not sure it is at all a bad strategy to pre-order one before reviews if it would be an option to someone.
 
With how hard it seem to be to get those cards and with how easy return policy tend to be (or to resell them use), not sure it is at all a bad strategy to pre-order one before reviews if it would be an option to someone.
You can always play the return song and dance. Also generally not worth it. At the end of the day whether you're looking to flip the card or use it, when you pre-order you're just gambling, hoping it's worth your time and cash. Because ultimately you do not know what it is you're even buying.
I'd rather play politics and not reward companies for inferior products. I'm on team: preorder nothing, ever, for any reason, all companies are here to serve me and not the other way around, and they all could just as soon die in a fire - forever.
If no one ever preordered, companies would have to deliver the goods all the time. And we'd have a lot less bullshit. I can't control what anyone else does, but I'm out, and naturally I think wisdom is on my side. Agree or not, that's up to you.
 
Last edited:
So this means the 7900 XT would've been around 50% faster than the 4080 12 GB for the same price.
 
So this means the 7900 XT would've been around 50% faster than the 4080 12 GB for the same price.
I still don't see how they are going to keep the 4080 16GB at $1199 compared to the 7900xtx. And I have no doubts with how shitty Nvidia is with slower cpus that the 7900 xtx will match or beat the 4090 with some of the cpus people are actually using. I think at 1440p the 7900xtx is going to be faster than the 4090 in many if not most games even with a top end cpu. On techpowerup the 4090 is only 30% faster than the 6950 xt at 1440p.

Everything changes with ray tracing though as AMD is WAY behind. Nvidia should be over 50% faster in most cases and even 75% faster in some games.
 
I still don't see how they are going to keep the 4080 16GB at $1199 compared to the 7900xtx. And I have no doubts with how shitty Nvidia is with slower cpus that the 7900xtx will match or beat the 4090 with some of the cpus people are actually using. I think at 1440p the 7900xtx is going to be faster than the 4090 in many if not most games even with a top end cpu. On techpowerup the 4090 is only 30% faster than the 6950 xt.

Everything changes with ray tracing though as AMD is WAY behind.
At 1440p either the 7900 XTX or the 4090 will leave you CPU bottlenecked with even the 13900k. So no difference, between them at 1440p. They are 4K cards through and through.
 
At 1440p either the 7900 XTX or the 4090 will leave you CPU bottlenecked with even the 13900k. So no difference, between them at 1440p. They are 4K cards through and through.
Not anywhere near as much on AMD if you actually look at how AMD has been scaling at lower resolutions compared to Nvidia. The last two generations of top end Nvidia cards do poorly at 1440p relative to AMD because Nvidia has piss poor hardware scheduling. There are cases where an AMD card that is not even half as fast as an Nvidia card at 4k can match or beat it lower resolutions without a top end cpu. Of course it can vary wildly depending on the game.
 
If you have the $$ for the best you pay for the best. The rest of the world does some sort of (performance/price) equation that makes sense in reality for them (wife/kids/bills etc..don't want to bore ya with reality for the 99% ya)
[H] used to be THE place for the most from the least. Not the most from the most with effort ZERO!
Enjoy your card bruh
I like how you're telling Brackle how the forums "used to be" and then calling him "bruh".

Would read again. 10/10.
 
Depending on how reviews pan out I might replace my 6700xt which I bought at msrp since couldn’t get a better card at non-scalper prices.
 
Maybe a dumb question, but what's the HDMI audio-out like on AMD cards like these days? Do they support 7.1, Atmos, DTS:X, etc. ? What about instant-on audio? My last experience with an AMD card was 5 years ago and all of those things were issues. It didn't support all of the standard formats and there was a roughly 1-2 second delay with all new audio sources.
 
that moment when you find out that Ubisoft is FINALLY going to release a new Splinter Cell game....via an AMD event...
 
Maybe a dumb question, but what's the HDMI audio-out like on AMD cards like these days? Do they support 7.1, Atmos, DTS:X, etc. ? What about instant-on audio? My last experience with an AMD card was 5 years ago and all of those things were issues. It didn't support all of the standard formats and there was a roughly 1-2 second delay with all new audio sources.
I'm happily using audio via HDMI on my Samsung QN90B from my RX 6800 XT. I also occasionally plug in a generic "ECCO" brand TV to watch streaming via PC in bed.

HTH.
 
In a just some months of difference of universe that seem like it would have been a hell of good minings cards
 
*deer in headlights look*
I don't even know what you mean. It games great and streams movies great. Audio great.

It also Words and Excels and Visual Studios great. ;-)
There was an older installer package issue where it wouldn't install things if they weren't detected at the time of installation, so if you didn't have your equipment all plugged in when you installed the drivers it would leave things out and you would have to do a full uninstall and reinstall of the driver package on site when things were all hooked up.
It made batching deployments a serious PITA, AMD has since corrected the issue but it went on for a full year or so before they got around to fixing that particular issue.
 
Rumored AIB models with higher voltage options should be about 15-20 percent faster:



The architecture was designed to run at 3GHz.

Well AMD has to leave some room for the Powercolors and the rest of their AIB, Spooge Red Devil edition cards.
 
Not my part of the playground, so I'll take your word for it.
Yeah it was an annoyance at worst, but once we knew it was a thing we just made sure to have dummy equipment in place so the installer packages could "detect" the components and would install drivers accordingly. AMD has been making good progress in their installers in general for the past few years, but for a solid 18 months or so they weren't great. It was a 2018 - 2020 issue.
 
Back
Top