Join us on November 3rd as we unveil AMD RDNA™ 3 to the world!

There was an older installer package issue where it wouldn't install things if they weren't detected at the time of installation, so if you didn't have your equipment all plugged in when you installed the drivers it would leave things out and you would have to do a full uninstall and reinstall of the driver package on site when things were all hooked up.
It made batching deployments a serious PITA, AMD has since corrected the issue but it went on for a full year or so before they got around to fixing that particular issue.

Good to know that it's fixed. You don't happen to know about bistreaming output do you? Any issues with Atmos/DTS:X or delays with audio streams? As dumb as it sounds, the GPU audio component is a deal breaker for me.
 
Well AMD has to leave some room for the Powercolors and the rest of their AIB, Spooge Red Devil edition cards.
The problem is the price potential. The question is which if any AIBs are worth the increased prices, meeting or exceeding FE/Reference designs, QA/build quality, and overclockability/VRM/features. At least in the last generation with NV it was pretty much only limited EVGA and Asus models that were even potentially worth it, and for AMD it was Sapphire, PC Red Devil, and Asus. When I picked up my 3090 Asus ROG Strix model I only did so because it was before the big shortage/tariff/pandemic/mining combo, so I paid around $1650-ish, which for a $1500 card seemed worth it for the top of the line AIB (and I've had good experiences with ROG Strix / Matrix cards in the past. It has worked flawlessly for me ever since and its additional features like dual BIOS, good cooling, and the ability to OC smoothly have been worth while I think.). However, it wouldn't be long before the MSRP was $2200-2400 (depending on the standard black/gunmetal or white color!) and scapers even higher, so there as no way in hell it was worth that over getting a $1500 FE. On the AMD side, I saw they seemed to have even less control over insane prices among AIBs; putting aside the fact that the 6800XT and 6900XT did NOT have an Asus ROG Strix version that was air cooled, I saw a lot of highly priced AIBs even during the era when AMD cards were not as desirable for mining etc. Adding this to the horrid, easily botted AMD dot com direct sales platform as the only way , and lots of people who would have paid for those $650 6800XT never got the chance - I don't want to see the same thing happen again.

While a lot of things are different now, I hope AMD has both better control on AIB pricing for partners and the reference/FE availability off their own website and any 3rd parties. Especially given the delay of more than a month before these cards show up for sale (I consider this inadvisable especially given NV has the top end cards already available, lower end ones will debut, and NV will adjust prices and ramp up their insane marketing machine ) AMD will again be underwhelmed when they really could have done much better on the product alone. While RDNA3 looks potentially great, we'll need to see real comparisons vs 4080 16gb AND 4090. Hardcore old school nerds like us will pick over minutia, but I have to wonder how many others will do so. There's a lot I like about AMD's GPUs and policies - their friendliness to open source and open specs etc.. is something I'd like to support. Unfortunately this is buffered by a bunch of hard to suss out limits and qualifications. Its relatively rare that AMD even competes with Nvidia when it comes to the top end cards - last generation is a noteworthy outlier and should have been a circumstance where AMD was far, far ahead in many tiers. However, a combination of other factors from availability, to mining, to Nvidia's ability to seemingly change the conversation to "Nothing matters but the stuff we're good at, raytracing and DLSS!" undermined a time when AMD had cards that not just competed but often won across the spectrum for regular rasterized gameplay AND price/performance (at standard prices).

I want to see more info on RDNA3 and want it to thrive, along with next gen FSR, alternatives to CUDA and other open tech...but when it appears they're not really interested in going up against the NV 4090, that's going to be a cost in the very online video heavy review space. Now, its possible the "We can basically do ALMOST as good as the 4090 but do it for $600 less" will be enough to get a lot of people onboard. That large a price/performance arc , wide enough across content, will encourage a lot of potential buyers - even if NV will again dance around with their "We're the top of the line" winnings. The question is how well things like raytracing and other factors will be in play, lest NV somehow be able to focus everything on a very narrow sub-group of their performance (which frankly, at least last generation was questionable of benefit and only really viable for the highest tier cards and near useless for lesser ones. Despite that, NV sold a lot of 3060 and 3070 cards when 6600 / 6700 AMD series should have been thriving. etc) . I guess we'll see, but it does seem that the only way that AMD seems to get a break is if they completely decimate their bigger bank account competitors (be it NV on the GPU side or Intel on the GPU) while charging a fraction of the price; which doesn't exactly seem fair. Sometimes I wondered why lots of the video reviewers last gen didn't highlight some of the "You're getting taken in by NV marketing on raytracing. Lets be honest for a bit" as much as they should, but we'll see. Anyway, until we see useful benchmarks for RDNA3 against both their previous cards and the competitor's current ones, we won't know the full story. However, I do hope that those who are cautiously optimistic prove to be right, though I have a lot of concerns that being right isn't necessarily enough, sadly.
 
Last edited:
I'm a bit concerned by this slide.

RDNA3 gaming slide.jpg


Doom Eternal Ray Tracing at 4K max settings.

I just tried that on my 4090, and I don't know where they got 135 FPS from, but the framerate on my 4090 is much MUCH higher than that, even without DLSS. I'd really like to know what their benchmark was for that number.
 
I'm a bit concerned by this slide.

View attachment 524218

Doom Eternal Ray Tracing at 4K max settings.

I just tried that on my 4090, and I don't know where they got 135 FPS from, but the framerate on my 4090 is much MUCH higher than that, even without DLSS. I'd really like to know what their benchmark was for that number.
For what it's worth, this is what the footnote (1) says:
  1. Testing done by AMD performance labs November 2022 on RX 7900 XTX, on 22.40.00.24 driver, AMD Ryzen 9 7900X processor, 32GB DDR5-6000MT, AM5 motherboard, Win11 Pro with AMD Smart Access Memory enabled. Tested at 4K in the following games: Call of Duty: Modern Warfare, God of War, Red Dead Redemption 2, Assassin’s Creed Valhalla, Resident Evil Village, Doom Eternal. Performance may vary. RX-842
 
For what it's worth, this is what the footnote (1) says:
  1. Testing done by AMD performance labs November 2022 on RX 7900 XTX, on 22.40.00.24 driver, AMD Ryzen 9 7900X processor, 32GB DDR5-6000MT, AM5 motherboard, Win11 Pro with AMD Smart Access Memory enabled. Tested at 4K in the following games: Call of Duty: Modern Warfare, God of War, Red Dead Redemption 2, Assassin’s Creed Valhalla, Resident Evil Village, Doom Eternal. Performance may vary. RX-842
Yep.

Also, just tested God of War at 4K max settings, no DLSS. 120-150 FPS average in the first level on my RTX 4090. This game does not use Ray Tracing.

Unless I'm missing something here, I just don't see the 7900 XTX as the GPU savior people are starting to hype it up to be.
 
Unless I'm missing something here, I just don't see the 7900 XTX as the GPU savior people are starting to hype it up to be.
What savior? It sounds like a great GPU for sub-4K gaming, and it's a lot cheaper than the 4090. That'll be good enough for a lot of people.
 
What savior? It sounds like a great GPU for sub-4K gaming, and it's a lot cheaper than the 4090. That'll be good enough for a lot of people.
There are lots of graphs making the rounds showing AMD being within 5-10% of the 4090 performance wise. My own testing of generalized scenarios using the 4090 shows that this is most likely false, even in purely rasterized games.
 
There are lots of graphs making the rounds showing AMD being within 5-10% of the 4090 performance wise. My own testing of generalized scenarios using the 4090 shows that this is most likely false, even in purely rasterized games.
I'm prepared to believe you about the performance, but I'm not spending $1600 on a GPU, and I game at 1440, so the 7900s seem to make a lot more sense to me. As always, YMMV.
 
There are lots of graphs making the rounds showing AMD being within 5-10% of the 4090 performance wise. My own testing of generalized scenarios using the 4090 shows that this is most likely false, even in purely rasterized games.
FYI, it says "4080" in your signature.
 
I'm prepared to believe you about the performance, but I'm not spending $1600 on a GPU, and I game at 1440, so the 7900s seem to make a lot more sense to me. As always, YMMV.
IMHO, there are pros and cons to each. There's even an opportunity (gulp) for AMD to be perceived as "better" even with much less fps at 4K... we'll see. Just to be honest, the 4K (purists, not low res upscalers) high gamer is a small group today, due to current limitations. Some of those limitations are lifted with these new AMD cards, but we need the monitors, etc. to support it. So, this could be a tale of December results and a "redo" once more advanced displays become more available.

Personally, I think people will likely be happy with either at 1440p, and many will opt to save the money, space and power this time around IMHO. But we'll know more after we see them in action.
 
IMHO, there are pros and cons to each. There's even an opportunity (gulp) for AMD to be perceived as "better" even with much less fps at 4K... we'll see. Just to be honest, the 4K (purists, not low res upscalers) high gamer is a small group today, due to current limitations. Some of those limitations are lifted with these new AMD cards, but we need the monitors, etc. to support it. So, this could be a tale of December results and a "redo" once more advanced displays become more available.

Personally, I think people will likely be happy with either at 1440p, and many will opt to save the money, space and power this time around IMHO. But we'll know more after we see them in action.
I've seen some speculation that the 7900 clocks will go as high as 3GHz, and that via that method, the card will be great at 4K/8K. We all know how speculation on the Internet goes, of course.
 
I've seen some speculation that the 7900 clocks will go as high as 3GHz, and that via that method, the card will be great at 4K/8K. We all know how speculation on the Internet goes, of course.
Let’s not kid ourselves. 4K maybe, but not 8k. I mean, believing that is not even weed territory, that’s straight up crack pipe religion.

:)
 
Good to know that it's fixed. You don't happen to know about bistreaming output do you? Any issues with Atmos/DTS:X or delays with audio streams? As dumb as it sounds, the GPU audio component is a deal breaker for me.
To the best of my knowledge it’s all working as intended. But looking at it I see why you are still concerned, all I can say is with my equipment it works but the more I read the more I am convinced I may be the exception and not the rule.
 
...While a lot of things are different now, I hope AMD has both better control on AIB pricing for partners and the reference/FE availability off their own website and any 3rd parties. Especially given the delay of more than a month before these cards show up for sale (I consider this inadvisable especially given NV has the top end cards already available, lower end ones will debut, and NV will adjust prices and ramp up their insane marketing machine ) AMD will again be underwhelmed when they really could have done much better on the product alone...
I've already said this in this thread, but it's pages back, FrgMstr said that nVidia dumped all the 4090s they could upfront. There is very limited allocation for any more in North America for the next two quarters. If you don't have a 4090 by now, you're gonna have a tough time getting one until Spring 2023. If AMD can deliver this year and in Q1 2023 and performance exceeds the 4080 16 GB, they will be beating what nVidia has available at the time, most likely at a lower price (for rasterization at least).

I'm betting that nVidia is adjusting and trying to squeeze every bit of performance they can from the 4080 before release now that AMD has thrown down the sub-4090 gauntlet. nVidia may have to adjust their launch price. Then again, they may not have to given their mindshare. There are probably lots of people willing to pay $200 nVidia tax over 7900xtx and they can justify it by saying raytracing is superior to 7900xtx (it will be).
 
Unless I'm missing something here, I just don't see the 7900 XTX as the GPU savior people are starting to hype it up to be.

AMD fans and Nvidia fans are both missing the point, here.

AMD was essentially required to put out a card this time of year and so was Nvidia. AMD challenged Nvidia to a game of slam-your-dick-in-a-car-door chicken, and Nvidia did three rails of coke, slammed their dick in the car door, and then hit the power lock. Then AMD said, OMG, you actually slammed your dick in a car door? Fucking outstanding.

Nvidia's like, why aren't you slamming your dick in the door? And AMD says that's retarded, why would anyone slam their dick in a car door?

AMD put out two cards that beat their current line-up, of which there is quite a lot of still, and that's it. They have a full stack and it runs all the way up to, but not over $1K. And it's all profit at the top for them. This makes Nvidia look desperate and greedy for most people.

They're still fighting off the "value" brand burden, on CPU and GPU fronts. I think that changes when AMD doesn't have to compete with existing AMD and Nvidia new-old stock.
 
AMD fans and Nvidia fans are both missing the point, here.

AMD was essentially required to put out a card this time of year and so was Nvidia. AMD challenged Nvidia to a game of slam-your-dick-in-a-car-door chicken, and Nvidia did three rails of coke, slammed their dick in the car door, and then hit the power lock. Then AMD said, OMG, you actually slammed your dick in a car door? Fucking outstanding.

Nvidia's like, why aren't you slamming your dick in the door? And AMD says that's retarded, why would anyone slam their dick in a car door?

AMD put out two cards that beat their current line-up, of which there is quite a lot of still, and that's it. They have a full stack and it runs all the way up to, but not over $1K. And it's all profit at the top for them. This makes Nvidia look desperate and greedy for most people.

They're still fighting off the "value" brand burden, on CPU and GPU fronts. I think that changes when AMD doesn't have to compete with existing AMD and Nvidia new-old stock.
nVidia is like Dodge when they released the Viper truck.

 
Confirmed. 7900 XTX is a 4080 competitor.


This is basically what I assumed this entire time. The VERY interesting part is just how much overhead the XTX will have in AIB partners hands. Or just to get to the point: just how much it will close any and all gaps. Moving from 2.3 to 3 GHz, I assume no more than MAYBE a 15% uplift (more likely 10-12%). But that is a massive swing against any and all 4080's that will inevitably come down the pike especially at this price point.

Let's assume a 3GHz Sapphire Toxic costs $1200, I don't think there is any 4080 that nVidia can launch that will look favorable without a massive price drop and performance bump.
 
Last edited:
This is basically what I assumed this entire time. The VERY interesting part is just how much overhead the XTX will have in AIB partners hands. Or just to get to the point: just how much it will close any and all gaps. Moving from 2.3 to 3 GHz, I assume no more than MAYBE a 15% uplift (more likely 10-12%). But that is a massive swing against any and all 4080's that will inevitably come down the pike especially at this price point.

Let's assume a 3GHz Sapphire Toxic costs $1200, I don't think there is any 4080 that nVidia can launch that will look favorable without a massive price drop and performance bump.
There is ZERO chance of an out of the box 3ghz game clock 7900xtx. Not only would a 30% game clock increase be a thing of fantasies, it has already been said that 3ghz looks like the highest the chip can even properly handle even for max boost.
 
This is basically what I assumed this entire time. The VERY interesting part is just how much overhead the XTX will have in AIB partners hands. Or just to get to the point: just how much it will close any and all gaps. Moving from 2.3 to 3 GHz, I assume no more than MAYBE a 15% uplift (more likely 10-12%). But that is a massive swing against any and all 4080's that will inevitably come down the pike especially at this price point.

Let's assume a 3GHz Sapphire Toxic costs $1200, I don't think there is any 4080 that nVidia can launch that will look favorable without a massive price drop and performance bump.
wonder how far they’ll throw the power limit out the window with the aib stuff
 
[Radeon RX 7900 XTX] is designed to go against 4080 and we don’t have benchmarks numbers on 4080. That’s the primary reason why you didnt see any NVIDIA compares. […] $999 card is not a 4090 competitor, which costs 60% more, this is a 4080 competitor.
— Frank Azor to PCWorld
 
...Moving from 2.3 to 3 GHz...
That's a 1.3x increase. Depending on how hard the chips are pushed to get them to just 2.3GHz, I would expect that to increase power consumption by at least 1.3³ = 2.2x. Not all of the 355W TBP is core power, so it wouldn't end up 355x2.2W, but it would still be ridiculously hot.
 
I see you have mentioned this before as well. Sure, you can source ALL the other components for A computer for 600, but it will fall victim to the same thing the 4090 did, and that's CPU bottlenecks. You aren't building a system that can push either card (if we are to believe the 1.7x) for $600.

Edit: I get it. You guys are FROTHING at the mouth for an AMD card and just hoping and praying it's as fast, or faster than Nvidia. You want it so bad! Unfortunately, I don't see it happening and AMD intentionally not even showing it's hand before reviewers isn't a "smooth move, because who cares about FPS anyway", it's a silly move that stinks of low performance numbers.

So, what would spices would you like on that hat if you end up eating those words? :)
 
AMD fans and Nvidia fans are both missing the point, here.

AMD was essentially required to put out a card this time of year and so was Nvidia. AMD challenged Nvidia to a game of slam-your-dick-in-a-car-door chicken, and Nvidia did three rails of coke, slammed their dick in the car door, and then hit the power lock. Then AMD said, OMG, you actually slammed your dick in a car door? Fucking outstanding.

Nvidia's like, why aren't you slamming your dick in the door? And AMD says that's retarded, why would anyone slam their dick in a car door?

AMD put out two cards that beat their current line-up, of which there is quite a lot of still, and that's it. They have a full stack and it runs all the way up to, but not over $1K. And it's all profit at the top for them. This makes Nvidia look desperate and greedy for most people.

They're still fighting off the "value" brand burden, on CPU and GPU fronts. I think that changes when AMD doesn't have to compete with existing AMD and Nvidia new-old stock.

This is the most absurd pile of nonsense that I have read in this thread so far.
 
AMD was essentially required to put out a card this time of year and so was Nvidia. AMD challenged Nvidia to a game of slam-your-dick-in-a-car-door chicken, and Nvidia did three rails of coke, slammed their dick in the car door, and then hit the power lock. Then AMD said, OMG, you actually slammed your dick in a car door? Fucking outstanding.

Nvidia's like, why aren't you slamming your dick in the door? And AMD says that's retarded, why would anyone slam their dick in a car door?

AMD put out two cards that beat their current line-up, of which there is quite a lot of still, and that's it. They have a full stack and it runs all the way up to, but not over $1K. And it's all profit at the top for them. This makes Nvidia look desperate and greedy for most people.
This might just be the best thing I've read so far here, lol!!!
 
This is the most absurd pile of nonsense that I have read in this thread so far.

4090 performs at, what, 95 percent at 70 percent power? They're running it at 140 percent power for a 5 percent boost in performance? They had to make a cooler for it. They decided to use a weird plug for it.

Nvidia got so scared shitless by AMD's rumor machine, they made a card that now has a rep for catching fire.

They slammed their dick in a card door.
 
Back
Top