AMD's Radeon RX 7900-series Highlights

Video cards in a box is new and ground breaking. Until now I've been getting mine out of the bottom of Lucky Charms and Froot Loop bags.

^^ Hey look a 10 and 14 minute video where absolutely jack shit of value was said! Amazing content to consume these days. :p
Got excited this morning, was about to click the videos, but first I read the comments, realized it was an unboxing, lost all interest. next week.
 
mebe a stupid question, but does anyone care about unboxing articles/videos? in this case, it tells you very little about the product that we didnt already know.
Yeah it gets people excited, OMG there's a real product that was shipped to a shil... er reviewer, all they have to do is follow our song and dance and not release data on the card and we'll absolutely let people do a build video where it gets put into an actual computer! Well except Linus, he'll just do a size comparison.

But yeah, I'm not getting that excited, the only two things that gets me excited is 1) seeing real world results of said product and then 2) what it does to push down the prices of older products that I may be more interested in because I'm not one of these monkeys that thinks $1000 for a graphics card is sane unless I'm making money off it in some way (crypto, a business, or streamer/gamer)

p.s. no offense to my [H]monkeys out there, I'm not using the term in a derogatory way :)
 
Last edited:
mebe a stupid question, but does anyone care about unboxing articles/videos? in this case, it tells you very little about the product that we didnt already know.
If we would look at the view count I imagine yes, large % of trapped people that simply saw RX 7900 and a video and clicked, but there is a lot of people with side glass that care about how product look like, high end consummer computer part became in the 2000s a bit of artpieces.
 
  • Like
Reactions: erek
like this
If we would look at the view count I imagine yes, large % of trapped people that simply saw RX 7900 and a video and clicked, but there is a lot of people with side glass that care about how product look like, high end consummer computer part became in the 2000s a bit of artpieces.
Wasn't there a trend where people made keychains out of dead CPUs? Shit now I kind of want to dremmel my dead R 1600x
 
  • Like
Reactions: erek
like this
Wasn't there a trend where people made keychains out of dead CPUs? Shit now I kind of want to dremmel my dead R 1600x
I just cleaned up some of my old storage the other day... I pulled an old Athlon Thunderbird 1.4 and a P4 3.0ghz out of some scrap. Keychain... perhaps a wall display with a few other additions. lol
 
While I have nvidia, i really feel there are so many shills on twitter. It almost seems like they will even find a way to spin positive in to negative. It almost seems they follow each other in pumping the narrative. Like I get it N31 might have come up short on clocks and power target. May be they designed it to go above 3ghz but this revision didn't get there. Now they are all hammering it as a hardware bug (like I usually consider a bug that makes you have crashes lmao) that it didn't hit its target and it needs more power to do better in apps that put more load. I mean its a shitty way of saying hey if there is more load you may not get better boost clocks and navi 31 right now takes more power to get to higher clocks (which amd avoided). It's almost like they are spinning it as its some how bad it doesn't use more power lmao.

Sure I get the point it may not have hit full blown clock target with this revision. I will give them that but to some how start spreading hardware bug it almost makes it sound like the card is bad and you are getting a bad chip lmao.
 
While I have nvidia, i really feel there are so many shills on twitter. It almost seems like they will even find a way to spin positive in to negative. It almost seems they follow each other in pumping the narrative. Like I get it N31 might have come up short on clocks and power target. May be they designed it to go above 3ghz but this revision didn't get there. Now they are all hammering it as a hardware bug (like I usually consider a bug that makes you have crashes lmao) that it didn't hit its target and it needs more power to do better in apps that put more load. I mean its a shitty way of saying hey if here is more load you may not get better boost clocks and navi 31 right now takes more power to get to higher clocks (which amd avoided). It's almost like they are spinning it as its some how bad it doesn't use more power lmao.

Sure I get the point it may not have hit full blown clock target with this revision. I will give them that but to some how start spreading hardware bug it almost makes it sound like the card is bad and you are getting a bad chip lmao.
Idk. A lot of people talk more like Nvidia shareholders than consumers. Gets tiring.
 
They are both more consistent in their performance than I thought they would be, the rumored price for the 4070TI coming in at $899 makes sense I guess.


I need to clarify that it making sense is not good, the prices are not good and I don't like them, but at least there is some consistency to them which I can at least tolerate.
 
Stolen from the WCCFTech comments section.
1670606824052.png
 
For the videocardz direct:
https://videocardz.com/newz/first-a...0-xt-3dmark-timespy-firestrikes-scores-are-in

If those numbers "legit" would the 4k numbers TimeSpy a sign of something gone wrong (something that bottleneck the xtx, maybe the chiplet design not scaling up correctly or something):

7900 xtx: 13729
7900xt: 13687

One would expect with how much difference there is specs wise a bigger gap and the xtx to go over an 4080

RDNA 2 was scaling
6950xt: 10644
6900xt: 9952
6800xt: 9203

And those test are some where we would expect them to look good (However, neither TimeSpy nor FireStrike use modern graphics technologies like raytracing,) the 6800xt was beating the 3090TI in 3dmark Fire Strike at 1440p the 6950xt was ahead of the 3090TI in 3dmark Time Spy the 6900xt ahead of the 3090.

Would not be surprised if those value change over time has driver mature.
 
Last edited:
  • Like
Reactions: erek
like this
For the videocardz direct:
https://videocardz.com/newz/first-a...0-xt-3dmark-timespy-firestrikes-scores-are-in

If those numbers "legit" would the 4k numbers TimeSpy I sign of something gone wrong (something that bottleneck the xtx):

7900 xtx: 13729
7900xt: 13687

One would expect with how much difference there is specs wise a bigger gap and the xtx to go over an 4080

RDNA 2 was scaling
6950xt: 10644
6900xt: 9952
6800xt: 9203

And those test are some where we would expect them to look good (However, neither TimeSpy nor FireStrike use modern graphics technologies like raytracing,) the 6800xt was beating the 3090TI in 3dmark Fire Strike at 1440p the 6950xt was ahead of the 3090TI in 3dmark Time Spy the 6900xt ahead of the 3090.

Would not be surprised if those value change over time has driver mature.
I wouldn't be surprised but I am not counting on it.
 
I wouldn't be surprised but I am not counting on it.
By bottleneck I do not mean others part the computer that was used for the benchmark, but the chiplet substrate/driver/i.e something in the card itself no letting use all its added CUs, ram bandwith and other advantage over the xt
 
  • Like
Reactions: erek
like this
By bottleneck I do not mean others part the computer that was used for the benchmark, but the chiplet substrate/driver/i.e something in the card itself no letting use all its added CUs, ram bandwith and other advantage over the xt
I was referring to the drivers improving performance in any significant way over time as they mature...
 
For the videocardz direct:
https://videocardz.com/newz/first-a...0-xt-3dmark-timespy-firestrikes-scores-are-in

If those numbers "legit" would the 4k numbers TimeSpy I sign of something gone wrong (something that bottleneck the xtx):

7900 xtx: 13729
7900xt: 13687

One would expect with how much difference there is specs wise a bigger gap and the xtx to go over an 4080

RDNA 2 was scaling
6950xt: 10644
6900xt: 9952
6800xt: 9203

And those test are some where we would expect them to look good (However, neither TimeSpy nor FireStrike use modern graphics technologies like raytracing,) the 6800xt was beating the 3090TI in 3dmark Fire Strike at 1440p the 6950xt was ahead of the 3090TI in 3dmark Time Spy the 6900xt ahead of the 3090.

Would not be surprised if those value change over time has driver mature.
I would wait until we have games. 3d mark usually doesn’t mean much for games.
 
  • Like
Reactions: erek
like this
Honestly that does look pretty sexy for a big card lmao. At least it got 3 brackets. I hate when gigabyte goes with 2 brackets on a big ass card.
a lot better than that STRIX 4090 and 4080 imho too
 
  • Like
Reactions: erek
like this
Maybe I'm not just hip to what is "in" with graphics cards... but why are the RGB(??) colors on the side of the card that faces the motherboard? Shouldn't that kind of stuff be on the other side? Or is it meant for gpu extender dohicky?

Some people will mount their card vertically with an adapter.
 
  • Like
Reactions: erek
like this
Maybe I'm not just hip to what is "in" with graphics cards... but why are the RGB(??) colors on the side of the card that faces the motherboard? Shouldn't that kind of stuff be on the other side? Or is it meant for gpu extender dohicky?
They are on both sides top and bottom.

1670625990926.png
 
I would wait until we have games. 3d mark usually doesn’t mean much for games.
And when we have game benchmarks?

"Wait until we have 7900 X3D. Just wait."

j/k but the goalpost shifting for AMD is always a little strange. And TBH the 3dmark numbers aren't even bad - they're pretty much where these new GPUs were expected to land.
 
Would not be surprised if those value change over time has driver mature.
Yass. "Fine wine". Buy not for the performance you get today, but for the hope of what might be tomorrow.

AMD does have a record of improving performance in drivers, but then it's not something unique to them. My issue is with the overall - I made one tech buying rule, to stop buying motherboards, VR headsets, whatever based on future/roadmap promises. Instead buy based on what you actually get today.
 
Last edited:
And when we have game benchmarks?

"Wait until we have 7900 X3D. Just wait."

j/k but the goalpost shifting for AMD is always a little strange. And TBH the 3dmark numbers aren't even bad - they're pretty much where these new GPUs were expected to land.

LMAO I have a 4090. I don't even care, but not sure what the goal post shifting was there in my post. Just saying 3D marks don't really tell all the story when it comes to performance in cards. Obviously there is going to be difference with large difference in specs of a card but it can be close if there isn't but games can differ in that scenario. That's all.
 
LMAO I have a 4090. I don't even care, but not sure what the goal post shifting was there in my post. Just saying 3D marks don't really tell all the story when it comes to performance in cards. Obviously there is going to be difference with large difference in specs of a card but it can be close if there isn't but games can differ in that scenario. That's all.
Not sure how your GPU relates, but all good as the comment wasn't directed at you specifically, but the overall pattern I've noticed every GPU cycle. Each new benchmark leak and there's hand wringing about it not being a real benchmark, wait for fine wine drivers etc. And when there are game benches it's either "those games all favor Nvidia anyway" or "just wait for nextgen" etc.

In fairness, Nvidia dorks are known to complain about "AMD sponsored" games when discussing and trying to disregard some benches. But just an observation.
 
Last edited:
Instead buy based on what you actually get today.
Then when people say to wait on actual game results, which is a smart move for ANY brand of hardware, you say it's just moving goalposts. We were saying the same thing with rtx 4000. We're always saying it for anything. Wait for real results. Even for rtx 3000/ rx 6000 cards the games you play can make a big difference because yes, some games run vastly better on either which isn't reflected well even in gaming averages or 3d benchmarks.
 
Then when people say to wait on actual game results, which is a smart move for ANY brand of hardware, you say it's just moving goalposts. We were saying the same thing with rtx 4000. We're always saying it for anything. Wait for real results. Even for rtx 3000/ rx 6000 cards the games you play can make a big difference because yes, some games run vastly better on either which isn't reflected well even in gaming averages or 3d benchmarks.
I don't disagree. It was mostly the idea of just disregarding 3DMark for "it's not a game" that stood out, because you're running out of road at that point as 3DMark does roughly correlate to performance deltas you'll see across aggregates of game benchmarks. IMO, at least a significant enough datapoint to extrapolate from that unless there's some drastically different day one driver, it becomes fairly predictable how the next chapter will go at release.

Obviously what's most important still comes down to the games or tasks most important to the buyer - meaning if brand X gives you X% better performance-per-dollar in Your-Favorite-Game(s), then who cares about aggregate scores.
 
Last edited:
I don't disagree, but it was the idea of just disregarding 3DMark because "it's not a game", because you're running out of road at that point as 3DMark does roughly correlate to performance deltas you'll see across aggregates of game benchmarks. Or it's at least a significant enough datapoint to extrapolate from that unless there's some drastically improved day one driver, it becomes fairly predictable how the next chapter will go at release.

Obviously what's most important still comes down to the games or tasks most important to the buyer - meaning if brand X gives you X% better performance-per-dollar in Your-Favorite-Game(s), then everything else becomes secondary.
Favorite Game can also change in a few months, knowing how well it does on what you are playing now is important, but you also need to have an idea of how it may play something new 6 months down the road.
 
This thread really is moot because scalpers will bot all the OEM cards for the first 6 months or until they feel they'll be stuck selling them at cost/loss.
 
  • Like
Reactions: erek
like this
This thread really is moot because scalpers will bot all the OEM cards for the first 6 months or until they feel they'll be stuck selling them at cost/loss.

I think scalpers are gonna be disappointed with these cards. because margins are going to be shit between this, 4080 and 4090. If they buy these and try go sell it for $1300 people will just try to get 4090 lmao. That's what you are seeing with the 4080. They are already paying for that and that's why you can get 4080s anytime you want.
 
Last edited:
  • Like
Reactions: erek
like this
Favorite Game can also change in a few months, knowing how well it does on what you are playing now is important, but you also need to have an idea of how it may play something new 6 months down the road.
That's a very important point, especially seeing as I've witnessed shifts in GPU technology in the past suddenly shake up who the market leader was, often with the previous leaders going out of business entirely.

3dfx Glide used to be the only way to get 3D acceleration in PC games, but then people stopped caring once the GeForce 256 brought HT&L and thus massive performance increases in games that supported it, alongside performance in existing games that slapped the Voodoo5 5500 silly.

Then nVidia had their own misstep with GeForce FX, which does great at DX8/SM 1.4 and prior and is now respected as a good choice for retrogaming builds (last nVidia architecture to support 8-bit paletted textures and other early Direct3D weirdness), but was absolute crap at DX9/SM 2.0 and was also late to market against Radeon 9700/9800. The FX 5900/5950 could hold its own in Doom 3 (which sidestepped the D3D issue by using OpenGL instead), but Half-Life 2 defaulted to DX8.1 mode for very good reason and both Far Cry and F.E.A.R. would have those top-end GPUs perform about as well as a lowly Radeon 9600 because of their DX9 renderers, while the 9800 XT could be up to twice as fast.

I can imagine that designing a GPU architecture that is not only competent at existing titles, but also isn't handicapped in some respect when using newer graphics features is a tricky balance to strike.

Nowadays, that's probably raytracing performance - needs to be done with mostly fixed-function hardware at the moment, but that takes up die space that could go to rasterization pipelines instead, improving performance in more games overall. Raytracing hasn't had its GLQuake moment yet, though - there just isn't a killer app that simply requires it to experience the game in some form, as most RT-required games are simply enhanced versions of games that didn't originally have it (Quake II, Portal, Metro Exodus), and likely won't be if new consoles like the PS5 still have to cut framerate down to 30 FPS just to utilize raytracing to any significant extent in games that normally run at 60 FPS without it.
 
Heh, a Rockstar game having driver issues? Say it ain't so. I give the benefit of the doubt to either gpu maker when Rockstar games are involved like RDR2.
 
Favorite Game can also change in a few months, knowing how well it does on what you are playing now is important, but you also need to have an idea of how it may play something new 6 months down the road.
I would just focus on the major engines.... past that everything is a gamble. Make sure your favorites today run decently. Then if it does well with unreal and cry they are the only two engines that are really going to push anyway. (and AMD and Nvidia both optimize hard for those engines... and Intel will be as well at some point) Other developers that use their own engines are a crap shoot anyway, could go better green or red or blue at this point. The custom engine developers are also most likely to take a few bucks to optimize.

You can always try and guess. I mean if you love a game that is a AMD featured game or a way its meant to be played title... if you know that developer has a game coming in a year you will want, if they took money once they will do the same thing again probably. To be fair though... the way its meant to be played tittles all seem to perform just fine on AMD hardware, and the games AMD have their logo on and advertised before launch like Godfall/Callisto don't have any issue on Nvidia either. IMO the industry seems to have progressed quite a bit, I can't remember any game released in the last 3 or 4 years that ran well on one and utter garbage on the other. The differences in any game seem to be slight single digit differences to the mean. I mean who cares if a card that is 12% faster or slower on average ranges from 10-15% depending on title. Its mostly not worth worrying about.
 
  • Like
Reactions: erek
like this
Back
Top