AMD 7900 GPU series reviews are up.

Status
Not open for further replies.
I bet Jensen is wishing he didn't give us a 102 die in the 4090, could have easily launched the 4080 as the top dog and had no problem.

It's fine at least he reserved the 102 die for the 90 Class GPU this time and did not put it in a $700 80 Class GPU like the 3080 lol
 
It's fine at least he reserved the 102 die for the 90 Class GPU this time and did not put it in a $700 80 Class GPU like the 3080 lol
The only reason GA102 was in the 3080 is because the Samsung node sucked and they needed a GPU with some legs.

As we see, the 4N TSMC node is way better and Nvidia is getting the performance they want.
 
From reading a few reviews, I would say it beats out the 4080 in most things outside of RTX + DLSS. https://www.thefpsreview.com/2022/12/12/amd-radeon-rx-7900-xtx-video-card-review/

and it's $200 less expensive to boot.
Depends on the game suite used for the reviews. But based on all the reviews I've seen the 7900XTX is somewhere between 0-5% better overall than the 4080 in pure rasterization depending on the resolution, but then gets demolished by the 4080 in RT by 15-20%+. And this is while utilizing more power than the 4080.

So if you only care about pure rasterization then it is very slightly better value (lol) at $999 than the 4080 at $1199. But if you care about RT then that extra $200 might be actually worth spending on the 4080. If Nvidia moves forward with dropping the price of the 4080 to at least $1099 then there is little reason to get a 7900XTX over it.
 
From reading a few reviews, I would say it beats out the 4080 in most things outside of RTX + DLSS. https://www.thefpsreview.com/2022/12/12/amd-radeon-rx-7900-xtx-video-card-review/

and it's $200 less expensive to boot.
But who wants to tank their performance in half to add Ray Tracing anyway? Despite the 4080/4090 having faster Ray tracing, I still don't think it's fast enough to justify even using in the first place. Of any of the games I've play that use it (BF2042, Warzone) it's much better to just leave that off and enjoy the higher FPS. To me, at least for the next few generations, rasterization is far more important. AMD hasn't released their next gen FSR 3 or HYPER-RX so we have that to look forward to in 2023. The 7900XTX is clearly the faster card and for games that it lags behind, I think it's just due to immature drivers. I recall there was pretty substantial performance improvement after several months of driver updates with the 6000 series.
 
But who wants to tank their performance in half to add Ray Tracing anyway? Despite the 4080/4090 having faster Ray tracing, I still don't think it's fast enough to justify even using in the first place. Of any of the games I've play that use it (BF2042, Warzone) it's much better to just leave that off and enjoy the higher FPS. To me, at least for the next few generations, rasterization is far more important. AMD hasn't released their next gen FSR 3 or HYPER-RX so we have that to look forward to in 2023. The 7900XTX is clearly the faster card and for games that it lags behind, I think it's just due to immature drivers. I recall there was pretty substantial performance improvement after several months of driver updates with the 6000 series.
Every game I've tested can do max ray tracing at 4K and performs very well on my 4090. There are 2 exceptions to this.

  • Cyberpunk 2077 - Needs DLSS Quality
  • Portal RTX - Needs DLSS Performance + Frame Generation

The 4080 is not there yet, nor is the 7900 XT/X. The 4090 is definitely there, though. The 4090 is literally next generation levels of RT performance.
 
Last edited:
But who wants to tank their performance in half to add Ray Tracing anyway? Despite the 4080/4090 having faster Ray tracing, I still don't think it's fast enough to justify even using in the first place. Of any of the games I've play that use it (BF2042, Warzone) it's much better to just leave that off and enjoy the higher FPS. To me, at least for the next few generations, rasterization is far more important. AMD hasn't released their next gen FSR 3 or HYPER-RX so we have that to look forward to in 2023. The 7900XTX is clearly the faster card and for games that it lags behind, I think it's just due to immature drivers. I recall there was pretty substantial performance improvement after several months of driver updates with the 6000 series.
Yup. Only one gpu out right now seems to be geared for these 4k RT benchmarks. The 4080 might as well be the same as the 7900xtx.

That 15% difference is going to matter a lot less than a few are making it out to be.
 
Yup. Only one gpu out right now seems to be geared for these 4k RT benchmarks. The 4080 might as well be the same as the 7900xtx.

That 15% difference is going to matter a lot less than a few are making it out to be.

Exactly. If the 7900 XTX is "so bad" at RT then what makes anyone think that an extra 15% is going to be game changing and suddenly worth it to use RT? It's like going from 40fps to 46fps while the 4090 would push that over 60.
 
Exactly. If the 7900 XTX is "so bad" at RT then what makes anyone think that an extra 15% is going to be game changing and suddenly worth it to use RT? It's like going from 40fps to 46fps while the 4090 would push that over 60.
60fps is still really poor when you have the ability to be over 100fps and minimal difference in visual quality. 60fps is bad when you have a 144hz monitor. I wouldn't use ray tracing in multiplayer games like Warzone, for example, unless they could guarantee me at least 120 FPS and no card will do that right now.
 
  • Like
Reactions: noko
like this
I don't think anyone looking at a 7900xtx is seriously considering a AAA RT experience. It's a worthwhile upgrade for anyone that skipped last generation and have 1k to spend, but that 1k could be convinced to go nvidia if any price cuts happen. Both 4080 and 7900xtx are really just middling entries to be honest. 4090 gets bragging rights and the performance to back it up, but anyone shopping in the tier down will have a tough call to make. If we exclude RT, raster performance of the previous gen is amazing that all but the most demanding competitive fps players would ask for already making a 6900/6950 a pretty damn good deal.
 
60fps is still really poor when you have the ability to be over 100fps and minimal difference in visual quality. 60fps is bad when you have a 144hz monitor. I wouldn't use ray tracing in multiplayer games like Warzone, for example, unless they could guarantee me at least 120 FPS and no card will do that right now.

Well yes but that wasn't my point....my point was that both the 7900 XTX and the RTX 4080 are BOTH not going to offer great RT experiences regardless, only the 4090 really does. And yes it can do 100+ fps with RT on at 4K provided you also use DLSS except for some titles like CP2077 and Dying Light 2. Even Control was running at 100+ fps with RT maxed out with DLSS Quality.
 
This right here is the problem for AMD. Hoping for the mythical and debunked "fine wine."

They need to put more effort into their launch drivers because first impressions have been killing them. The XTX is a good card held back by AMD software.
Driver improvements over time are not "mythical".
Cards with more compute power and memory tend to age better with driver improvements.
Not to mention this is the first "chiplet" GPU, i'm sure there are plenty of improvements to be made over time.

Same thing happens each and every console generation with games(launch games don't look as good as 5+ years in the lifecycle games).

I don't think that the idle power consumption will come down to 6900xt levels though.....
 
But who wants to tank their performance in half to add Ray Tracing anyway? Despite the 4080/4090 having faster Ray tracing, I still don't think it's fast enough to justify even using in the first place. Of any of the games I've play that use it (BF2042, Warzone) it's much better to just leave that off and enjoy the higher FPS. To me, at least for the next few generations, rasterization is far more important. AMD hasn't released their next gen FSR 3 or HYPER-RX so we have that to look forward to in 2023. The 7900XTX is clearly the faster card and for games that it lags behind, I think it's just due to immature drivers. I recall there was pretty substantial performance improvement after several months of driver updates with the 6000 series.

I love ray-tracing. Since I only play single-player games I don't really need 100+ FPS in everything (even though it is quite nice to have).

Even with taking RT into account, if we're looking purely at MSRP and reference performance the XTX does interest me more than 4080. However, I have no interest in an AMD reference card, so that completely alters the discussion.
 
60fps is still really poor when you have the ability to be over 100fps and minimal difference in visual quality. 60fps is bad when you have a 144hz monitor. I wouldn't use ray tracing in multiplayer games like Warzone, for example, unless they could guarantee me at least 120 FPS and no card will do that right now.
There's quite a few videos of 13900k/7950 setups with 4090 doing 160+ fps rt on dlss off in 4k in warzone, unless it's a different warzone than I'm thinking of.
 
Only if you dial back the settings. My goal is to run the game at max settings, with ray tracing enabled, and have my FPS pegged at 144fps 100% of the time. WoW has brought major changes to the game over the years and new more GPU-intensive content with every new expansion. The 9th expansion just launched 2 weeks ago... My 2080 is pegged at 100% GPU usage when I play. I don't think it's unreasonable to want to have as much info as possible when choosing my next card.
I'm sure all of us here appreciate having hardware to run games at 4k 144+fps. But for WoW? Really??? What benefit is there for running WoW at 144 fps capped other than OCD lol?
 
Digital Foundry's review included the XT and the XTX. They arrived at the same conclusion all of us did; the 7900 XT is piss-poor value. It needs to be an $800 card to make any sense at all.
Yeah, what is crazy about this generation from Nvidia and AMD is that it is their second tier cards 4080 and 7900 XT that presents terrible value. Not the top cards.

Normally you would pay 40% more for 10% more performance or something like that on the top cards. But not this time: they want us to see value in the most expensive cards and we eat the bait. Well, I did at least ordering a RTX 4090.
 
Yeah, what is crazy about this generation from Nvidia and AMD is that it is their second tier cards 4080 and 7900 XT that presents terrible value. Not the top cards.

Normally you would pay 40% more for 10% more performance or something like that on the top cards. But not this time: they want us to see value in the most expensive cards and we eat the bait. Well, I did at least ordering a RTX 4090.

The 4090 is still a terrible value, but at least it's expected for halo tier cards. Really both the 4080 and the XTX should be under $1000, but guess this is where we're at now and there's no going back.
 
The 4090 is still a terrible value, but at least it's expected for halo tier cards. Really both the 4080 and the XTX should be under $1000, but guess this is where we're at now and there's no going back.
What can people expect? People have been giving the green flag on these prices ever since, what? Turing?
 
The 4090 is still a terrible value, but at least it's expected for halo tier cards. Really both the 4080 and the XTX should be under $1000, but guess this is where we're at now and there's no going back.
That's the problem with this generation... so far, the only GPUs that seem to provide ANY value at all are the 7900 XTX and the RTX 4090, a $1000 and $1600 GPU. Remember the last time we said that? Oh that's right... never. Weird times we live in.
 
The 4090 is still a terrible value,

Would it be found at $1600, at $10 USD by frame it would sadly be not that bad in the current market, similar to an used 3090 but with a warrenty (and usually the higher you go the less frame by $ you expect), it had line around the blocks at launch after all and his currently selling in average around $2400, showing how much value some would find in a $1600 one.

dollars-per-frame_3840-2160.png
 
Last edited:
I'm sure all of us here appreciate having hardware to run games at 4k 144+fps. But for WoW? Really???

Just because I might play different games than you do doesn't make my desire for higher performance any less valid than anyone else's. WoW is the only game that I play as part of an organized team that meets and plays on a regular schedule. I want the best performance that I can possibly get, because it is a fast-paced game especially in group content.

What benefit is there for running WoW at 144 fps capped other than OCD lol?

What benefit is there for having a 144Hz monitor if you can't pump out 144fps most of the time? The reason why I would want FPS that can match my monitor's refresh rate would seem obvious.
 
Driver improvements over time are not "mythical".
Cards with more compute power and memory tend to age better with driver improvements.
Not to mention this is the first "chiplet" GPU, i'm sure there are plenty of improvements to be made over time.

Same thing happens each and every console generation with games(launch games don't look as good as 5+ years in the lifecycle games).

I don't think that the idle power consumption will come down to 6900xt levels though.....
No one is talking about driver improvements when fine wine is discussed. HardOCP debunked the fine wine argument. I suggest you look it up.
 
Pricing aside (Greed-flation), it seems to me that the 7900XTX is a fantastic card. I have no idea why it's getting hate in here.

It is faster or equal to a 4080 in raster performance. It is faster than a 3090Ti in raytracing performance. It costs less than a 4080 (and yes $200 is signifciant and it's not even possible to buy a 4080 at MSRP) and more than a 3090Ti (if you can find stock). It seems to me that it is competitively priced (relative to the way nVidia has placed cards in the market) for the way that it performs. If anything it is better dollar to performance than current nVidia offerings.

Calling the 7900XTX bad because the raytracing doesn't perform like a 4080 is a bit silly considering that the 3090Ti also doesn't meet that performance level. So is the 3090Ti bad now because it doesn't meet 4080 levels of RT performance? It's an absurd statement, obviously. The 7900XTX outperforms the 3090Ti in RT, it's far from being "poor".

The 7900XT though is basically AMD's 4080 12GB (that is the 4080 which nVidia has quickly pulled and will turn into a 4070Ti or whatever). It should cost $700 and not $900. There is WAY too big of a performance drop for that $100 price difference.
 
  • Like
Reactions: noko
like this
Pricing aside (Greed-flation), it seems to me that the 7900XTX is a fantastic card. I have no idea why it's getting hate in here.

It is faster or equal to a 4080 in raster performance. It is faster than a 3090Ti in raytracing performance. It costs less than a 4080 (and yes $200 is signifciant and it's not even possible to buy a 4080 at MSRP) and more than a 3090Ti (if you can find stock). It seems to me that it is competitively priced (relative to the way nVidia has placed cards in the market) for the way that it performs. If anything it is better dollar to performance than current nVidia offerings.

Calling the 7900XTX bad because the raytracing doesn't perform like a 4080 is a bit silly considering that the 3090Ti also doesn't meet that performance level. So is the 3090Ti bad now because it doesn't meet 4080 levels of RT performance? It's an absurd statement, obviously. The 7900XTX outperforms the 3090Ti in RT, it's far from being "poor".

The 7900XT though is basically AMD's 4080 12GB (that is the 4080 which nVidia has quickly pulled and will turn into a 4070Ti or whatever). It should cost $700 and not $900. There is WAY too big of a performance drop for that $100 price difference.

It's not a bad card but I think everyone's expectations were higher since AMD themselves claimed up to 1.7x a 6950XT. So even conservative estimates put it at 1.5x a 6950XT on average. The actual uplift? More like 1.3x.
 
I have no idea why it's getting hate in here.

It's funny that 3090ti level raytracing is now considered shit, though. Games are still pretty much the same. I think some are just spoiled on the 4090 already.

I'll say that yes, still seeing $1k for a GPU sucks and I'll never pay that for the foreseeable future. But well, people let Nvidia normalize it and AMD is just following suite.
 
  • Like
Reactions: noko
like this
Pricing aside (Greed-flation), it seems to me that the 7900XTX is a fantastic card. I have no idea why it's getting hate in here.

It is faster or equal to a 4080 in raster performance. It is faster than a 3090Ti in raytracing performance. It costs less than a 4080 (and yes $200 is signifciant and it's not even possible to buy a 4080 at MSRP) and more than a 3090Ti (if you can find stock). It seems to me that it is competitively priced (relative to the way nVidia has placed cards in the market) for the way that it performs. If anything it is better dollar to performance than current nVidia offerings.

Calling the 7900XTX bad because the raytracing doesn't perform like a 4080 is a bit silly considering that the 3090Ti also doesn't meet that performance level. So is the 3090Ti bad now because it doesn't meet 4080 levels of RT performance? It's an absurd statement, obviously. The 7900XTX outperforms the 3090Ti in RT, it's far from being "poor".

The 7900XT though is basically AMD's 4080 12GB (that is the 4080 which nVidia has quickly pulled and will turn into a 4070Ti or whatever). It should cost $700 and not $900. There is WAY too big of a performance drop for that $100 price difference.
It's a great card, but it's not really on-par with a 3090ti from a raytracing standpoint. It's closer to a 3080ti. Still not bad, but it's about what people expected. It's basically a full generation behind in raytracing performance.
 
It's not a bad card but I think everyone's expectations were higher since AMD themselves claimed up to 1.7x a 6950XT. So even conservative estimates put it at 1.5x a 6950XT on average. The actual uplift? More like 1.3x.
I guess? But when has there ever been anything that has delivered up to the par of the hype machine? I'm an old curmudgeon by forum standards though, and maybe my memory is longer regarding the consistency of ALL manufacturers overpromising and underdelivering. And also the nature of "ignore all hype and wait for benchmarks". I feel like at this point it's kind of on you if you have any other mentality.
It's funny that 3090ti level raytracing is now considered shit, though. Games are still pretty much the same. I think some are just spoiled on the 4090 already.

I'll say that yes, still seeing $1k for a GPU sucks and I'll never pay that for the foreseeable future. But well, people let Nvidia normalize it and AMD is just following suite.
Absolutely. And for all of us I think this is bad. However, I do also think the prices will normalize themselves. There just aren't that many people willing to pay ultra premium for graphics cards. Most "gamers" are console gamers or low end PC gamers. Most GPU's cost between $200-$500. Even the top end of that specturm, $500 is an outlier. $1000 is just to the point of the absurd, even for hobbiests.

I think the prices will eventually equalize just because of those factors. Once the silicon shortage fully alleviates and there is greater competition, the ROI will return to "selling more cards" > "selling fewer higher priced cards". There will ALWAYS be a race to the bottom.
It's a great card, but it's not really on-par with a 3090ti from a raytracing standpoint. It's closer to a 3080ti. Still not bad, but it's about what people expected. It's basically a full generation behind in raytracing performance.
I'm basing my statements off of the DF video where he talks about RT in combination with reconstruction techniques:



Brent Justice, formerly of HardOCP also reaches the same conclusions as stated here: https://www.thefpsreview.com/2022/12/12/amd-radeon-rx-7900-xtx-video-card-review/8/
In Ray Tracing performance the AMD Radeon RX 7900 XTX is on par with the GeForce RTX 3090 to GeForce RTX 3090 Ti performance, depending on the game.

Overall, I'd say it's a mixed bag, but I don't think it's unfair to say the 7900XTX is roughly 3090Ti level in RT. It's title dependent.
 
Last edited:
I guess? But when has there ever been anything that has delivered up to the par of the hype machine? I'm an old curmudgeon by forum standards though, and maybe my memory is longer regarding the consistency of ALL manufacturers overpromising and underdelivering. And also the nature of "ignore all hype and wait for benchmarks". I feel like at this point it's kind of on you if you have any other mentality.

Absolutely. And for all of us I think this is bad. However, I do also think the prices will normalize themselves. There just aren't that many people willing to pay ultra premium for graphics cards. Most "gamers" are console gamers or low end PC gamers. Most GPU's cost between $200-$500. Even the top end of that specturm, $500 is an outlier. $1000 is just to the point of the absurd, even for hobbiests.

I think the prices will eventually equalize just because of those factors. Once the silicon shortage fully alleviates and there is greater competition, the ROI will return to "selling more cards" > "selling fewer higher priced cards". There will ALWAYS be a race to the bottom.

I'm basing my statements off of the DF video where he talks about RT in combination with reconstruction techniques:


Overall, I'd say it's a mixed bag, but I don't think it's unfair to say the 7900XTX is roughly 3090Ti level in RT.


IDK probably because AMD has never really given claims that were kinda off the mark like that before. Usually what they claim is what we get, or at least close to it. All the slides they been showing seem to point to a 50% uplift on average and people just believed it as usual. Nvidia has always given us BS numbers so nobody ever believes it at this point. Yes they obviously cherry picked 6 games and only showed the best case scenarios but that is the point I am trying to make here. They typically don't do that. Rather than cherry pick and only show the best case scenarios they tended to show the real average we can expect across the board.
 

Attachments

  • AMD-Radeon-RX-7000-RDNA-3-GPU-_-RX-7900-XTX-RX-7900-XT-Launch-_-Navi-31-GPU-_18.png
    AMD-Radeon-RX-7000-RDNA-3-GPU-_-RX-7900-XTX-RX-7900-XT-Launch-_-Navi-31-GPU-_18.png
    913.6 KB · Views: 1
Last edited:
These cards should've been the 7900 XT and 7800 XT at the same MSRP or slightly lower on the 900 series case. Both cards would've been a revelation at the right price. 6800 XT launch price of 650 for a 7800 XT (The 7900 XT), and 850 for the 7900 XT (the 7900 XTX). These things are going to be sitting on shelves if they actually made 200k of them, they're garbage at their current price.
 
These cards should've been the 7900 XT and 7800 XT at the same MSRP or slightly lower on the 900 series case. Both cards would've been a revelation at the right price. 6800 XT launch price of 650 for a 7800 XT (The 7900 XT), and 850 for the 7900 XT (the 7900 XTX). These things are going to be sitting on shelves if they actually made 200k of them, they're garbage at their current price.
I think we all agree that the 7900XT is poorly priced, but I'm more or less certain that the 7900XTX will sell out within the first week.

I also think all the commentary about the name doesn't really change much if anything at all. They could've named it the Radeon "1" and "2" for all that it changes. Names are marketing. If it affects you, you wouldn't know. And if it doesn't then it has no meaning what-so-ever. It makes just as much difference as the naming between the Ferrai F355 to F360 to F430 to F458 to F488. Why those names/numbers? Who cares?

Just look at the price and the performance and either buy or don't buy based on that criteria. For everything else the market will decide for itself.
 
Seems like testing done with an AMD CPU makes it look better. This is a much more optimistic view of the 7900XTX

 
It is faster than a 3090Ti in raytracing performance.
Close enough that it could be either, but is it ?

in Quake 2 RTX it seem behind a 3090 with the denoiser on
https://www.phoronix.com/review/rx7900xt-rx7900xtx-linux/8

In 3dMark port royal in path tracing mode, It is significantly below a 3090TI, more about a 3080 (12gb).
https://www.guru3d.com/articles_pages/amd_radeon_rx_7900_xtx_review,24.html

Same in Indigo Rendering:
https://www.guru3d.com/articles_pages/amd_radeon_rx_7900_xtx_review,26.html
Between a 3080 and 3080(12GB) level of performance

it is stronger enough at everything else to get around or better the same performance in some game with hybrid rendering solution but I am not sure it is actual faster at doing RT.

Techpower up game sample show 7900xtx being an exact tie with a 3090TI in 4k RT games, but the more RT heavy (Control-Cyberpunk, Metro Exodus etc...) saw -65%, -67% turning RT on the 7900xtx vresus -50% type of number on a 3090TI typically ahead, the big win in very low RT effect Far Cry 6 skewing a bit things
 
Close enough that it could be either, but is it ?

in Quake 2 RTX it seem behind a 3090 with the denoiser on
https://www.phoronix.com/review/rx7900xt-rx7900xtx-linux/8

In 3dMark port royal in path tracing mode, It is significantly below a 3090TI, more about a 3080 (12gb).
https://www.guru3d.com/articles_pages/amd_radeon_rx_7900_xtx_review,24.html

Same in Indigo Rendering:
https://www.guru3d.com/articles_pages/amd_radeon_rx_7900_xtx_review,26.html
Between a 3080 and 3080(12GB) level of performance

it is stronger enough at everything else to get around or better the same performance in some game with hybrid rendering solution but I am not sure it is actual faster at doing RT.

Techpower up game sample show 7900xtx being an exact tie with a 3090TI in 4k RT games, but the more RT heavy (Control-Cyberpunk, Metro Exodus etc...) saw -65%, -67% turning RT on the 7900xtx vresus -50% type of number on a 3090TI typically ahead, the big win in very low RT effect Far Cry 6 skewing a bit things
I'm all for arguing the finer points, let's just say everything you're saying is true. Ultimately people just want to know can it perform the way I want for the price that I want to pay. Considering the pricing on 3090/3090Ti/4080/4090 I would say that the 7900XTX is competitive vs those options. Right for you or anyone else in all scenarios? Can't say, but I can say that it's very reasonable for what it is.
 
I'm all for arguing the finer points, let's just say everything you're saying is true. Ultimately people just want to know can it perform the way I want for the price that I want to pay.
Will have to see the actual price, but for sure keeping the face value price tag of the 6900xt is good value relative to the 2017-2022 marketplace, if they are available at that price point, in China already sold out apparently with only the high price model available for orders.

That comment take for granted the the energy issue will be fix, otherwise the 7900xtx could end up costing signficantly more than a 4080 with similar performance which was considered a terrible deal.
 
Last edited:
I think we all agree that the 7900XT is poorly priced, but I'm more or less certain that the 7900XTX will sell out within the first week.

I also think all the commentary about the name doesn't really change much if anything at all. They could've named it the Radeon "1" and "2" for all that it changes. Names are marketing. If it affects you, you wouldn't know. And if it doesn't then it has no meaning what-so-ever. It makes just as much difference as the naming between the Ferrai F355 to F360 to F430 to F458 to F488. Why those names/numbers? Who cares?

Just look at the price and the performance and either buy or don't buy based on that criteria. For everything else the market will decide for itself.
I think there is MUCH more in a name than what a lot of people here think. Nvidia just launched a 3060 8GB and it is nowhere near the same performance as the 3060 12GB. If you live in a world where "it's just a name bro" then this is fine. no issue at all.

If you live in a world that appreciates a transparent market and corporate honesty, this is a godamn slap across the face. How well do you think the 3060 8GB will sell, versus an alternative universe where they named it the 3050 Ti? If one could exercise their deductive reasoning and inferencing skills, one would perhaps come to the conclusion that naming this product similarly to a higher-performing product would influence SOME number of uninformed customers to purchase the product in situations where they wouldn't, had it been named differently.

Regardless if this is a single customer, or a significant portion of purchases to this product: it's a positive integer of purchases. This is a hypothesis that the naming of a product has real-world, monetary and market effects. One could then conclude that the naming of a product, while having very little to do with the actual hardware aside from the paint/sticker/laser etching on the shroud, has a non-zero effect on the product value and perception. And this influence on the market has a direct result in the products offered and the prices they're offered to everyone, not just uninformed consumers.

I work with people in computer stores that don't know the difference between a 5800X and a 5800X3D. These are people who, in my absence, are responsible for making decisions for uninformed consumers about upgrades, new PCs and general advisory. These people are considered 'computer guys'. They don't hang out on these forums, they don't watch reviews. If they see a 7900XT and a 7900XTX, and only %10 difference in price, what do you hypothesize they might think?
 
I think there is MUCH more in a name than what a lot of people here think. Nvidia just launched a 3060 8GB and it is nowhere near the same performance as the 3060 12GB. If you live in a world where "it's just a name bro" then this is fine. no issue at all.

If you live in a world that appreciates a transparent market and corporate honesty, this is a godamn slap across the face. How well do you think the 3060 8GB will sell, versus an alternative universe where they named it the 3050 Ti? If one could exercise their deductive reasoning and inferencing skills, one would perhaps come to the conclusion that naming this product similarly to a higher-performing product would influence SOME number of uninformed customers to purchase the product in situations where they wouldn't, had it been named differently.

Regardless if this is a single customer, or a significant portion of purchases to this product: it's a positive integer of purchases. This is a hypothesis that the naming of a product has real-world, monetary and market effects. One could then conclude that the naming of a product, while having very little to do with the actual hardware aside from the paint/sticker/laser etching on the shroud, has a non-zero effect on the product value and perception. And this influence on the market has a direct result in the products offered and the prices they're offered to everyone, not just uninformed consumers.

I work with people in computer stores that don't know the difference between a 5800X and a 5800X3D. These are people who, in my absence, are responsible for making decisions for uninformed consumers about upgrades, new PCs and general advisory. These people are considered 'computer guys'. They don't hang out on these forums, they don't watch reviews. If they see a 7900XT and a 7900XTX, and only %10 difference in price, what do you hypothesize they might think?
Not sure what the fix is for some of this. Unfortunately, the bell curve is slipping left and there is only so much you can do for ignorant/lazy people. So, hold everyone's hands or put reasonable laws in place that let people make their own choices even when they are stupid? I say let people be stupid.
 
Status
Not open for further replies.
Back
Top