AMD Radeon RX 9070 XT tipped to launch alongside FSR4 and Ryzen 9000X3D in late January

You mean the AMD Ryzen AI MAX + X PLUS 380HQXT+ ?
If only. :)
This gen they have the Ryzen AI MAX Ultra Cool X Plus Extreme 395. (joking seriously though what stupid naming). They have the 395 which is 16 core zen 5 80 MB cache + 40 CU RDNA 3.5 5.1GHZ boost.
They also announced Fire Range the top chip is the 9955HX3D 16 core zen 5 144mb cache 5.4Ghz boost with no iGPU... so designed for laptops with onboard GPU.

I think the 395 mini PCs may make a solid steam machine type Linux gaming box. Who knows though Zen6 maybe they smash the two together. A 16 core x3d + 40 CU GPU SOC might make a pretty killer gaming box.
 
If only. :)
This gen they have the Ryzen AI MAX Ultra Cool X Plus Extreme 395. (joking seriously though what stupid naming). They have the 395 which is 16 core zen 5 80 MB cache + 40 CU RDNA 3.5 5.1GHZ boost.
They also announced Fire Range the top chip is the 9955HX3D 16 core zen 5 144mb cache 5.4Ghz boost with no iGPU... so designed for laptops with onboard GPU.

I think the 395 mini PCs may make a solid steam machine type Linux gaming box. Who knows though Zen6 maybe they smash the two together. A 16 core x3d + 40 CU GPU SOC might make a pretty killer gaming box.
The issue is that the SoC is such a massive chip, it will cost more than a 16-core CPU + 40CU GPU combined.

Sure it's low power, but I expect devices with this chip (and integrated memory) to START at over $1.2K barebones.
 
  • Like
Reactions: ChadD
like this
The issue is that the SoC is such a massive chip, it will cost more than a 16-core CPU + 40CU GPU combined.

Sure it's low power, but I expect devices with this chip (and integrated memory) to START at over $1.2K barebones.

Very possible.
ASUS new Flow Z13 Tablet is Ryzen 395 32GB DDR5x 1TB Gen4 13" 4k. $2199

$1200 probably sounds about right for a bare bones cube type machine.
 
Holding out hope...RTX 5070 / RTX 5070 Ti level performance PLUS great FSR / AI-based upscaling...all for a great price...would be huge.
 

this is a title that AMD do very well I think, it could match a 4080super without being stronger than a 7900xt according to this:
https://www.techpowerup.com/review/call-of-duty-black-ops-6-fps-performance-benchmark/5.html

which was pretty much always where speculation and AMD marketing material did put it, AMD kind of told us where it would land in a precise way for the upper best case scenario:

amdnaming.png.feb439e8faa822e13c931871b07679a4.png
 
this is a title that AMD do very well I think, it could match a 4080super without being stronger than a 7900xt according to this:
https://www.techpowerup.com/review/call-of-duty-black-ops-6-fps-performance-benchmark/5.html

which was pretty much always where speculation and AMD marketing material did put it, AMD kind of told us where it would land in a precise way for the upper best case scenario:

View attachment 702762
"AMD Radeon RX 9070 XT Benchmarked in 3D Mark Time Spy Extreme and Speed Way
by GGforever Today, 13:22 Discuss (0 Comments)
Although it has only been a few days since the RDNA 4-based GPUs from Team Red hit the scene, it appears that we have already been granted a first look at the 3D Mark performance of the highest-end Radeon RX 9070 XT GPU, and to be perfectly honest, the scores seemingly live up to our expectations - although with disappointing ray tracing performance. Unsurprisingly, the thread has been erased over at Chiphell, but folks have managed to take screenshots in the nick of time.

The specifics reveal that the Radeon RX 9070 XT will arrive with a massive TBP in the range of 330 watts, as revealed by a FurMark snap, which is substantially higher than the previous estimated numbers. With 16 GB of GDDR6 memory, along with base and boost clocks of 2520 and 3060 MHz, the Radeon RX 9070 XT managed to rake in an impressive 14,591 points in Time Spy Extreme, an around 6,345 points in Speed Way. Needless to say, the drivers are likely far from mature, so it is not outlandish to expect a few more points to get squeezed out of the RDNA 4 GPU."
 
Needless to say, the drivers are likely far from mature, so it is not outlandish to expect a few more points to get squeezed out of the RDNA 4 GPU."
are those the score AMD said were far from reality because of drivers... that said if it is about xtx in RT and a bit less in raster, that would be quite good.
 
  • Like
Reactions: kac77
like this
To this day, my HD7970 (Oc'd to 1.2GHz) was one of the best cards I ever owned.
I enjoyed running two HD7970 cards that I had a few years ago with insane FP64 performance. Not even RTX 3090 beats HD7970 in FP64 compute performance, at least on paper but I know for certain that my previous GTX 1080 Ti couldn't even beat this card when I had all of them. Ok I'm using a very niche application (in area of scientific computation) requiring high precision floating point calculation in distributed computing. ;)
Nowadays, newer cards don't have good FP64 as this is not needed in gaming.
 
Now that we know the core count and clock speeds, here are a couple of back-of-the-envelope estimates. It’s actually looking pretty promising.

7800 XT
2420 MHz
3840 cores
2420 * 3840 ‎ = 9,292,800

9070 XT
3000 MHz
4096 cores
3000 * 4096 ‎ = 12,288,000

No architecutral uplift
12288 / 9292.8 ‎ = 1.322
+32.2%

10% architectural uplift
12288 * 1.1 ‎ = 13,516.8
13516.8 / 9292.8 ‎ = 1.455
+45.5%

Techpowerup relative performance vs 7800 XT
7900 XT = +30%
7900 XTX = +51%
 
are those the score AMD said were far from reality because of drivers... that said if it is about xtx in RT and a bit less in raster, that would be quite good.
It would. If AMD could price the 9070 xt low enough it could lock nVidia out of the 60 series range. They would have to price it considerably lower. AMD should have performance to spare to do it. nVidia could retaliate but it would screw over any early adopters, which is something I'm not sure they would want to do.
 
It would. If AMD could price the 9070 xt low enough it could lock nVidia out of the 60 series range. They would have to price it considerably lower. AMD should have performance to spare to do it. nVidia could retaliate but it would screw over any early adopters, which is something I'm not sure they would want to do.
Nvidia is either going to just release a 60 and say whatever. It is what it is. We have 4x fake frames. Or they may just decide to skip the 60 class cards completely. They might just say $550 is the floor to enter the NV Shangri La. Sadly I don't think they actually have to compete at that price range to move product. Likely they could release a 5060 that is just complete and utter shit... and it will sell. I mean that is exactly what has been happening. Its not like the 4060 or 3060 were all that compedititve if people are being honest. You can't advocate for those cards with out bringing up upscaling, and or some promise of RT that really means nothing in that performance class.
 

I tried FSR 4 on a new AMD Radeon RX 9070 XT graphics card, and it looks great​

We tested AMD's latest AI resolution upscaling tech on its new gaming GPU with Ratchet & Clank, and it looks massively better than FSR 3.


The difference in quality is substantial, and you can see it everywhere in the scene when it’s moving, with much less in the way of shimmering and noise around edges, and a much sharper image. Areas where the benefits of FSR 4 are particularly noticeable are where Ratchet’s fur meets the rest of the scene. As you can see in the image above, the edges around Ratchet’s tail look much sharper when it swishes, as do the edges around the ears. Even the shadows look significantly better on FSR 4 (right) than FSR 3 (left), with clear edges that aren’t surrounded by noise.

https://www.pcgamesn.com/amd/fsr-4-...UWOGFH&state=b22833fe6668455fb1bed15f613457b9
 

AMD's new AI-based FSR 4 upscaler impresses in hands-on testing at CES 2025 on RX 9070-series hardware​

This "research project" fixes long-standing FSR issues.


In side-by-side comparisons between two PCs running Rift Apart in 4K performance mode, one with FSR 3.1 and one with FSR 4, the new ML-based technique looks to solve many of the previous versions' biggest issues.

Most notably, image quality seemed noticeably improved on the "research project" PC across a variety of scenes. The fine texture of the red carpet in the game's opening level is a hard one to reproduce, for example, with FSR 3.1 compressing a fair amount of detail and producing a moiré pattern, but FSR 4 managing to better preserve the individual fibres.

Similarly, the use of SSAO produces some artefacting on FSR 3.1, with occlusion appearing and disappearing from frame to frame, whereas this appeared to have been fixed on FSR 4. The image also looked fairly sharp, without the soft look that characterises some upscalers running in a similarly challenging 1080p to 4K mode.

The fast-moving confetti particles that accompany Ratchet and Clank's first-level parade is another traditional trouble spot, but again FSR 4 exhibited little of the ghosting or trails we've seen in various versions of FSR. An even more obvious improvement comes with the movement of the bystanders in the stadium seating, with their claps and cheers causing disocclusion fizzle in FSR 3.1 - that looks to be fixed in FSR 4.

https://www.eurogamer.net/digitalfoundry-2025-amd-new-ml-based-fsr-4-impresses-in-hands-on-testing
 
Nvidia is either going to just release a 60 and say whatever. It is what it is. We have 4x fake frames. Or they may just decide to skip the 60 class cards completely. They might just say $550 is the floor to enter the NV Shangri La. Sadly I don't think they actually have to compete at that price range to move product. Likely they could release a 5060 that is just complete and utter shit... and it will sell. I mean that is exactly what has been happening. Its not like the 4060 or 3060 were all that compedititve if people are being honest. You can't advocate for those cards with out bringing up upscaling, and or some promise of RT that really means nothing in that performance class.
You're looking at it from an enthusiast point of view. I had an RTX 4060-powered laptop and that thing ripped. It's a good card. Is it great? No. It is usable? Absolutely. Look no further than the Steam hardware survey to see that there are cards far less performant in use in the majority of PCs.

I get that everyone is butthurt that the 3060 to 4060 was just process node improvements and AI hardware...but that's essentially what we are looking at with this new gen. That is because raster is the past and AI-powered upscaling is the future. There will not be another big, hot rod card like the 7900 XTX that is setup, for the majority for raster dominance, at any price (power utilization, etc. be damned). It will be a balance of raster performance increase and AI-powered upscaling increase with raster getting less and less as we go on. I doubt the 3090 to 4090 jump will be seen again.
 
The idea that Nvidia will skip the xx60 class of GPU completely is a little bit wilde to me, what do they do with the binned laptop chips ?

That sound like a bit like all that they will skip the 5080-5090 this generation talk...

The last xx60 card fitted in a 159mm die, with good yield (considering how cut down the mobile 4050 is and the size of the die they must be great) you can have 340-350 GPU per 12inch wafer, TSMC average 12inch costed around 5000-6000 during the 4060 life, being on the more advanced size say Nvidia had them at 14k, that $40 per gpu chips for each extra marginal one they are making.

Not sure how much they can sell them to AIB, but they end up in $300 relatively simple cards in power and memory or very expensive Laptop, Nvidia selling them in average $65+ with over 60% raw margin would not surprise me, it is an insane hardware business with a giant amount of the cost like Drivers, marketing, oem relation, etc... thats already and almost 100% all made anyway even if you stop down at the 5070 level and the 4060 laptop-4060 desktop card were their best sellers in all their previous generations.... according to steam survey 13-14% of the users base run on 4060 family product, 15-20 millions people, that probably similar to the whole Xbox market (sold more than that but over 4 years now vs 2).
 
The idea that Nvidia will skip the xx60 class of GPU completely is a little bit wilde to me, what do they do with the binned laptop chips ?

That sound like a bit like all that they will skip the 5080-5090 this generation talk...

The last xx60 card fitted in a 159mm die, with good yield (considering how cut down the mobile 4050 is and the size of the die they must be great) you can have 340-350 GPU per 12inch wafer, TSMC average 12inch costed around 5000-6000 during the 4060 life, being on the more advanced size say Nvidia had them at 14k, that $40 per gpu chips for each extra marginal one they are making.

Not sure how much they can sell them to AIB, but they end up in $300 relatively simple cards in power and memory or very expensive Laptop, Nvidia selling them in average $65+ with over 60% raw margin would not surprise me, it is an insane hardware business with a giant amount of the cost like Drivers, marketing, oem relation, etc... thats already and almost 100% all made anyway even if you stop down at the 5070 level and the 4060 laptop-4060 desktop card were their best sellers in all their previous generations.... according to steam survey 13-14% of the users base run on 4060 family product, 15-20 millions people, that probably similar to the whole Xbox market (sold more than that but over 4 years now vs 2).

I could see Nvidia treating the 5060 this gen much the way they have treated 50 type things in the past. They exist for OEMs, the naming convention exists for laptop parts. I believe its possible this gen we don't see companies OEMs like ASUS releasing 5060s. I mean a 60 launch is way off, if AMD face plants and the best Intel can do is a b580, then sure of course we'll get terrible 5060 parts. In the event that AMDs card really is priced stupid aggressive though... the OEMs aren't going to want to get stuck with product. Nvidia may not want that either and just skip a big launch. It hinges on AMD... so I mean what can you say, AMD is good at screwing Radeon things up. As much as I want to say the bits leaking about the 9070 sound really good, AMD being super cagey about it is not a great sign.
 
I could see Nvidia treating the 5060 this gen much the way they have treated 50 type things in the past. They exist for OEMs, the naming convention exists for laptop parts. I believe its possible this gen we don't see companies OEMs like ASUS releasing 5060s. I mean a 60 launch is way off, if AMD face plants and the best Intel can do is a b580, then sure of course we'll get terrible 5060 parts. In the event that AMDs card really is priced stupid aggressive though... the OEMs aren't going to want to get stuck with product. Nvidia may not want that either and just skip a big launch. It hinges on AMD... so I mean what can you say, AMD is good at screwing Radeon things up. As much as I want to say the bits leaking about the 9070 sound really good, AMD being super cagey about it is not a great sign.
Yeah it is interesting to me that we have a 5070, 5070 Ti, 5080, and 5090 for laptops - no sign of the 5060. I owned a 4060, 4070, 4080, and 4090 laptop at one point or another last gen (lots of 1-2 day trials and returns - some extended ownership) - my end thesis is the 4060 is the one to get. I ended up with a Razer Blade 14 4070 only (although I sold it last month ahead of this refresh, now I just have a PlayStation Portal to my PS5 Pro) because I got a great open box deal from Micro Center. But it makes ZERO SENSE (for my use case) to get a bigger GPU since I just want to play Fortnite (and other competitive shooters) in my hotel room when I am traveling on business. I get that some people want a laptop as their main gaming box - that is where it makes more sense to splurge. But in my case I build PCs so its not anything that interests me.
 
  • Like
Reactions: ChadD
like this
You're looking at it from an enthusiast point of view. I had an RTX 4060-powered laptop and that thing ripped. It's a good card. Is it great? No. It is usable? Absolutely. Look no further than the Steam hardware survey to see that there are cards far less performant in use in the majority of PCs.

I get that everyone is butthurt that the 3060 to 4060 was just process node improvements and AI hardware...but that's essentially what we are looking at with this new gen. That is because raster is the past and AI-powered upscaling is the future. There will not be another big, hot rod card like the 7900 XTX that is setup, for the majority for raster dominance, at any price (power utilization, etc. be damned). It will be a balance of raster performance increase and AI-powered upscaling increase with raster getting less and less as we go on. I doubt the 3090 to 4090 jump will be seen again.
I guess what I should say is they may skip enthusiast type dGPU versions of the 5060. Sure laptops are a completely different thing. Often the laptop parts don't even directly relate in anyway to the hardware in the dGPU side of things. Well obviously its never the same hardware I just mean in general the 70 and 80 class laptop parts are not on par with the stack.(often 70 class laptop parts are really 60 class desktop parts in terms of cores and such) Laptop expectations are very different.

The thing with the Nvidia smoke and mirrors RTX stuff. Is it has mostly been smoke and mirrors. Its not that they haven't come up with some cool hardware. Its just that for 4 generations of cards now Nvidia has been selling gamers on the RT future. The RT future did not exist when people bought 2060s, or 3060s or really even 4060s. Games that are now finally launching that really are the games of the future that require RT are still very few... there is like TWO. Both of them run like absolute ass on 2060 3060 4060 and very likely on 5060 class cards should they come to be as well. I mean load up Indiana Jones one one of those old 8GB 60 class cards... its not going to go well. Nvidia has been pre selling a future... and somehow a lot of gamers have bought it. Telling someone a 3060 card was all set for future RT games was disingenuous. Nvidia cut raster performance 3 generations early. In fact I would argue they have probably cut it more then they should have with the 5000s as well but it might be unfair to say that before we see benchmarks of the actual raster on 5000. The bottom line is 95% of the games being played still benefit more from good raster performance today and that probably isn't going to change as much as Nvidia would like over the life of the 5000 maybe even 6000s.
 
I guess what I should say is they may skip enthusiast type dGPU versions of the 5060. Sure laptops are a completely different thing. Often the laptop parts don't even directly relate in anyway to the hardware in the dGPU side of things. Well obviously its never the same hardware I just mean in general the 70 and 80 class laptop parts are not on par with the stack.(often 70 class laptop parts are really 60 class desktop parts in terms of cores and such) Laptop expectations are very different.

The thing with the Nvidia smoke and mirrors RTX stuff. Is it has mostly been smoke and mirrors. Its not that they haven't come up with some cool hardware. Its just that for 4 generations of cards now Nvidia has been selling gamers on the RT future. The RT future did not exist when people bought 2060s, or 3060s or really even 4060s. Games that are now finally launching that really are the games of the future that require RT are still very few... there is like TWO. Both of them run like absolute ass on 2060 3060 4060 and very likely on 5060 class cards should they come to be as well. I mean load up Indiana Jones one one of those old 8GB 60 class cards... its not going to go well. Nvidia has been pre selling a future... and somehow a lot of gamers have bought it. Telling someone a 3060 card was all set for future RT games was disingenuous. Nvidia cut raster performance 3 generations early. In fact I would argue they have probably cut it more then they should have with the 5000s as well but it might be unfair to say that before we see benchmarks of the actual raster on 5000. The bottom line is 95% of the games being played still benefit more from good raster performance today and that probably isn't going to change as much as Nvidia would like over the life of the 5000 maybe even 6000s.
I agree. Although in the 4060s case it was the rare card that was almost identical to the full desktop part. Which is why it was a big-time sleeper.

I think the PS5 Pro is a test bed for the PS6 - which means AI-powered upscaling will be a big thing. That will trickle down to PC. Raster will always be important I just don't see it improving like in the past. As an online competitive shooter player it will always be a big deal to me (not that I am any good - but upscaling doesn't necessary "fix" slow compute). However, I will say my son didn't know the difference in Fortnite from an RTX 4090 to an RTX 4070 SUPER because I enabled DLSS (before I enabled DLSS he complained about "lag"). This is at 3440x1440. That is powerful. That saved Dad a lot of money last year. :)
 
  • Like
Reactions: ChadD
like this
Yeah it is interesting to me that we have a 5070, 5070 Ti, 5080, and 5090 for laptops - no sign of the 5060. I owned a 4060, 4070, 4080, and 4090 laptop at one point or another last gen (lots of 1-2 day trials and returns - some extended ownership) - my end thesis is the 4060 is the one to get. I ended up with a Razer Blade 14 4070 only (although I sold it last month ahead of this refresh, now I just have a PlayStation Portal to my PS5 Pro) because I got a great open box deal from Micro Center. But it makes ZERO SENSE (for my use case) to get a bigger GPU since I just want to play Fortnite (and other competitive shooters) in my hotel room when I am traveling on business. I get that some people want a laptop as their main gaming box - that is where it makes more sense to splurge. But in my case I build PCs so its not anything that interests me.

Makes sense. Laptops are such a different situation. I mean even the highest end laptop cards your almost always dialing settings down a bit. On desktop people with higher end 80 or 90 cards are going to get upset if they are forced to drop a setting down from ultra. Any game they can't play at ultra... is going to get the "shitty devs, optimize your game" treatment. It is never a fault of their flagship or flagship -1 parts... it has to be the game is not "optimized". On laptops people are fine with playing at medium or even low settings. So combo that with a 90 class laptop part really being around a 70 class desktop part... sure why the big laptop cards are probably not at all worth the price premium.
 
Yeah it is interesting to me that we have a 5070, 5070 Ti, 5080, and 5090 for laptops - no sign of the 5060.
What novel here is the return to the 5070-5070ti getting the announcement, not the no sign of the 5060.

I think a simple reason could simply be GDDR6X procurement issues that puished Nvidia to update the lineup that used it first (they made some last minute regular GDDR6 sku of them, but they probably tried to limit volume of those)

The 4060 was announced May 18, 2023, 4090 was announced September 20, 2022, more than 6 months before.

When was the last time a 60 series card was announced during the generation announcement ?

Turing announcement had no 2060 in it:
https://www.tomshardware.com/news/nvidia-rtx-2080-ti-2070-price-specs-release,37647.html

August 20, 2018 vs January 6, 2019 for the 2060.

Pascal:
https://wccftech.com/nvidia-introduce-gtx-1080-gtx-1070-battlefield-5/
no 1060.

If anything, I would expect the fastest xx60 launch in a long time, to keep yearly cadence release talk alive and because this time around I would expect Nvidia did not overdo 4060 cards that need to be sold for a very long time and the reserve of 3060 will be a different tier product and not their xx60 but their xx50 in the lineup and could even be trickling down by noow.
 
Last edited:
What novel here is the return to the 5070-5070ti getting the announcement, not the no sign of the 5060.

I think a simple reason could simply be GDDR6X procurement issues that puished Nvidia to update the lineup that used it first (they made some last minute regular GDDR6 sku of them, but they probably tried to limit volume of those)

The 4060 was announced May 18, 2023, 4090 was announced September 20, 2022, more than 6 months before.

When was the last time a 60 series card was announced during the generation announcement ?

Turing announcement had no 2060 in it:
https://www.tomshardware.com/news/nvidia-rtx-2080-ti-2070-price-specs-release,37647.html

August 20, 2018 vs January 6, 2019 for the 2060.

Pascal:
https://wccftech.com/nvidia-introduce-gtx-1080-gtx-1070-battlefield-5/
no 1060.

If anything, I would expect the fastest xx60 launch in a long time, to keep yearly cadence release talk alive and because this time around I would expect Nvidia did not overdo 4060 cards that need to be sold for a very long time and the reserve of 3060 will be a different tier product and not their xx60 but their xx50 in the lineup and could even be trickling down by noow.
Good point. This was my first foray into the xx60 arena with laptops so I really didn't pay attention to the timing of it. Gives me a reason to wait a bit if I can. The PlayStation Portal with case is absurdly huge. It's like I am carrying a band instrument. Haha.
 
Nvidia isn't likely going to skip the 60 series, but releasing it while they still have 4070's sitting on the shelves isn't an awesome idea because they are directly competing with themselves at that point.
Better to issue a series of price drops on the 4070 series to keep them competitive with anything AMD and Intel are about to launch and clear them out before putting new parts in their performance class on the shelves.
 
Speaking of mobile parts - have we seen anything about mobile gaming GPUs from AMD (i.e. non-CPU embedded)? I know they basically skipped last gen.
 
Yeah but that is still some 3 years out and still on the drawing board, buy our GPU today because in 3 years we might have something cool that is exclusive to our cards, is a tough sales push.
Why not? It worked well for Nvidia with the "RTX" 2000 series partial rebrand. Not like those cards could ray trace or upscale all that well. And, although reviewers and consumers balked at Nvidia's blatant price raising schemes people still did buy in some volume eventually while justifying the purchase because of the then largely unimplemented new tech features.
 
Why not? It worked well for Nvidia with the "RTX" 2000 series partial rebrand. Not like those cards could ray trace or upscale all that well. And, although reviewers and consumers balked at Nvidia's blatant price raising schemes people still did buy in some volume eventually while justifying the purchase because of the then largely unimplemented new tech features.
Yeah but they landed with some game titles using it out the gate, enough to at least keep people interested and talking about it.
had they released it into a vacuum with nothing using the features with the story of "we will see developers launching titles with this in 3-5 years" that would be another entirely.
 
this is a title that AMD do very well I think, it could match a 4080super without being stronger than a 7900xt according to this:
https://www.techpowerup.com/review/call-of-duty-black-ops-6-fps-performance-benchmark/5.html

which was pretty much always where speculation and AMD marketing material did put it, AMD kind of told us where it would land in a precise way for the upper best case scenario:

View attachment 702762

god this sounds like going back to the old "PR" system they used to use on CPUs vs Intel
 
Speaking of mobile parts - have we seen anything about mobile gaming GPUs from AMD (i.e. non-CPU embedded)? I know they basically skipped last gen.

They have Strix and it looks great.
 
They have Strix and it looks great.
I think that about sums it up.
The got sick of OEMs sticking any Full AMD CPU + Laptop CPU in the value lineup. Then selling the high end skus as AMD or Intel + Nvidia. I'm not saying it doesn't/didn't make sense for OEMs who can sell on Nvidia marketing as well. It must be frustarting for AMD though to have your laptop parts always stuck in the low quality lines.

AMD took the choice away from OEMs. Strix is strix and it makes zero sense to make parts with Nvidia Graphics + Strix. Cuts of a competitor and keeps their OEMs from shitting up their name.
 
The 7900xtx came after the 6900xt and 6950xt. Has zero to do with being faster or a slower than a 4090, and to think it has is a really odd way of thinking. it was just the logical naming scheme after the 6000 series. Rdna 4 naming is a different kettle of fish because of possible confusion with ryzen ai series so they skipped the 8000 numbering.

Having "ai" in the product name is the most retarded thing they've done yet from a branding perspective.
 
Having "ai" in the product name is the most retarded thing they've done yet from a branding perspective.

I think it’s to differentiate those parts from Kracken, which doesn’t support it.
 
Back
Top