AMD Radeon RX 9070 XT tipped to launch alongside FSR4 and Ryzen 9000X3D in late January

Leaks indicate around 7900xt raster performance

RT / PT / FSR 4 remain open question marks
 
It's getting close, I'd prefer it as an APU not as a mobile device.
Meh, there's a lot of crossover. I play heavily-modded Minecraft on a 7735HS and get well over 100fps. All those Chinese companies are putting HS and U chips on mini PC motherboards.

And AMD's getting closer every generation they actually update the iGPU. That 7735HS can *almost* do 1080p Diablo IV at playable frame rates. I mean, 20-25fps is choppy, but the 680M is a couple of generations old now and I suspect the 16+CU iGPUs probably do 30+.
 
Is there a picture of all the engineers and software devs assembled that you're looking at?
Mostly their management of the radeon brand.

I miss the days of the 7970 where you could get a top-end red team card that traded blows and even surpassed the top end Nvidia cards, had great features and technology that was ahead of Nvidia...
 
And today's non-announcement didn't help.... but we got 150+ mentions of AI
Pathetic showing from AMD in regard to RDNA4. Was hoping they might actually learn something and take charge but nope. Like the chicken shits they are, they waited for Nvidia to announce and dictate pricing for the next gen so that presumably AMD can go "cool, Nvidia's price minus $50" again. Yep...that'll gain them marketshare... :ROFLMAO: (n)
 
Mostly their management of the radeon brand.

I miss the days of the 7970 where you could get a top-end red team card that traded blows and even surpassed the top end Nvidia cards, had great features and technology that was ahead of Nvidia...

That wasn't all that long ago, 6000 series in raster at least traded blows with and surpassed the nvidia top end. They were just behind in rt. Though overall rdna was a misstep for them hence the swing back to a unified setup.
 
That wasn't all that long ago, 6000 series in raster at least traded blows with and surpassed the nvidia top end. They were just behind in rt. Though overall rdna was a misstep for them hence the swing back to a unified setup.
That's really their big problem isn't it? They have several bad gens, then one good gen, followed by several bad gens again and then it's "back to the drawing board". They need like 3 good gens in a row for anyone to take them seriously. RDNA2 is really the only great gen they've had in recent memory. Polaris I guess sold well? Lol. RDNA1 was ok, but not a lot of traction, especially with the bogus price they announced with. They lost marketshare with RDNA3. If I recall Vega didn't do all that great in the market either which is when they ditched GCN for RDNA.
 
Mostly their management of the radeon brand.

I miss the days of the 7970 where you could get a top-end red team card that traded blows and even surpassed the top end Nvidia cards, had great features and technology that was ahead of Nvidia...
And they still lost marketshare. I keep saying that the market is bonkers cause it is. AMD can't take risks anymore on the GPU side because they are just going to lose money.
1736362810961.png
 
And they still lost marketshare. I keep saying that the market is bonkers cause it is. AMD can't take risks anymore on the GPU side because they are just going to lose money.
View attachment 702575
And now they are in a situation where they are perpetually on the back foot.
Even if AMD was to develop some cool new tech, their market share is too small for developers to focus on it, where as if Nvidia does their lead is so commanding they would be fools not too.
Furthermore, Nvidia seems to have had some serious forethought in their plans and maintaining backwards compatibility with their exclusive/proprietary solutions.
 
Even if AMD was to develop some cool new tech, their market share is too small for developers to focus on it,
Depends when on the console cycle I imagine, if the PS6 has said cool new tech, developer would focus on it.
 
And they still lost marketshare. I keep saying that the market is bonkers cause it is. AMD can't take risks anymore on the GPU side because they are just going to lose money.
View attachment 702575
Simply put, Nvidia was spending TONS of money putting teams of people in the trenches with developers, designing tools, doing graphics research, single-handedly funding entire R&D and development teams.

I know in 2010 AMD didn't have a lot of money, but AMD is a larger company now than Nvidia was then. They have no excuse not to be putting an AMD rep in every game dev studio, designing cutting edge tools and workflows, creating hardware based on developer needs....
 
Depends when on the console cycle I imagine, if the PS6 has said cool new tech, developer would focus on it.
Yeah but that is still some 3 years out and still on the drawing board, buy our GPU today because in 3 years we might have something cool that is exclusive to our cards, is a tough sales push.
 
And now they are in a situation where they are perpetually on the back foot.
Even if AMD was to develop some cool new tech, their market share is too small for developers to focus on it, where as if Nvidia does their lead is so commanding they would be fools not too.
Furthermore, Nvidia seems to have had some serious forethought in their plans and maintaining backwards compatibility with their exclusive/proprietary solutions.

Not true, developers have to focus on their tech due to the consoles that are AMD powered. Unless that changes, than developers will have to care about AMD tech in the gpu side.
 
Depends when on the console cycle I imagine, if the PS6 has said cool new tech, developer would focus on it.
The PS5 Pro literally has this tech. I get that we, as enthusiasts, are disappointed - it is not the "moAr PoWer" that we demand - but AMD is improving in meaningful ways. They've learned raster brute forcing is not the future and are pivoting.

https://www.techradar.com/gaming/ps5/what-is-pssr-explained

I know this is an extension of what has been done on the PC side - but this is novel in the console space which is actually very meaningful for the larger industry.

Mark Cerny (architect of the PlayStation) essentially thinks raster is the past - https://www.tomshardware.com/video-...n-debunked-by-ps5-system-architect-mark-cerny (long video linked).
 
And they still lost marketshare. I keep saying that the market is bonkers cause it is. AMD can't take risks anymore on the GPU side because they are just going to lose money.
View attachment 702575
7970 was slower, hotter, and cost more than the GTX 680 by $50 ($500 vs $550 for the 7970, and both were easy to get at msrp soon after launch). It also had no frame metering like the Nvidia GTX 680, so crossfire was a mess of frametimes looking like ketchup on a graph compared to the smooth Nvidia sli frametimes. Sli was popular back then, and I had two in sli.
 
Not true, developers have to focus on their tech due to the consoles that are AMD powered. Unless that changes, than developers will have to care about AMD tech in the gpu side.
And what changes has AMD brought to the table so far that they have?
 
The PS5 Pro literally has this tech
Will have to see how much plug and play PSSR support is with FSR 4 (and similar the tech), regardless FSR 4 will work out of the box with all FSR 3 title would be my guess, as it would need the same input to work, a bit like the new DLSS that use transformer seem to work on all the previous title as it use the same input than the previous CNN type solution.

I thyink Lakados point was more about a cool tech that require new work from the game engine, that where market share really matter.
 
Last edited:
Will have to see how much plug and play PSSR support is with FSR 4 (and similar the tech), regardless FSR 4 will work out of the box with all FSR 3 title would be my guess.

I thyink Lakados point was more about a cool tech that require new work from the game engine.
Ahhh.

The interesting part is PSSR and other tech via PS5 Pro is applying to PS4 games and "just working" - this is what gets me excited for the future of this technology. Improving older titles with very little dev interaction.

I am such a nerd I bought a PS5 Pro early last month because I was bored with the current gen GPUs. It's fun. Really cool console (haven't had a PlayStation since 4).
 
And what changes has AMD brought to the table so far that they have?

They put ray tracing in the consoles, without that developers would not have launched Indiana Jones game requiring the use of it. However the developers have to be mindful of what it excels at and does not. It's not so much that AMD is pushing new tech, it's what tech they decide to put into the consoles. Obviously this is an area that will improve in the next generation of consoles. Consoles will always set the bar that developers have to make sure can run the game. So no matter what Nvidia decides to do or the fact that they dominate the discrete gpu market in PC's really changes that fact. So Nvidia may dominate the PC world but they can't dominate the gaming developers since they do not control the console market.
 
7970 was slower, hotter, and cost more than the GTX 680 by $50. It also had no frame metering like the Nvidia GTX 680, so crossfire was a mess of frametimes looking like ketchup on a graph compared to the smooth Nvidia sli frametimes. Sli was popular back then, and I had two I sli.
Thinking of the 690 perhaps in regards to performance? The 7970 was more or less at par with the 680 and then the Ghz version of the 7970 remedied the price situation.
https://www.techpowerup.com/review/amd-hd-7970-ghz-edition/
https://www.techpowerup.com/review/palit-geforce-gtx-680-jetstream/28.html
 
Thinking of the 690 perhaps in regards to performance? The 7970 was more or less at par with the 680 and then the Ghz version of the 7970 remedied the price situation.
https://www.techpowerup.com/review/amd-hd-7970-ghz-edition/
https://www.techpowerup.com/review/palit-geforce-gtx-680-jetstream/28.html
No. The 680 and original 7970. The 7970 was reliably 5% or so slower across the board.

https://www.techpowerup.com/review/nvidia-geforce-gtx-680/

The 7970 ghz came several months later, than the 7970, matched Nvidia's msrp, and beat it by several percent. By then, everyone had bought gtx cards or the original 7970 for the generation.
 
Last edited:
That's really their big problem isn't it? They have several bad gens, then one good gen, followed by several bad gens again and then it's "back to the drawing board". They need like 3 good gens in a row for anyone to take them seriously. RDNA2 is really the only great gen they've had in recent memory. Polaris I guess sold well? Lol. RDNA1 was ok, but not a lot of traction, especially with the bogus price they announced with. They lost marketshare with RDNA3. If I recall Vega didn't do all that great in the market either which is when they ditched GCN for RDNA.

Vega was just an oddity, think i read somewhere they lost money for every card they sold and gaming was an afterthought for it, though promoted by Raja and co as being the holy grail. Even their promo video for it comes off as really weird, like it was blasting the doors off anything nvidia had when in reality it was ok at best, nothing special despite an 11 minute promo video making to out to be something really remarkable.


View: https://www.youtube.com/watch?v=uxVzDQtHzqo&ab_channel=AMD

Rdna 1 were decent cards with stupid pricing.

Rdna 2 were very good cards, but hard to come by for the most part due to covid.

Rdna 3, the first desktop gaming chiplet gpu obviously didn't work out how they planned. I have to wonder if they had stayed monolithic would they have been able to put out just as good a gpu as rdna2. Rdna 3 was interesting due to it being the first gaming chiplet gpu, but fell short of expectations. I think the hype of this chiplet setup gave people unrealistic expectations of what it could do. Though there's no doubt that even to amd internally it was likely not what they thought it would be.
 
Vega was just an oddity, think i read somewhere they lost money for every card they sold and gaming was an afterthought for it, though promoted by Raja and co as being the holy grail. Even their promo video for it comes off as really weird, like it was blasting the doors off anything nvidia had when in reality it was ok at best, nothing special despite an 11 minute promo video making to out to be something really remarkable.


View: https://www.youtube.com/watch?v=uxVzDQtHzqo&ab_channel=AMD

Rdna 1 were decent cards with stupid pricing.

Rdna 2 were very good cards, but hard to come by for the most part due to covid.

Rdna 3, the first desktop gaming chiplet gpu obviously didn't work out how they planned. I have to wonder if they had stayed monolithic would they have been able to put out just as good a gpu as rdna2. Rdna 3 was interesting due to it being the first gaming chiplet gpu, but fell short of expectations. I think the hype of this chiplet setup gave people unrealistic expectations of what it could do. Though there's no doubt that even to amd internally it was likely not what they thought it would be.

RDNA's chiplet design needed a better TSMC packaging process that never appeared, they were working it internally but ultimately scrapped it because of costs and failure rate.
TSMC redesigned it, and ultimately launched it, but Nvidia Blackwell is the first production product using it and they are having one hell of a time with it.
 
Will have to see how much plug and play PSSR support is with FSR 4 (and similar the tech), regardless FSR 4 will work out of the box with all FSR 3 title would be my guess, as it would need the same input to work, a bit like the new DLSS that use transformer seem to work on all the previous title as it use the same input than the previous CNN type solution.
The new hardware will work with FSR4 on titles that currently have FSR3.1 support otherwise it will fall back to using the version packaged in the game.
And because of how FSR's update process works. any updates there require the developer to issue a game patch.
Fortunately there is a growing list of FSR 3.1 supported titles so there is that at the very least.
 
7970 was slower, hotter, and cost more than the GTX 680 by $50 ($500 vs $550 for the 7970, and both were easy to get at msrp soon after launch). It also had no frame metering like the Nvidia GTX 680, so crossfire was a mess of frametimes looking like ketchup on a graph compared to the smooth Nvidia sli frametimes. Sli was popular back then, and I had two in sli.
It actually went up against the 580 not the 680. It predated it by 4 months. So for 4 months AMD had the highest single GPU performing part. It beat it by 15% and was just 19% slower than the 590 which was a dual GPU card and it's (RTX 590) performance was anything but exceptional. Back then crossfire/SLI was hit/miss and not consistent at all. AMD's problem is execution and marketing not tech.
1736381296514.png
 
Last edited:
It actually went up against the 580 not the 680. It beat it by 15% and was just 19% slower than the 590 which was a dual GPU card and it's (RTX 590) performance was anything but exceptional. Back then crossfire/SLI was hit/miss and not consistent at all. AMD's problem is execution and marketing not tech.
View attachment 702607
The gtx 680 released a couple of months after the 7970. That was its generational competitor. The GTX 680 sli had frame metering and sli support was nearly ubiquitous, among demanding games. I had it. The 680 was faster and cheaper than the 7970, as well as running cooler and quieter.
 
The gtx 680 released a couple of months after the 7970. That was its generational competitor. The GTX 680 sli had frame metering and sli support was nearly ubiquitous, among demanding games. I had it. The 680 was faster and cheaper than the 7970, as well as running cooler and quieter.
Not 2, but 4 months after. It should have gained marketshare in that time. It didn't. This notion that AMD has never beat nVidia in performance / especially in performance per dollar isn't true. The performance difference you're trying to pretend was great just wasn't. 5% is virtually meaningless considering it launched 4 months later and if I recall that 680 launch was horrible. It took nVidia quite some time before cards actually hit the shelves many many months later. (Looked it up and the 680 didn't really become available for many months later.)
1736383441136.png
 
Last edited:
The gtx 680 released a couple of months after the 7970. That was its generational competitor. The GTX 680 sli had frame metering and sli support was nearly ubiquitous, among demanding games. I had it. The 680 was faster and cheaper than the 7970, as well as running cooler and quieter.

View: https://youtu.be/R0PVs4kYTiE?si=L8ngssKjE5pBPpm2

As per the link above the 7970 aged better in the long run over the 680. In reality the 680 and 7970 were very close in performance even at launch, part of that reason was the 7970 simply could overclock better. Also why AMD had about 40% market share back then. But one thing people don't want to mention is both companies sold way more cards back in the day until they started charging over 600 bucks for a video card then profits went up but number of sales has cratered. The price now is simply too high and will continue to crater the PC market for gaming, most people are unwilling to spend more than 500 bucks or so for a video card and have hung on to their very old cards. Unless that changes the bar for PC gaming can only go so high when most of the market is running such old hardware. I mean the steam hardware survey tells the story, almost 4 times as many 1650 cards to the 4090. So yeah the 4090 sold well for a Halo card, but it's still just a small drop in the bucket.

NVIDIA GeForce RTX 4090 1.18%
NVIDIA GeForce GTX 1650 4.50%
 
Last edited:

View: https://youtu.be/R0PVs4kYTiE?si=L8ngssKjE5pBPpm2

As per the link above the 7970 aged better in the long run over the 680. In reality the 680 and 7970 were very close in performance even at launch, part of that reason was the 7970 simply could overclock better. Also why AMD had about 40% market share back then. But one thing people don't want to mention is both companies sold way more cards back in the day until they started charging over 600 bucks for a video card then profits went up but number of sales has cratered. The price now is simply too high and will continue to crater the PC market for gaming, most people are unwilling to spend more than 500 bucks or so for a video card and have hung on to their very old cards. Unless that changes the bar for PC gaming can only go so high when most of the market is running such old hardware. I mean the steam hardware survey tells the story, almost 4 times as many 1650 cards to the 4090. So yeah the 4090 sold well for a Halo card, but it's still just a small drop in the bucket.

NVIDIA GeForce RTX 4090 1.18%
NVIDIA GeForce GTX 1650 4.50%


AMD believe it or not is in a good position right now to gain market share... if 2 things are true. First that the 9070 is a good card. Meaning both the XT and non XT match or hopefully slightly best 5070s in Raster... and the RT is fast enough to be usable. Secondly that AMD is actually willing to fight on pricing to gain market share for real.

If the 9070 is at the upper range of the rumors. Then AMD needs to make a hardcore case for radeon going forward. They can't screw around with $699 $499 pricing. Nvidia - 50 bucks bleeds another half of the market share they have left. On the other hand if AMD says screw it, lets roll the dice lets loose 50 bucks a card this gen. $550-600 XT (based on where they really are vs the 5070) $400 non XT. Go hard make a 16GB 9070 non XT $399.99 make the idea of Intel releasing a higher end battlemage stupid. Make Nvidia spec up or discount the hell out of a 5060 if they are even planning one.

The next gen is the unification of RDNA and CDMA its happening anyway and all that gen R&D can be applied to data center / igpu / console products anyway. dGPU can live or die on this gen. Either the market share goes up a reasonable amount after selling at or below cost on 2 SKUs. Or you walk away. Say screw it PC gaming with dGPUs is dead. Have fun Nvidia is about to make you all hurt real bad. Buy our console like Strix follow up next gen... and load SteamOS on it. At that point at least AMD would be making money, in a market they are no doubt going to dominate. They also will want to triple down on the R&D in that area before Nvidia comes in with their own consumer ARM/NV play.
 
AMD believe it or not is in a good position right now to gain market share... if 2 things are true. First that the 9070 is a good card. Meaning both the XT and non XT match or hopefully slightly best 5070s in Raster... and the RT is fast enough to be usable. Secondly that AMD is actually willing to fight on pricing to gain market share for real.

If the 9070 is at the upper range of the rumors. Then AMD needs to make a hardcore case for radeon going forward. They can't screw around with $699 $499 pricing. Nvidia - 50 bucks bleeds another half of the market share they have left. On the other hand if AMD says screw it, lets roll the dice lets loose 50 bucks a card this gen. $550-600 XT (based on where they really are vs the 5070) $400 non XT. Go hard make a 16GB 9070 non XT $399.99 make the idea of Intel releasing a higher end battlemage stupid. Make Nvidia spec up or discount the hell out of a 5060 if they are even planning one.

The next gen is the unification of RDNA and CDMA its happening anyway and all that gen R&D can be applied to data center / igpu / console products anyway. dGPU can live or die on this gen. Either the market share goes up a reasonable amount after selling at or below cost on 2 SKUs. Or you walk away. Say screw it PC gaming with dGPUs is dead. Have fun Nvidia is about to make you all hurt real bad. Buy our console like Strix follow up next gen... and load SteamOS on it. At that point at least AMD would be making money, in a market they are no doubt going to dominate. They also will want to triple down on the R&D in that area before Nvidia comes in with their own consumer ARM/NV play.
I want AMD to do so well.

I want it in my BONES.

To this day, my HD7970 (Oc'd to 1.2GHz) was one of the best cards I ever owned. Lasted WAY longer than it ever should have.

Only when I was given a Titan X (maxwell) did I replace it.

But AMD's recent efforts since then have been a whole lot of "Me Too" and "Hey wait for me!" and "but but but I can do THIS"

I though Raja Koduri leaving would mean they would finally do good things. I thought they'd find a way to make Chiplet methodology work to REDUCE the price of high-end cards and have their "Ryzen" moment and blow the doors off of performance ceilings due to die yield limits.

But no, it's just pathetic farts of products with no complete stack, confusing lineups, last-generation performance, lack of ANY confidence, and marketing chest-puffing (Remember when they said they COULD have made a card to compete with the 4090, they just didn't want to? Yeah, fucking SURE) and now they sit in their cuck chair watching Nvidia fuck us all and politely asking Nvidia how much the 5070 will cost so they make sure to price their product JUST enough that they make as much profit as possible without making daddy Nvidia mad. They even copied Nvidia's naming because they KNOW they're just a fucking bargain-bin "second choice" in ANY comparison.


Makes me want to puke. Radeon is such a cuck brand now.
 
I want AMD to do so well.

I want it in my BONES.

To this day, my HD7970 (Oc'd to 1.2GHz) was one of the best cards I ever owned. Lasted WAY longer than it ever should have.

Only when I was given a Titan X (maxwell) did I replace it.

But AMD's recent efforts since then have been a whole lot of "Me Too" and "Hey wait for me!" and "but but but I can do THIS"

I though Raja Koduri leaving would mean they would finally do good things. I thought they'd find a way to make Chiplet methodology work to REDUCE the price of high-end cards and have their "Ryzen" moment and blow the doors off of performance ceilings due to die yield limits.

But no, it's just pathetic farts of products with no complete stack, confusing lineups, last-generation performance, lack of ANY confidence, and marketing chest-puffing (Remember when they said they COULD have made a card to compete with the 4090, they just didn't want to? Yeah, fucking SURE) and now they sit in their cuck chair watching Nvidia fuck us all and politely asking Nvidia how much the 5070 will cost so they make sure to price their product JUST enough that they make as much profit as possible without making daddy Nvidia mad. They even copied Nvidia's naming because they KNOW they're just a fucking bargain-bin "second choice" in ANY comparison.


Makes me want to puke. Radeon is such a cuck brand now.

Agreed completely. AMD runs so damn smooth on Linux these days I can't imagine going Nvidia. All the Nvidia - 50 bucks stuff though is starting to get annoying as an AMD fan. I hope the announcement delay is them arguing over how much money they are willing to loose this gen to fight back. I am fully prepared though for the let down announcement in a week or two of $699 and $499 and somewhat sort of almost as good cards. lol
 
I bought my two GTX 680 cards in Feb 2012. The Radeon 7970 launched in Dec 2011.

EDIT: 2 months later as I said. Yes cards can be hard to source at launch, always have been: https://www.techpowerup.com/review/nvidia-geforce-gtx-680/
I believe the GTX 680 launched in March 2012.

I easily purchased my EVGA GTX 670 FTW on June 18, 2012 (found the order invoice).

I wasn't watching these especially hard after they launched in May. June was just the point where I hit needing to get a new card for my new build and that's what I decided on and it was easily and readily available at the time about a month after launch.
 
I believe the GTX 680 launched in March 2012.

I easily purchased my EVGA GTX 670 FTW on June 18, 2012 (found the order invoice).

I wasn't watching these especially hard after they launched in May. June was just the point where I hit needing to get a new card for my new build and that's what I decided on and it was easily and readily available at the time about a month after launch.
Hm, seems I was mistaken. You're right.

EDIT: from my email...

Sales Order Date: 3/21/2012
 
Last edited:
Agreed completely. AMD runs so damn smooth on Linux these days I can't imagine going Nvidia. All the Nvidia - 50 bucks stuff though is starting to get annoying as an AMD fan. I hope the announcement delay is them arguing over how much money they are willing to loose this gen to fight back. I am fully prepared though for the let down announcement in a week or two of $699 and $499 and somewhat sort of almost as good cards. lol
The problem is they make more money from the cuck chair than they do actually fighting for market share.

They're the little birds picking food from the crocodile's teeth, and that's all they'll ever be now.

Those that accept submissive, subservient, self-deprecating roles because it's easier than fighting in battle were called cowards back in the day.
 
The problem is they make more money from the cuck chair than they do actually fighting for market share.

They're the little birds picking food from the crocodile's teeth, and that's all they'll ever be now.

Those that accept submissive, subservient, self-deprecating roles because it's easier than fighting in battle were called cowards back in the day.

Will see, ball is in their court to launch their two cards at prices that are not just under the cost of Nvidia cards.
 
The problem is they make more money from the cuck chair than they do actually fighting for market share.

They're the little birds picking food from the crocodile's teeth, and that's all they'll ever be now.

Those that accept submissive, subservient, self-deprecating roles because it's easier than fighting in battle were called cowards back in the day.
At some point they aren't selling enough to even be in the room... they are going to have to get out of that chair at some point, and get into party mode or exit the room. LOL

At this point I'm not going to lie Fire Range made me way more excited then anything else. I'm looking forward to Zen 6 I'm sure the next step will be to smash Strix Halo and Fire Range into one super chip part. At the rate PC gaming is going... I think I will be happy with a mini PC with a Zen6 Halo x3d 16 core + 40 or so CU GPU SOC.
 
Will see, ball is in their court to launch their two cards at prices that are not just under the cost of Nvidia cards.
I wish I had your optimism.

But the ball, court, crowd, both teams, entire stadium, parking lot, city, state, and country belong to Nvidia. If AMD tried to compete now it would piss in the corn-flakes of the sweet gig they have farting out mediocre products and earning a shit-ton from them simply because Nvidia prices their products so high.
 
At some point they aren't selling enough to even be in the room... they are going to have to get out of that chair at some point, and get into party mode or exit the room. LOL

At this point I'm not going to lie Fire Range made me way more excited then anything else. I'm looking forward to Zen 6 I'm sure the next step will be to smash Strix Halo and Fire Range into one super chip part. At the rate PC gaming is going... I think I will be happy with a mini PC with a Zen6 Halo x3d 16 core + 40 or so CU GPU SOC.
You mean the AMD Ryzen AI MAX + X PLUS 380HQXT+ ?
 
Back
Top