AMD Radeon RX 9070 XT tipped to launch alongside FSR4 and Ryzen 9000X3D in late January

The PS5 Pro literally has this tech. I get that we, as enthusiasts, are disappointed - it is not the "moAr PoWer" that we demand - but AMD is improving in meaningful ways. They've learned raster brute forcing is not the future and are pivoting.

https://www.techradar.com/gaming/ps5/what-is-pssr-explained

I know this is an extension of what has been done on the PC side - but this is novel in the console space which is actually very meaningful for the larger industry.

Mark Cerny (architect of the PlayStation) essentially thinks raster is the past - https://www.tomshardware.com/video-...n-debunked-by-ps5-system-architect-mark-cerny (long video linked).
The PS5 Pro is using PSSR, which is developed internally by SONY. It's unclear if FSR4 can be run as is, on PS5 Pro. As the ML hardware was spec'ed by SONY. They have been pretty clear to say that AMD had no involvement in PSSR. Only supplying hardware based on SONY's needs.
 
RDNA's chiplet design needed a better TSMC packaging process that never appeared, they were working it internally but ultimately scrapped it because of costs and failure rate.
TSMC redesigned it, and ultimately launched it, but Nvidia Blackwell is the first production product using it and they are having one hell of a time with it.
Blackwell is monolithic, unless the rumors/leaks were incorrect?
 
The PS5 Pro is using PSSR, which is developed internally by SONY. It's unclear if FSR4 can be run as is, on PS5 Pro. As the ML hardware was spec'ed by SONY. They have been pretty clear to say that AMD had no involvement in PSSR. Only supplying hardware based on SONY's needs.
There is no ML hardware in PS5 pro.

They just repurposed the registers
 
There is no ML hardware in PS5 pro.

They just repurposed the registers
OK well whatever it is about the PS5 Pro which allows it to run algorithms derived from ML. Which is really what this is all about. These upscalers are all derived algorithms.
 
Having "ai" in the product name is the most retarded thing they've done yet from a branding perspective.

Jumping on a buzzword, intel are going on about ai pc's etc. Jensen the other night used the term somewhere near a half million times on stage. Wont be long before intel have it in the name of a cpu.

I think its stupid as well but they obviously don't.
 
  • Like
Reactions: kac77
like this
From my very limited understanding PSSR is quite different than FSR 4 or DLSS, maybe by the motivation to be able to run on already released game (and taking advantage that a playstation game was not outputting 4k already) it seem to use the render image and upscaling it, not being fed the motion vector, deepmap and other info FSR-DLSS use ?
 
The PS5 Pro is using PSSR, which is developed internally by SONY. It's unclear if FSR4 can be run as is, on PS5 Pro. As the ML hardware was spec'ed by SONY. They have been pretty clear to say that AMD had no involvement in PSSR. Only supplying hardware based on SONY's needs.
Never said it has FSR. The point is it uses AI-based upscaling heavily - which is the future. RT performance is significantly enhanced.
 
Having "ai" in the product name is the most retarded thing they've done yet from a branding perspective.
Guarantee you OEMs / Microsoft want to see it. I don't think there's a single product releasing in 2025 that doesn't mention AI somewhere.
 
Everything has to have AI tacked on so the stocks can pump. Kind of like everything was .com back in late 1999 / early 2000 even if the company had no real online component or value-add. Then all those stocks kept performing really well and the economy did great. I might be mis-remembering some of that...
 
I think a lot of people haven't read up much on PSSR - which is understandable...it's a console tech. The reason I got excited about the PS5 Pro is it uses some of the next gen AMD GPU hardware coming out this month.

https://www.ign.com/articles/mark-cerny-on-ps5-pro-flopflation-and-playstations-partnership-with-amd

Sony and AMD are collaborating and learning from the PS5 Pro - it's not about Sony proprietary tech and AMD is "just" the hardware...
Tell me more about the Amethyst partnership. What kind of information is being shared between the companies? And how does this partnership differ from the existing relationship between SIE and AMD with regards to how you've built PS5 on their hardware and previous consoles, what's new?

So, first, I should give the nature of the collaboration. There are two targets we are working on with AMD. One is: better hardware architectures for machine learning. And that is not about creating proprietary technology for PlayStation – the goal is to create something that can be used broadly across PC and console and cloud. The other collaboration is: with regards to these lightweight CNNs for game graphics. So you know, the sorts of things that are used in PSSR and perhaps the sorts of things that would be used in future FSR.

Does that mean that we can expect the findings from that collaboration to be reflected in future AMD hardware that isn't necessarily PlayStation hardware?

Absolutely. This is not about creating proprietary technology or hardware for PlayStation.

AI Upscaler details:

As for the AI upscaler that you're using for PSSR – is that a discrete piece of hardware or is it built into the GPU itself?

We needed hardware that had this very high performance for machine learning. And so we went in and modified the shader core to make that happen. Specifically, as far as what you touch on the software side, there are 44 new machine learning instructions that take a freer approach to register RAM access. Effectively, you're using the register RAM as RAM. And also implement the math needed for the CNNs.

To put that differently, we enhanced the GPU. But we didn't add a tensor unit or something to it.

Yes, Sony created ML capabilities - PSSR is super resolution based on their hardware:
You did also mention the prospect of building it yourself versus buying or outsourcing the technology. Could you elaborate on the thought process there?

One very simple way to look at this is: are we taking the next roadmap AMD technology, or are we, in fact, going in and trying to design the circuitry ourselves – and we chose the latter. And we chose the latter because we really wanted to start working in the space ourselves. It was clear that the future was very ML-driven. And by that, you know, the world's talking about LLMs and generative AI, but I'm really mostly just looking at game graphics and the boost for game graphics we can get. So, based on that, we wanted to be working in that space.

If you're a nerd check out the recent Digital Foundry videos with Cerny and the PS5 Pro - very interesting stuff. Whether you dig consoles or not - just graphics stuff from a guy on the frontlines who has to think a bit differently (frozen hardware for years).
 
Don't know if this was posted here but just in case.

AMD Radeon RX 9070 expected to start at $479, available as soon as late January

By Zo Ahmed January 10, 2025 at 11:08 AM
"Let's dive into the juiciest rumor from Chiphell: pricing for the RX 9070 XT. According to a post on the forum, AMD might price the reference 9070 XT at $479, with custom third-party models starting around $549. If true, that's a compelling price point for what's expected to be a solid 1440p GPU."
 
Don't know if this was posted here but just in case.

AMD Radeon RX 9070 expected to start at $479, available as soon as late January

By Zo Ahmed January 10, 2025 at 11:08 AM
"Let's dive into the juiciest rumor from Chiphell: pricing for the RX 9070 XT. According to a post on the forum, AMD might price the reference 9070 XT at $479, with custom third-party models starting around $549. If true, that's a compelling price point for what's expected to be a solid 1440p GPU."
I will wait until AMD announces. They could literally be going with a different price today, since that rumor. Assuming that rumor was correct at one point in time.
 
I don't see any reason why AMD would price 9070xt cheaper than nvidia if RT is better than 5070

If that is the case, we would be looking at

AIB 9070xt = $600
MBA 9070xt = $550
MBA 9070 = $450 - $480
 
I don't see any reason why AMD would price 9070xt cheaper than nvidia if RT is better than 5070

If that is the case, we would be looking at

AIB 9070xt = $600
MBA 9070xt = $550
MBA 9070 = $450 - $480
Better than a 4070 I would believe, but a 5070? I mean I would have to see reviews. AMD is usually one generation behind for RT, but who knows. Sony could have lit a fire for the PS5Pro and the RT could be much stronger than most realize. At least for the Pro its definitely noticeable. Will need to wait for reviews, but I can wait until the end of the month I think. Overall though raster has been the most important thing for me with RT being an afterthought in most cases. I think what ID/Bethesda did for Indiana Jones was remarkable because it's the first time where there's RT and it doesn't kneecap performance.
 
I don't see any reason why AMD would price 9070xt cheaper than nvidia if RT is better than 5070

If that is the case, we would be looking at

AIB 9070xt = $600
MBA 9070xt = $550
MBA 9070 = $450 - $480
It's not going to be lol, it'll be better in raster, worse in RT. If people want multi-frame gen and more RT, go nvidia, if people want more traditional raster, less RT, go AMD. Hopefully FSR 4 is good enough people won't care about traditional DLSS 4. We'll see though. It'd be nice to have some actual competition for the first time in a while.
 
It's not going to be lol, it'll be better in raster, worse in RT. If people want multi-frame gen and more RT, go nvidia, if people want more traditional raster, less RT, go AMD. Hopefully FSR 4 is good enough people won't care about traditional DLSS 4. We'll see though. It'd be nice to have some actual competition for the first time in a while.
Hopefully fsr4 upscaling is good. Early first impressions sounded it from the show.
 
Hopefully fsr4 upscaling is good. Early first impressions sounded it from the show.
I am guessing since FSR 4 is new there would be some glitches but it should improve with time. This could be the reason that AMD is taking its own time for the launch
 
Having "ai" in the product name is the most retarded thing they've done yet from a branding perspective.
If it's stupid and it works, it ain't stupid.

"Loathing" may be too small a word for me to describe how I feel about this sort of branding. However, at the moment the "ai" label is the easiest way to recognize amd's current top mobile processors.
 
If it's stupid and it works, it ain't stupid.

"Loathing" may be too small a word for me to describe how I feel about this sort of branding. However, at the moment the "ai" label is the easiest way to recognize amd's current top mobile processors.
There are articles about how many people are turned off by "ai" in the product name. That's why it sounds stupid to have used it for their cpus.
 
Blackwell is monolithic, unless the rumors/leaks were incorrect?
The B100 is multi chip.
1736579557267.png
 
starting at $550 for AIB would be very similar to 5070
I would not be shocked in the slightest that a 9070 will be a better value then the 5070. The 5070 12GB starts at $550. If this rumor is true AMD will be selling the 16GB 9070 XT $80 less then then NV. They will also presumably be selling a 9070 non XT also with 16 GB for less then that. According the the AMD guys the performance rumors floating around are all low.

As always wait for benchmarks. I am pretty sure the 5070 is going to be a massive disappointment... and AMD might be trying on purpose as stupid as it might sound to try and derail the hype as hard as the possibly can. I hope that means they want a big impact when they over deliver. We all get to speculate for the next few weeks till reviews actually hit no matter what.
 
I would not be shocked in the slightest that a 9070 will be a better value then the 5070. The 5070 12GB starts at $550. If this rumor is true AMD will be selling the 16GB 9070 XT $80 less then then NV. They will also presumably be selling a 9070 non XT also with 16 GB for less then that. According the the AMD guys the performance rumors floating around are all low.

As always wait for benchmarks. I am pretty sure the 5070 is going to be a massive disappointment... and AMD might be trying on purpose as stupid as it might sound to try and derail the hype as hard as the possibly can. I hope that means they want a big impact when they over deliver. We all get to speculate for the next few weeks till reviews actually hit no matter what.
Booooooooring. I am bored down through the bone-marrow by amd's lineup (only slightly less by nvidia's, by the way).

Amd needs something to show that their video chips aren't playing third or fourth fiddle to their cpu lineups. Which would be pretty hard, since they obviously are.
 
Booooooooring. I am bored down through the bone-marrow by amd's lineup (only slightly less by nvidia's, by the way).

Amd needs something to show that their video chips aren't playing third or fourth fiddle to their cpu lineups. Which would be pretty hard, since they obviously are.

We are in a recession. Boring is ok. If the 9070 XT really launches under $500, sounds good to me. I am going to assume it sounds good to a lot of other people as well. If they have a 90% as good 9070 non XT for around the $400 mark maybe better yet.

Games aren't RTing. Half the ones that are are ugly noise fests with zero replay anyway. I'll be happy to see a good solid raster card that can push 1440 monitors to the top of their refresh range. Not that boring imo. 4 fake frames for every real one.... Ahhh that isn't boring but it does sound annoying to me. I'll wait till people that aren't in marketing see and report if NV has really discovered the promise land... I somehow doubt it. 75% of your game being AI 6 finger sounds pretty stupid to me. Maybe the reality distortion field over at NV has finally ran out of AI.
 
Perhaps this is not the most fitting thread, but I'm pondering if AI powered features will be included in games and if they are going be accelerated with fitted hardware for that purpose, like the latest GPUs are equipped with and by that some Ryzen APUs too. I realize that texture compression and decompression can be aided by AI, and perhaps AI is to help with handling the cache registry in components, but naturally you'd think that performance varies more from case to case using AI, since an actual AI largely predicts, though the end result might be net positive anyway compared to traditional solutions. Regarding games though, I'm talking about functional features, like what the team behind PUBG introduced along the introduction of Blackwell: this article contains a presentation video as well.

I'm not sure how many players will ever like features that may require speaking verbally to your setup, but I think with a game enhancing AI companion I personally would be up for that - not totally sure without trying. PUBG is an online competitive game, so it remains to be see how useful the AI actually is and if it's truly self-intellect, as it wouldn't have any in-game look at the map so that it actually knows beforehand where and what all the item drops are, etc. PUBG aside, games like Mass Effect, which is a party RPG with strong focus on commanding yourself your followers, or any similar game with even more units to command to succeed, very well could be greate use for AI in gaming - even RTSs might benefit.

The question follows regarding hardware and performance, is that how often such game integrated AI will be accelerated by dedicated AI hardware, like with the solutions on GPUs, or is the CPU going to be used most of the time? I see no mention that the companion in PUBG would require or even benefit from anything regarding Blackwell. Yet, there is the possible online component, from where the AI gets it's data, which could create steeper networking requirements, although AI's data can be stored locally or be a mix of online and local data. Some programs may blow up in install size though, if the AI model is even partly downloaded to local storage.

Amusing questions, to me at least. I'm not sure how ready RX 7000 series is for the AI future in general, I'd imagine not specifically well, while Lovelace is better and Blackwell and RDNA4 are potent (my own prediction), but another question is how soon gaming related AI features are to be incorporated to make use of the hardware?
 
Last edited:
We are in a recession. Boring is ok. If the 9070 XT really launches under $500, sounds good to me. I am going to assume it sounds good to a lot of other people as well. If they have a 90% as good 9070 non XT for around the $400 mark maybe better yet.

Games aren't RTing. Half the ones that are are ugly noise fests with zero replay anyway. I'll be happy to see a good solid raster card that can push 1440 monitors to the top of their refresh range. Not that boring imo. 4 fake frames for every real one.... Ahhh that isn't boring but it does sound annoying to me. I'll wait till people that aren't in marketing see and report if NV has really discovered the promise land... I somehow doubt it. 75% of your game being AI 6 finger sounds pretty stupid to me. Maybe the reality distortion field over at NV has finally ran out of AI.
Agreed. As for the distortion field, not likely :p. DLSS has been "better than native" and there have been obvious flaws. Their compression looks alright for the ratio but is obviously self-serving. DLAA looks alright from what I've seen, but that's not exciting like "free" performance. Nvidia has been given an inch and they're going to stretch it to more than a mile if they have the chance.
Amusing questions, to me at least. I'm not sure how ready RX 7000 series is for the AI future in general, I'd imagine not specifically well, while Lovelace is better and Blackwell and RDNA4 are potent (my own prediction), but another question is how soon gaming related AI features are to be incorporated to make use of the hardware?
If AMD, Intel, or Nvidia did the heavy lifting then maybe the current generation or upcoming might make use of it. All the HW is probably going to be outdated by the time it's in any games in an appreciable sense if it ever is. Seems like another "AI solution" looking for a problem.
 
  • Like
Reactions: ChadD
like this
We are in a recession. Boring is ok
https://tradingeconomics.com/united-states/consumer-spending
https://www.reuters.com/markets/us/...vember-monthly-inflation-subsides-2024-12-20/
https://www.bls.gov/news.release/pdf/empsit.pdf
https://fred.stlouisfed.org/series/A939RX0Q048SBEA

?

The question follows regarding hardware and performance, is that how often such game integrated AI will be accelerated by dedicated AI hardware, like with the solutions on GPUs, or is the CPU going to be used most of the time?
I feel this could be dictated quite a bit by where the input data for the model come from and where its output will be used and the workload as well.

Something like ML material and shaders could tend to be on the GPU a bit, being the end user, something relatively easy to run and used by the CPU like text to speech could run on it (current cpu can already achieve this in real time).

but another question is how soon gaming related AI features are to be incorporated to make use of the hardware?
Nvidia sponsored game that has it a bit like a tech demo, 2025 we can imagine, stuff like text to voice follow by audio 2 face sound so much cheaper than the alternative , for an example:

View: https://store.steampowered.com/app/2628740/Dead_Meat/
Which should not need anything special hardware, by 2025 voice recognition and text to speech models.

Or: https://www.tigames.com/zoopunk, that use Stable Diffusion to let you customize your ship and some model to talk with NPCs

For the auto animation and speech of face, that could be something that become the norm on small budget title quite fast, for neural rendering-shaders that could be something we see on Nvidia sponsored heavy affair like the next Witcher, that could be the next Cyberpunk in that regard.

For the AI that is not technical rendering (DLSS-neural shaders) on a massive game, do not feel like a demo because it was a massive endavour and not something cheaply made, it could be by the time the PS6 come out, could easily have 2,000 "Tops" in 4bits, things mature enough with AA type title having been released showcasing, Sony wanting to push for a title on release that use it and so on.
 
If that pricing is accurate, should keep AMD in the same place or a bit worse, than the position they're currently in (yawn).
 
https://tradingeconomics.com/united-states/consumer-spending
https://www.reuters.com/markets/us/...vember-monthly-inflation-subsides-2024-12-20/
https://www.bls.gov/news.release/pdf/empsit.pdf
https://fred.stlouisfed.org/series/A939RX0Q048SBEA

?


I feel this could be dictated quite a bit by where the input data for the model come from and where its output will be used and the workload as well.

Something like ML material and shaders could tend to be on the GPU a bit, being the end user, something relatively easy to run and used by the CPU like text to speech could run on it (current cpu can already achieve this in real time).


Nvidia sponsored game that has it a bit like a tech demo, 2025 we can imagine, stuff like text to voice follow by audio 2 face sound so much cheaper than the alternative , for an example:

View: https://store.steampowered.com/app/2628740/Dead_Meat/
Which should not need anything special hardware, by 2025 voice recognition and text to speech models.

Or: https://www.tigames.com/zoopunk, that use Stable Diffusion to let you customize your ship and some model to talk with NPCs

For the auto animation and speech of face, that could be something that become the norm on small budget title quite fast, for neural rendering-shaders that could be something we see on Nvidia sponsored heavy affair like the next Witcher, that could be the next Cyberpunk in that regard.

For the AI that is not technical rendering (DLSS-neural shaders) on a massive game, do not feel like a demo because it was a massive endavour and not something cheaply made, it could be by the time the PS6 come out, could easily have 2,000 "Tops" in 4bits, things mature enough with AA type title having been released showcasing, Sony wanting to push for a title on release that use it and so on.

If the price of groceries double, are people buying more groceries or did consumer spending increase? I would be more interested in units sold in relation.
 
  • Like
Reactions: ChadD
like this
If the price of groceries double, are people buying more groceries or did consumer spending increase? I would be more interested in units sold in relation.
Not that it is possible to be perfect, but those are in Constant Prices 2017
 
If that pricing is accurate, should keep AMD in the same place or a bit worse, than the position they're currently in (yawn).
I think people are underestimating just what a disappointment Blackwell is going to be. It’s basically a sidegrade based on NV’s own slides. I’m betting that most DLSS 4 games will be just as good on Lovelace for the first couple of years until MFG and similar tech matures, and by then Blackwell will be obsolete. Take the 5070, for instance:

5070, clock speed * cores
2510 * 6144 ‎ = 15,421,440

4070 Super
2475 * 7168 ‎ = 17,740,800

17740800 / 15421440 ‎ = 1.15
4070 Super 15% faster

NV all-but-confirmed that there’s no architectural uplift with the 4090 vs 5090 slides, where the 5090’s performance in non-MFG games is almost exactly in-line with its increased in core count. NV will try to spin it by comparing the 5070 to the 4070 (not the Super which is the actual predecessor) and insisting on MFG vs non-MFG comparisons. But there’s no hiding that it’s going to be noticeably slower than its predecessor unless there’s some compensating factor I haven’t heard about. When was the last time NV replaced a card with a slower successor? The FX 5800 Ultra is the only example I can think of, and that was only slower than the GF 4 Ti4600 in certain limited circumstances. I guess the 3060 was basically a sidegrade to the 2060 Super, but at least with more VRAM and better efficiency.

Meanwhile, AMD’s card is going to be at least 30% faster than the 7800 XT based on cores and clocks, maybe close to 50%, which would put it near the 7900 XTX with better RT and much better power efficiency for a lot less money. That also puts it equal or ahead of the 4080, which would make it significantly faster than the $750 5070 Ti. I’m almost going to say that NV should have scrubbed the whole gen and released only the 5090 under the name 4090 Ti, because that’s really what it is. The most ironic thing is that AMD cancelled big RDNA 4 because they were so scared of big bad NVIDIA, and now NV is about to show up empty-handed.
 
  • Like
Reactions: kac77
like this
Meanwhile, AMD’s card is going to be at least 30% faster than the 7800 XT based on cores and clocks, maybe close to 50%, which would put it near the 7900 XTX with better RT and much better power efficiency for a lot less money. That also puts it equal or ahead of the 4080, which would make it significantly faster than the $750 5070 Ti. I’m almost going to say that NV should have scrubbed the whole gen and released only the 5090 under the name 4090 Ti, because that’s really what it is. The most ironic thing is that AMD cancelled big RDNA 4 because they were so scared of big bad NVIDIA, and now NV is about to show up empty-handed
You mean, AMD is going to do something different from anything they've ever done historically? I guess you never know. Can I hold you to that 30% jump claim, which has to be "without tricks" so as to match your criticism of Nvidia?
 
NV can do whatever they want, if AMD ends up being good enough, they will only push Super launch earlier to be in 6 months to equal AMD and roll them on brand recognition for the rest, assuming that this will even be necessary.
 
FS4 = DLSS4 right?? Let's make the names the same as our competitor! Ride their coat-tails, and trick stupid consumers!

******

Nvidia's prices on 5080, 5070Ti, and 5070 are not going to give AMD any room for profit margin... AMD took one look at those prices and thought "we're fucked!".
 
Back
Top