cjcox
2[H]4U
- Joined
- Jun 7, 2004
- Messages
- 3,214
AMD: We moved to 14 angstrom so we could put the name of the chip actually on the chip. Better than spreading the name across the core and CCDs.So are our product names.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
AMD: We moved to 14 angstrom so we could put the name of the chip actually on the chip. Better than spreading the name across the core and CCDs.So are our product names.
The PS5 Pro is using PSSR, which is developed internally by SONY. It's unclear if FSR4 can be run as is, on PS5 Pro. As the ML hardware was spec'ed by SONY. They have been pretty clear to say that AMD had no involvement in PSSR. Only supplying hardware based on SONY's needs.The PS5 Pro literally has this tech. I get that we, as enthusiasts, are disappointed - it is not the "moAr PoWer" that we demand - but AMD is improving in meaningful ways. They've learned raster brute forcing is not the future and are pivoting.
https://www.techradar.com/gaming/ps5/what-is-pssr-explained
I know this is an extension of what has been done on the PC side - but this is novel in the console space which is actually very meaningful for the larger industry.
Mark Cerny (architect of the PlayStation) essentially thinks raster is the past - https://www.tomshardware.com/video-...n-debunked-by-ps5-system-architect-mark-cerny (long video linked).
Blackwell is monolithic, unless the rumors/leaks were incorrect?RDNA's chiplet design needed a better TSMC packaging process that never appeared, they were working it internally but ultimately scrapped it because of costs and failure rate.
TSMC redesigned it, and ultimately launched it, but Nvidia Blackwell is the first production product using it and they are having one hell of a time with it.
There is no ML hardware in PS5 pro.The PS5 Pro is using PSSR, which is developed internally by SONY. It's unclear if FSR4 can be run as is, on PS5 Pro. As the ML hardware was spec'ed by SONY. They have been pretty clear to say that AMD had no involvement in PSSR. Only supplying hardware based on SONY's needs.
OK well whatever it is about the PS5 Pro which allows it to run algorithms derived from ML. Which is really what this is all about. These upscalers are all derived algorithms.There is no ML hardware in PS5 pro.
They just repurposed the registers
Having "ai" in the product name is the most retarded thing they've done yet from a branding perspective.
Never said it has FSR. The point is it uses AI-based upscaling heavily - which is the future. RT performance is significantly enhanced.The PS5 Pro is using PSSR, which is developed internally by SONY. It's unclear if FSR4 can be run as is, on PS5 Pro. As the ML hardware was spec'ed by SONY. They have been pretty clear to say that AMD had no involvement in PSSR. Only supplying hardware based on SONY's needs.
Guarantee you OEMs / Microsoft want to see it. I don't think there's a single product releasing in 2025 that doesn't mention AI somewhere.Having "ai" in the product name is the most retarded thing they've done yet from a branding perspective.
Tell me more about the Amethyst partnership. What kind of information is being shared between the companies? And how does this partnership differ from the existing relationship between SIE and AMD with regards to how you've built PS5 on their hardware and previous consoles, what's new?
So, first, I should give the nature of the collaboration. There are two targets we are working on with AMD. One is: better hardware architectures for machine learning. And that is not about creating proprietary technology for PlayStation – the goal is to create something that can be used broadly across PC and console and cloud. The other collaboration is: with regards to these lightweight CNNs for game graphics. So you know, the sorts of things that are used in PSSR and perhaps the sorts of things that would be used in future FSR.
Does that mean that we can expect the findings from that collaboration to be reflected in future AMD hardware that isn't necessarily PlayStation hardware?
Absolutely. This is not about creating proprietary technology or hardware for PlayStation.
As for the AI upscaler that you're using for PSSR – is that a discrete piece of hardware or is it built into the GPU itself?
We needed hardware that had this very high performance for machine learning. And so we went in and modified the shader core to make that happen. Specifically, as far as what you touch on the software side, there are 44 new machine learning instructions that take a freer approach to register RAM access. Effectively, you're using the register RAM as RAM. And also implement the math needed for the CNNs.
To put that differently, we enhanced the GPU. But we didn't add a tensor unit or something to it.
You did also mention the prospect of building it yourself versus buying or outsourcing the technology. Could you elaborate on the thought process there?
One very simple way to look at this is: are we taking the next roadmap AMD technology, or are we, in fact, going in and trying to design the circuitry ourselves – and we chose the latter. And we chose the latter because we really wanted to start working in the space ourselves. It was clear that the future was very ML-driven. And by that, you know, the world's talking about LLMs and generative AI, but I'm really mostly just looking at game graphics and the boost for game graphics we can get. So, based on that, we wanted to be working in that space.
I will wait until AMD announces. They could literally be going with a different price today, since that rumor. Assuming that rumor was correct at one point in time.Don't know if this was posted here but just in case.
AMD Radeon RX 9070 expected to start at $479, available as soon as late January
By Zo Ahmed January 10, 2025 at 11:08 AM
"Let's dive into the juiciest rumor from Chiphell: pricing for the RX 9070 XT. According to a post on the forum, AMD might price the reference 9070 XT at $479, with custom third-party models starting around $549. If true, that's a compelling price point for what's expected to be a solid 1440p GPU."
I would take a reference at $479 for sure. I'm actually due for at least two GPU upgrades and that would fit the bill.starting at $550 for AIB would be very similar to 5070
Better than a 4070 I would believe, but a 5070? I mean I would have to see reviews. AMD is usually one generation behind for RT, but who knows. Sony could have lit a fire for the PS5Pro and the RT could be much stronger than most realize. At least for the Pro its definitely noticeable. Will need to wait for reviews, but I can wait until the end of the month I think. Overall though raster has been the most important thing for me with RT being an afterthought in most cases. I think what ID/Bethesda did for Indiana Jones was remarkable because it's the first time where there's RT and it doesn't kneecap performance.I don't see any reason why AMD would price 9070xt cheaper than nvidia if RT is better than 5070
If that is the case, we would be looking at
AIB 9070xt = $600
MBA 9070xt = $550
MBA 9070 = $450 - $480
It's not going to be lol, it'll be better in raster, worse in RT. If people want multi-frame gen and more RT, go nvidia, if people want more traditional raster, less RT, go AMD. Hopefully FSR 4 is good enough people won't care about traditional DLSS 4. We'll see though. It'd be nice to have some actual competition for the first time in a while.I don't see any reason why AMD would price 9070xt cheaper than nvidia if RT is better than 5070
If that is the case, we would be looking at
AIB 9070xt = $600
MBA 9070xt = $550
MBA 9070 = $450 - $480
Hopefully fsr4 upscaling is good. Early first impressions sounded it from the show.It's not going to be lol, it'll be better in raster, worse in RT. If people want multi-frame gen and more RT, go nvidia, if people want more traditional raster, less RT, go AMD. Hopefully FSR 4 is good enough people won't care about traditional DLSS 4. We'll see though. It'd be nice to have some actual competition for the first time in a while.
I am guessing since FSR 4 is new there would be some glitches but it should improve with time. This could be the reason that AMD is taking its own time for the launchHopefully fsr4 upscaling is good. Early first impressions sounded it from the show.
If it's stupid and it works, it ain't stupid.Having "ai" in the product name is the most retarded thing they've done yet from a branding perspective.
There are articles about how many people are turned off by "ai" in the product name. That's why it sounds stupid to have used it for their cpus.If it's stupid and it works, it ain't stupid.
"Loathing" may be too small a word for me to describe how I feel about this sort of branding. However, at the moment the "ai" label is the easiest way to recognize amd's current top mobile processors.
The B100 is multi chip.Blackwell is monolithic, unless the rumors/leaks were incorrect?
Ah, that's for data center. GB202 (used in RTX 5090) is a single chip.The B100 is multi chip.
View attachment 703196
I would not be shocked in the slightest that a 9070 will be a better value then the 5070. The 5070 12GB starts at $550. If this rumor is true AMD will be selling the 16GB 9070 XT $80 less then then NV. They will also presumably be selling a 9070 non XT also with 16 GB for less then that. According the the AMD guys the performance rumors floating around are all low.starting at $550 for AIB would be very similar to 5070
Booooooooring. I am bored down through the bone-marrow by amd's lineup (only slightly less by nvidia's, by the way).I would not be shocked in the slightest that a 9070 will be a better value then the 5070. The 5070 12GB starts at $550. If this rumor is true AMD will be selling the 16GB 9070 XT $80 less then then NV. They will also presumably be selling a 9070 non XT also with 16 GB for less then that. According the the AMD guys the performance rumors floating around are all low.
As always wait for benchmarks. I am pretty sure the 5070 is going to be a massive disappointment... and AMD might be trying on purpose as stupid as it might sound to try and derail the hype as hard as the possibly can. I hope that means they want a big impact when they over deliver. We all get to speculate for the next few weeks till reviews actually hit no matter what.
Booooooooring. I am bored down through the bone-marrow by amd's lineup (only slightly less by nvidia's, by the way).
Amd needs something to show that their video chips aren't playing third or fourth fiddle to their cpu lineups. Which would be pretty hard, since they obviously are.
Agreed. As for the distortion field, not likely . DLSS has been "better than native" and there have been obvious flaws. Their compression looks alright for the ratio but is obviously self-serving. DLAA looks alright from what I've seen, but that's not exciting like "free" performance. Nvidia has been given an inch and they're going to stretch it to more than a mile if they have the chance.We are in a recession. Boring is ok. If the 9070 XT really launches under $500, sounds good to me. I am going to assume it sounds good to a lot of other people as well. If they have a 90% as good 9070 non XT for around the $400 mark maybe better yet.
Games aren't RTing. Half the ones that are are ugly noise fests with zero replay anyway. I'll be happy to see a good solid raster card that can push 1440 monitors to the top of their refresh range. Not that boring imo. 4 fake frames for every real one.... Ahhh that isn't boring but it does sound annoying to me. I'll wait till people that aren't in marketing see and report if NV has really discovered the promise land... I somehow doubt it. 75% of your game being AI 6 finger sounds pretty stupid to me. Maybe the reality distortion field over at NV has finally ran out of AI.
If AMD, Intel, or Nvidia did the heavy lifting then maybe the current generation or upcoming might make use of it. All the HW is probably going to be outdated by the time it's in any games in an appreciable sense if it ever is. Seems like another "AI solution" looking for a problem.Amusing questions, to me at least. I'm not sure how ready RX 7000 series is for the AI future in general, I'd imagine not specifically well, while Lovelace is better and Blackwell and RDNA4 are potent (my own prediction), but another question is how soon gaming related AI features are to be incorporated to make use of the hardware?
https://tradingeconomics.com/united-states/consumer-spendingWe are in a recession. Boring is ok
I feel this could be dictated quite a bit by where the input data for the model come from and where its output will be used and the workload as well.The question follows regarding hardware and performance, is that how often such game integrated AI will be accelerated by dedicated AI hardware, like with the solutions on GPUs, or is the CPU going to be used most of the time?
Nvidia sponsored game that has it a bit like a tech demo, 2025 we can imagine, stuff like text to voice follow by audio 2 face sound so much cheaper than the alternative , for an example:but another question is how soon gaming related AI features are to be incorporated to make use of the hardware?
https://tradingeconomics.com/united-states/consumer-spending
https://www.reuters.com/markets/us/...vember-monthly-inflation-subsides-2024-12-20/
https://www.bls.gov/news.release/pdf/empsit.pdf
https://fred.stlouisfed.org/series/A939RX0Q048SBEA
?
I feel this could be dictated quite a bit by where the input data for the model come from and where its output will be used and the workload as well.
Something like ML material and shaders could tend to be on the GPU a bit, being the end user, something relatively easy to run and used by the CPU like text to speech could run on it (current cpu can already achieve this in real time).
Nvidia sponsored game that has it a bit like a tech demo, 2025 we can imagine, stuff like text to voice follow by audio 2 face sound so much cheaper than the alternative , for an example:
View: https://store.steampowered.com/app/2628740/Dead_Meat/
Which should not need anything special hardware, by 2025 voice recognition and text to speech models.
Or: https://www.tigames.com/zoopunk, that use Stable Diffusion to let you customize your ship and some model to talk with NPCs
For the auto animation and speech of face, that could be something that become the norm on small budget title quite fast, for neural rendering-shaders that could be something we see on Nvidia sponsored heavy affair like the next Witcher, that could be the next Cyberpunk in that regard.
For the AI that is not technical rendering (DLSS-neural shaders) on a massive game, do not feel like a demo because it was a massive endavour and not something cheaply made, it could be by the time the PS6 come out, could easily have 2,000 "Tops" in 4bits, things mature enough with AA type title having been released showcasing, Sony wanting to push for a title on release that use it and so on.
Not that it is possible to be perfect, but those are in Constant Prices 2017If the price of groceries double, are people buying more groceries or did consumer spending increase? I would be more interested in units sold in relation.
I think people are underestimating just what a disappointment Blackwell is going to be. It’s basically a sidegrade based on NV’s own slides. I’m betting that most DLSS 4 games will be just as good on Lovelace for the first couple of years until MFG and similar tech matures, and by then Blackwell will be obsolete. Take the 5070, for instance:If that pricing is accurate, should keep AMD in the same place or a bit worse, than the position they're currently in (yawn).
You mean, AMD is going to do something different from anything they've ever done historically? I guess you never know. Can I hold you to that 30% jump claim, which has to be "without tricks" so as to match your criticism of Nvidia?Meanwhile, AMD’s card is going to be at least 30% faster than the 7800 XT based on cores and clocks, maybe close to 50%, which would put it near the 7900 XTX with better RT and much better power efficiency for a lot less money. That also puts it equal or ahead of the 4080, which would make it significantly faster than the $750 5070 Ti. I’m almost going to say that NV should have scrubbed the whole gen and released only the 5090 under the name 4090 Ti, because that’s really what it is. The most ironic thing is that AMD cancelled big RDNA 4 because they were so scared of big bad NVIDIA, and now NV is about to show up empty-handed