Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature currently requires accessing the site using the built-in Safari browser.
If this is when AMD move to chiplets in GPUs, I could actually see larger than normal performance ceiling rise on the high-end- but I wouldn't believe for a moment that performance in the midrange or perf/$ in general will go up by anything remotely close to 3x. If the top-end RX 7000 or whatever is somehow 2.5-3x as fast as 6900XT by virtue of chiplet scaling on top of IPC & clock gainz, it'll also be an even higher performance tier (7990?) and cost more accordingly. Feel free to quote the entirety of this post next year to see how I did lol.
I'm thinking along those lines tooGoing by current msrp
6900 XT > $999
7990 XT > $2499
Last paper I read from NVidia from like 2 years ago was they were working on how to lower the latency between the IO die and the chiplets to improve frame time issues. I have to assume that they all had similar issues and have been working on it all this time. 3 years seems like a reasonable timeframe to tackle this issue.Eh, they have a different solution going. Basically the system only sees one GPU, which is the I/O die. The I/O die tells the chiplets what to do, and takes the info they handle and present it to the system. Essentially, it's just one big GPU even if it's hosted on multiple physical dies.
I don't know if they work like one big die or if they split up roles, like SFR back in the day. But however it'll work, it's a black box as far as the system is concerned.
Cant wait for Etherium to ditch GPU mining.The news we all want is "3x WORSE in crypto mining"![]()
Last paper I read from NVidia from like 2 years ago was they were working on how to lower the latency between the IO die and the chiplets to improve frame time issues.
geez I hope not... NV and AMD spent the entire 2010's insisting that "~this time~ AFR frame pacing, scaling, and compatibility is fixed!" and it never really was. I'm hoping for more like the arrangement on Ryzen 3000/5000 where the IO die holds the last-level cache and scheduler and treats the compute dies as a single pool of threads.Makes it sound like AFR is back on the table...
I'm hoping for more like the arrangement on Ryzen 3000/5000 where the IO die holds the last-level cache and scheduler and treats the compute dies as a single pool of threads.
And you cant buy one
https://twitter.com/greymon55/status/1471693761579130888?s=20Some information and a few guesses:
31=
~60WGP
~16GB 256bit GDDR6
~256M 3D IFC
~2.5GHz
32=
~40WGP
~12GB 192bit GDDR6
~192M 3D IFC
~2.6~2.8GHz
33=
~16WGP
~8GB 128bit GDDR6
~64M IFC
~2.8~3.0GHz
My 6600xt is 10mh slower than my 5600xt wasThe news we all want is "3x WORSE in crypto mining"![]()
Yeah but the new flagships "leaked" from both AMD and NVidia show them to be MCM designs with almost 2x the silicon by surface area and almost 2x the power draw over the existing lineup.
A little disappointed that Nvidia is staying monolithic, but yeah both AMD and NVidia are supposed to be 600w + flagships. That is gonna get spicy.Only AMD is going with chiplets in the next round. Nvidia's will be monolithic. Power envelope either way is expected to be huge, though.
Link?Only AMD is going with chiplets in the next round. Nvidia's will be monolithic. Power envelope either way is expected to be huge, though.
So they'll still be one generation behind NVIDIA with ray tracing performance.I'm thinking along those lines too
7800XT = 80CU (single chiplet) ~20% faster than 6900XT in raster, ~50% faster in RT, ~799$
7900XT = 120CU (dual chiplet) ~70% faster than 6900XT in raster, ~150% faster in RT, ~1199$-1499$
7990XT = 160CU (dual chiplet) ~100% faster than 6900XT in raster, ~200% faster in RT, ~1999$-2499$
disclaimer: all performance numbers are pulled out my arse, please no-one post these on WCCF claiming confirmed performance leak thanx
So they'll still be one generation behind NVIDIA with ray tracing performance.
Quite a few do already and it's only becoming more common. Same with dlss (140+ recent titles)....for the handful of games that actually take advantage of it 3 years after launch...
So they'll still be one generation behind NVIDIA with ray tracing performance.
If Raytracing wasn't here to stay, AMD wouldn't be spending money trying to improve theirs....for the handful of games that actually take advantage of it 3 years after launch...
New development middlewares are leaning heavily into Ray Tracing as a means of decreasing development times and complexities. So much of game "optimization" (I hate that term) is tweaking texture maps, shadows, and assets to maximize visual impact while minimizing the performance hit for using them, the new tools are instead taking raw high definition assets with their movie quality textures and using software and AI to apply the necessary optimizations to the assets for the desired platform then lean into the RTX libraries for shadows, reflections, and other visual goodies. Yeah, it requires the RTX and all the Ray Tracing goodies, but it replaces 1000's of hours of manual labor with a button push and gives out very consistent results. Using RTX saves developers very real money on a project while giving better visuals which is really what PC gaming is all about.If Raytracing wasn't here to stay, AMD wouldn't be spending money trying to improve theirs.
And it's the intergenerational improvement that's commendable, not where they stand in relation to Nvidia. Raytracing isn't zero sum, everyone benefits when there's competition to do it better and faster. RT has always been the graphical holy grail.
Raja Koduri of Intel was recently spotted watching raytracing videos on YouTube, and then commenting "whoa, that's so cool, how does it do that?" So Intel is probably thinking about RT now as well.
A little disappointed that Nvidia is staying monolithic, but yeah both AMD and NVidia are supposed to be 600w + flagships. That is gonna get spicy.
I don't know but at 600w I if I could put it under my desk opposite my Tower it would be a great replacement for the Honeywell ceramic heater I use in there to keep my toes warm.Yeah. How long before external flagship gpus (with their own psu) become standard?
3dfx v6000Yeah. How long before external flagship gpus (with their own psu) become standard?
Yeah, I remember that being specifically mentioned in the Metro Exodus Enhanced Edition deep dive vid; there was an especially poignant part where a dev demonstrates the rasterized lighting workflow and spends a significant amount of time there tweaking light and shadow maps, adding invisible light sources for fill lighting, tweaking it more, iterating. Then for the RTX workflow for the same scene it was just "I tell the game engine that light bulbs are light bulbs and it does the rest"New development middlewares are leaning heavily into Ray Tracing as a means of decreasing development times and complexities. So much of game "optimization" (I hate that term) is tweaking texture maps, shadows, and assets to maximize visual impact while minimizing the performance hit for using them, the new tools are instead taking raw high definition assets with their movie quality textures and using software and AI to apply the necessary optimizations to the assets for the desired platform then lean into the RTX libraries for shadows, reflections, and other visual goodies. Yeah, it requires the RTX and all the Ray Tracing goodies, but it replaces 1000's of hours of manual labor with a button push and gives out very consistent results. Using RTX saves developers very real money on a project while giving better visuals which is really what PC gaming is all about.
Might as well be 1000x faster, I'm keeping my 2070 at this point. Screw both AMD and nVidia for treating their customers the way they've been doing it for the past year!
Yep. Considering you can only reliably sort-of kind-of get a 2060 level in performance type card.. Anything newer/better I just don't care about it. They're insanely difficult/too expensive to acquire.I cant get excited anymore for gpus, its ruined for me , hope one day they'll sort it out with actually being able to buy at (reasonable) MSRP
You would hope, but if they use 2x the juice while being 3x faster then technically it’s an electrical decrease and a profit increase.Maybe the power draw will be enough to put a dent in mining profitability.