RTX 5xxx / RX 8xxx speculation

Somewhat surprised that there have been no decent deals on the 4xxx of late, should have picked up that $799 4080 super used from BB back in June to keep or flip. My $400 4070s will need to battle on for a while...
Just picked up a "broken" 4090 for ~$700. Hopefully I did not get scammed lol
 
I mean, how could that even turn into a scam? If it works? "I was told this would be broken, but I was lied to!"
Card could arrive with GPU, Memory, or PCB gutted. It was listed as un-opened and what not. Maybe a box full of rocks to get the weight right. :LOL:
 
If the card doesn't work, missing components or otherwise, how would it be better than a box of rocks?
A card with no physical damage listed as not working on Ebay tends to just be user error. (in many case trying to test an GPU with a PSU that is waaay too weak or some other driver issue). And they are usually still under warranty.
When it has been gutted, the warranty is obviously void and the card is not worthing buying... only worth more than a box of rocks If a few parties are interested in buying spare PCBs, Coolers and the like.


tl;dr the card is probably not broken, just user error.
 
Anyone have any rumors as far as the uplift in AI performance for 5090? I was really hoping it would have more than just 28GB ram which kind of sucks.
 
Anyone have any rumors as far as the uplift in AI performance for 5090? I was really hoping it would have more than just 28GB ram which kind of sucks.
For inference, we can imagine they will have the new FP4 tensor acceleration and have a jump in 4 bits that look a bit like lovelace had for 8 bits over ampere:

GPUs_Ada_performance_per_dollar6.png


While good but nothing special boost in 16-8 bits (like Lovelace 16 bits over ampere), if you want to run 4 bits model, near doubling of performance over Lovelace would not be unexpected.
Also 4 bits model take half the memory than 8 bits to run (and 25% of the 16 bits) and seem to be becoming common by now, 24GB could run 30-48B model with 4 bits.
 
For inference, we can imagine they will have the new FP4 tensor acceleration and have a jump in 4 bits that look a bit like lovelace had for 8 bits over ampere:

View attachment 677213

While good but nothing special boost in 16-8 bits (like Lovelace 16 bits over ampere), if you want to run 4 bits model, near doubling of performance over Lovelace would not be unexpected.
Also 4 bits model take half the memory than 8 bits to run (and 25% of the 16 bits) and seem to be becoming common by now, 24GB could run 30-48B model with 4 bits.
I am afraid I don't really understand any of that, but I'm guessing stable diffusion and local chat models will run better.
 
stable diffusion
That a 16 bits model, the 4080 was not much faster than a 3090 at least, could be similar here (a 5080 not really beating the 4090), 3080 beating the 4070.

1725387332759.png


LLM 4bits would be an different category for performance leap.
 
It's now September and not a single announcement from Nvidia. At this point it's probably safe to say RTX 50 series will not launch this year.
 
I think while Blackwell may be delayed, Nvidia is ramping down or ceasing 4xxx production entirely, that would explain the lack if meaningful stock clearance sales 2 years into this gen. That and AMD isn't really a competitor..
 
I think while Blackwell may be delayed, Nvidia is ramping down or ceasing 4xxx production entirely, that would explain the lack if meaningful stock clearance sales 2 years into this gen. That and AMD isn't really a competitor..
They probably don't want a repeat of late summer 2022 when people were getting 3090 for $900 and 3090Ti for $1100 before RTX 4000 was even announced. $1000 clearance 4090s would be a nightmare situation for NV in the current market lol
 
I think while Blackwell may be delayed, Nvidia is ramping down or ceasing 4xxx production entirely, that would explain the lack if meaningful stock clearance sales 2 years into this gen. That and AMD isn't really a competitor..

I have to disagree, AMD competes everywhere except for the 4090 with Nvidia. Just a matter of what you want at the lower tiers. Most people are not playing at 4K resolution, but I will admit the 4090 sold better than I expected it to.
 
Yeah they compete "on paper" and that's it. People still buy nvidia anyways and then love to complain about things like VRAM and price despite AMD literally offering a solution to that.
NVIDIA has captured the mindshare of the market, just like Intel on the CPU side of things despite their many blunders over the years. Normies will say "Intel is better" and will never be able to explain why.
 
No one says Intel is better these days..and sadly my INTC long positions agree. But I also chuckle at the CNBC Karens with all their doom and glooms , ever considered where AMD was not that long lol.
 
Fwiw https://www.pcgamer.com/hardware/gr...ptember-and-well-believe-it-when-we-see-them/ , getting a 5080 soon would be nice..but probably useless click bait article.

Nvidia RTX 5080 and RTX 5090 are reportedly 'scheduled to officially launch in September'

Gpt and google translate give different result ( I think his translator gave him a false one)

Gpt:
The NVIDIA GeForce RTX 50 series based on the Blackwell GPU architecture is expected to debut at CES 2025.
If nothing changes, the GeForce RTX 5090 / D and GeForce RTX 5080 / D, based on the Blackwell GPU architecture, are expected to officially begin development in September.

google:
If there are no surprises, the Blackwell GPU architecture GeForce RTX 5090/D and GeForce RTX 5080/D are scheduled to be officially launched in September

TPU:
https://www.techpowerup.com/326292/...month-chinese-d-variant-arrives-for-both-skus

Translation goes with gpt saying that: NVIDIA is targeting September for the official design specification finalization of both models.
 
Last edited:
No one says Intel is better these days..and sadly my INTC long positions agree. But I also chuckle at the CNBC Karens with all their doom and glooms , ever considered where AMD was not that long lol.
I somewhat agree with you, but look at what had to happen for this change. AMD needed to beat or at least match Intel, which thy did. Zen 5 not withstanding, they brought significant improvement gen over gen, and their platforms lasted a lot longer. At the same time, Intel has been royally fucking up. Not only with little to no improvment gen over gen, but with failing CPU's the last two gens.

Contrast that to the GPU market. AMD has not been able to match nVidia in recent years. Even back when they had the performance crown, it was short lived. nVidia does not stumble like Intel does and has been. When they do, it's short lived. Their last real stumbling block was the FX series
 
I somewhat agree with you, but look at what had to happen for this change. AMD needed to beat or at least match Intel, which thy did. Zen 5 not withstanding, they brought significant improvement gen over gen, and their platforms lasted a lot longer. At the same time, Intel has been royally fucking up. Not only with little to no improvment gen over gen, but with failing CPU's the last two gens.

Contrast that to the GPU market. AMD has not been able to match nVidia in recent years. Even back when they had the performance crown, it was short lived. nVidia does not stumble like Intel does and has been. When they do, it's short lived. Their last real stumbling block was the FX series
Intel needs a generational shift in leadership and thinking, and frankly the layoffs are a good thing, I know of some Intel 'fellows' raking in 500k a year for binging Nflx for the better part of the past decade. But Intel failing entirely will be a very very bad thing for a host of reasons. And Intel is nowhere near where AMD was at, back in 2013-14 I was working in consulting at a Big 4 firm and AMD wasn't able to always pay for our services on time...when I decided to sell my 1500 shares of AMD for a small tax loss. Bad mistake in hindsight but I acted based on information available at the time.
 
Nvidia RTX 5080 and RTX 5090 are reportedly 'scheduled to officially launch in September'

Gpt and google translate give different result ( I think its translator gave him a false one)

Gpt:
The NVIDIA GeForce RTX 50 series based on the Blackwell GPU architecture is expected to debut at CES 2025.
If nothing changes, the GeForce RTX 5090 / D and GeForce RTX 5080 / D, based on the Blackwell GPU architecture, are expected to officially begin development in September.

google:
If there are no surprises, the Blackwell GPU architecture GeForce RTX 5090/D and GeForce RTX 5080/D are scheduled to be officially launched in September

TPU:
https://www.techpowerup.com/326292/...month-chinese-d-variant-arrives-for-both-skus

Translation goes with gpt saying that: NVIDIA is targeting September for the official design specification finalization of both models.
Let's confirm with someone who speaks Chinese instead of machine translation or chatbot

1000009585.png


I'll go with the take from MEGAsizeGPU that the design will be finalized this month, not shipping this month.
There ain't no way RTX 5000 is launching in the next few weeks- the rumor mill would be lit up like a Christmas tree, and that's not the case.

Edit to add- to gpt's credit, it did come to the correct conclusion but machine translation is fraught in general
 
The mGPU speculation here for RDNA4 has me (vaguely) intrigued. If for $1200 (2x $600) you could get near 4090 performance, and if they could reduce some of the issues inherent with mGPU, it may be an interesting proposition.

It would be anothed high-end option, in any case. I loved my 3870x2.
 
The mGPU speculation here for RDNA4 has me (vaguely) intrigued. If for $1200 (2x $600) you could get near 4090 performance, and if they could reduce some of the issues inherent with mGPU, it may be an interesting proposition.

It would be anothed high-end option, in any case. I loved my 3870x2.

First time I hear those speculations, I thought even io chiplet was not happening for RDNA4 (let alone being a mGPU), back to simple monolithic design was a rumours consensus for a big time, where does the speculation come from ?
 
Last edited:
Seems to come from someone named Harmattan ;)
I'm by far the first here to speculate on mGPU making a comeback with RDNA 4, if you've been following this thread and other sites.

Even if it's completely software enabled (which it almost certainly would be), and only supported in some games, it may be a viable way for AMD to stay in the enthusiast market, at least in name.
 
Even if it's completely software enabled (which it almost certainly would be), and only supported in some games, it may be a viable way for AMD to stay in the enthusiast market, at least in name.
If you only get near 4090 with 2 $600 GPU, that seem hard to sales over a $1000 4080 super/$900 7900xtx. Even more so a $1000-1200 faster than 4090 RTX 5080 when it launches.

Has for a return of AMD CrossFire this year, is there any indication on the motherboard-new chipsets that there a plan for this ? The 2x16 pci-express slot for GPU setup does not seem to make a return at all.

The popularity of Temporal (make alternate rendering not that easy) - fullscreens effects (make interline or split screen rendering not that easy) that made using 2 gpu really hard did not went away at all if anything it continued to grow how much it is prevalent. And now we even added full scene effect, would all of nanite-lumen run on both gpu anyway ?. Dx12-Vulkan are all about maximising manually and carefully a single gpu performance and has frame rate goes up, latency become more and more an factor.

The market of motherboard ready to receive 2 gpu, the size of gpu and the market offering a 4090 level GPU instead of buying 2-3 one to do the same thing, make it something really hard commercially.

It is for full-on enthusiast, faster than a xtx but slower than a 4090, saving a little bit of money
The i will buy a card now and buy an used exact same model a bit later on in today world is not that attractive, GPUs take an eternity to ever get cheap.

And with how niche it would be, game engine and game developers would not put resource into it, creating a circle of hard to become any good and any popular.

As long as Nvidia offer people to buy all you can realistically put power wise in a north america machine in one nice package, this seems a small landing stripe.
 
Last edited:
If you only get near 4090 with 2 $600 GPU, that seem hard to sales over a $1000 4080 super/$900 7900xtx. Even more so a $1000-1200 faster than 4090 RTX 5080 when it launches.

Has for a return of AMD CrossFire this year, is there any indication on the motherboard-new chipsets that there a plan for this ? The 2x16 pci-express slot for GPU setup does not seem to make a return at all.

The popularity of Temporal (make alternate rendering not that easy) - fullscreens effects (make interline or split screen rendering not that easy) that made using 2 gpu really hard did not went away at all if anything it continued to grow how much it is prevalent. And now we even added full scene effect, would all of nanite-lumen run on both gpu anyway ?. Dx12-Vulkan are all about maximising manually and carefully a single gpu performance and has frame rate goes up, latency become more and more an factor.

The market of motherboard ready to receive 2 gpu, the size of gpu and the market offering a 4090 level GPU instead of buying 2-3 one to do the same thing, make it something really hard commercially.

It is for full-on enthusiast, faster than a xtx but slower than a 4090, saving a little bit of money
The i will buy a card now and buy an used exact same model a bit later on in today world is not that attractive, GPUs take an eternity to ever get cheap.

And with how niche it would be, game engine and game developers would not put resource into it, creating a circle of hard to become any good and any popular.

As long as Nvidia offer people to buy all you can realistically put power wise in a north america machine in one nice package, this seems a small landing stripe.
Stop ruining my dream for mgpu to make a comeback.
 
Stop ruining my dream for mgpu to make a comeback.
I would be so down for mgpu again. But they got to be able to fix the latency issues AND not have to write drivers for every game that comes out (and sometimes wait for weeks for it to even come to make it work). Just make it with without having to tweak shit to get it to work right.
 
I would be so down for mgpu again. But they got to be able to fix the latency issues AND not have to write drivers for every game that comes out (and sometimes wait for weeks for it to even come to make it work). Just make it with without having to tweak shit to get it to work right.
Are you saying that big RDNA 5 will also be scrapped 🤔
 
Back
Top