RTX 4xxx / RX 7xxx speculation

They're gonna do it. No one is playing at 8k and it's been proven heaps of cheaper cache memory can (somewhat) make up for a wide bus. Any way to cut costs.
After everyone just made a big fuss about memory bus during this current generation, it will bring a sense of irony.
 
Gotta wonder where the rest of the lineup falls below that. There mid and lower end chips will have very limited vram if there are no 4 GB gddr6 chips.
4070ti - 192 bit 12 gb launched later
4070 -128 bit maybe 8 GB?
4060ti - same as 4070
4060 - 96 bit with 6GB vram????
 
Some good news, it seems nvidia have actually dropped prices on their UK store, by as much as £300. They were previously offering a £280 rebate of the 3090ti only, but now it's an across the board upfront discount:

https://store.nvidia.com/en-gb/geforce/store/?page=1&limit=9&locale=en-gb

I've been following the prices there regularly ever since they first started selling FE cards in the UK to help me gauge how they will react in future generations, and they've never dropped prices like this before. (y)
 
Nvidia gimping the base 4080 model to keep the facade of $700 MSRP while the "real" 4080 costs up to $1,200?
Or... it's just an internal test sample. Or the leakers got the models mixed up again, ie: the 4080 12 GB is actually a 4070 Ti or something.

The leaker also posted a 3090 Super and a 3080 20 GB in the last few days, so...
I would suspect that NVIDIA is refactoring the product lineup to keep retail prices in check. 8Gb GDDR6 was $10-16 per chip in bulk, depending on speed. GDDR6X price has never been made available, so there is no telling how much it costs for those "high speed" 16Gb chips. But then it begs the question if they're really saving money on the bill of materials, and is it worth it for the overall reduction in bandwidth.
 
Maybe the “Supers” will have higher bandwidth…?

Seems like some segmentation or maybe it is cost shaving to hit price points or keep margins.
 
Maybe the “Supers” will have higher bandwidth…?

Seems like some segmentation or maybe it is cost shaving to hit price points or keep margins.
I think NVIDIA did away with the SUPER branding with the Ampere generation. It was just a thing with Turing due to yield issues before the process was improved. They could introduce variants with more memory in the future, like the 6GB GTX 780 and 12GB RTX 3080.
 
First they need to come out with 32 Gb (4GB) modules which I doubt will happen at this stage in gddr6.

Everything below the 4080 will be 192 bit or less meaning 12 GB of vram AT MOST.
 
It looks like someone with access to GTC info has leaked a few things.
Two new FE images, which show the same heatsink, frame indentations, and enlarged fan as previous leaks.

https://twitter.com/QbitLeaks/status/1567380930389544960/
https://twitter.com/QbitLeaks/status/1567342466109423618

He also posted some frame charts.

https://twitter.com/QbitLeaks/status/1567378137100460037

You might notice these graphs look similar to Nvidia's past fps charts (green and gray bars) like this. These are easy to fake, though.
He also notes this is NOT the flagship GPU.

40vs30.png
40vs30_2.png
 
Last edited:
I have a 3080 and am interested mainly for the increased VRAM...but I really don't want to have to buy a new power supply
 
4090 Chinese timespy leak. 3 GHz, 20k in TSE.

https://videocardz.com/newz/alleged...-3-0-ghz-scores-20k-points-in-timespy-extreme

"The leaker claims that the graphics card is air-cooled and it is huge. It looks to be designed for 600 to 800W but the TDP is 450W. This suggests this might indeed be GeForce RTX 4090 GPU which is rumored to feature such board power."

To clarify, if you look closely at the pic, the GPU is actually running 65C.
The VCZ article says 55C.
 
Last edited:
Guess I was lucky after all, buying my 3090 Ti Ultra at $1,199.99 "in stock"...instead of waiting to buy at $1,099.99 "out of stock" then. ;)
 
4 slot cooler? Fuck. I guess I that won’t fit in my Falcon northwest case.
 
Nvidia may have bought a bunch back.

Would the actually do that? I kind of doubt it but I have no idea. Unless they're dumping former AIB cards in prebuilt PCs with bottom of the barrel crappy PSUs, motherboards that companies can sell for an artificially high price by pairing a good GPU with otherwise low end parts.
 
So what's the 'killer app' gonna be for this gen then? Seems it's gonna be about the most jaded generation yet.

1080 - 4K Gaming!
2080 - Ray Tracing!
3080 - Mining Playable Ray Tracing + 8K!
4080 - ???

o_O
 
Last edited:
So what's the 'killer app' for this gen then? Seems it's gunna about the most jaded generation yet.

1080 - 4K Gaming!
2080 - Ray Tracing!
3080 - Mining Playable Ray Tracing + 8K!
4080 - ???

o_O
Yeah I mentioned before these GPUs are too fast to be useful and got butchered for it.
I guess 4K 144 Hz? Or higher raytracing settings without DLSS? VR? People will find a way.

GPUs seem to be advancing faster than graphics and monitor technology.
 
Back
Top