RTX 4xxx / RX 7xxx speculation

I thought you were gunna say do what LTT did and use your pool as a radiator! I don't know if they did that, but someone on here mentioned it. You'd have to pre-filter out the leaves and stuff.
I do not think you can share the water pool has a reservoir (modern water block with ultra-thin fin system need pretty near perfect water and some would worry about the many pool product used), you just past tube in it to make some heat exchange
 
Last edited:
I do not think you can share the water pool has a reservoir, you just past tube in it to make some heat exchange
yeah its laid under the pool bottoms cement and it radiates heat into the pool, like heated floors.
 
If they're planning on charging 700 for the 4080 12 GB this generation is already a massive disappointment. We'll see, but I'm not hopeful. If this is their plan it'll allow AMD to do basically the same thing.
It is possible, $700 in 2022 dollars will be around $590 2018 dollars when Turing released, the 2070 FE edition was $600 on launch, 2080 FE was $800 (or $950 now). The cheapest 3080 MSRP was $800 in 2022 dollars and impossible to buy at that price tag, I am not sure why the 16 gig 4080 would not be at least a bit more than the same price (i.e. $800 USD).

When you think about it is actually $700 when you buy it (and not just so price of not really made entry model), and you are actually able to buy it and released before Black Friday, it could be a major win versus the average month of the last 2 year's.
 
It is possible, $700 in 2022 dollars will be around $590 2018 dollars when Turing released, the 2070 FE edition was $600 on launch, 2080 FE was $800 (or $950 now). The cheapest 3080 MSRP was $800 in 2022 dollars and impossible to buy at that price tag, I am not sure why the 16 gig 4080 would not be at least a bit more than the same price (i.e. $800 USD).

When you think about it is actually $700 when you buy it (and not just so price of not really made entry model), and you are actually able to buy it and released before Black Friday, it could be a major win versus the average month of the last 2 year's.
It's the cut down AD 103 chip, my guess it'll be marginally faster than the 3080 12 gb we already have. These chips were typically reserved for 70 series parts in the past. At the very least, they should name it a 3070 Ti to avoid confusion.
 
It's the cut down AD 103 chip, my guess it'll be marginally faster than the 3080 12 gb we already have. These chips were typically reserved for 70 series parts in the past. At the very least, they should name it a 3070 Ti to avoid confusion.
The 4070 rumored specs were very similar but at 160 bit bus instead of 192 bits and with the 160 bits the rumours were around a bit over a 3090 TI in games, it is a bit strange.

I would either not be too optimistic on the announced pricing (at least not forgetting that inflation start with $100+ more if the price stay the exact same on the higher end, not talking about the china added if the exclusion that lift Dec.31 2022 is not renewed) or the actual price if they announce low price.

Has for the buyable before Black Friday, hum maybe ?:
https://www.tweaktown.com/news/8844...waiting-in-warehouses-since-august/index.html

If they were building stock for an august-september release and would have a couple more months of production in the stock than usual, if some of the planning (like reserving space at TSMC) was made when the demand was skyhigh and if demand stay low... maybe an October launch will be less than the usual terribleness.
 
Last edited:
Lord have mercy.

https://videocardz.com/newz/galax-new-serious-gaming-graphics-card-with-four-fans-has-been-leaked


GALAX-GEFORCE-SG-GPU3-768x513.jpg
 
Last edited:
That looks ridiculously huge and awesome, like sticking an RGB edition of War and Peace in your PC.
 
The phone number displayed in the Project Beyond teaser goes to voicemail that says "Tell us, how fast would you like to go?" followed by the beep.
Seriously? 🤣

Judging by the coolers and fans in recent leaks it looks like they are getting ready for liftoff 🤣🤣🤣
 
For me that looks ridiculously ugly and stupid but more importantly unusable for those who actually like to use other slots in their machines. Basically 2 slot hybrid 1st choice, water cooled block second choice.
 
For me that looks ridiculously ugly and stupid but more importantly unusable for those who actually like to use other slots in their machines. Basically 2 slot hybrid 1st choice, water cooled block second choice.
If the rumored TGP is true, then the 4090 will dump up to 660W into the cabinet. I think there will be plenty of offers on watercooling for those cards then, so don´t worry. :)

With the new power levels and design, I think many of us have to rethink our setup, even get some new PSUs that can handle better any transient spikes and that have a PCIE 5 power connector supporting that much power to the GPU. Perhaps some USB adapters/hubs for whatever you use the other slots for.

I am looking at the 4080 16gb variant or something AMD equal depending on reviews of noise at launch (hoping for a big ass cooler on that one too). Should I decide to go "all in" with the 4090, I might have to reverse the fans in my cabinet (SL600M, 2x 200mm top + 2x 200mm bottom) so I can get the hot air from the GPU out as fast as possible without heating up everything else first.

I hope we get to know more from the GTC broadcast tomorrow.
 
Its going to be an interesting launch this time. Nvidia with monolithic die on 4n and AMD with chiplet design on 5n.
Isn't the chiplet only separating memory controller and IO task from the main chip, still a single monolitic main compute die seem to be the latest rumors, will see how significant it will be to have a memory die and a graphic one. Maybe it will let them to have only one on 5nm and the other on a cheaper node, most of the gain on that generation could be on the cost side (not only maximising node cost but less waste of the expensive graphic unit die if you do not have to throw away those that do not work because of the memory controller part of the chip).
 
Last edited:
Maybe it will let them to have only one on 5nm and the other on a cheaper node, most of the gain on that generation could be on the cost side.

Cost and power. From what I've seen the memory controllers are 6nm and might be something the drivers can turn off when not in use.

Also, there are still rumors about a multi-GPU setup for workstations. Obviously, not exciting stuff for gamers, but given that AMD is expected to beat Nvidia on efficiency by pretty much everyone, this could be the generation that lets them take a good chunk of compute stuff.
 
Isn't the chiplet only separating memory controller and IO task from the main chip, still a single monolitic main compute die seem to be the latest rumors, will see how significant it will be to have a memory die and a graphic one. Maybe it will let them to have only one on 5nm and the other on a cheaper node, most of the gain on that generation could be on the cost side.
There have been rumored more then a single gpu chiplet and an I/O chiplet (7950 XT). Always take it with a huge bucket of salt though. In the Zen space, you can have 2 (consumer) to 8 (pro/threadripper) CPU cores combined with I/O and infinity fabric cache. A bit harder with GPU and parallell tasking, but who knows ... :) The chiplet design might allow for more voltage and higher clocks too. Its a bit exciting nevertheless. :)
 
Official AMD communication is promoting power efficiency but not higher clocks 🤔

https://community.amd.com/t5/gaming...gAfCsJiyJfp00InSABRvAJsXcMXTtc6BDEN6PPSU2vBD4
It provide the thing we actually care about: AMD RDNA™ 3 is on track to deliver an estimated >50 percent better performance per watt than AMD RDNA™ 2 architecture

Frequency change info are nice for very similar architecture, but if the memory vs graphic is a significant change, not sure how much it provides info (like do all chiplet run synced and are we talking about 4ghz for the main graphical one), while obviously everything else being equal higher being always better.

RDNA 2 was 54% better than RDNA per watt according to the link (which sound right), that would be 2 50% improvement in a row, even with a 2 year's between launch quite impressive and I imagine in line with the ambitious AMD planned timeline projection they show us from time to time.
 
Seems pretty standard in the picture, will be interesting to see if the rumours around the power requirements are indeed true.
 
The claim is with DLSS 3 the 4090 is 2-4x faster than the 3090ti With DLSS 2... But DLSS 3 will be exclusive to RTX 4xxx? No mention there :)
 
Imagine if DLSS 3 only works on the new cards…. Yikes. Especially if it’s artificially limited to them via software.
 
Of course it will. I doubt there isn't even a technical limitations on the 30 series cards that would prevent it from take advantage of 3.0.
I think they were showing an extra chip that RTX3 contains which sequences the ray tracing for better performance aka DLSS 3
 
The 4080 16GB is overpriced if it only offers 4GB more VRAM. I'll likely be a sucker and buy it though. 4090 is overkill for my uses.
 
Jumping on the 4090. Still rocking 2070 Super which has served me well for 3+ years now. $1599 is a good price, I was afraid they were going to go crazy and ask for $1999+ due to crypto (thankfully that has crashed)
 
Part of me knows it won't work as well as advertised, but damn if I don't want some of that "Omniverse" / AI-enhance (upscaling) modding tool.

That Morrowind scene looked awesome.
 
Back
Top