RTX 4xxx / RX 7xxx speculation

I just hope mining activity stays relatively low and the supply chain stabilizes so that the insane demand doesn't evaporate stock
And that Intel somewhat help, not sure if it can but if
Intel Alchemist is on TSMC 6nm
Amd on TSMC 5nm
And lovelace is on the high end at least on TSMC 4nm and Ampere continue to go on Samsung 8

And that launch not being at the same time has the console launch and while they would still be in giant demand for the holidays, them being still on TSMC 7nm.

It seem that it least it could be more possible to ramp up if the substrate of what is needed can keep up. Or maybe I am naive and 6nm, 5nm, etc... is not mostly new capability but more mostly transferred process using old chain.

If crypto continue to just be all time extremely high like now but not grow significantly, maybe it could become "normals" less than 6-7 months after launch this time, by normal I mean very high price but possible to buy in a simple way like in around 2017 ?
 
I am assuming that most consumer (that was an important part of the message) would not be interested in doing electrical work for their computer to boot, and just using their current electrical plug being a must.

Lot of consumer use Wi-Fi for always at the same place device (or even use a laptop in their own house), imagine running a 220 line.

Maybe the question was meant by the most consumer of 3090 type of cards that would do that, you could certainly have a point they are far from being most consumer of PC, in the next decade with electrical charging car, having 400 ampere house and higher voltage going in more place could get more common.
Oh I know, it wasn't a serious suggestion. I'd consider it a fun DIY project and then explain to people that the big round plug is for a pc :D But that's just me.
 
Sounds like you have it all figured out. Buy what you like.

Well you can debate it. My comment wasn’t towards anyone buying anything. Every report is saying 4090 is going to be pushed to 600w region to push it to limit. So yea I will definitely by what I want, what I am stating is everything that is leaking.
 
Well you can debate it. My comment wasn’t towards anyone buying anything. Every report is saying 4090 is going to be pushed to 600w region to push it to limit. So yea I will definitely by what I want, what I am stating is everything that is leaking.
How do you even cool a 600w card... look at the FTW3 coolers. They're massive. Will all 4090 cards be hybrids?
 
You know, most households have an outlet for a clothes dryer: 220v at 30 amps. That should be enough for another generation or 2 of graphics cards. You just have to move your pc to the laundry room.

A lot of electric stoves are 220v 40A. Just unplug the stove. ;)

I personally prefer the option of a 600W card assuming it backs it up with performance. Not sure why everyone gets their panties in a twist. Most of us ran SLI or crossfire using a lot more. It should be most people’s dream on [H] to have this option and not have to deal with sli/crossfire IMO given this is an overclocking website, at least in the old days.

Now if AMD offers the same performance at 2/3 the wattage that’s a different story but we don’t know that yet.

All said and done, I don’t expect to need over 3080 performance for a long long time. All my games are maxed out and the only thing I am excited for is The Witcher 3 with ray tracing…
 
Most circuit breakers in North America top out at around 1,500 watts. Some newer ones will handle 1,800.
The two most common breakers in the US are 15 and 20 amp.

Multiply that by 120 volts and you get the max watts, though it is recommended you not exceed 80% load.

120*15=1800 watts and then *.8 = 1440 watts.
120*20=2400 watts and then *.8 = 1920 watts.
 
  • Like
Reactions: c3k
like this
You know, most households have an outlet for a clothes dryer: 220v at 30 amps. That should be enough for another generation or 2 of graphics cards. You just have to move your pc to the laundry room.
It will fit right in there, considering you also need to upgrade cooling solutions to the same noise levels as the dryer.
 
How do you even cool a 600w card... look at the FTW3 coolers. They're massive. Will all 4090 cards be hybrids?

Wouldn't be a surprise, actually. I believe AMD had a generation where all of their top cards were AIO cooled, I think...? Question is how big will the radiator need to be...

Also I just realized that GamerNexus mentioned PCIE5 in their video... will these cards require PCIE5? That pretty much 100% cuts them off for me as I just got done upgrading to a 5950x and this is only PCIE4...
 
Wouldn't be a surprise, actually. I believe AMD had a generation where all of their top cards were AIO cooled, I think...? Question is how big will the radiator need to be...

Also I just realized that GamerNexus mentioned PCIE5 in their video... will these cards require PCIE5? That pretty much 100% cuts them off for me as I just got done upgrading to a 5950x and this is only PCIE4...
Ya it was the R9 Fury?

I'm guessing they'll have to be triple rads, I mean the 3090TI kingpin is triple. Itll be interesting to see for sure.
 
I'd definitely welcome the next generation of cards offering us similar performance with lower power draw/temps. I need an AC window unit in my office that has 2 3080 equipped PCs, lol.
 
I'd definitely welcome the next generation of cards offering us similar performance with lower power draw/temps. I need an AC window unit in my office that has 2 3080 equipped PCs, lol.
Same here! Especially so that technology trickles down into laptop performance
 
  • Like
Reactions: Banya
like this
For what it's worth, the people that will buy these 600-800W parts already have electricians:

 
The two most common breakers in the US are 15 and 20 amp.

Multiply that by 120 volts and you get the max watts, though it is recommended you not exceed 80% load.

120*15=1800 watts and then *.8 = 1440 watts.
120*20=2400 watts and then *.8 = 1920 watts.
Oh good, I'm glad we still got headroom for the 5090 :ROFLMAO:
:ROFLMAO::ROFLMAO:
 
The two most common breakers in the US are 15 and 20 amp.

Multiply that by 120 volts and you get the max watts, though it is recommended you not exceed 80% load.

120*15=1800 watts and then *.8 = 1440 watts.
120*20=2400 watts and then *.8 = 1920 watts.
Glad I am in europe with 13x230 = 2392w at 80% load on the circuit breaker that the computer sits on. Should just be able to squeeze in a portable air condition along with a 600w GPU and 250w CPU....
 
OMG go VR already!

VR or not, you need to have the setup for the physical simulator, and those aren't like controllers, you need to plan out the installs.

But for racing or flight sims, you're probably better off with the multiple screens, because you need to look at your controls. She's a real rally racer, that's her setup for practice off the track.
 
VR or not, you need to have the setup for the physical simulator, and those aren't like controllers, you need to plan out the installs.

But for racing or flight sims, you're probably better off with the multiple screens, because you need to look at your controls. She's a real rally racer, that's her setup for practice off the track.
I've seen p;people model actual Spitfires and other aircraft to match real life but it is a rally car I don't imagine there is much your hands wouldn't know where to go.
I am not a big fan of separate screens as they do not move with the motion simulator. For her she is just repeat practicing and memorizing the track so it all really doesn't matter.
As far as a gamer goes? VR kicks the shit out of that for true immersion.
 
I've tried dirt rally and AC in VR, it made me feel like I was going to puke my brains out, I would way rather have a triple screen like her.

Most pro simracers use triple screens, not VR.
 
A bunch of info from kopite today.

https://twitter.com/kopite7kimi/status/1519164336035745792
https://twitter.com/kopite7kimi/status/1519182862699823106

RTX 4080 will use AD103 chips, build with 16G GDDR6X, have a similar TGP to GA102. RTX 4070 will use AD104 chips, build with 12G GDDR6, 300W. Both of them haven't started testing yet, but soon do them.

In fact, there is another full-fat AD102 SKU with 900W TGP, 48G 24Gbps GDDR6X, 2*16pin and higher frequency. But no one knows whether it will become an actual product. Because the test board of AD102 has more than two 16pin connectors, so everything is possible.
 
A bunch of info from kopite today.

https://twitter.com/kopite7kimi/status/1519164336035745792
https://twitter.com/kopite7kimi/status/1519182862699823106

RTX 4080 will use AD103 chips, build with 16G GDDR6X, have a similar TGP to GA102. RTX 4070 will use AD104 chips, build with 12G GDDR6, 300W. Both of them haven't started testing yet, but soon do them.

In fact, there is another full-fat AD102 SKU with 900W TGP, 48G 24Gbps GDDR6X, 2*16pin and higher frequency. But no one knows whether it will become an actual product. Because the test board of AD102 has more than two 16pin connectors, so everything is possible.
900W😲
 
They've officially lost their fucking minds if this is true. Hopefully AMD has much improved ray-tracing performance. Stuff like this makes me think the 4070 (card i'd likely buy) will be 400-450 watts.
 
They've officially lost their fucking minds if this is true. Hopefully AMD has much improved ray-tracing performance. Stuff like this makes me think the 4070 (card i'd likely buy) will be 400-450 watts.
The same source for the halo professional product (that seem to be a 48 gig ram cards, not for gaming) is saying that the 4080-4070 will be more similar to Ampere, topping at 300w
 
  • Like
Reactions: Axman
like this
I guess they're just following the trend of datacenter GPUs. Repackage a high-end datacenter GPU with less vram and call it the 4090, for example. Their latest Hopper GPUs are pulling up to 600W apiece so that would be the upper end for the 4090 type model I'm guessing.
 
  • Like
Reactions: Axman
like this
They have obviously reached the wall and the consumption of cards and cpu will have to be increased for sure.... above 1000w power supply will become normal if you want performance
 
The same source for the halo professional product (that seem to be a 48 gig ram cards, not for gaming) is saying that the 4080-4070 will be more similar to Ampere, topping at 300w

I think the target for those parts is still in the 500 watt range, 600 watts for 4090.
 
I think the target for those parts is still in the 500 watt range, 600 watts for 4090.
500 watt for the xx70 product seem a lot to me, would that exclude selling them in pre-build in place like California ?
 
500 watt for the xx70 product seem a lot to me, would that exclude selling them in pre-build in place like California ?

I thought they only had restrictions for OEMs/pre-builts. Can't Californians just buy the parts separately like everyone else?
 
They've officially lost their fucking minds if this is true. Hopefully AMD has much improved ray-tracing performance. Stuff like this makes me think the 4070 (card i'd likely buy) will be 400-450 watts.
People will still buy them. But at what point does having to consider the electrical wiring of your house affect your purchase behavior? A 1000w computer just seems bonkers for the home consumer. These seem like they'd be for professional use, I'd assume.
 
I thought they only had restrictions for OEMs/pre-builts. Can't Californians just buy the parts separately like everyone else?
That was my thought has well, thus the (bad English) would that exclude selling them in pre-build in place like California question.

Anyway will see but it seem like a lot a 125% jump in upper mid tier card, specially if they still have the power spike on peak that Ampera has, a 500watt card you want to calculate 550 watt of room on your PSU I imagine.
 
Wouldn't these higher power cards be a bad purchase for miners? I would assume the performance/watt would be around parity from the previous generation, if not worse. I guess it's all speculation for now
 
That was my thought has well, thus the (bad English) would that exclude selling them in pre-build in place like California question.

Yeah, probably. OEMs will ship with placeholder cards and big PSUs knowing that consumers will swap them out.

Wouldn't these higher power cards be a bad purchase for miners?

No, they'll undervolt and underclock them as long as they have at least 8 gigs. But miners really aren't the problem, there's so much other stuff going on that even if they all quit, there'd still be supply issues. Because this is the first time any real progress has happened on all of the PC fronts at the same time, from NVMe to core counts, and everyone's trying to upgrade at the same time.
 
No, they'll undervolt and underclock them as long as they have at least 8 gigs. But miners really aren't the problem, there's so much other stuff going on that even if they all quit, there'd still be supply issues. Because this is the first time any real progress has happened on all of the PC fronts at the same time, from NVMe to core counts, and everyone's trying to upgrade at the same time.
Sheesh is that really true? I thought it was primarily from covid-TSMC manufacturing issues and miners driving up prices. I've been really extending the life out of my 1070 by not gaming as much lately.
 
Wouldn't these higher power cards be a bad purchase for miners? I would assume the performance/watt would be around parity from the previous generation, if not worse. I guess it's all speculation for now
Apparently the latest Hopper can get around 375MH/s on 500-600 watt

A 3060 TI is around 60 on 120 watt when fully optimized according to this:
https://miningchamber.com/gpu-mining/rtx-3060-ti-mining-settings/

If a data centre card like Hopper is particularly good at that type of work it could be a wash and stacking Ampere being quite the better deal depending on price points (which is a really important variable here not just power, we can expect high enough MSRP for the product to be boderline for miners).
 
Wouldn't these higher power cards be a bad purchase for miners? I would assume the performance/watt would be around parity from the previous generation, if not worse. I guess it's all speculation for now
Depends on efficiency and cost. The cost will probably make them a bad buy, but they probably will be really good in terms of efficiency. Currently the rtx a2000 and cmp 170hx (a cut down a100) are the most efficient cards. The cmp 170hx wins by a fair margin, but even though it's maybe 1/2 the price of an A100 it's still much more expensive per hash than something like the rtx a2000 so it can make sense to buy the less efficient cards but get twice the horsepower out of them.

I imagine Nvidia will do something similar next gen and release a high end mining version of these GPUs with less vram. Those probably won't be cards gamers will want anyway and it should give miners a more affordable option.
 
Back
Top