RTX 4xxx / RX 7xxx speculation

Speaking of that source... HardOCP should make a return... Not sure if there's money to be made in articles these days but I do miss the reviews. On topic, we're closing in and hopefully we have some exciting news. A lot of the speculation is 2x raster performance, that would probably be enough for me to switch if the price is right, I'm assuming that's only at the 7900xt level vs the 6900xt.
 
I am waiting to see what 4080 and AMD 7000 has to offer. The fact that Corsair says the 4 8-pin adapter is not compatible with my AX1200 (I think they are wrong and still thinking about their cable) and now with the adapter connector issues I think I am going to wait it out.
 
Lol i got shit on for posting a video about the power connector a while ago around launch time, sorry it's happening but I do feel vindicated. Did they just not want to have 4 8 pins on the board? I guess that would get kind of absurd, this on the other hand is far worse.
That's my understanding for why nvidia pushed for the smaller connector. But the weird thing is, they have plenty of room to increase the size of their PCBs now that coolers are twice as large. Sure, it'd add extra materials cost, but it would work.

I'm sure they already went through all this, but TE and Molex have some pretty simple connectors that are capable of >1300W and don't take up much space. Maybe something in that form factor would work better since you wouldn't have loose pins to worry about.
te connector.jpg


The 12VHPWR connector just reminds me too much of these hateful things.
5391-00.jpg
 
Last edited:
That's my understanding for why nvidia pushed for the smaller connector. But the weird thing is, they have plenty of room to increase the size of their PCBs now that coolers are twice as large. Sure, it'd add extra materials cost, but it would work.

I'm sure they already went through all this, but TE and Molex have some pretty simple connectors that are capable of >1300W and don't take up much space. Maybe something in that form factor would work better since you wouldn't have loose pins to worry about.
View attachment 521677

The 12VHPWR connector just reminds me too much of these hateful things.
View attachment 521679
Just for some scale:
Those TE connectors in your picture are using 14AWG on the left, 12AWG on the right, and 10AWG in the center. The Molex is 18AWG.
 
Just for some scale:
Those TE connectors in your picture are using 14AWG on the left, 12AWG on the right, and 10AWG in the center. The Molex is 18AWG.
Right. I think they're rated for something like 18kW :eek: I just mean the design seems compact enough and you wouldn't have to worry about pins becoming mis-aligned. Probably more expensive than the molex design but who cares when it's on a $1600 GPU.
 
Right. I think they're rated for something like 18kW

It depends on how those watts are achieved. Watts are basically Volts x Amps. Current (amps) is what generates heat and requires larger (thicker) cables and connectors. It's possible to get a lot of watts from a thin cable and thin connector as long as you up the voltage but keep the amps low. For example, this is how you can get 240w over a USB-C cable, by keeping it at 5-amps but upping the voltage to 48v! But when you are fixed at 12v, the only way to up the wattage is to increase the amps. This is why jumper cables for car batteries are so thick; you're only dealing with 12-14v, but a TON of amps. The situation with GPU power is the same. Computer power supplies don't supply greater than 12v, so extra wattage has to come from more amps. This is a recipe for disaster when using small cables and/or small connectors.

It's the same logic behind why high-voltage power lines (that transport electricity over long distances) use such high voltages, often between 100,000 and 500,000 volts, so that they can transmit a LOT of watts while keeping the amps lower, which allows the use of "thin" cables. To transmit that many watts at a lower voltage would require cables that were multiple feet thick, which would obviously not be practical.

That kind of makes you wonder.... They went through all this effort designing the 12VHPWR connector for the ATX 3.0 spec - maybe they should have included a rail with more than 12V instead?
 
It depends on how those watts are achieved. Watts are basically Volts x Amps. Current (amps) is what generates heat and requires larger (thicker) cables and connectors. It's possible to get a lot of watts from a thin cable and thin connector as long as you up the voltage but keep the amps low. For example, this is how you can get 240w over a USB-C cable, by keeping it at 5-amps but upping the voltage to 48v! But when you are fixed at 12v, the only way to up the wattage is to increase the amps. This is why jumper cables for car batteries are so thick; you're only dealing with 12-14v, but a TON of amps. The situation with GPU power is the same. Computer power supplies don't supply greater than 12v, so extra wattage has to come from more amps. This is a recipe for disaster when using small cables and/or small connectors.

It's the same logic behind why high-voltage power lines (that transport electricity over long distances) use such high voltages, often between 100,000 and 500,000 volts, so that they can transmit a LOT of watts while keeping the amps lower, which allows the use of "thin" cables. To transmit that many watts at a lower voltage would require cables that were multiple feet thick, which would obviously not be practical.

That kind of makes you wonder.... They went through all this effort designing the 12VHPWR connector for the ATX 3.0 spec - maybe they should have included a rail with more than 12V instead?

More then 12 volts in DC is rare as it comes with dangers and issues for people. Some time ago they talked about upping the voltage in cars and ultimately they abandoned it due to the issues with connecters and users being harmed. So I doubt you will see higher DC voltage being used in computers.
 
More then 12 volts in DC is rare as it comes with dangers and issues for people. Some time ago they talked about upping the voltage in cars and ultimately they abandoned it due to the issues with connecters and users being harmed. So I doubt you will see higher DC voltage being used in computers.

I'm not saying you are wrong, but I haven't heard anyone talk about 48v over a USB-C cable being unreasonably dangerous, and people will have MUCH more interaction with an external USB-C cable that is often handled while plugged-in compared to a power cable inside an enclosed computer that is generally only handled while the power-supply is turned-off or unplugged.
 
More then 12 volts in DC is rare as it comes with dangers and issues for people. Some time ago they talked about upping the voltage in cars and ultimately they abandoned it due to the issues with connecters and users being harmed. So I doubt you will see higher DC voltage being used in computers.
3d printers have moved mostly to 24v and that's a hobby filled with Reddit/FB-tier Handy Andys. I'm sure PC builders would have zero issues with it.
 
I'm not saying you are wrong, but I haven't heard anyone talk about 48v over a USB-C cable being unreasonably dangerous, and people will have MUCH more interaction with an external USB-C cable that is often handled while plugged-in compared to a power cable inside an enclosed computer that is generally only handled while the power-supply is turned-off or unplugged.

arcing is the biggest concern at 48 volts, it will eat the terminals quickly, so a very tight fit is critical and was the biggest issue.
 
More then 12 volts in DC is rare as it comes with dangers and issues for people. Some time ago they talked about upping the voltage in cars and ultimately they abandoned it due to the issues with connecters and users being harmed. So I doubt you will see higher DC voltage being used in computers.

Cars see loads in the dozens or hundreds of amps, in inclement weather, while covered with a not-so-fine spray of road salt. Very different situation from your average gamer battlestation. Also, electric cars run battery packs in the hundreds of volts.

Meanwhile, broad swathes of the HVAC industry run on 24v, 48v is pretty ubiquitous across the telecomm industry, and all manner of data centers are going 48v.

48v isn't terribly dangerous, and it'd be even less so if people would tell Molex exactly where they could shove their awful connectors.
 
That's my understanding for why nvidia pushed for the smaller connector. But the weird thing is, they have plenty of room to increase the size of their PCBs now that coolers are twice as large. Sure, it'd add extra materials cost, but it would work.

I'm sure they already went through all this, but TE and Molex have some pretty simple connectors that are capable of >1300W and don't take up much space. Maybe something in that form factor would work better since you wouldn't have loose pins to worry about.
View attachment 521677

The 12VHPWR connector just reminds me too much of these hateful things.
View attachment 521679
I use molex on an industrial project we do at work, had to put them together by hand. There's still some applications for them in engineering, my boss is an electrical engineer and a computer science guy. I'm not real fan of making them, and their reliability isn't the best. I'm with you though, I would've preferred they use an existing solution that's stable, new standards always have issues.
 
On an ATX 3.0 PSU you would probably end up with the connector on the end of the modular 12VHPWR cable that plugs into your PSU burning up also.
Apparently, the issue has yet to be replicatable on an ATX 3.0 PSU direct connection;
https://www.igorslab.de/en/adapter-...hot-12vhpwr-adapter-with-built-in-breakpoint/
However, the “safe” is only valid if e.g. the used supply lines from the power supply with “native” 12VHPWR connector have a good quality and 16AWG lines or at least the used 12VHPWR to 4x 6+2 pin adapter also offers what it promises.

The issue of overheating being du to all the pin being connected together via the Nvidia adapter

The overall build quality of the included adapter for the GeForce RTX 4090, which is distributed by NVIDIA itself, is extremely poor and the internal construction should never have been approved like this. NVIDIA has to take its own supplier to task here, and replacing the adapters in circulation would actually be the least they could do.
 
I bet EVGA is chuckling softly to themselves.

Maybe. They could have simply shipped their cards with a better adapter/connector and become an instant hero among consumers.

Just talked to some folks and the socket itself is not being ruled out yet as being the problem. If the socket is the issue and not the adapter, this is going to be an epic clusterfuck.

The connector itself frankly doesn't make any sense. It's only supplying one voltage - 12v, and all of the pins are connected together... So why do we even need a 12-pin (not counting the sense pins) connector? There should only need to be two large robust connectors, one positive and one negative. That's how basically every other 12v device in the world works. Something like this:

12awg_banana.jpg
 

Now we know why Nvidia cranked up the power so high their cards already have a rep for catching fire. (I know they're not completely related.)

They're worried. They need to hold onto as much mindshare as they can, and they're reaching for the brass ring by running these cards at the ragged edge.
 
Now we know why Nvidia cranked up the power so high their cards already have a rep for catching fire. (I know they're not completely related.)

They're worried. They need to hold onto as much mindshare as they can, and they're reaching for the brass ring by running these cards at the ragged edge.
Uh, actually they were originally going for 600w and decided on 450 instead. There are many reasons Nvidia had better mindshare though. Amd will probably sell every card they produce due to short supply anyway though and the devoted Linux fan base.
 
Uh, actually they were originally going for 600w and decided on 450 instead.

I think they were both playing rumor mill chicken knowing that they might actually run cards over 400 watts. AMD's dual CCD card (which supposedly never went into any real production) they said it would be 700 watts. Then when they launch with "reasonable" power models, people are relieved.

I mean, before they started catching on fucking fire. I know, not directly related, but if Nvidia had stayed under 375W with regular connectors this wouldn't have been a problem in the first place. But then AMD would have probably topped Nvidia's flagship.
 
Look like the 4080 is starting to be in people hands in China:

https://www.chiphell.com/thread-2458296-1-1.html

Little bit of FPS, which can be hard to direct compare (settings, cpu-ram, etc..), but one interesting aspect if true:

https://static.chiphell.com/forum/202211/08/234632pzr84sgmmsg588y2.jpg

GPU clock: 2.9 ghz
Temp: 64 Celsius

With low fan speed (42%)
320 watt full board draw, 265 watt for the GPU while under a quite good overclock, which would be I think a good step down from the 3080 FE.
 
Last edited:
Using a gen-for-gen comparison (3090 vs 4090, 3080 vs 4080) it's worth about $650.
I am not sure of the technic used but 3090 had ridiculously poor performance by $ in games, so if it some 3090performance/4090performance * something it would end up with that kind of strangeness.

Considering how much people are paying for a 3080TI right now and the announced 7900xt price, I am not sure how 20% more than a 3090TI could be worth $650 in less than 2 weeks.
 
I am not sure of the technic used but 3090 had ridiculously poor performance by $ in games, so if it some 3090performance/4090performance * something it would end up with that kind of strangeness.

Considering how much people are paying for a 3080TI right now and the announced 7900xt price, I am not sure how 20% more than a 3090TI could be worth $650 in less than 2 weeks.
The 3090 had poor performance value because the 3080 was so amazing. If we apply the same logic to the 4090, we end up with an amazing value 4080... That's the point.

4090 is 64% faster than the 3090 for 6.6% more money. (TPU summary 61% -> 100%)
4080 is 47% faster than the 3080. (TPU summary 81% -> 119%)

Use a calculator to normalize the price increase and you get 4.8% for the 4080, or $734. Higher-end cards have diminishing returns on cost, so the 4080 should be somewhere around $650-$700. Even if Nvidia wanted to be greedy, it would be $750-$800. Can't see it any higher.
 
The 3090 had poor performance value because the 3080 was so amazing. If we apply the same logic to the 4090, we end up with an amazing value 4080... That's the point.
I am not sure that particularly true, I think the 3090 had poor performance for games value in large part by design (24 gig of vram was not a gaming card) more than the 3080 being so much value at $700 versus say the 1080.

And if the 3080 was so amazing at $700, why the 4080 would need to go cheaper ?

Using the $1500 3090 in game has a value benchmark would make anything look like it has incredible value in a very distorted way.

Redo the same with actual average sales new 3080, 3080Ti cards in the last year.

under that logic
4090 is 64% faster than the 3090 for 6.6% more money.
7900xtx is say 65% faster than the 6800xt, it should be a $693 card to be good value in the current market, and the just 35% faster 7900xt $673 ?, I feel like I must be entering bad numbers or doing something wrong, because that seem all wrong.
 
Last edited:
Using a gen-for-gen comparison (3090 vs 4090, 3080 vs 4080) it's worth about $650. Just highlights how amazing the 4090 is.

Except you're not supposed to use the outgoing generation's launch price to determine MSRP, that's how you get these insane price jumps. You're supposed to use the outgoing generation's current used and new-old-stock pricing.
 
https://videocardz.com/newz/nvidia-...r-than-rtx-3090-ti-in-the-first-gaming-review

Seem like either go at 280-300w or 450w+ could be the sweat spot on the 4080 models:

View attachment 526692View attachment 526693

if those 5 game and setup hold up:
https://cdn.videocardz.com/1/2022/11/RTX4080-GAMES-768x355.jpg

Average 4k FPS
3090Ti: 86.73
4080, 320w: 103.422
4090: 134.93

Perfectly placed in between 3090Ti and 4090, one $950 price tag away to be an excellent card.
The "sweat" spot? That is what I have in my armpits at the end of the day.

I notice this typo a lot on the internet these days.
 
7900xt first numbers ?Direct from AMD, still suspiciously extremelly limited which is maybe a having to fix -optimise drivers to have clearer numbers affair:

rx-7900-xtx-raster-perf.jpg


rx-7900-xtx-ray-tracing-perf.jpg


https://www.digitaltrends.com/computing/amd-rx-7900-xtx-benchmark-70-percent-boost/

In 6950xt ratio then in 7900xtx ratio:

6950xt​
7900xt​
7900 xtx​
7900xt​
7900 xtx​
6950xt​
7900xt​
RE
124​
157​
190​
1.27​
1.53​
0.65​
0.83​
COD
92​
117​
139​
1.27​
1.51​
0.66​
0.84​
Cy2077
43​
60​
72​
1.40​
1.67​
0.60​
0.83​
WD Legion
68​
85​
100​
1.25​
1.47​
0.68​
0.85​
RayTracing
RE
94​
115​
135​
1.22​
1.44​
0.70​
0.85​
DL2
12​
21​
24​
1.75​
2.00​
0.50​
0.88​
Cy2077
13​
18​
21​
1.38​
1.62​
0.62​
0.86​
Hitman 3
23​
34​
38​
1.48​
1.65​
0.61​
0.89​

Without much of a surprise the 7900xt seem to be about 5/6 of a 7900xtx for 9/10 of the price.

In extremely low amount of cherry pick title
7900xt seem to be 1.3 time a 6950xt in raster, 1.38 time in RT title
7900xtx seem to be 1.55 time a 6950xt in raster, 1.61 time in RT title
 
Last edited:
Back
Top