RTX 3xxx performance speculation

Kinda off topic but this guy was great and my son was always excited to show me vids of some of the cool projects he would do. It's too bad the guy died unexpectedly last year.
I would argue that once you start throwing liquid nitrogen in your own face, you lose the right to use the word "unexpectedly" in relation to your own death.
 
I would argue that once you start throwing liquid nitrogen in your own face, you lose the right to use the word "unexpectedly" in relation to your own death.

Ehh it's no worse than throwing ice water in your face as long as it's not prolonged as it evaporates before it can cause damage.
 
I don't know about naming, but yeah, pretty much guarantee, this will be $1000+.
Ugh. You’re killing me dude! The eternal optimist in me hopes you’re wrong (i.e. that the part that is slightly bettter than 2080ti is not still $1000).
 
Samsung’s “8nm” process is really a slightly tweaked 10nm with only minor shrinks from what I last read about it. Isn’t it a lot more than 25% larger than tsmc 7nmLPP?
 
Samsung’s “8nm” process is really a slightly tweaked 10nm with only minor shrinks from what I last read about it. Isn’t it a lot more than 25% larger than tsmc 7nmLPP?
8LPP is 32.5% less dense than N7. 8LPP 18.15% more dense than 10LPP. Samsung was able to make 8LPP more dense by adding another fin to the cell and reducing the MMP (and CPP by a smaller amount) compared to 10LPP.
 
8LPP is 32.5% less dense than N7. 8LPP 18.15% more dense than 10LPP. Samsung was able to make 8LPP more dense by adding another fin to the cell and reducing the MMP (and CPP by a smaller amount) compared to 10LPP.

I think a huge grain of salt is needed on the Samsnung 8nm rumors, especially given that Pro Ampere is using TSMC 7nm.
 
Ugh. You’re killing me dude! The eternal optimist in me hopes you’re wrong (i.e. that the part that is slightly bettter than 2080ti is not still $1000).

Without a doubt over $1000, I'm guessing closer to £1200. It will be a cold day in holy heck nV pulls back on pricing. Even if Big Navi runs within 10% performance of 3080 Ti and is $800, nV will be sticking to their guns because of "features" (y)
 
Without a doubt over $1000, I'm guessing closer to £1200. It will be a cold day in holy heck nV pulls back on pricing. Even if Big Navi runs within 10% performance of 3080 Ti and is $800, nV will be sticking to their guns because of "features" (y)
I feel the same way, I dont think they take AMD as a serious competitor. I haven't purchased an AMD highend card in a long time but if it works out like that I would have to buy red.
 
I am sure this one will cost more than 2080 ti launch price. Nvidia no dummy either be same or more in price. Fact card will be in short supply at launch.
 
https://www.hardwaretimes.com/nvidi...prices-expected-to-increase-in-coming-months/

Prices expected to increase on RTX-20 GPUs? I guess that's a way to grab money from people that can't wait or just don't know better. I'm glad to see more news that helps confirm my suspicions that the 30-series will be released this fall, just in time to play Cyberpunk 2077.
Supply and demand? Supply goes down, demand remains flat (or flatter than supply), price comes up. If prices go up even further, it'll make the 30 series look like they're priced good again just because people will be used to seeing the highly inflated 20 series prices, lol. If 2080ti goes up to $1500... a 3080ti for $1200 would feel like deal :p.
 
Its not NVidia raising the prices, its because they have stopped production of everything except the 16xx cards so there will be less cards available.
This wont reflect on base 30xx prices.
 
Pricing its hard to say. Nvidia did say a while back that they were going to price the 3080Ti lower than the 2080Ti at release. So $1200 for 2080Ti at release. Okay they could price the 3080Ti at $1000 and say they accomplished just that and still make loads of money. $1000 is still expensive in my book but we're talking Nvidia, no real competition, demand, capitalism and greed thrown in so yeah they could charge $1200 again and maybe more. It can go either way but its safe to assume $1000-$1200 will be the likely range for the 3080Ti (or whatever they name it). That's my prediction anyway. Like it or not.
 

LOL, Nvidia Fermi 2, hard to believe September without more leaks occurring unless it is Nvidia exclusive for the first 3 months so they can keep a tight lip on it. Also AMD has been better than ever in keeping it tight lipped, maybe also AMD branded initially before OEMs make a stab. I do wonder how much AMD and Nvidia knows about each other upcoming products, how accurate, pricing etc. If Nvidia is stuck on 8nm Samsung for now, that may not be too good. AMD has had great experience on several designs now at 7nm, big advantage. If report is somewhat true, Nvidia is pushing to get 40% by going up to 400w at 4K, over a 2080Ti and proper performing RT. AMD? 40%-65% over a 2080Ti with unknown RT performance but it appears to be good on the PS5 so far. Still all rumors at this time. Hmmm, just a couple or so months to go, it will go quick.
 
I see SillySeason has started...
Yes indeed! I would not take any of these so call leaks as remotely solid. Hopefully we don't get another drummer boy AMD video. A Ruby one, hmmm, that would be cool.
 
I saw that video, not sure what to make of it other than at least 3 different engineering samples are floating around and it's too early to say which one represents the possible top-end for Nvidia. I think it's interesting that AMD is rumored to be around 40%-50% faster than a 2080 ti yet only expects to be equal to a vanilla 3080.

It will be a fun few months for sure. I've been AMD the last few rounds mainly for freesync but since that is now supported by both vendors I am open to going with whatever is the fastest this time out, having choices will be good.
 
Last edited:
LOL, Nvidia Fermi 2, hard to believe September without more leaks occurring unless it is Nvidia exclusive for the first 3 months so they can keep a tight lip on it. Also AMD has been better than ever in keeping it tight lipped, maybe also AMD branded initially before OEMs make a stab. I do wonder how much AMD and Nvidia knows about each other upcoming products, how accurate, pricing etc. If Nvidia is stuck on 8nm Samsung for now, that may not be too good. AMD has had great experience on several designs now at 7nm, big advantage. If report is somewhat true, Nvidia is pushing to get 40% by going up to 400w at 4K, over a 2080Ti and proper performing RT. AMD? 40%-65% over a 2080Ti with unknown RT performance but it appears to be good on the PS5 so far. Still all rumors at this time. Hmmm, just a couple or so months to go, it will go quick.
I can't say I've been particularly impressed with the RT performance shown on the PS5 so far. Gran Turismo is running at 60 fps, but with RT reflections only rendered at 1080p. Ratchet & Clank is running at 30 fps and still has various objects and particles excluded from the RT reflections. Otherwise we've only really seen a few trailers. So based on what I've seen and the information available, I would expect Ampere's raytracing to be significantly better than RDNA2.

My perception of the current leaks is the 3080 Ti being a minimum of 30% faster than the 2080 Ti at 300W but it'll probably be faster than that. Big Navi is expected to be 40-50% faster than a 2080 Ti, also at 300W. In the end, I think the difference in rasterization performance will be pretty small but probably with the win going to AMD, at least until nVidia can get 7nm production going.
 
One of the more nonsensical rumors is that it will have a new 12 pin connector for power. That looks just like two 6 pin connectors, but will be incompatible with the current 6 pin connectors, meaning every PSU in existence will require adapters to run and would no doubt would be a support nightmare as people try to force 2 -6pin connectors into the holes.
 
One of the more nonsensical rumors is that it will have a new 12 pin connector for power. That looks just like two 6 pin connectors, but will be incompatible with the current 6 pin connectors, meaning every PSU in existence will require adapters to run and would no doubt would be a support nightmare as people try to force 2 -6pin connectors into the holes.
There's actually quite a bit of evidence behind the rumor and it makes a lot of sense if you read more about it. The current 8 and 6 pin connectors are fairly limited and any card above about 200 watts requires multiple power connectors. Some overclocked cards even need 3. The new 12 pin connector would feature thicker wiring, more robust connections, and be able to handle as much power as 4x 8 pins, so one cable would be enough for any GPU. This also works around the problem of power supply manufacturers daisy-chaining multiple PCI-E connectors on one cable, which exceeds the design limits and can cause stability problems (has happened quite a bit with the 2080 Ti.)

Currently, it's expected that founder's edition cards will use the new connector and of course come with an adapter while nVidia works to get the new connector standardized for future power supplies and GPUs. Current 6 pin connectors would not fit and the adapter should make it pretty obvious how it's supposed to be connected so I don't think people trying to jam 6 pins in will be an issue.
 
Currently, it's expected that founder's edition cards will use the new connector and of course come with an adapter while nVidia works to get the new connector standardized for future power supplies and GPUs. Current 6 pin connectors would not fit and the adapter should make it pretty obvious how it's supposed to be connected so I don't think people trying to jam 6 pins in will be an issue.

I’m going to hazard a guess that you don’t spend a lot of time supporting users.
 
I’m going to hazard a guess that you don’t spend a lot of time supporting users.
Heh I know what you mean. I guess it would make more sense if they used a completely different connector type with different physical dimensions.
 
There's actually quite a bit of evidence behind the rumor and it makes a lot of sense if you read more about it. The current 8 and 6 pin connectors are fairly limited and any card above about 200 watts requires multiple power connectors. Some overclocked cards even need 3. The new 12 pin connector would feature thicker wiring, more robust connections, and be able to handle as much power as 4x 8 pins, so one cable would be enough for any GPU. This also works around the problem of power supply manufacturers daisy-chaining multiple PCI-E connectors on one cable, which exceeds the design limits and can cause stability problems (has happened quite a bit with the 2080 Ti.)

Currently, it's expected that founder's edition cards will use the new connector and of course come with an adapter while nVidia works to get the new connector standardized for future power supplies and GPUs. Current 6 pin connectors would not fit and the adapter should make it pretty obvious how it's supposed to be connected so I don't think people trying to jam 6 pins in will be an issue.

But it won't actually do anything beneficial for most people, because it would require adapters to the standard cables that everyone has. In fact it would just become one more layer of potential issues.

Not against a better power connector. But it should be a standard that is announced well ahead of time and lets everyone work toward getting ready for it.
 
  • Like
Reactions: noko
like this
But it won't actually do anything beneficial for most people, because it would require adapters to the standard cables that everyone has. In fact it would just become one more layer of potential issues.

Not against a better power connector. But it should be a standard that is announced well ahead of time and lets everyone work toward getting ready for it.
Why would that matter? People would have to deal with adapters anyway, unless we somehow got all the power supply manufacturers to produce power supplies with the new connectors but no GPUs used them for a good 10 years. And even then we'd still have people using older power supplies and needing adapters. It's not like we haven't been through this before with the transition from VGA to DVI.
 
There's actually quite a bit of evidence behind the rumor and it makes a lot of sense if you read more about it. The current 8 and 6 pin connectors are fairly limited and any card above about 200 watts requires multiple power connectors. Some overclocked cards even need 3. The new 12 pin connector would feature thicker wiring, more robust connections, and be able to handle as much power as 4x 8 pins, so one cable would be enough for any GPU. This also works around the problem of power supply manufacturers daisy-chaining multiple PCI-E connectors on one cable, which exceeds the design limits and can cause stability problems (has happened quite a bit with the 2080 Ti.)

Currently, it's expected that founder's edition cards will use the new connector and of course come with an adapter while nVidia works to get the new connector standardized for future power supplies and GPUs. Current 6 pin connectors would not fit and the adapter should make it pretty obvious how it's supposed to be connected so I don't think people trying to jam 6 pins in will be an issue.

It makes NO sense when looking at the TDP of the A100 PCIe...
 
Stupid power connector is looking more likely. It was just confirmed by Steve at Gamers Nexus. He has pretty good contacts. He said he doesn't expect much impact on DIY market, this will mostly be in the OEM/SI market. Like you get a Alienware PC and it has 12 pin.
 


Anyone thinking GTX 480 from those new reports? Not a bad card in performance, but hot at stock, barn burner when overclocked. I don't know, there's a bunch of info slinging going on now (some real, some BS, some without a doubt planted by nV and AMD)...
 
  • Like
Reactions: noko
like this
Anyone thinking GTX 480 from those new reports? Not a bad card in performance, but hot at stock, barn burner when overclocked. I don't know, there's a bunch of info slinging going on now (some real, some BS, some without a doubt planted by nV and AMD)...
Personally I dont mind a card that is built to handle 400w+ if it has a tangible benefit. Has good performance at reasonable power levels as well.

As an example, Vega 64 LC, it could push 400w+, cooler was excellent and kept it cool except it did not buy much of anything other than excess heat. Almost pointless.

So if Nvidia supports extreme OCing, that to me would be another reason to buy if that 400w blows everything else out of the water. If it just mostly produce heat -> no.
 
Sure it will have adapter to plug regular pcie 8 pin x2 onto new connection but who knows or lever power bridges evga came out with. Just want 3080ti to replace my 1080ti. Wasnt inpressed with 2080ti so I skipped it. 3080ti seems better 30 percent better than 2080ti. Was hoping 16gb but seems will be 12 gb.
 
Sure it will have adapter to plug regular pcie 8 pin x2 onto new connection but who knows or lever power bridges evga came out with. Just want 3080ti to replace my 1080ti. Wasnt inpressed with 2080ti so I skipped it. 3080ti seems better 30 percent better than 2080ti. Was hoping 16gb but seems will be 12 gb.
Pretty much How I felt about the 2080Ti, I would think EVGA would have models without the 12 pin so one would not have to use an adapter. If I buy an Ampere card, it will most likely will be EVGA.
 
Pretty much How I felt about the 2080Ti, I would think EVGA would have models without the 12 pin so one would not have to use an adapter. If I buy an Ampere card, it will most likely will be EVGA.
Probably EVGA for me as well this time around, hopefully a hybrid that can fit in a Fractal Define Nano S
 
Probably EVGA for me as well this time around, hopefully a hybrid that can fit in a Fractal Define Nano S
I definitely want another integrated loop setup. Silent at stock, unobjectionable when pushed, temperatures always under control.
 
Back
Top