RTX 3xxx performance speculation

XoR_

[H]ard|Gawd
Joined
Jan 18, 2016
Messages
1,027
Just want HDMI 2.1. Seriously. I'll be happy for 30% performance increase, but I REALLY want HDMI 2.1.
I will be fine with 20% performance increase in rasterization and 2x in RT and will replace 2070 to 3070 asap

But without HDMI 2.1 : GTFO
 

Snowdog

[H]F Junkie
Joined
Apr 22, 2006
Messages
10,658
I will be fine with 20% performance increase in rasterization and 2x in RT and will replace 2070 to 3070 asap
If we are issuing wishes, I'll take 20 % increase in RT, and 2x in rasterization. That would be much better at everything.
 
  • Like
Reactions: XoR_
like this

madpistol

Weaksauce
Joined
May 17, 2020
Messages
96
I will be fine with 20% performance increase in rasterization and 2x in RT and will replace 2070 to 3070 asap

But without HDMI 2.1 : GTFO
If we are issuing wishes, I'll take 20 % increase in RT, and 2x in rasterization. That would be much better at everything.
Well if we're all wishing I want a 200% increase in RT and Rasterization.
I just want a 200% increase in HDMI 2.1 !!!
 

n370zed

Limp Gawd
Joined
May 21, 2012
Messages
387
If we’re all wishing I’ll chime in also! I remember spending $709 on an MSI 780 Ti back in 2014 so can all Ti Class cards be at maximum maybe $749 from here on after!? Pretty please 🙏.
 
Last edited:
  • Like
Reactions: Elios
like this
Joined
May 20, 2016
Messages
703
If we’re all wishing I’ll chime in also! I remember spending $709 on an MSI 780 Ti back in 2014 so can all Ti Class cards be at maximum maybe $749 from here on after!? Pretty please 🙏.
Well the only reason you could buy the Ti at that price was because AMD had launched the R9 290X. So really you should be praying AMD's Navi 2X isn't a dud.
 

vegeta535

Supreme [H]ardness
Joined
Jul 19, 2013
Messages
4,159
Pray and wish all you want. I expect a price increase before a decrease from Nvidia. They could release the 3080ti for $1500 and sell out every card at release and beyond.
 
Last edited:

Astral Abyss

2[H]4U
Joined
Jun 15, 2004
Messages
2,742
Pray and wish all you want. I expect a price increase before a decrease from Nvidia. They could release the 3080ti for $1500 and sell out every card at release and beyond.
Unfortunately, I agree with you. Legitimate competition from AMD is the only hope of pricing sanity.
 

Astral Abyss

2[H]4U
Joined
Jun 15, 2004
Messages
2,742
Only if you ignore the rising transitors prices as nodes gets smaller...and that would indeed be a stupid move to ignore real world facts ..
So would you say then, historically, Nvidia prices their top models based on GPU transistor cost?

Also, transistor costs aren't going up, they are hitting the bottom of the curve and becoming a flat cost. Nvidia will still see a cost reduction per transistor going from the 16/12nm node to 7nm.
 

Snowdog

[H]F Junkie
Joined
Apr 22, 2006
Messages
10,658
Also, transistor costs aren't going up, they are hitting the bottom of the curve and becoming a flat cost.
Even transistors becoming a flat cost is a big problem. The big gains in GPU performance over time, where largely driven by massive declines in transistor cost.

IOW, getting 50% more transistors for free each new node. Over time you get compounding increases in transistors for the same money.

But when transistors change to a flat cost, then 50% more transistors, cost 50% more, so you can't just keep freely throwing more and more transistors at the problem.
 

Astral Abyss

2[H]4U
Joined
Jun 15, 2004
Messages
2,742
Even transistors becoming a flat cost is a big problem. The big gains in GPU performance over time, where largely driven by massive declines in transistor cost.

IOW, getting 50% more transistors for free each new node. Over time you get compounding increases in transistors for the same money.

But when transistors change to a flat cost, then 50% more transistors, cost 50% more, so you can't just keep freely throwing more and more transistors at the problem.
I don't dispute anything you've said here. I was disputing the claim that I am somehow confused in thinking that competition keeps pricing in-check.
 

noko

Supreme [H]ardness
Joined
Apr 14, 2010
Messages
5,140
Just because you go to a smaller node does not mean you have to increase the card cost. AMD RNDA, 5700 XT, 7nm outperforms Vega big 14nm node and yet is cheaper. Nvidia did not even go to a smaller node, just a refined one and hiked the price to the stratosphere, mainly due to the huge size of their chips. In other words other factors come into play dealing with price.
 

Nebell

[H]ard|Gawd
Joined
Jul 20, 2015
Messages
1,757
We're issuing wishes here? :D
Ok, 3080Ti 50-70% speed increase over 2080Ti and 3x RT, for $899.
Not impossible of it happening, but about 0.02% chance.
 

MangoSeed

Gawd
Joined
Oct 15, 2014
Messages
859
We're issuing wishes here? :D
Ok, 3080Ti 50-70% speed increase over 2080Ti and 3x RT, for $899.
Not impossible of it happening, but about 0.02% chance.
That seems reasonable except for the price. The Ti will surely be north of $1000.
 

Armenius

Fully [H]
Joined
Jan 28, 2014
Messages
22,662
Just because you go to a smaller node does not mean you have to increase the card cost. AMD RNDA, 5700 XT, 7nm outperforms Vega big 14nm node and yet is cheaper. Nvidia did not even go to a smaller node, just a refined one and hiked the price to the stratosphere, mainly due to the huge size of their chips. In other words other factors come into play dealing with price.
Vega 64 launch price = $500
Vega 64 transistor count = 12.5 billion
Price per transistor = $0.000000040

5700 XT launch price = $400
5700 XT transistor count = 10.3 billion
Price per transistor = $0.000000039

If retail price is related to transistor cost, the per transistor price fell only 3% going from 14nm to 7nm for AMD.

1080 Ti launch price = $700
1080 Ti transistor count = 11.8 billion
Price per transistor = $0.0000000593

2080 Ti launch price = $1200
2080 Ti transistor count = 18.6 billion
Price per transistor = $0.0000000645

For comparison, an 8.76% increase between the NVIDIA cards going from 16nm to "12nm" (no increase in transistor density).
 

Snowdog

[H]F Junkie
Joined
Apr 22, 2006
Messages
10,658
2080 Ti launch price = $1200
2080 Ti transistor count = 18.6 billion
Price per transistor = $0.0000000645

For comparison, an 8.76% increase between the NVIDIA cards going from 16nm to "12nm" (no increase in transistor density).

The 2080Ti is a special case, it's up near the Reticle Limit of the process, that is going to clobber yield. Yield decreases with size, which means increasing cost/transistor with size.

Try the same comparison on the rest of the 1000 vs 2000 series.
 

MangoSeed

Gawd
Joined
Oct 15, 2014
Messages
859
Vega 64 launch price = $500
Vega 64 transistor count = 12.5 billion
Price per transistor = $0.000000040

5700 XT launch price = $400
5700 XT transistor count = 10.3 billion
Price per transistor = $0.000000039

If retail price is related to transistor cost, the per transistor price fell only 3% going from 14nm to 7nm for AMD.

1080 Ti launch price = $700
1080 Ti transistor count = 11.8 billion
Price per transistor = $0.0000000593

2080 Ti launch price = $1200
2080 Ti transistor count = 18.6 billion
Price per transistor = $0.0000000645

For comparison, an 8.76% increase between the NVIDIA cards going from 16nm to "12nm" (no increase in transistor density).
You can’t infer anything really about transistor cost from retail prices. You would need to know what AMD and nvidia are paying TSMC which of course we don’t.
 

noko

Supreme [H]ardness
Joined
Apr 14, 2010
Messages
5,140
Vega 64 launch price = $500
Vega 64 transistor count = 12.5 billion
Price per transistor = $0.000000040

5700 XT launch price = $400
5700 XT transistor count = 10.3 billion
Price per transistor = $0.000000039

If retail price is related to transistor cost, the per transistor price fell only 3% going from 14nm to 7nm for AMD.

1080 Ti launch price = $700
1080 Ti transistor count = 11.8 billion
Price per transistor = $0.0000000593

2080 Ti launch price = $1200
2080 Ti transistor count = 18.6 billion
Price per transistor = $0.0000000645

For comparison, an 8.76% increase between the NVIDIA cards going from 16nm to "12nm" (no increase in transistor density).
Plus AMD was using HBM2 with Vega which added another additional cost, just cannot look at GPU only as the total cost. Navi 10 per transistor would be more expensive over Vega, die area efficiency in this case looks like it cut the costs down with the smaller node while increasing performance and decreasing power while still using higher power DDR6 memory but cheaper. Also RNDA2 will be another +50% (indicated by AMD) in Perf/w over RNDA, another huge jump incoming. With the lower power consumption of Navi 10, the card itself would require less beefy power circuity cutting costs further. Then again AMD was meeting their goal of 45% profit margin with Navi (really hard to know other than overall financials) while with Vega it was probably much less. In the end I just don't think we will be that accurate figuring out the cost per transistor for Vega and Navi 10 other than it was cheaper to buy, less power and more performance. Which was not the case with Nvidia.
 
Last edited:

MangoSeed

Gawd
Joined
Oct 15, 2014
Messages
859
OLEDs have real 120hz now?

That would make conventional monitors obsolete. No?
I’m hankering for hdmi 2.1 myself to take full advantage of my LGC9. It’s beautiful but won’t replace conventional monitors as TVs are just too big for typical desktop usage.

I do most of my PC gaming in the living room now though with the desktop relegated to FPS duty. Can’t beat rocking back on the couch in front of that sweet OLED GSync IQ.
 

Riptide_NVN

[H]ard|Gawd
Joined
Mar 1, 2005
Messages
1,829
I'm actually planning on moving my C9 back to the bedroom and retiring it from gaming due to concerns over burn-in. After putting in 300+ hours of division 2, I'm a bit concerned that continuing is going to result in issues with the TV. There are some HUD elements that cannot be repositioned in this game, and it is what I'm playing currently.

For gaming on the big screen, looking at a gen2 TCL 8 series whenever they hit stores as long as the reviews are good.
 

oldmanbal

2[H]4U
Joined
Aug 27, 2010
Messages
2,193
I'd like to see 60+ fps at 4k make it's debut in the mainstream market, ie x60 class, which prooooobably won't quite happen. It would be nice to upgrade my sons rig from a 970gtx to something Nvidia but lets not kid ourselves, AMD is going to be the value choice again for children and wives everywhere.

I'm not sure how much more the TI segment is going to cost this year, but $1500 sounds about right. AMD didn't release $4k threadrippers to start giving us a deal on their 'enthusiast class' gpu now either. If big navi comes in around $1000, and the TI is anything more than 20% faster, then 1500 it will be. The price/performance ratio goes exponential at the high end.
 

Auer

[H]ard|Gawd
Joined
Nov 2, 2018
Messages
1,760
3070 should run 4k 60fps most modern titles. I'd be very surprised if it doesnt.

$500 would be nice, that's what I paid for my 2070 and consider it a good deal then and now, money well spent.
If more DLSS 2.0 implementations show up I might even hang on to it for another year. Most likely I'll cave tho and nab a 3070/3080 hybrid....
 

Gideon

2[H]4U
Joined
Apr 13, 2006
Messages
2,668
3070 should run 4k 60fps most modern titles. I'd be very surprised if it doesnt.

$500 would be nice, that's what I paid for my 2070 and consider it a good deal then and now, money well spent.
If more DLSS 2.0 implementations show up I might even hang on to it for another year. Most likely I'll cave tho and nab a 3070/3080 hybrid....
I would expect at least the 3080 needed for that kind of performance.
 

MangoSeed

Gawd
Joined
Oct 15, 2014
Messages
859
I'd like to see 60+ fps at 4k make it's debut in the mainstream market, ie x60 class, which prooooobably won't quite happen.
Depends on which games and settings. For games that are a few years old 4K/60 should be doable on medium settings on x60 class cards.
 

SixFootDuo

Supreme [H]ardness
Joined
Oct 5, 2004
Messages
5,640
I prepaid for my 3080 Ti a Microcenter about 2 weeks ago.

Will pick it up as soon as it's released.

How? I bought a new 2080 Ti and decided to get the $140 extended 2 year warranty. So total investment was around $1500 with tax if not a bit more. Under the warranty I can bring it back for fan noise, ticks, artifacting, coil wine, over heating ect for a full refund or store credit.

The card already run's hotter than the last card I had idle is low 80's. They told me to return it whenever. I'll return it when the 3080 Ti comes out.

Hopefully sometime in Sept or Oct. Super excited.
 
Joined
May 20, 2016
Messages
703
I would expect at least the 3080 needed for that kind of performance.
And with the launch of a new console generation at the same time, I would expect 3080 Ti to be more the baseline for 4k pc gaming for newer games. The 3080 looks like its shaping up to be about 2080 ti + 10% or so and minimal RT hit, whereas the 3080ti is rumored to be 35-40% faster than 2080 Ti.

3080 Ti is only going to be first gen 7nm with a mid-sized die which is where the 5376 CUDA core rumors are coming from. Its the 4080 Ti where we would expect a 7nm variant of a monster die ala 2080 Ti on 12nm.
 

Conman

Limp Gawd
Joined
Jun 8, 2004
Messages
459
...The 3080 looks like its shaping up to be about 2080 ti + 10% or so and minimal RT hit, whereas the 3080ti is rumored to be 35-40% faster than 2080 Ti...
Looks like its shaping up from where? 25 pages of this BS thread and the only real information we have is Jensen pulling crap out of his oven. Somebody wake me up when there is something real to talk about.
 

drutman

Limp Gawd
Joined
Jan 4, 2016
Messages
183
So much RT talk here, but I'm just waiting for a locked 120fps at 4K....
Sounds like a winner if it can do 120 fps at 4 K @ $400 price point ?
Some speculators believe Ampere 3060 will out perform 2080Ti on RT, seeing is believing on this claim.
Refuse to pay anymore, not that I can't, just law of diminishing returns applies here.
 

amenx

Limp Gawd
Joined
Dec 17, 2005
Messages
347
Some speculation that there will not be a 3080ti (initially at least). That the 3080 will be the top card and based on a GA102 chip. Sounds logical to me since Nvidia had always done that with the Ti since Kepler (release it several months after a xx80). Only Turing bucked the trend.
 
Top