NVIDIA GeForce RTX 3070 and RTX 3070 Ti Rumored Specifications Appear

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,868
Wonder how the the 3080 Ti will be

"The regular RTX 3070 is supposed to have 2944 CUDA cores on GA104-400 GPU die, while its bigger brother RTX 3070 Ti is designed with 3072 CUDA cores on GA104-300 die. Paired with new technologies that Ampere architecture brings, with a new GDDR6X memory, the GPUs are set to be very good performers. It is estimated that both of the cards would reach a memory bandwidth of 512 GB/s. So far that is all we have. NVIDIA is reportedly in Design Validation Test (DVT) phase with these cards and is preparing for mass production in August. Following those events is the official launch which should happen before the end of this year, with some speculations indicating that it is in September. "

https://www.techpowerup.com/269511/...and-rtx-3070-ti-rumored-specifications-appear
 
Skimming WCCFTech's bloated article about this, I came across this gem: " the GA104 GPU will feature a more optimized design for GPUs in the sub $500 US segment. " First time I've heard "optimized" used as a synonym for "cut down".
 
Optimized for what ?

A) Marketing bling bling (y)
B) A disguised mini-me version :eek:
C) Suckers/gotta-have-latest/greatest folks who either don't care or won't bother reading the fine print/specs/details :D

Sub-$500 aint gonna get you much of a GPU these days, as we have clearly seen with the last several generations of these cards, with the more recent ones (2080ti, FE etc) launching at ~$1300-1500....
 
If $500 can deliver solid 1440p with RTX and all the blingy goodness at solid frame rates than it is a big win for everybody. The existing 2000 series cards are not cheap to produce and most of the card goes unused by the masses, so "optimizing" the cards to bring costs down and performance up is going to help competition, I am looking forward to the AMD/nVidia slugfest for those precious holiday sales numbers.

Both teams would be in error if they ignored the current global economies, luxury items are a tough sell atm and they are going to have to do whatever they can to keep costs lower so the engineer in me is a little excited at the creative solutions they come up with to shave costs while keeping performance and reliability where they need to be.
 
Without competition, video card performance will not get cheaper. Faster cards will come out, and more efficient cards will launch, but price-per-performance will stand still.

1440p 144FPS costs the same today as it did 4 years ago.
 
Without competition, video card performance will not get cheaper. Faster cards will come out, and more efficient cards will launch, but price-per-performance will stand still.
Nvidia has been competing mostly with themselves since AMD bought ATi; yet performance has continued to get cheaper.

I'd put the blame more on a confluence of limitations for the recent stagnation in price/performance growth, being largely the cadence of foundry availability and the need to introduce RT hardware.

Well, RT has been introduced, and fab capacity for the latest processes has grown at more than one foundry, and really unless Nvidia offers something really attractive, they're not going to get the unit sales. The people that were willing to pay up for Turing have paid up.
 
Nvidia has been competing mostly with themselves since AMD bought ATi; yet performance has continued to get cheaper.

I'd put the blame more on a confluence of limitations for the recent stagnation in price/performance growth, being largely the cadence of foundry availability and the need to introduce RT hardware.

Well, RT has been introduced, and fab capacity for the latest processes has grown at more than one foundry, and really unless Nvidia offers something really attractive, they're not going to get the unit sales. The people that were willing to pay up for Turing have paid up.

You've seen multiple times when Nvidia and AMD battled for the top end while AMD owned the Radeon brand, the 5870 vs 480 vs 6970 vs 580 vs 7970 vs 680 vs 7970GHz vs 780 vs 290X vs 780Ti vs 390X vs 980 but around the time of the 980 AMD went full-retard and a card that competed against the top end was never again seen from them. Then we started to see the creep-up of prices.

This recent stagnation of price-performance is entirely due to Nvidia having nobody to compete against.
 
AMD released dual-GPU cards after their HD290 debacle in order to 'compete' at the top-end. Nvidia did this too and was faster still.

Thing is, we've seen prices creep up due to a number of factors; the last one was the mining craze. In a situation like that Nvidia was smart to raise MSRPs. It meant that there was actually some stock left to buy...

This last generation, the need to introduce new features and the costs associated on an outgoing node meant that MSRP had to go up.

But those are the same reasons that price per performance could swing in the other direction, like it did with the GTX600-series, or even the 9000-series that was really just a straight shrink of the 8000-series.
 
Optimized for what ?

A) Marketing bling bling (y)
B) A disguised mini-me version :eek:
C) Suckers/gotta-have-latest/greatest folks who either don't care or won't bother reading the fine print/specs/details :D

Sub-$500 aint gonna get you much of a GPU these days, as we have clearly seen with the last several generations of these cards, with the more recent ones (2080ti, FE etc) launching at ~$1300-1500....

Sub-$500 actually lets you game at 1440P with good detail. Sub-$500 doesn’t give you e-peen, but it can certainly give you a great gaming experience.
 
You've seen multiple times when Nvidia and AMD battled for the top end while AMD owned the Radeon brand, the 5870 vs 480 vs 6970 vs 580 vs 7970 vs 680 vs 7970GHz vs 780 vs 290X vs 780Ti vs 390X vs 980 but around the time of the 980 AMD went full-retard and a card that competed against the top end was never again seen from them. Then we started to see the creep-up of prices.

This recent stagnation of price-performance is entirely due to Nvidia having nobody to compete against.
At the top end perhaps but AMD has continued to provide a number of solutions in the Low and Mid ranges and both sides have consistently kept in step with pricing, so this does not track at all. The creeping in price is directly linked to the increase in costs from TSMC for using their processes and the increased board complexity on the cards themselves to feed the more complex chips. If there were any evidence of gouging it would be extremely clear in the reported financials and there is no sign of it there, unless they are lying on their financial reports to investors which is a crime then I would suggest that people start lawyering up because there is big money to be had.
 
Both teams would be in error if they ignored the current global economies, luxury items are a tough sell atm and they are going to have to do whatever they can to keep costs lower so the engineer in me is a little excited at the creative solutions they come up with to shave costs while keeping performance and reliability where they need to be.

As much as we all want sweet hardware deals, if both teams are paying attention to the current economic climate, don’t expect super low prices, just ask your local computer shop about how fast their inventory is moving. Government stimulus cheques are doing their job, I’ll just say that. So, no, I don’t think they need to do everything they can to shave costs, just like I don’t think Nvidia *had* to lower the price of the 2080Ti for the simple reason that, despite how much everyone was complaining, the cards sold well.
 
I love how they just make up a technical term like GDDR6X just to describe GDDR6 with higher clock speeds :rolleyes:

GDDR6X does not exist, period. I mean, even GDDR5X validation articles were available online for months before the GTX 1080 launched, so this shit doesn't just happen!

https://www.extremetech.com/computi...-high-speeds-on-gddr5x-but-will-anyone-use-it


Also, the idea that they're releasing a 3070 Ti on day one just tells you how much Adderall these guys are popping!
 
Last edited:
Both teams would be in error if they ignored the current global economies, luxury items are a tough sell atm and they are going to have to do whatever they can to keep costs lower
You're assuming that's how things should be, but the reality is the opposite.
 
As much as we all want sweet hardware deals, if both teams are paying attention to the current economic climate, don’t expect super low prices, just ask your local computer shop about how fast their inventory is moving. Government stimulus cheques are doing their job, I’ll just say that. So, no, I don’t think they need to do everything they can to shave costs, just like I don’t think Nvidia *had* to lower the price of the 2080Ti for the simple reason that, despite how much everyone was complaining, the cards sold well.
Yeah I am not expecting a lot of price changes per say, I think the price segments are going to remain roughly where they are give or take $100, but I am thinking we will see a large shift in what we get at those price points. Even now if you want 1080p with full RTX goodness you are looking at a minimum of a 2060 super which is what $450??? I am thinking we are going to see that level of performance in the sub $300 range bringing it to the "mainstream" areas. The 500-800 is where you are going to see the 1440p options and 800+ for the bragging rights to 4K glory.
 
Yeah I am not expecting a lot of price changes per say, I think the price segments are going to remain roughly where they are give or take $100, but I am thinking we will see a large shift in what we get at those price points. Even now if you want 1080p with full RTX goodness you are looking at a minimum of a 2060 super which is what $450??? I am thinking we are going to see that level of performance in the sub $300 range bringing it to the "mainstream" areas. The 500-800 is where you are going to see the 1440p options and 800+ for the bragging rights to 4K glory.

I certainly hope so, because I'm sick of having to wait for a worthwhile upgrade and I'm not paying $1000+ for the privilege.
 
It would be interesting if they did both a 70 and 70ti on release. That would just mean they're trying a clever tactic to push the price up for the **70 level performance. And again, shows that AMD just isn't competing much.
 
So you guys got to remember that with the new consoles coming out with those specs they have, if Nvidia want to sell ANY cards coming up $1300 for just a gpu just ain't gonna cut it...
 
Last edited:
those core counts and mem clocks look plausible enough (albeit obvious) but a 3070Ti at or close to launch doesn't make sense to me, especially with such a small performance difference. 1070Ti only happened as a response to Vega.
 
Last edited:
I love how they just make up a technical term like GDDR6X just to describe GDDR6 with higher clock speeds :rolleyes:

GDDR6X does not exist, period. I mean, even GDDR5X validation articles were available online for months before the GTX 1080 launched, so this shit doesn't just happen!

https://www.extremetech.com/computi...-high-speeds-on-gddr5x-but-will-anyone-use-it


Also, the idea that they're releasing a 3070 Ti on day one just tells you how much Adderall these guys are popping!

Are we sure they didn't double the prefetch for the RAM even if there isn't an official spec yet for it? Normally the X means they double the read/write width per clock (prefetch) from X bytes to 2X bytes. Otherwise, they may have just screwed JEDEC up on their naming convention because if JEDEC uses the term GDDR6X and nvidia had it in their marketing first and it doesn't meet/have the same specs... I don't even want to think about it. Going from GDDR5 which had an 8n prefetch for 32 bytes per memory access to GDDR5X they increased it to 16n for 64b bytes per memory access. My guess is the same as yours that it's just a marketing term becaue they're using slightly faster memory than before, which is really annoying. Maybe shouldn't have used a term that had a previous meaning that differed from what they are using it to mean.
 
Price is very important to me. Yeah, I know: not what I should post at [H]. I used to like being one tier below cutting edge. Now? Well, I cannot see spending $1,000 on a video card. Guess I'm going .
 
Price is very important to me. Yeah, I know: not what I should post at [H]. I used to like being one tier below cutting edge. Now? Well, I cannot see spending $1,000 on a video card. Guess I'm going .

Hell, it used to be during the 1080 Ti days that you could spend $600-700 and get the absolute top of the line chip (a little more if you wanted the pre-affixed custom water block). $1000+ GPUs were wacky "Titan" level edge cases
 
Sub-$500 aint gonna get you much of a GPU these days, as we have clearly seen with the last several generations of these cards, with the more recent ones (2080ti, FE etc) launching at ~$1300-1500....

If the specs on these cards are even remotely true, the 3070 essentially replaces the 2080 Super at $300 less with (hopefully) better RTX performance. You're looking at 2080 Supers in the used market for $350 which is pretty good bang for the buck at 1440p.

As much as we all want sweet hardware deals, if both teams are paying attention to the current economic climate, don’t expect super low prices, just ask your local computer shop about how fast their inventory is moving. Government stimulus cheques are doing their job, I’ll just say that. So, no, I don’t think they need to do everything they can to shave costs, just like I don’t think Nvidia *had* to lower the price of the 2080Ti for the simple reason that, despite how much everyone was complaining, the cards sold well.

Did they though? A dedicated hardware forum is hardly the place to gauge. Another interesting indicator is how many "Alienware" type systems don't include the 2080Ti instead opting for the 2080/Super. In the non-DIY space, there's a limit in people's mind of how much they will pay for a gaming computer, and the pricing of the 2080Ti priced it out of contention for a vast majority of consumers.

Completely hypothetical numbers forthcoming warning! Do you make more money with A). 1000 units sold at $1200 or B). 3000 units sold at $900. I think Nvidia opted for A with the 2080Ti when B "could" have made them more money.
 
Are we sure they didn't double the prefetch for the RAM even if there isn't an official spec yet for it? Normally the X means they double the read/write width per clock (prefetch) from X bytes to 2X bytes. Otherwise, they may have just screwed JEDEC up on their naming convention because if JEDEC uses the term GDDR6X and nvidia had it in their marketing first and it doesn't meet/have the same specs... I don't even want to think about it. Going from GDDR5 which had an 8n prefetch for 32 bytes per memory access to GDDR5X they increased it to 16n for 64b bytes per memory access. My guess is the same as yours that it's just a marketing term becaue they're using slightly faster memory than before, which is really annoying. Maybe shouldn't have used a term that had a previous meaning that differed from what they are using it to mean.


They could do that again (just like they did with GDDR5X), but once again, you don't just make a new memory chip design under a rock.

https://phys.org/news/2007-12-samsung-fastest-gddr5-memory-gbs.html


Even early GDDR5 samples were announced 6 months before the 4870 started using it. There is no such thing as a phantom memory spec launch (even for something corner-case , like GDDR5X was).

This design stinks of the finest grade-a horseshit. They either use 18-20Gbps GDDR6, or they switch to HBM2e - there is no other memory option in-development.

Hey, check out this rumor of 20Gbps GDDR6, from two years ago!


https://www.tweaktown.com/news/62109/micron-teases-gddr6-20gbps-blowing-hbm2-water/index.html

If they can already push the interface to 20 on the year of launch, guess how much more likely this one is to get made, over imaginary GDDR6X?
 
Last edited:
They could do that again (just like they did with GDDR5X), but once again, you don't just make a new memory chip design under a rock.

https://phys.org/news/2007-12-samsung-fastest-gddr5-memory-gbs.html


Even early GDDR5 samples were announced 6 months before the 4870 started using it. There is no such thing as a phantom memory spec launch (even for something corner-case , like GDDR5X was).

This design stinks of the finest grade-a horseshit. They either use 18-20Gbps GDDR6, or they switch to HBM2e - there is no other memory option in-development.

Hey, check out this rumor of 20Gbps GDDR6, from two years ago!


https://www.tweaktown.com/news/62109/micron-teases-gddr6-20gbps-blowing-hbm2-water/index.html

If they can already push the interface to 20 on the year of launch, guess how much more likely this one is to get made, over imaginary GDDR6X?
I agree, that's why I said my guess is the same as yours that they are just using it as a marketing gimmick. Seems stupid to use X since that actually had meaning in the prior generation. But that's marketing, trying to make something sound better whether it's real or fake.
 
I won't be buying one of these out of the gate, but I am genuinely interested (as we all are) to how much GPU we are going to get for the price. They blew customer goodwill by charging $400 more for a Ti part that delivered no tangible benefits in real world applications (Sorry, 20 year old games don't count).......so really it was $400 more for the same 25-32% increase over 1080ti, ish......

But honestly I'm less concerned about the hardware at this point as I am about raytracing in general.....the implementations of this technology are not at all mature enough to be fed to gamers without criticism.....BF5 gave us what...reflections, including ones that were wholly inappropriate (I harp on it but in a game like Battlefield V in the middle of a warzone no MARBLE IS POLISHED AND REFLECTIVE AFTER THE FIRST BULLETS PUT DUST AND GUNPOWDER IN THE AIR) :D.....Puddles reflect perfectly but they don't deform or splash? This, to me, is the wrong direction. Now if we were talking real-time shadows where the world was deformable and light would then bounce differently or you had a game where taking out lights would give one or another side an advantage (Or perhaps even having to avoid taking out lights so you can't spam grenades or whatever) but let's be honest, no multiplayer game is that way anymore, every FPS you ever played is just lasertag with WW2 or modern shaped weapons on your hud.......same game different cosplay. You look at a game like The Division 2 and go "why would I need real time lighting or shadowing in this game" and I'd agree, the baked-in lighting is good enough by a huge stretch. Modern Warfare I think even showed there was almost no tangible difference between the baked-in lighting and dynamic lighting.....that was painful to watch, because like everyone else I really want RT to blow my socks off and make me go "yes I WANT to spend a grand to have those visuals in my game!"..............but ahhh...so far.......well.
 
I won't be buying one of these out of the gate, but I am genuinely interested (as we all are) to how much GPU we are going to get for the price. They blew customer goodwill by charging $400 more for a Ti part that delivered no tangible benefits in real world applications (Sorry, 20 year old games don't count).......so really it was $400 more for the same 25-32% increase over 1080ti, ish......

First off, Pascal was a performance ANOMALY.
Basing anything on that launch is either ignorant or dishonest.
It is like people forget how much of an anomaly Pascal was and decided that "This is the NEW norm, data be damned!!!"

Secondly this does not look like they burned anything:

Code:
Steam Survey June 2020:

NVIDIA GeForce GTX 1650                  3.10% +0.41%
NVIDIA GeForce GTX 1660                  1.70% +0.14%
NVIDIA GeForce GTX 1660 SUPER            0.97% +0.24%
NVIDIA GeForce GTX 1660 Ti               2.81% +0.22%
NVIDIA GeForce RTX 2060                  2.63% +0.16%
NVIDIA GeForce RTX 2060 SUPER            1.00% +0.16%
NVIDIA GeForce RTX 2070                  1.92% +0.05%
NVIDIA GeForce RTX 2070 SUPER            1.66% +0.20%
NVIDIA GeForce RTX 2080                  1.02% -0.01%
NVIDIA GeForce RTX 2080 SUPER            0.66% +0.06%
NVIDIA GeForce RTX 2080 Ti               0.92% +0.06%

Total                                   18.39% +1.69%

Despite Pascal being a performance anomaly, Turing now sits close to 20% of the market...the data doses not support your "conclusion" FYI

Thirdly, cost per transistor is going UP now, gone are the days of Moore's Law...get used it.

3 strikes and you are out, right?
 
First off, Pascal was a performance ANOMALY.
Basing anything on that launch is either ignorant or dishonest.
It is like people forget how much of an anomaly Pascal was and decided that "This is the NEW norm, data be damned!!!"

Secondly this does not look like they burned anything:

Code:
Steam Survey June 2020:

NVIDIA GeForce GTX 1650                  3.10% +0.41%
NVIDIA GeForce GTX 1660                  1.70% +0.14%
NVIDIA GeForce GTX 1660 SUPER            0.97% +0.24%
NVIDIA GeForce GTX 1660 Ti               2.81% +0.22%
NVIDIA GeForce RTX 2060                  2.63% +0.16%
NVIDIA GeForce RTX 2060 SUPER            1.00% +0.16%
NVIDIA GeForce RTX 2070                  1.92% +0.05%
NVIDIA GeForce RTX 2070 SUPER            1.66% +0.20%
NVIDIA GeForce RTX 2080                  1.02% -0.01%
NVIDIA GeForce RTX 2080 SUPER            0.66% +0.06%
NVIDIA GeForce RTX 2080 Ti               0.92% +0.06%

Total                                   18.39% +1.69%

Despite Pascal being a performance anomaly, Turing now sits close to 20% of the market...the data doses not support your "conclusion" FYI
If it was a performance anomaly, then Turing was a pricing anomaly. The replacement for the 1080Ti was 70%+ more expensive, I don’t remember the last time that happened. Turing will probably go down as a “meh” generation since other than the 2080Ti, there really wasn’t anything impressive. RT performance is shit and DLSS was half baked until recently.
 
If it was a performance anomaly, then Turing was a pricing anomaly. The replacement for the 1080Ti was 70%+ more expensive, I don’t remember the last time that happened. Turing will probably go down as a “meh” generation since other than the 2080Ti, there really wasn’t anything impressive. RT performance is shit and DLSS was half baked until recently.

Oh, you muppets...
Moore.jpg


Then add this:
8800 GTX -> 9800 GTX (G80 -> G92) = -2% (die shrink)
9800 GTX -> GTX 285 (G92 -> Tesla) = +37% (die shrink)
GTX 285 -> GTX 480 (Tesla -> Fermi) = +39% (new arch)
GTX 480 -> GTX 580 (Fermi -> Fermi 2) = +11% (die shrink)
GTX 580 -> GTX 680 (Fermi 2 -> Kepler) = +19% (new arch)
GTX 680 -> GTX 780 Ti (Kepler -> Kepler 2) = +39% (optimization)
GTX 780 Ti -> GTX 980 Ti (Kepler -> Maxwell) = +43% (new arch)
GTX 980 Ti -> GTX 1080 Ti (Maxwell -> Pascal) = +85% (new arch)
GTX 1080 Ti -> RTX 2080 Ti (Pascal -> Turing) = +39% (new arch)

This is why we say Pascal was an outlier and you shouldn't set your expectations based on it.

Is reality really that hard for you?
 
You’re just going to casually ignore the fact that prices went up massively with Turing while performance increases weren’t anything special?

Turing focused on RT even though there were zero games with it for months and it still performed like complete garbage when RT was turned on. Nvidia shoved a half assed feature down everyone’s throats, maybe it’ll actually be useful with the next generation.

It would be like putting carbon ceramic brakes on a minivan, sure it increases the cost but it adds nothing useful.
 
You’re just going to casually ignore the fact that prices went up massively with Turing while performance increases weren’t anything special?

Turing focused on RT even though there were zero games with it for months and it still performed like complete garbage when RT was turned on. Nvidia shoved a half assed feature down everyone’s throats, maybe it’ll actually be useful with the next generation.

It would be like putting carbon ceramic brakes on a minivan, sure it increases the cost but it adds nothing useful.

Turing's performance increase was normal.
But manufactoring costs are rising.
Are you being thick on purpose...you have all the data needed?
RT + Tensor cores are ~ 10% of the die...and RT is not an Nvidia thing...AMD and Intel are going DXR too?
Entitlement is a bad foundation...
 
You’re just going to casually ignore the fact that prices went up massively with Turing while performance increases weren’t anything special?

Turing focused on RT even though there were zero games with it for months and it still performed like complete garbage when RT was turned on. Nvidia shoved a half assed feature down everyone’s throats, maybe it’ll actually be useful with the next generation.

It would be like putting carbon ceramic brakes on a minivan, sure it increases the cost but it adds nothing useful.

They built a one size fits all chip with Turing so they were forced to deal with a giant chip which likely drove up costs so they passed that cost on to consumers. As for RT/DLSS, well it had to start somewhere so it did with Turing. I agree though that at $1200 the 2080 Ti was/is a bad buy. We'll see how the 3080 Ti prices are, especially now with miners seemingly getting back into the scene.
 
They built a one size fits all chip with Turing so they were forced to deal with a giant chip which likely drove up costs so they passed that cost on to consumers. As for RT/DLSS, well it had to start somewhere so it did with Turing. I agree though that at $1200 the 2080 Ti was/is a bad buy. We'll see how the 3080 Ti prices are, especially now with miners seemingly getting back into the scene.

How was Turing a bad buy, when you compare it to the launches from Fermi to Turing?
The perfomance delta data does not agree with your stance?
The die data does not agree with your stance?
The transistor count/price does not agree with your stance?

If people think that Ampere will be like Pascal, they are setting themselves up for a major disappointment...
 
How was Turing a bad buy, when you compare it to the launches from Fermi to Turing?
The perfomance delta data does not agree with your stance?
The die data does not agree with your stance?
The transistor count/price does not agree with your stance?

If people think that Ampere will be like Pascal, they are setting themselves up for a major disappointment...

Price jump for 2080 and up was too high for the performance increase. DLSS 1.0 was useless so the tensor cores were and still largely are useless as have been the RT cores. But they all added to the core size and cost.
 
legit ray-tracing GPU's are still another generation or 2 away...Ampere/Big Navi will be better then Turing but it's still not there yet...games and GPU's aren't going to form that perfect symmetry for another year at least
 
Oh, you muppets...
View attachment 261049

Then add this:


Is reality really that hard for you?
Really again? LOL
Everyone knows except for you it appears that Turing is not 7nm. Go figure once again. Meaning the 12nm is basically a 16nm message process. So showing that chart, yet again, by you again, shows clearly that Nvidia just bent willing participants over for a good one. 70% increase for 20%-35% increase in performance depending upon game, resolution and a few games that most will not use if present RTX, basically on the same node. If you showed a chart for die size and cost then you may have something. The chart was meant to show when node size goes down, bigger dies become very problematic. Nvidia did not go to a smaller node with Turing. Now they will with Ampere-> So now are we suppose to really be happy and bend over even more when they actually innovate to a smaller node?

AMD is so far ahead with process tech and making good designs, I hope Nvidia shows their stuff on Ampere for gamers and not have to loose a kidney over it.
 
Back
Top