NVIDIA GeForce RTX 3070 and RTX 3070 Ti Rumored Specifications Appear

Price jump for 2080 and up was too high for the performance increase. DLSS 1.0 was useless so the tensor cores were and still largely are useless as have been the RT cores. But they all added to the core size and cost.

You act like performance/priced is a locked metric.
That is a false assumption.
Node/transistors determine price.
You can try and ignore the rising costs the smaller the node gets...but that is a departure from reality.

Again, entitlement in a luxury hobby is a bad foundation.
 
Really again? LOL
Everyone knows except for you it appears that Turing is not 7nm. Go figure once again. Meaning the 12nm is basically a 16nm message process. So showing that chart, yet again, by you again, shows clearly that Nvidia just bent willing participants over for a good one. 70% increase for 20%-35% increase in performance depending upon game, resolution and a few games that most will not use if present RTX, basically on the same node. If you showed a chart for die size and cost then you may have something. The chart was meant to show when node size goes down, bigger dies become very problematic. Nvidia did not go to a smaller node with Turing. Now they will with Ampere-> So now are we suppose to really be happy and bend over even more when they actually innovate to a smaller node?

AMD is so far ahead with process tech and making good designs, I hope Nvidia shows their stuff on Ampere for gamers and not have to loose a kidney over it.

That was an AMD slide (Hint: AMD is telling you that they also will be upping prices), you whine about the price of die that lean on the very edge of reticle limit (physics is king)...that millennium entitlement in a luxury hobby is a bad foundation.
 
That was an AMD slide (Hint: AMD is telling you that they also will be upping prices), you whine about the price of die that lean on the very edge of reticle limit (physics is king)...that millennium entitlement in a luxury hobby is a bad foundation.
I am sure your right about AMD and pricing, investors are restless with AMD profit margins while Ngreedia makes about 20% more. That can give AMD a price advantage if taken. The chart has nothing to do with Nvidia Turing pricing.
 
You act like performance/priced is a locked metric.
That is a false assumption.
Node/transistors determine price.
You can try and ignore the rising costs the smaller the node gets...but that is a departure from reality.

Again, entitlement in a luxury hobby is a bad foundation.

It’s not like Nvidia was on cutting edge 7nm that drove their costs up. It’s because they built a jack of all trades large gpu and saddled customers with it instead of a pure performance successor to Pascal.
 
Price is very important to me. Yeah, I know: not what I should post at [H]. I used to like being one tier below cutting edge. Now? Well, I cannot see spending $1,000 on a video card. Guess I'm going .

Nah.... [H], in my view, has always been about getting the most performance for your money, not "I don't care what my Ferrari costs". No one should need to spend $1000 on a card to have top-tier performance. I hate nVidia for dirtying what's been fun a hobby for the past 25+ years by ram-rodding "luxury" pricing on us.

I've decided price is going to be a matter of principal to me for my future card purchases (or abstentions from purchasing). If nV chooses to knock up prices with Ampere further (thereby lowering the historical price/performance/time curve), I'm out. The Turing price hike was too much to begin with.
 
Nah.... [H], in my view, has always been about getting the most performance for your money, not "I don't care what my Ferrari costs". No one should need to spend $1000 on a card to have top-tier performance. I hate nVidia for dirtying what's been fun a hobby for the past 25+ years by ram-rodding "luxury" pricing on us.

I've decided price is going to be a matter of principal to me for my future card purchases (or abstentions from purchasing). If nV chooses to knock up prices with Ampere further (thereby lowering the historical price/performance/time curve), I'm out. The Turing price hike was too much to begin with.

You’re not entitled to top tier performance though especially since new tiers have been added over the years. There are decent cards at every price level. Just pick one that fits your budget.

The 2080 Ti has terrible price/performance but the cheaper cards aren't bad compared to Pascal. And that’s without RT or DLSS.
 
You’re not entitled to top tier performance though especially since new tiers have been added over the years. There are decent cards at every price level. Just pick one that fits your budget.

Right, and in that "dis-entitlement" lies the issue. The paradigm that nVidia has formulaically introduced with new tiers e.g. Ti, Titan, FE over the past 5 years exists with the main purpose of raising prices. The result of this tiering (model segmentation) is that a near-full-fat chip now costs $1200+. Five years ago, a card with a near-full-fat chip was $650.
 
Right, and in that "dis-entitlement" lies the issue. The paradigm that nVidia has formulaically introduced with new tiers e.g. Ti, Titan, FE over the past 5 years exists with the main purpose of raising prices. The result of this tiering (model segmentation) is that a near-full-fat chip now costs $1200+. Five years ago, a card with a near-full-fat chip was $650.

$650 today buys you a much faster card than $650 5 years ago. Aside from ego there's no reason to care about the fatness of chips in your card. The only thing that matters is what you're getting for the dollars you spend. People have this misguided notion that they're entitled to the "best" for some reason. Just stick to your budget.
 
$650 today buys you a much faster card than $650 5 years ago. Aside from ego there's no reason to care about the fatness of chips in your card. The only thing that matters is what you're getting for the dollars you spend. People have this misguided notion that they're entitled to the "best" for some reason. Just stick to your budget.

In absolute terms, I'd certainly hope $650 would get you a faster card today than five years ago. In relative terms vis a vis prior gens, however, it's pretty bleak.

No one feels entitled here on having the best card. I do, however, feel entitled to make informed decisions on purchases in this field due to a) historical price/pricing knowledge and b) knowledge of nVidia's aggressive multi-tiered strategy to hike prices on cards (capitalizing on prior, ephemeral mining demand, and lack of competition in the high-end etc.). And when I'm getting 30% less gen-to-gen performance gains over prior gens for 40% more cost, that's when I draw the line.

If I was shopping for a mid-range card, it would be a somewhat similar sad story, albeit muted due to there actually being some competition in that segment from AMD.
 
In absolute terms, I'd certainly hope $650 would get you a faster card today than five years ago. In relative terms vis a vis prior gens, however, it's pretty bleak.

No one feels entitled here on having the best card. I do, however, feel entitled to make informed decisions on purchases in this field due to a) historical price/pricing knowledge and b) knowledge of nVidia's aggressive multi-tiered strategy to hike prices on cards (capitalizing on prior, ephemeral mining demand, and lack of competition in the high-end etc.). And when I'm getting 30% less gen-to-gen performance gains over prior gens for 40% more cost, that's when I draw the line.

If I was shopping for a mid-range card, it would be a somewhat similar sad story, albeit muted due to there actually being some competition in that segment from AMD.

Then why do you ignore the rising cost per transistor?
That does not seem "informed"?
 
Then why do you ignore the rising cost per transistor?
That does not seem "informed"?
You are really stuck in a loop over the cost per transistor. I dont't even remember anymore which thread I saw it first. I hate that irregardless is now recognized as a word now, I think I'm going to put that in as many threads as I can now. On topic I'm looking forward to all the cards above 2080 performance and hopefully the 3070 will be close to 2080ti performance.
 
And when I'm getting 30% less gen-to-gen performance gains over prior gens for 40% more cost, that's when I draw the line.

That sounds like hyperbole.

At launch (founders editions):

GTX 2070 42% faster than GTX 1070 for $150 (33%) higher MSRP.
GTX 2080 45% faster than GTX 1080 for $100 (14%) higher MSRP.
GTX 2080 9% faster than GTX 1080 Ti for $100 (14%) higher MSRP.
GTX 2080 Ti 40% faster than GTX 1080 Ti for $500 (71%) higher MSRP.

Bang for the buck the 2080 Ti was off the charts stupid. The 2080 didn't look great against the 1080 Ti (assuming you already owned one). However, Turing as a whole wasn't the complete $/fps disaster that so many claim.
 
You are really stuck in a loop over the cost per transistor. I dont't even remember anymore which thread I saw it first. I hate that irregardless is now recognized as a word now, I think I'm going to put that in as many threads as I can now. On topic I'm looking forward to all the cards above 2080 performance and hopefully the 3070 will be close to 2080ti performance.

The cost per transistor going up is going to affect prices if you like it or not.
It is a fact, that trumps your feelings.
Complain to AMD/Nvidia/Intel over it and see what they will respond to you.

This will not help:
 

Attachments

  • download.jpeg
    download.jpeg
    5.7 KB · Views: 0
I am sure your right about AMD and pricing, investors are restless with AMD profit margins while Ngreedia makes about 20% more. That can give AMD a price advantage if taken. The chart has nothing to do with Nvidia Turing pricing.

Nvidia offers more perfomance/features, why do you sound surprised they have bigger profits?
 
The cost per transistor going up is going to affect prices if you like it or not.
It is a fact, that trumps your feelings.
Complain to AMD/Nvidia/Intel over it and see what they will respond to you.

This will not help:
Once again said in repeat, this time in reply to what I said it made even less since. On a comment about how you are saying this alot, you just say it again?
 
Nvidia offers more perfomance/features, why do you sound surprised they have bigger profits?
Not surprised but seeing them as digging a hole. Great if they can maintain the chasm but then they may just fall into it as well. True competition which I would expect from Lisa Sue may make some things way more difficult for Jensen. Seeing 60fps titles slotted for the PS5, using RT is very encouraging and also should bring the price point down and not forever rising per Jensen dreams. Does not matter, one should just evaluate what is available as always, not some fanboy wet dream (not saying you are), using non-objective wishful thinking. There just GPUs which after 5 or so years many will just be tossed in the trash can. In the scheme of things not really that important.
 
I have to build a TR system and I am waiting to see what cards are available that I can pair with it, I am hoping to see something nice from AMD but I am not sure I will be able to take them, the system will be running Server 2019 and last time I checked trying to install their residential GPU drivers on server OS's was a non starter usually met with an immediate warning message and error, so if I can't install it than I wont be buying their GPU's this time around. nVidia may not "support" their consumer GPU's on server OS's but they at least don't kill the installer when it encounters that scenario.
 
That was an AMD slide (Hint: AMD is telling you that they also will be upping prices), you whine about the price of die that lean on the very edge of reticle limit (physics is king)...that millennium entitlement in a luxury hobby is a bad foundation.
Well, considering 3700x was cheaper than 1700 while almost doubling the # of transistors, I have a hard time thinking their prices are going to jump, especially when their current line is already on 7nm. I do agree though, you are always going to pay for new technology and it's a luxury product that will be priced to whatever the market is willing to pay, (mostly) regardless of costs.
 
Gotta be more than just 4.3% more CUDA cores to differentiate 3070 vs 3070ti

No way both cards are coming to the shelves at the same time.
If the off chance it happens, I would definitely buy an AMD card. The wallet rape from Nvidia will be at a new level.
 
That sounds like hyperbole.

At launch (founders editions):

GTX 2070 42% faster than GTX 1070 for $150 (33%) higher MSRP.
GTX 2080 45% faster than GTX 1080 for $100 (14%) higher MSRP.
GTX 2080 9% faster than GTX 1080 Ti for $100 (14%) higher MSRP.
GTX 2080 Ti 40% faster than GTX 1080 Ti for $500 (71%) higher MSRP.

Bang for the buck the 2080 Ti was off the charts stupid. The 2080 didn't look great against the 1080 Ti (assuming you already owned one). However, Turing as a whole wasn't the complete $/fps disaster that so many claim.

The performance differentiation was in ray-tracing which Nvidia sold too early. They wanted to be first and charged us more to do that.
Now games will finally use it more often, we will see good performance gaps between generations.
 
I just want something that is as fast as my 1080 Ti but produces at least 30% less heat.
 
...by spreading them out across three dies 😉
So, having 2 dies with almost the same # of transistors of the single, and adding a 2 billion transistor IOD (I/O die) @12nm and having to connect them all together, coming in at a lower price makes what I said, more correct? The 2billion transistor IOD is almost 1/2 the 1700x (4.8billion vs 2.09).... So if the cost of 12nm went up a tiny bit, this would be 1/2 the cost of the entire 1700x by itself. The other 3.9+3.9 (7.8) billion transisors @ 7nm come in at?? Less than 1/2 the original cost of the 4.8billion? Still not sure why splitting them up would make much difference unless you're trying to say binning makes that big of a difference due to the size? Even then it doesn't come close to adding up unless AMD margins dropped tremendously to absorb the difference, but looking at their financials that doesn't seem to be the case.

Edit:. I do believe prices aren't dropping like they used to, but I think people are overestimating just how much difference it makes. I don't imagine prices swollen be going down, but I don't think Nvidia and AMD are reducing their profit margins due to increasing transistor costs.
 
In the end I think we are all expecting 24-35% performance increases and possibly a doubling or more of RayTracing (aka: Puddles and Mirrors Mode) performance. Meanwhile those cardboard 2d trees still won't appear any closer on the horizon in any game you play today..........and you'll still notice shadows and texture LOD fade-in from 10 feet :)

Hardware is great and all but....you know...fuck super-accurate reflective puddles, we have bigger problems to solve-for.
 
In the end I think we are all expecting 24-35% performance increases and possibly a doubling or more of RayTracing (aka: Puddles and Mirrors Mode) performance. Meanwhile those cardboard 2d trees still won't appear any closer on the horizon in any game you play today..........and you'll still notice shadows and texture LOD fade-in from 10 feet :)

Hardware is great and all but....you know...fuck super-accurate reflective puddles, we have bigger problems to solve-for.
Resolving the fuzzy math we need to fake lighting by replacing it with accurate global illumination is a big problem that is solved with ray tracing. You're ill-informed if you think ray tracing is just creating reflections.
 
That sounds like hyperbole.

At launch (founders editions):

GTX 2070 42% faster than GTX 1070 for $150 (33%) higher MSRP.
GTX 2080 45% faster than GTX 1080 for $100 (14%) higher MSRP.
GTX 2080 9% faster than GTX 1080 Ti for $100 (14%) higher MSRP.
GTX 2080 Ti 40% faster than GTX 1080 Ti for $500 (71%) higher MSRP.

Bang for the buck the 2080 Ti was off the charts stupid. The 2080 didn't look great against the 1080 Ti (assuming you already owned one). However, Turing as a whole wasn't the complete $/fps disaster that so many claim.

Really if the 1080ti wasn't SO good I don't think it'd be an issue. That being said the 2080ti was crazy even if the 1080ti didn't exist.
 
the highest tier cards are really only necessary for 4K...for 1440p a 2070 Super or even 2060 Super are very good choices
I would agree with this, using my RX580 @1440P and dropping everything from ultra to various settings I can pay with it close to locked @60 fps on the division 2 and destiny 2.
 
  • Like
Reactions: noko
like this
Resolving the fuzzy math we need to fake lighting by replacing it with accurate global illumination is a big problem that is solved with ray tracing. You're ill-informed if you think ray tracing is just creating reflections.

I am aware of the power of Ray Tracing for creating photo-realistic renders and delivering the age-old dream of photo realistic gaming. I understand it's potential for improving both lighting, shadowing and reflections....all those components being updated in real time as a scene changes frame to frame.

Unfortunately, I'm also not enough of a potato to think we're even peering into the looking glass of that reality with a 2nd gen gaming card. Mostly cuz history teaches us things like this generally take 5-10 years to hit their stride and start delivering on promises...the ones we have in our heads that WE put there, not the ones the companies marketing departments are always careful to only *allude to*.

Ray Tracing in 2020 is delivering overly-reflective puddles, hyper-waxed mirror-reflective automobiles and overly-bright versions of Quake 2. I'd rather have a baked-in lighting solution and a removal of all LOD's from the worlds I game in before I need to make "those shadows even more shadowier". Yesterday I was playing Fallout 4 (again) on my Xbox....when I moved in front of a brightly-lit light source nothing happened, but when my NPC moved in it presented a reasonable approximation of the NPC's shape on the ground next to it. When a bucket moved, the shadow moved with it. Nothing else in the scene was able to be moved by my rockets or grenades. It was convincing enough. So to me, getting ray traced lighting in that scene (so I would self shadow, for example) is not the problem I really wanted to solve (other games have pulled that off convincingly for years, just not FO)......I'd prefer we solve for the 5 different versions of objects or textures or shadows that mysteriously appears or grass that grows just at the edge of the horizon. To me those are bigger issues than, again, hyper reflective surfaces updating in real time.
 
I fully anticipate $600 used 2080Ti's. And honestly, that's about all they're going to be worth.
And I fully anticipate vendors like eVGA selling "B-stock" 2080TIs for no more than a 15% discount of brand new prices as those prices sit right now.
 
I fully anticipate $600 used 2080Ti's. And honestly, that's about all they're going to be worth.

Based on what in the past gave you that idea?
And if you think that price is "right" for the perfomance/features...AMD is waaay overpriced.

You did not think before you posted...nice move 🤣
 
Last edited:
Not surprised but seeing them as digging a hole. Great if they can maintain the chasm but then they may just fall into it as well. True competition which I would expect from Lisa Sue may make some things way more difficult for Jensen. Seeing 60fps titles slotted for the PS5, using RT is very encouraging and also should bring the price point down and not forever rising per Jensen dreams. Does not matter, one should just evaluate what is available as always, not some fanboy wet dream (not saying you are), using non-objective wishful thinking. There just GPUs which after 5 or so years many will just be tossed in the trash can. In the scheme of things not really that important.

Thanks for reminding me that you should not be taken seriously...I just got a flashback of your "serial raytracing" from your dribbles.
 
the thing today, if you remove RTX, budget cards can play games exceptionally well, something budget cards sucked at 10+ years ago. even something 'old' like a 1060 can breeze through even newer games at 1080p with medium+ graphic settings.

so long ago you bought the best or your experience blew.
 
the thing today, if you remove RTX, budget cards can play games exceptionally well, something budget cards sucked at 10+ years ago. even something 'old' like a 1060 can breeze through even newer games at 1080p with medium+ graphic settings.

so long ago you bought the best or your experience blew.
GTX970 on my second / backup / living room computer... it'll still game. Last one that it struggled with was The Outer Worlds, at 1440p with the settings buried, out of newer AAA games that I've tried (not many).
 
Back
Top