Nvidia SUPER Refresh: Faster 2060, 2070, 2080, Faster GDDR6 and $100 Off

I can't defend price / performance, nor the utility of RTX in specific use cases.

What I've said is that I understand how they got to that pricing. The RTX GPUs are quite large; they offer increased performance over their predecessors and they have significant die space reserved for hardware RT. Further, I don't really care what Nvidia is charging: as has been obvious, by pricing RTX cards the way they have, their unit sales have dropped.

That's on them.


Yep they need to balance unit sales with unit profit and unit cost. Once they have that balanced it will be fine. If they charge enough more to make up to selling less their profit margins for the cards will equalize. Then they can produce less and claim more profit. It makes them have improved financial efficiency.

Yes the cards are expensive for a video card. WAY back when I bought a 450 dollar BFG card I thought that was expensive too. And people complained about the increased cost of cards then as well.

So that BFG card was EASILY 10 years ago... lets seee....

So that 450 I spent on the BFG card then would equate to 537.77 dollars today.

I paid 800 dollars for the card (just like a buck over after taxes actually.) But i sold my old card for 150. So I spent 650 net on my new card.

So by that math the financial impact was just over 100 more than what I spent on my card back then. My income is vastly different than it was then though. Roughly 1/2 of what I make today.

Really looking that BFG went defunct in 2010 I might have bought that card in 2007. that means the 450 is 560 today... so improves things a bit more.

So if we account for PART of the cost increase people are so incensed about being that the value of the dollars is so different... Perhaps that will help alleviate the concern?

Of course....going way back to 1990 when I first started spending my own money on computer parts... 450 then was BARELY enough for 4 MEG of ram.
 
PCIe 3.0 isn't an issue now...

It's really just a raw horsepower problem.

Well really I disagree. With everything wanting processor control and access. Those PCIE lanes are becoming more and more of an issue... (especially on Intel CPU's)

A single PCIE 4.0 lane is double the bandwidth and frequency of PCIE 3. This means effectively you can have 4 (probably 3) PCIE 3 lanes for every 1 PCIe 4 lane.

So a CPU with.. 40 PCIE 4 lanes has 120 equivelent PCIE 3 lanes? That's BONKERS. And will prove benificial if PCIE 5 doesn't hit the market on the Intel side.
 
Oh here we go..... RTX 2080 Super Dooper Mega Awesome Leather Jacket edition...
 
So a CPU with.. 40 PCIE 4 lanes has 120 equivelent PCIE 3 lanes? That's BONKERS. And will prove benificial if PCIE 5 doesn't hit the market on the Intel side.

The issue here isn't the total bandwidth, but that you don't get any more lanes. If moving from PCIe 3.0 to PCIe 4.0 meant that the lanes could easily be split up for more devices then there'd be far more utility- as it stands, you get the same number of lanes whether they run at PCIe 3.0 or PCIe 4.0 speeds.

More bandwidth on 16 lanes for a single GPU or four lanes for an SSD isn't really going to help consumer workloads unfortunately.
 
I paid $1,400 each for two Maxwell based Titan X's for SLI. So the $1,249.99 of my current RTX 2080 Ti card didn't bother me.

At the time there wasn't a halo Titan RTX above your Maxwell Titans and the 980 Ti was $650.
 
Well really I disagree. With everything wanting processor control and access. Those PCIE lanes are becoming more and more of an issue... (especially on Intel CPU's)

A single PCIE 4.0 lane is double the bandwidth and frequency of PCIE 3. This means effectively you can have 4 (probably 3) PCIE 3 lanes for every 1 PCIe 4 lane.

So a CPU with.. 40 PCIE 4 lanes has 120 equivelent PCIE 3 lanes? That's BONKERS. And will prove benificial if PCIE 5 doesn't hit the market on the Intel side.
Like Burticus said, doesn't quite work like that. pcie 4.0 has the same number of traces as 3.0, just switching twice as fast. You could use fewer PCIe lanes per device, but only if those devices support the faster pcie gen4 spec. Gen 3 devices would be stuck with fewer slow gen3 lanes unless you had a pcie gen 3 to gen 4 mux and wired all the slots for 16 lanes, which would be very complex and expensive.
 
But see that's what I don't get... The same "luxury class, flagship part" jumped in price $500 more from one generation to another. If it were $800 with new fancy features, so be it. I'd buy one. At $1200, an instant buy became a no buy. I can't believe people defend the pricing for RTX "features."

I don't defend it. I buy the fastest card I can when a new generation comes out. That was the 2080 Ti. I didn't see anything on the market besides that card to get top performance. Did you? We all have our price limitations, you reached yours last gen. My limitation is around $1400 for top-end.
 
I don't defend it. I buy the fastest card I can when a new generation comes out. That was the 2080 Ti. I didn't see anything on the market besides that card to get top performance. Did you? We all have our price limitations, you reached yours last gen. My limitation is around $1400 for top-end.

By buying into their nonsense, you're only setting yourself up for one more generation before you're in my boat also at $1400 and buying a $1000 upper midrange card. I'm thinking a $1499 3080Ti... At that point, I'll be a console peasant.
 
Cant wait for AMD to give us cheap RT next.

Their most recent behavior in the GPU market has me highly skeptical- however, unlike rasterization, ray tracing is damn near trivial to implement in hardware and drivers.

I would expect that to be a small hurdle for AMD whenever they choose to tackle it.
 
Their most recent behavior in the GPU market has me highly skeptical- however, unlike rasterization, ray tracing is damn near trivial to implement in hardware and drivers.

I would expect that to be a small hurdle for AMD whenever they choose to tackle it.

I find it strange that AMD is the one being risk adverse on putting new tech into their GPU. For the longest time AMD/ATI was the company you could count on to do that first. Now it's the other way around and everyone says it's worthless. I suppose the argument was the same back then.
 
I find it strange that AMD is the one being risk adverse on putting new tech into their GPU.

They seem to be only taking on market segments that they can compete in. Frustrating for enthusiasts, but it makes sense from a business stand point, and demonstrates reason for home for their upcoming products.
 
I'm still not interested in upgrading from my GTX 1070...I'm content to wait for 2nd gen ray-tracing cards...if the 2070 was $299 instead of $399 I might have been tempted
 
Meh, don’t really care unless my 1080ti goes to silicon heaven. If that happens AMD currently has nothing and the next logical step is a 2080ti.
 
  • Like
Reactions: Auer
like this
I find it strange that AMD is the one being risk adverse on putting new tech into their GPU. For the longest time AMD/ATI was the company you could count on to do that first. Now it's the other way around and everyone says it's worthless. I suppose the argument was the same back then.

AMD and ATI being so gung-ho with new tech bit them in the ass on more than one occasion. RTX isn’t much different than AMD talking up tessellation a few years ago, just that Nvidia has both the money and resources to push devs to support it.
 
AMD and ATI being so gung-ho with new tech bit them in the ass on more than one occasion. RTX isn’t much different than AMD talking up tessellation a few years ago, just that Nvidia has both the money and resources to push devs to support it.
I've seen interviews where devs talk about how much easier it is to use RT and light shafts or whatever the hell they call it rather than having to program in static reflections.

Imagine open world games with modern buidings where you see ACTUAL reflections of the environment around the buildings in the windows instead of static images? The HP needed for that level of integration is probably mind boggling. But in 3-5 generations we will have it so fully immersive that games without it will seem archaic.
 
https://www.guru3d.com/news-story/r...force-rtx-2060-(249)2070-(399)2080-(599).html

The 2070s and 2060s seem redundant as do the 2080 and 2080s. A cut down 2080ti (2080tiSE?) would have done better filling a gap. I just do t know if that makes sense with the architecture.
I'd like to sell a bridge to anyone thinking that the refreshed cards with faster memory will be priced like the following : "RTX 2060, 2070 and 2080 graphics cards, and that they will cost $249, $399 and $599 respectively"
 
Don't tell me you bought a card for a game that does not even have a release date.:whistle:

I'm sure many people here have done upgrades specifically because of one or two games that weren't out when they were buying parts. "I'm upgrading for [game]" is a pretty common thing to see with big games.
 
I'm sure many people here have done upgrades specifically because of one or two games that weren't out when they were buying parts. "I'm upgrading for [game]" is a pretty common thing to see with big games.
I get your point, that's why I'm upgrading for Star Citizen.
I'm more a wait and see the final specs and maybe even a review or two before taking a plunge. But that's just me.
 
https://www.guru3d.com/news-story/r...force-rtx-2060-(249)2070-(399)2080-(599).html

The 2070s and 2060s seem redundant as do the 2080 and 2080s. A cut down 2080ti (2080tiSE?) would have done better filling a gap. I just do t know if that makes sense with the architecture.
Seems more possible that the release of these cards drops the old cards in price. And the new cards release at current rtx MSRP. But we shall see. I agree Nvidia giving us all that plus a price drop does not sound very reasonable.
 
By buying into their nonsense, you're only setting yourself up for one more generation before you're in my boat also at $1400 and buying a $1000 upper midrange card. I'm thinking a $1499 3080Ti... At that point, I'll be a console peasant.

You could also not buy the top end card, nothing at all wrong with that. Most companies in most industries really rip people off on high end products, high lincoln is just a ton more money than a ford and doesnt offer that much for it, you can buy a decked out fusion that will feel almost as good. You gain some performance but you can still do a ton with lower end gpu. The thing is companies know there will always be someone who wants the highest end product if for nothing else to help them stand out. Even consoles have higher end models.
 
Other than maybe the 2060s to battle the 5700xt, there is not much point to these as they really do not fill any gaps.

Still haven't heard anything on a 1650ti, which would have the most use for Nvidia.
 
Their most recent behavior in the GPU market has me highly skeptical- however, unlike rasterization, ray tracing is damn near trivial to implement in hardware and drivers.

I would expect that to be a small hurdle for AMD whenever they choose to tackle it.

And still no DXR driver from AMD....NVIDIA did it with Pascal, something triggers my spider-sense here...
 
AMD's driver guy is busy with Navi?

:D

lol

Well something is off.
NVIDIA got two generations(Pascal + Turing) supporting DXR now...AMD is talking"next gen"...I mean looking at AMD's info...GCN should be a compute powerhouse and capable of runnng stuff "asynced" (or in tandem)....but still nothing.

They have to be careful that they don't wait so long that their fanbase will BSOD when they hear RT is not dying but comming to AMD too ;)
 
lol

Well something is off.
NVIDIA got two generations(Pascal + Turing) supporting DXR now...AMD is talking"next gen"...I mean looking at AMD's info...GCN should be a compute powerhouse and capable of runnng stuff "asynced" (or in tandem)....but still nothing.

They have to be careful that they don't wait so long that their fanbase will BSOD when they hear RT is not dying but comming to AMD too ;)

I have Pascal and could care less about the whole 3 games you can use DXR in, it's literally worthless for now. AMD will support it when they feel the need to.
 
lol

Well something is off.
NVIDIA got two generations(Pascal + Turing) supporting DXR now...AMD is talking"next gen"...I mean looking at AMD's info...GCN should be a compute powerhouse and capable of runnng stuff "asynced" (or in tandem)....but still nothing.

They have to be careful that they don't wait so long that their fanbase will BSOD when they hear RT is not dying but comming to AMD too ;)

With both PS5 and Project Scarlett having RT support of some sort maybe AMD will talk about PC RT at their event tomorrow.
 
Back
Top