Anyone seriously considering the 4070 TI

Sure, if you ignore reality. So wanting lower prices makes people entitled? :wacky:

I'd love to have access to Nvidia's internal costs for these.

Right now they are claiming that this is the new reality, because GPU's cost more to make now than back in 2001 when the fastest GPU money could buy cost $349 or back in 2013 when the original 6GB Titan cost $999 or in 2016 when the Pascal Titan X cost $1,199.

To a certain extent I believe them. There has been inflation, limitations to Moore's law makes both R&D and manufacture of chips more expensive, and higher power use makes cooling solutions more expensive.

I am a huge skeptic that current pricing is actually reflective of growing costs compared to years past. A small price increase year over year is not unexpected, but this has been massive, and seems wholly unjustified.
 
I'd love to have access to Nvidia's internal costs for these.

Right now they are claiming that this is the new reality, because GPU's cost more to make now than back in 2001 when the fastest GPU money could buy cost $349 or back in 2013 when the original 6GB Titan cost $999 or in 2016 when the Pascal Titan X cost $1,199.

To a certain extent I believe them. There has been inflation, limitations to Moore's law makes both R&D and manufacture of chips more expensive, and higher power use makes cooling solutions more expensive.

I am a huge skeptic that current pricing is actually reflective of growing costs compared to years past. A small price increase year over year is not unexpected, but this has been massive, and seems wholly unjustified.

It doesn't have to be reflective of costs. Granted, TSMC raised prices, so their costs will be higher, but they just spent the last two years seeing people snap up video cards are ridiculous prices and they want to see how much of that will continue. They exist to maximize profit, that's it. This is the same thing that happened when Turing was launched, and prices came back down to reasonable levels for Ampere when it launched for about two weeks before the world declared that a crypto coin with a picture of a dog on it was the future of money, and Nvidia apparently forgot that's over and we've gone back to reality.

As much as I hate it, Nvidia is right to try this strategy. After all, they launched a card for $1600 USD and say massive line ups in a post-crypto boom world from gamers that wanted a few extra frames, so why not see if they'll pay $800 for a 4070 Ti? If they do, great. If not, you can always lower the price.
 
It doesn't have to be reflective of costs. Granted, TSMC raised prices, so their costs will be higher, but they just spent the last two years seeing people snap up video cards are ridiculous prices and they want to see how much of that will continue. They exist to maximize profit, that's it. This is the same thing that happened when Turing was launched, and prices came back down to reasonable levels for Ampere when it launched for about two weeks before the world declared that a crypto coin with a picture of a dog on it was the future of money, and Nvidia apparently forgot that's over and we've gone back to reality.

As much as I hate it, Nvidia is right to try this strategy. After all, they launched a card for $1600 USD and say massive line ups in a post-crypto boom world from gamers that wanted a few extra frames, so why not see if they'll pay $800 for a 4070 Ti? If they do, great. If not, you can always lower the price.

Of course. Pricing is not set at cost plus. it is set at what people are willing to pay for something.

But Nvidia is trying to say it is the new normal and blaming their costs. I'm just calling bullshit on that.

With the halo level 4090 it is due to artificial market shortages by Nvidia artificially limiting supply (because it is easier to sell fewer parts with more margin, than more parts with less margin). They are trying to do the same thing with the 4080 and 4070Ti but are quickly finding out that people just aren't willing to pay those prices for mid to high end parts, resulting in the very slow sales of the 4080.

Time will tell if it will have the same effect on the 4070ti, if people will hold on to older GPU's or if they will reluctantly spend the $849 real world price for these models (because very few AIB models will actually release at Nvidia's MSRP of $799.
 
Last edited:
Sure, if you ignore reality. So wanting lower prices makes people entitled? :wacky:

That seems to be one of the arguments going around now with the 4070ti. My favorite though is that it uses half the power and beats the 3090ti in RT at 4k, with dlss 3 and in raster at 1440/1080p. Which somehow sells the card such that it should fly off the shelves.
 
That seems to be one of the arguments going around now with the 4070ti. My favorite though is that it uses half the power and beats the 3090ti in RT at 4k, with dlss 3 and in raster at 1440/1080p. Which somehow sells the card such that it should fly off the shelves.
Beating one of the worst values in history is hardly impressive.
 
Well, the party of the last decade is basically over. You're already seeing the shift in the European consumer market since they've been hit harder first. The US consumer market is basically almost there, and I would suggest there is a high probability a card with the level of enterprise grade performance like the 4090/ti is likely the last for very many years. The consumer market will fall back in price, but we'll be lucky to get any serious performance gains as the focus will be shifted to making a cheaper and more power efficient product since consumers at large are running out of cheap debt and are going broke. My larger concern is that I have the distinct feeling that this current high-end consumer market is basically going to go away due to the economic factors. If you want actual high-end performance at home we're going to be paying $20k for enterprise grade GPU's, etc.

Since the 4090 has no problem selling I don't expect that to happen. But if production costs and inflation really are responsible for what's going on, we may see cards like the 4080 disappear entirely leaving a monumental performance gap between a $2000 4090Ti "Titan" and the 4070Ti and lower. Without the crypto mining boom there is no market above $1000 except for that of the Titan crowd.

The most Nvidia can get away with is reverting back to a $1200 2080Ti / 3080Ti, imo. I expect them to drop in a 4080Ti - a slightly cut down 4090 with half the VRAM - after the $1200 4080 experiment is determined failed.
 
Since the 4090 has no problem selling I don't expect that to happen. But if production costs and inflation really are responsible for what's going on, we may see cards like the 4080 disappear entirely leaving a monumental performance gap between a $2000 4090Ti "Titan" and the 4070Ti and lower. Without the crypto mining boom there is no market above $1000 except for that of the Titan crowd.

The most Nvidia can get away with is reverting back to a $1200 2080Ti / 3080Ti, imo. I expect them to drop in a 4080Ti - a slightly cut down 4090 with half the VRAM - after the $1200 4080 experiment is determined failed.
Half the Vram is 12 gigs, may as well not even make a card at that point. I could see it being 20 GB instead or something like that. I'm not sure what the napkin math is for configurations they can do. They may never officially cut the 4080 price and admit a mistake (discounts will just show up in 6 months or so). I think it's more likely there's a Ti card that slots in at 1200 and a 1000 dollar 4080 Super that's basically an overclocked version.
 
I wouldn't say entitled. I think that there are many people who just don't see the reality of our current economic situation coupled with how complex these GPU's are getting. They aren't getting cheaper.

Yes, those are the people setting the prices of the GPUs. Grocery costs have gone up around 30% here. Even accounting for inflation, GPU prices have increased further. People just don't have quite as much and the higher priced GPUs go the more niche they become.
 
I wouldn't say entitled. I think that there are many people who just don't see the reality of our current economic situation coupled with how complex these GPU's are getting. They aren't getting cheaper.

Also keep in mind that the reality of this economic situation is that fewer and fewer people can afford an $850 GPU.

And not that long ago, an $850 GPU was very high end, now its midrange.

So not only are you getting less for your money, but more and more people cant even afford to spend what they used to.
 
How long would it take prices to come down if they even do?

I doubt we can figure that out unless we hard substantial sales figures, manufacturing costs and some internal dialogue with stores/AIBs/Nvidia.
 
How long would it take prices to come down if they even do?

Short of more competition or some sort of price war, I am pessimistic that it will happen.

During the mini g craze and shortage, the FOMO kiddies showed Nvidia that they are willing to spend almost anything to get their hands on a GPU, and companies set pricing at the highest price they think people are willing to pay.

ADHD gamer kiddies showed Nvidia that there is almost no end to how much they are willing to pay, and O fear that had permanently changed the market.
 
And not that long ago, an $850 GPU was very high end, now its midrange.

I think NVIDIA considers the 4070ti to be high end now. Didn’t they say that “mainstream” cards (which I take to be mid range) are now those from the 30-series?

“40-series cards aren’t for the poors.“ — NVIDIA probably
 
A friend of mine just bought a 4070ti. I don't want to say its good value but if someone wants to drive a 4k monitor without spending $1000+ there really isnt a lot of options. just a couple months ago, i thought my 6900xt was going to have some legs, and now i look at benchmarks and see how far it lags behind 4000 and 7900xtx ... Even with my 1440p monitor im only getting 110-120fps in apex, which might be partially blamed on my 3900x.
 
A friend of mine just bought a 4070ti. I don't want to say its good value but if someone wants to drive a 4k monitor without spending $1000+ there really isnt a lot of options. just a couple months ago, i thought my 6900xt was going to have some legs, and now i look at benchmarks and see how far it lags behind 4000 and 7900xtx ... Even with my 1440p monitor im only getting 110-120fps in apex, which might be partially blamed on my 3900x.
That's true of almost every generation - Maxwell to Pascal and Pascal to Turing aside - generally we've seen major jumps every year or every other year. The Geforce3 was followed by the 4 a year later to the date - the TI came out 8 months after the initial 3 release. The FX was a year later, and then (thank god) the 6000 series was there a year after that. And so on - lately it's taken longer, but I'm still conditioned to expect that every generation of graphics card will be a significant jump in performance - and potentially cost.
 
How long would it take prices to come down if they even do?
Pretty sure the ultra high end is going no where. Consumers have voted with their wallets and $1600-2000 appears to be no problem for them to sell out the top end card like 4090's instantly. There is hope for the other tiers, but I don't think we'll see real reasonable pricing again. Better prices, sure, eventually after they sit on stock for 6 months to see if consumers break. Even if the 4080 dropped to $999.99 and 4070ti to $699, they are still over priced, but I bet at that price they would start selling out because it will seem like a bargain compared to where it was to many and Nvidia will laugh all the way to the bank. But I have a feeling it will be AMD that caves first, once XTX sales slow in a few months and they are sick of sitting on XT stock we will see some tangible downward movement.
 
The Steam Hardware Survey is not indicative of most gamers

I think you are wrong there.

Even with some new stores having popped up in the last few years, Steam is still BY FAR the largest source of PC games, and their statistical sampling of PC hardware is the best tallying of the market there is. If someone has a PC they use for games, chances are they have a Steam account. Even if they get many of their games elsewhere (like Free EGS shit or Game Pass) almost everyone has a Steam account, and the Steam client, and if they do, they are randomly sampled as part of the hardware survey.

It can be surprising to some on enthusiast hardware sites like ours to see just how much low end hardware there is out there, but we have to remember that we aren't necessarily representative of the market as a whole. We are the crazy few who make this our primary hobby, and spend way more than your average person on hardware.

Looking at the Hardware survey it tracks perfectly with many of the conversations I see in tech communities. For instance, a majority of people are still holding on to Win 10 for now. 16GB of RAM just recently became the norm for those who enjoy games, and in the HWsurvey it just recently crept over the 50% mark. It appears to track community hardware trends pretty accurately.
 
I think you are wrong there.

Even with some new stores having popped up in the last few years, Steam is still BY FAR the largest source of PC games, and their statistical sampling of PC hardware is the best tallying of the market there is. If someone has a PC they use for games, chances are they have a Steam account. Even if they get many of their games elsewhere (like Free EGS shit or Game Pass) almost everyone has a Steam account, and the Steam client, and if they do, they are randomly sampled as part of the hardware survey.

It can be surprising to some on enthusiast hardware sites like ours to see just how much low end hardware there is out there, but we have to remember that we arent necessarily representative of the market as a whole. We are the crazy few who make this our primary hobby, and spend way more than your average person on hardware.
All of that is true - but there's a statistical issue with who hits "yes" on the survey; I believe I saw a report that it tends to be both the low end and the high end, and that the middle is underrepresented (either "meh, I don't want people to know," or "I'm embarassed," either of which could skew the numbers). If it wasn't voluntary we could trust it for sure, but as is, there's a potential for miss-sampling there (hell, I hit "no" 90% of the time because I just can't be arsed - or I don't need it reporting that I game on an M1 Pro macbook, when I was just firing up steam to use it as a link to another system for an actual desktop app).

Plus then you get into folks that have multiple systems - am I a 3090? 3060TI? 6800XT? 2080TI? ... RX480? I have all of those; all of those systems have Steam. Which one actually gets used the most (the 6800XT, actually).

We'd need an actual research group to dig into the numbers to determine if it's a valid statistical sampling or not - although it's certainly useful as-is, we can't be 100% sure of HOW useful yet.
 
All of that is true - but there's a statistical issue with who hits "yes" on the survey; I believe I saw a report that it tends to be both the low end and the high end, and that the middle is underrepresented (either "meh, I don't want people to know," or "I'm embarassed," either of which could skew the numbers). If it wasn't voluntary we could trust it for sure, but as is, there's a potential for miss-sampling there (hell, I hit "no" 90% of the time because I just can't be arsed - or I don't need it reporting that I game on an M1 Pro macbook, when I was just firing up steam to use it as a link to another system for an actual desktop app).

Plus then you get into folks that have multiple systems - am I a 3090? 3060TI? 6800XT? 2080TI? ... RX480? I have all of those; all of those systems have Steam. Which one actually gets used the most (the 6800XT, actually).

We'd need an actual research group to dig into the numbers to determine if it's a valid statistical sampling or not - although it's certainly useful as-is, we can't be 100% sure of HOW useful yet.

That is true. I forgot you had to opt in.

Those who are proud of their hardware are more likely to opt in than those who aren't. That's kind of a classic sampling error.
 
Last edited:
Founders packaging;

NVIDIA GeForce RTX 4070 Founders Edition packaging leaks out

RTX4070TI-BOX-RGT3-1200x574.jpg
 
All of that is true - but there's a statistical issue with who hits "yes" on the survey; I believe I saw a report that it tends to be both the low end and the high end, and that the middle is underrepresented (either "meh, I don't want people to know," or "I'm embarassed," either of which could skew the numbers). If it wasn't voluntary we could trust it for sure, but as is, there's a potential for miss-sampling there (hell, I hit "no" 90% of the time because I just can't be arsed - or I don't need it reporting that I game on an M1 Pro macbook, when I was just firing up steam to use it as a link to another system for an actual desktop app).

Plus then you get into folks that have multiple systems - am I a 3090? 3060TI? 6800XT? 2080TI? ... RX480? I have all of those; all of those systems have Steam. Which one actually gets used the most (the 6800XT, actually).

We'd need an actual research group to dig into the numbers to determine if it's a valid statistical sampling or not - although it's certainly useful as-is, we can't be 100% sure of HOW useful yet.
There certainly will be some error but I'm not sure how you can be embarrassed in a completely anonymous survey.
 
A 2 slot 4070Ti FE would be quite the hit for the SFF market. But I guess the SFF crowd isn't large enough for Nvidia to care.
Agreed. I've got a NUC 12 Extreme with a 3060TI in it - which tends to be the biggest you can put in there.
 
Also keep in mind that the reality of this economic situation is that fewer and fewer people can afford an $850 GPU.

And not that long ago, an $850 GPU was very high end, now its midrange.

So not only are you getting less for your money, but more and more people cant even afford to spend what they used to.
4070 ti is midrange in the lineup. But for real world performance: midrange is 1440p. And you can do that fine on a 6600 xt. And really well on a 6700 XT, RTX 3060 ti, or RTX 3070. There are only a couple of games where that is not true.
New gen videocards right now are basically: "how much do you want to pay for 4K??!!"
 
Last edited:
Only reason I think someone might go for a 4070ti as opposed to a used 3080ti is the AV1 hardware encoding.
 
4070 ti is midrange in the lineup. But for real world performance: midrange is 1440p. And you can do that fine on a 6600 xt. And really well on a 6700 XT, RTX 3060, or RTX 3070. There are only a couple of games where that is not true.
New gen videocards right now are basically: "how much do you want to pay for 4K??!!"

Bingo, you want 4K high pay up or play at 1440 high or 4K medium settings.
 
Bingo, you want 4K high pay up or play at 1440 high or 4K medium settings.
4070 ti isn't good for 4k, limited bus size and memory. I have a 6800 XT and I just played spiderman maxed out without RT and FSR set to quality. 520 bucks is paying up for me. Two years from now the next generation 4k 60 should be mid to low end.
 
4070 ti isn't good for 4k, limited bus size and memory. I have a 6800 XT and I just played spiderman maxed out without RT and FSR set to quality. 520 bucks is paying up for me. Two years from now the next generation 4k 60 should be mid to low end.
4070 ti is fine for 4K. Architecture is more important than bus size. In most games, its comparable to a 3090 ti. There are a few, where the 3090 ti is notably better. Those games my be more sensitive to the bus size. But, the 4070 ti is still giving good 4K performance, even in those games.
 
  • Like
Reactions: T4rd
like this
4070 ti is fine for 4K. Architecture is more important than bus size. In most games, its comparable to a 3090 ti. There are a few, where the 3090 ti is notably better. Those games my be more sensitive to the bus size. But, the 4070 ti is still giving good 4K performance, even in those games.


The bus size isn't the main variable at play here. It's the fact that the 4070 Ti only has 12Gb of VRAM; half that of the 3090 Ti. 4K uses a lot of VRAM, and the performance of the 4070 Ti drops quickly once you run out of VRAM.
 
Back
Top