NVIDIA GeForce RTX 4070 Priced at $600

So the 4090 has drastically increased your willingness to play pc games?
Absolutely! I was burned out on pc gaming for the past few years until this purchase. The 4090 has transformed my gaming experience to a level I didn't think possible. Games that were just ok at 1440p on a 27in monitor are now much more enjoyable to me due to the added immersion of NV surround and the high refresh rates afforded by the 4090. The best games are made even greater because everything is amplified to the 10th degree at 4320x2560@165hz. It's gaming bliss and I'd wish I'd done it sooner.....
 
Absolutely! I was burned out on pc gaming for the past few years until this purchase. The 4090 has transformed my gaming experience to a level I didn't think possible. Games that were just ok at 1440p on a 27in monitor and now much more enjoyable to me due to the added immersion of NV surround and the high refresh rates afforded by the 4080. The best games are made even greater because everything is amplified to the 10th degree at 4320x2560@165hz. It's gaming bliss and I'd wish I'd done it sooner.....
Huh, fair enough. I guess I've just always "played", even on low end hardware :) Whatever works, though.
 
Yeah imo 12GB is low if you ask me. It should be 16GB at least for a 4070.

12GB should be what is on the 4060/4060ti.

To be fair, the amount of games that have run out of VRAM on my RTX 3070 were slim to zero. The few that did were un-opitmized messes, and also had ray tracing enabled which produced unsatisfactory frame rates anyways. DLSS can also help lower VRAM requirements. I am doubting 12GB will be too little.

If you are running at 4K then yes you want more than 12GB without a doubt. But your performance will be too poor in the first place, so that is kind of a moot point.
 
Ill just keep buying last gen then. Regular folks like Cesário offload clean, high end cards for fair prices to guys like me who are okay with turning down settings and playing at peasant resolutions
 
Well the leather jacket even said a gpu upgrade should be like buying a new console. So yeah, take what you will from that.

Ironically, a 4070 Ti is less capable of running 4K than a PS5 for the same price because of the memory limitations. It's just fucking stupid now, its way more cost effective to buy a console which can provide more or less the same visual experience at 4K without all the other additional hardware costs.
 
Ironically, a 4070 Ti is less capable of running 4K than a PS5 for the same price because of the memory limitations. It's just fucking stupid now, its way more cost effective to buy a console which can provide more or less the same visual experience at 4K without all the other additional hardware costs.
Must be rare game that run at 4k on a PS5, but it is due to some optimization or due to the hardware decompressor ?, a PC with a 4070 ti will have more ram-vram than a 16gb ps5
 
IMO I was expecting $699, but even $599 is too damn high.
for real. the op says $370 for gtx770 but i had bought an asus dcu2 model (good one, not founders) for $299 and that's when they first came out! and the x80 class was only ~ $150 higher?!! and the difference too is back then MSRP was a high estimate, nobody was paying msrp, nowdays it's a lowball number.
 
Errrr most games can run at 4K (native or dynamic) on PS5.

https://www.psu.com/news/guide-ps5-...ry-ps4-ps5-game-with-enhanced-graphics-modes/

As to how it achieves that, optimization, less configurations to deal with and much simpler operating system to code for.
na, if you look at comparisons of consoles to pc it's like running games at low settings and most of the time they and can either be lower resolutions and/or frame rates and this day and age i'm sure they are using upscaling technology mixed in with it.
 
na, if you look at comparisons of consoles to pc it's like running games at low settings and most of the time they and can either be lower resolutions and/or frame rates and this day and age i'm sure they are using upscaling technology mixed in with it.

Its not like the old days of the PS2 and original Xbox where there were noticeable night and day differences in graphical fidelity because of PC being capable of higher texture quality and resolution, the advantage that a PC has is marginal especially when you take into account the cost differential. I don't deny that you cant see differences at the same viewing distances, but the reality is most people will not notice that when sitting 2 metres away from a large 4K screen. E.g.:







GoW in particular looks almost indistinguishable between the versions.
 
Its not like the old days of the PS2 and original Xbox where there were noticeable night and day differences in graphical fidelity because of PC being capable of higher texture quality and resolution, the advantage that a PC has is marginal especially when you take into account the cost differential. I don't deny that you cant see differences at the same viewing distances, but the reality is most people will not notice that when sitting 2 metres away from a large 4K screen. E.g.:







GoW in particular looks almost indistinguishable between the versions.

The advantage is now maginal because these developers only pump out absolute shit ports instead of making them for PC from the ground up.
 
yes upscaled (what I meant by not running at 4k), when it is not dynamic 4k it is often at 30 fps
With lower quality textures, short draw distances, hard shadows, static lighting, and a lot of shortcuts “optimizations” to make it get there.
They “optimize” a PS5 for 4K by taking things out of a scene that you won’t notice when sitting on the other side of your living room that you would certainly see 16” from a 4K of monitor.
 
After the 6gb was released. And the 12gb was aimed to crypto miners. Not gamers

What?

First, the 3060 was a 12GB card from day one.

Second, the 3060 had the first attempt at a hash rate limiter which was somewhat effective at first.

Whether or not the 3060 was fast enough to use 12GB effectively is another story, but the point is it had 12GB last gen.
 
Last edited:
is Nvidia actually selling 4000 series cards at these crazy prices?...4090 $1700, 4080 $1200, 4070 Ti $800

Some are moving, but the plentiful amounts of stock early in the launch cycle suggests they’re not selling as many as they did in previous launch cycles.

Nvidia is doing what a lot of companies are right now, and that’s testing what the market will bear. Crypto mining inflated prices because those people were using it to make money and were willing to pay a higher price. Some gamers tapped out and went along with it. Nvidia is hoping the price anchoring worked and gamers now think $1000+ GPUs “make sense”. The same thing happened with Turing.
 
Yepp, $600 for a vanilla 700 level card.

https://www.techpowerup.com/306631/nvidia-geforce-rtx-4070-priced-at-usd-600

For those of you wondering about MSRPs of past cards with 2023 inflation in parenthesis. So with inflation, it's in line with the 2070 and 3070. Quite a bit more expensive than 970 and 1070.

770: $370 ($515.27), 2013
970: $329 ($418.09), 2014
1070: $379 ($475.06), 2016
2070: $499 ($597.83), 2018
3070: $499 ($580.03), 2022
4070: $600
This is what I have been telling people but they seem to not understand inflation or think it cannot possible be that much of an increase... I cannot buy a top trim level half ton pickup for $30,000 like I could in 1999 either, they now cost twice that or more.
 
The evidence has been provided multiple times since the original 4080 topic began - Nvidia propaganda working wonders on you guys. You cannot compare a 4000 series card to a previous "equivalent" 3000 or earlier series card. In multiple ways Nvidia has been dropping almost all if not all the cards down one or two "levels" while keeping the name the same and the MSRP the same or greater. It isn't even close to just "inflation."
 
The evidence has been provided multiple times since the original 4080 topic began - Nvidia propaganda working wonders on you guys. You cannot compare a 4000 series card to a previous "equivalent" 3000 or earlier series card. In multiple ways Nvidia has been dropping almost all if not all the cards down one or two "levels" while keeping the name the same and the MSRP the same or greater. It isn't even close to just "inflation."

Case in point, a $799 192-bit 4060 masquerading as a 4070Ti.
 
Now that you can just go find one in a store, i keep thinking about how maybe I should finally get a 4090, but then I think about what games I'd play with it, and I come up empty.

Other than DCS World, the only game I feel anything other than inescapably "meh" about is the expansion for Horizon Forbidden West, and that's apparently PS5 only. It's been about 18 months since my last PC upgrade, but... I can't imagine what I'd do with a faster one.
 
It's sad when the price of the Titan-class card makes more sense than the rest of the lineup.
 
Yeah imo 12GB is low if you ask me. It should be 16GB at least for a 4070.

12GB should be what is on the 4060/4060ti.
One of my complaints with my 3080 Ti. When you factor in the fact that the 1080 Ti and 2080 Ti preceding it were both 11GB, another case of VRAM stagnation that will just make me need to upgrade it sooner because of 4k.

The evidence has been provided multiple times since the original 4080 topic began - Nvidia propaganda working wonders on you guys. You cannot compare a 4000 series card to a previous "equivalent" 3000 or earlier series card. In multiple ways Nvidia has been dropping almost all if not all the cards down one or two "levels" while keeping the name the same and the MSRP the same or greater. It isn't even close to just "inflation."
Just look at the silicon. They kind of mucked it up with this 103 for the 4080, but its still 70 cards on third tier silicon, they just called it 104 instead of 106.
 
Last edited:
This is what I have been telling people but they seem to not understand inflation or think it cannot possible be that much of an increase... I cannot buy a top trim level half ton pickup for $30,000 like I could in 1999 either, they now cost twice that or more.
buddy the price increase is by far more than just inflation. Add in the same or lower VRAM and its a mess
 
Its not like the old days of the PS2 and original Xbox where there were noticeable night and day differences in graphical fidelity because of PC being capable of higher texture quality and resolution, the advantage that a PC has is marginal especially when you take into account the cost differential. I don't deny that you cant see differences at the same viewing distances, but the reality is most people will not notice that when sitting 2 metres away from a large 4K screen. E.g.:







GoW in particular looks almost indistinguishable between the versions.

Super high compressed videos is a terrible way to show comparisons of graphics. 4K UHD's cant even do lossless video you think YouTube can... :D
 
You can still max or near max any game launched and most that have been announced at 1080p on a 1660 super and maintain numbers in the 60’s.

We’re paying for feature bloat, that’s it.

These are the kinds of cards that let you turn the settings to max and have some decent raytracing going on with 60+ at 1440p.

The reality is once you step above 1080p you pay.
 
The reality is once you step above 1080p you pay.
Okay? This has always been the case. It doesn't change the fact that 1440p hardware is now being sold as $800. Also some games are challenging the notion of a 1660 being enough for 1080p/60fps max settings. Hogwarts Legacy for instance will make it fall far short of that. Sure, "crappy console port", but it's the hottest game of the year so far. My friend's 1080ti was chugging here and there on his 1080p setup.
 
Okay? This has always been the case. It doesn't change the fact that 1440p hardware is now being sold as $800. Also some games are challenging the notion of a 1660 being enough for 1080p/60fps max settings. Hogwarts Legacy for instance will make it fall far short of that. Sure, "crappy console port", but it's the hottest game of the year so far. My friend's 1080ti was chugging here and there on his 1080p setup.
Yeah with that exception as it will average in the 30s, but that is what most people are playing with, Medium + FSR and it's back into the '80s and looking good.
A good 1440p monitor is still going to cost more than most people pay for their GPUs, there is a reason why 1080p still makes up more than 60% of steam machines on a daily basis, it costs more than most are willing to put into the hobby always has been the baseline isn't here to change that so once you step outside of 1080p you are paying a premium. It sucks, but all the bitching in the world won't change that and AMD actively wants to keep it that way to protect their console numbers, which make a significant portion of their annual. With no competition from AMD to lower the bracket Nvidia is going to go for the throat, leaving Intel as our savior but the second they are established, they are going to be right there alongside them.
The reality of the market is anything GTX 1070TI performance class or better is still a perfectly viable gaming card at 1080p thanks to FSR and DLSS and until gaming projects step up their low end that isn't changing any time soon, There is still a crap load of RX 6000 hardware and RTX 2000 and 3000 parts flooding the market and AIBs are still pushing them out, those have already been paid for so Nvidia and AMD see nothing from their sales as they trickle out so the new stuff is priced to make up that shortfall.
Nvidia's recent announcement of them going after "unauthorized" refurbishers and the like isn't as altruistic as it sounds, they want any reason they can to gut the supply on the second-hand market and bring sales back in line because as long as they exist their new sales are hampered by the fact they are competing against their own products in bulk at prices that are too low to bother with.
 
Case in point, a $799 192-bit 4060 masquerading as a 4070Ti.

I think it would have been okay for 4070 at $600. A bit pricey, but it does have decent performance. The regular 4070 rumored specs and its price tag looks underwhelming. More like a 4060ti at best.

I do wonder what a regular 4060 will look like. Probably will be $400, $450 actual retail price.
 
This is what I have been telling people but they seem to not understand inflation or think it cannot possible be that much of an increase... I cannot buy a top trim level half ton pickup for $30,000 like I could in 1999 either, they now cost twice that or more.
That's a terrible comparison because anyone watching the car market also knows that there was a massive shortage there too due again to the silicon shortage. It was simply a function of being able to produce fewer cars, not because any of the components to make a car had gone up.

In fact Tesla has been lowering prices because the price of lithium has decreased (paywall sorry, but the point is it's true), something that no one thought would ever happen.

If you're paying close attention there is an expectation that the car market actually is going to implode. The silicon shortage is tapering off, so production has gone up. What has happened is that demand also has dropped like a rock. The ability of car manufacturers to command high prices for vehicles is ending. If you want to look into this and see evidence of that, look into what is happening with Carvana. They bought like crazy because they bet that skyrocketing prices would never end. Now demand has dropped in the used market at these inflated prices. They and everyone else will have to lower prices to get sales again. For them likely at an extreme loss.

This isn't inflation. It is a function of supply and demand. Supply side crashed due to the shortage so prices increased. Now supply side has come back but there isn't increased demand. Costs have to come down again to actually meet equilibrium. This is also directly what is happening in the GPU market. A majority of this is absolutely not inflation (some of it is, but not by 30+%). This is supply side trying to hang on to higher prices but failing hard because the demand isn't there at these prices. They have to reach equilibrium again for people outside of whales/enthusiasts to actually buy more (this is economics 101, supply and demand curves). We know with 100% certainty that we are not at economic equilibrium because video cards are sitting on shelves. Demand is not equal to supply. In fact: every publication has been stating that this is the lowest ever quarter in terms of CPU and GPU sales in over a decade.

If AMD/nVidia want GPU sales, then they're going to have to re-balance their curve and figure out how many GPU's they're going to need to sell to make similar profit as previous years. And they're going to have to take a margin cut or figure out cost cutting measures to make pricing in line with what people actually want to spend.
 
Last edited:
I think it would have been okay for 4070 at $600. A bit pricey, but it does have decent performance. The regular 4070 rumored specs and its price tag looks underwhelming. More like a 4060ti at best.

I do wonder what a regular 4060 will look like. Probably will be $400, $450 actual retail price.
Not until the supply of 3070s and 3060TI's dries up they won't, Nvidia's AIB partners would shit a brick if Nvidia invalidated a crap load of their overstock by releasing something better performing and cheaper. Nvidia might be harsh with them but they won't openly antagonize them like that, Nvidia's release schedule at this point seems very much in line with the stock of the 3000 series parts and only once a particular performance class has dried up do they launch its 4000 series performance successor. I mean what would a 4060 look like and how would it look compared to say a 3070TI in terms of pricing and availability? Would there be a place for it while new 3070TI's are actively available for purchase?
 
Not until the supply of 3070s and 3060TI's dries up they won't, Nvidia's AIB partners would shit a brick if Nvidia invalidated a crap load of their overstock by releasing something better performing and cheaper. Nvidia might be harsh with them but they won't openly antagonize them like that, Nvidia's release schedule at this point seems very much in line with the stock of the 3000 series parts and only once a particular performance class has dried up do they launch its 4000 series performance successor. I mean what would a 4060 look like and how would it look compared to say a 3070TI in terms of pricing and availability? Would there be a place for it while new 3070TI's are actively available for purchase?
Actually, that's exactly what should happen. Old tech should have a massive price decrease. Incoming parts should be faster and cheaper. "Cheaper" being a relative expression of stating that for the same level of cost as the previous generation buyers are getting a higher performing card. Meaning at a lower cost users should be able to buy something that was at minimum equal to a higher tier product.

An i5 today, destroys a Xeon from 2010. Prices go down, performance goes up.

What nVidia should be doing is giving their board partners rebates and releasing the new product stack at similar pricing to the old product stack. That's what every other market does and what nVidia used to also do. But as we know, they have become bigger dicks towards their board partners (eating all the margin without any recourse) and they are holding onto crypto pricing for as long as possible.
 
Back
Top