NVIDIA GeForce RTX 4070 Reviews

Canada Computers is already giving away a RAM kit with purchase, and I have already seen some models on sale locally here, for a card that just launched.

I have also now come across one local seller giving away a Samsung 970 Evo Plus 1TB M.2 SSD with any Gigabyte RTX 40 GPU....the desperation is becoming palpable.
 
I'm a bit annoyed that in the reviews I've seen they ALWAYS compare the 4070 to a 10GB 3080 and not the 12GB variant. It looses to the lowly 10G @ 4K by a fair margin and I suspect that would notch up compared to the 12GB as it was better hardware and had 2GB more RAM.
Anyone have 308012GB vs. 4070 4K/VR graphs to share to shame the 4070 more?
Everything that I have seen regarding the 12gb 3080 is that you just look at the performance of a 3080ti but imagine that it's pulling all the electricity instead of most of it.

Is it annoying to see a missing data point especially if you happen to own that point? Yes.

Unless you can find the 12gb model for less than $250 new or used it's simply not worth owning or comparing.
 
I have also now come across one local seller giving away a Samsung 970 Evo Plus 1TB M.2 SSD with any Gigabyte RTX 40 GPU....the desperation is becoming palpable.

That sounds like a good deal. Probably has more to do with local sellers trying to get people in the door though. If Newegg, Best Buy or Amazon start doing the same I think it will be tell us more about sales.
 
If you buy a $700 card with 12gb on it... your stupid.
As Hardward unboxed showed the other days.... 8GB 3000 Nvidia cards are getting handily beat in newer titles by older AMD cards with 16GB of ram in RAY TRACING performance on newer titles. Even without RT enabled some of the old 8gb cards are essentially unplayable at 1080 ultra and in some cases HIGH settings.
That was 1 game that was poorly ported and the Game dev for that has put out a patch to fix it:
1681764234230.png

source: https://feedback.naughtydog.com/hc/...Last-of-Us-Part-I-v1-0-3-0-Patch-Notes-for-PC
More fixes are still coming since the 3070 issues aren't fully resolved just yet.

It's worth pointing out that the port was outsourced to another company... you get what you pay for.
12gb right now is setting yourself up for tears.
I agree more ram is better, but at the same time those cards are still going to work just fine in 5 years even at 1440p. The 1660Ti has only 6Gb ram and it still plays just fine.
You want to see a video card where the vRam is a problem, that would be a 1030 with 2Gb vram, some games can't run on that. 8Gb cards got a lot of life left, and can probably be used until they fail. The 12Gb cards, same. It's not a bottom tier xx30 model...
Game developers have hit a point where they have all said fuck it... 16gb is the target now the consoles can handle it and we are not spending a year of development trying to squeeze 8gb decent looking texture packs out, or find invisible load points to swap textures in and out.
Oh? Sounds like a developer who is making Games I don't want to buy... more likely you are just talking out of your ass on that one.
In the case of Last of Us, that port was outsourced to a company that will either get its' shit together or fold.
OK let me help you out then since maybe you don't notice it.... One of the negative comments was problems related to lack of VRAM. You downplayed this by complaining that it was the fault of shitty ports. While it may indeed be true that this situation in some cases is the fault of a bad port, the fact of the matter is that bad ports happen and will continue to happen.
Shitty ports likely will continue to happen, but the issues they have varies. More vRam is not a cure-all for shitty console ports.. not even close.
 
I'm a bit annoyed that in the reviews I've seen they ALWAYS compare the 4070 to a 10GB 3080 and not the 12GB variant. It looses to the lowly 10G @ 4K by a fair margin and I suspect that would notch up compared to the 12GB as it was better hardware and had 2GB more RAM.
Anyone have 308012GB vs. 4070 4K/VR graphs to share to shame the 4070 more?
Yeah I was actually curious about this too. I believe the 3080 12GB also had what like 3% more shaders than the 10GB version?
 
Everything that I have seen regarding the 12gb 3080 is that you just look at the performance of a 3080ti but imagine that it's pulling all the electricity instead of most of it.

Is it annoying to see a missing data point especially if you happen to own that point? Yes.

Unless you can find the 12gb model for less than $250 new or used it's simply not worth owning or comparing.
As far as I can recall the 3080 12GB was far more power efficient then the 3080ti. No one is finding any 3080 new or used at $250, especially the 12GB version. A new 4070 is $600 minimum as has poorer performance @ 4K/ VR then both 3080's so I don't understand why you think they are not worth owning or comparing.
 
That sounds like a good deal. Probably has more to do with local sellers trying to get people in the door though. If Newegg, Best Buy or Amazon start doing the same I think it will be tell us more about sales.
Seems to have only received a very lukewarm response on local bargain forums, the mindset of many consumers here seems to be that the entire RTX 40 series is mispriced and needs to be recalibrated significantly lower. Economic headwinds caused by sky high inflation and interest rates at levels not seen since 2008 (on debt levels that have gone parabolic) are probably exacerbating this attitude.

For what it's worth, VideoCardz is claiming the same thing for the US market.

https://videocardz.com/newz/amd-rx-...on-7000-series-now-available-at-or-below-msrp
 
Seems to have only received a very lukewarm response on local bargain forums, the mindset of many consumers here seems to be that the entire RTX 40 series is mispriced and needs to be recalibrated significantly lower. Economic headwinds caused by sky high inflation and interest rates at levels not seen since 2008 (on debt levels that have gone parabolic) are probably exacerbating this attitude.

For what it's worth, VideoCardz is claiming the same thing for the US market.

https://videocardz.com/newz/amd-rx-...on-7000-series-now-available-at-or-below-msrp

I agree with everything that you said. But I don't believe sales are so bad that that big stores will start dropping them by $50 or giving $50-100 items for free. Very small stores can't afford to have products sit. So I can see why local stores would do such a thing if the demand is a good bit lower than expected. But I am doubting sales are so poor that newegg/BB/Amazon will start doing the same.

When we see newegg throwing in $80 SSDs for free then it would be safe to say sales are very poor.
 
Could be a bit a stupid way to look at it, but MSRP not a cent more with the Asus brand tag not being sold out on launch week, feel week.

Stupid, could just be large volume for once, the chip is a small 295 mm that has been racked up since december at the latest with all the rest being probably quite easy to do by now (4000 series start to be older and everything is an easier version than usual).

4060TI price/perf when announced could be a clue, if it is $450 high for a bit 3070ti performance, would make it look like they are OK to continue with that pricing (20% more than previous gen, with muted gain) and the sales that come with it.
 
I agree with everything that you said. But I don't believe sales are so bad that that big stores will start dropping them by $50 or giving $50-100 items for free. Very small stores can't afford to have products sit. So I can see why local stores would do such a thing if the demand is a good bit lower than expected. But I am doubting sales are so poor that newegg/BB/Amazon will start doing the same.

When we see newegg throwing in $80 SSDs for free then it would be safe to say sales are very poor.

I agree with you as well.
 
Seems to have only received a very lukewarm response on local bargain forums, the mindset of many consumers here seems to be that the entire RTX 40 series is mispriced and needs to be recalibrated significantly lower. Economic headwinds caused by sky high inflation and interest rates at levels not seen since 2008 (on debt levels that have gone parabolic) are probably exacerbating this attitude

the crazy thing is that the 4070 at $600 looks like a bargain compared to the rest of the 40 series lineup...as a 3080 owner I see zero reason to 'upgrade' to a 4070 as DLSS 3 and 2GB more VRAM is not enough of a selling point...if the 4070 had 16GB VRAM I might have been tempted
 
Yes it is agreed that sales are slow. The fact we don't have scalpers ATM is telling and welcomed. AMD has been sitting in the wings as currently Nvidia dictates AMD's market strategy and positioning it seems. The joy of being THE consummate second place with no pressure from anyone (yet).
So AMD will launch its own 4070 series class cards soon as now Nvidia has set the price/performance bar so the marketing geniuses/s at AMD will now counter leaving Nvidia lackluster sales slightly more so.
As a consumer I can only pray that it dominoes into a bit of a collapse and we get proper pricing soon and the market stabilizes.
I feel that there is still too much crypto/covid$ buy at all cost boom still lingering in folks minds. So deprived for so long peeps are still dropping the $. I feel it is coming to an end quickly and the smart will be patient.
 
the crazy thing is that the 4070 at $600 looks like a bargain compared to the rest of the 40 series lineup...as a 3080 owner I see zero reason to 'upgrade' to a 4070 as DLSS 3 and 2GB more VRAM is not enough of a selling point...if the 4070 had 16GB VRAM I might have been tempted
Well, 10% cheaper by frame than a 4070ti (8% at 4k) (frame/$ should get cheaper and cheaper down the stack), is that enough out of line to feel like bargain.

It feel perfectly fine relative to the market, between the 7900xt and xtx in price per frame at 4k about the same as a $500 3070ti or $630 6900xt, not a bargain that will force price down much.
https://www.techpowerup.com/review/nvidia-geforce-rtx-4070-founders-edition/33.html

AV1, newest and longer driver life, 19-200 easier to manage card can still make it the clear best choice at the same price/perf, but not because it is a bargain, just a tiebreaker winner.
 
Yes it is agreed that sales are slow. The fact we don't have scalpers ATM is telling and welcomed. AMD has been sitting in the wings as currently Nvidia dictates AMD's market strategy and positioning it seems. The joy of being THE consummate second place with no pressure from anyone (yet).
So AMD will launch its own 4070 series class cards soon as now Nvidia has set the price/performance bar so the marketing geniuses/s at AMD will now counter leaving Nvidia lackluster sales slightly more so.
As a consumer I can only pray that it dominoes into a bit of a collapse and we get proper pricing soon and the market stabilizes.
I feel that there is still too much crypto/covid$ buy at all cost boom still lingering in folks minds. So deprived for so long peeps are still dropping the $. I feel it is coming to an end quickly and the smart will be patient.
What’s your opinion of the slow sales?
 
That was 1 game that was poorly ported and the Game dev for that has put out a patch to fix it:
View attachment 564928
source: https://feedback.naughtydog.com/hc/...Last-of-Us-Part-I-v1-0-3-0-Patch-Notes-for-PC
More fixes are still coming since the 3070 issues aren't fully resolved just yet.

It's worth pointing out that the port was outsourced to another company... you get what you pay for.

I agree more ram is better, but at the same time those cards are still going to work just fine in 5 years even at 1440p. The 1660Ti has only 6Gb ram and it still plays just fine.
You want to see a video card where the vRam is a problem, that would be a 1030 with 2Gb vram, some games can't run on that. 8Gb cards got a lot of life left, and can probably be used until they fail. The 12Gb cards, same. It's not a bottom tier xx30 model...

Oh? Sounds like a developer who is making Games I don't want to buy... more likely you are just talking out of your ass on that one.
In the case of Last of Us, that port was outsourced to a company that will either get its' shit together or fold.

Shitty ports likely will continue to happen, but the issues they have varies. More vRam is not a cure-all for shitty console ports.. not even close.

Its not just crashing on 3070 8gb its also textures popping in and out in multiple games. Its not just one game.
Yes of course you can just reduce the quality settings and use less vram. The point is... Nvidia is selling cards with just enough ram to get them through 1-2 years of use at quality settings people would expect to be able to use. Yes a 3070 isn't a 90 class card.... but Nvidia is selling these "mid" range cards for what we used to pay for the halo products. Cards people are going to have to drop settings to medium in some games within a year of a GPUs launch. I don't know I don't ever wanna pay that much for a GPU that can't handle at least high settings a year out. But everyone will have to make their own value judgements on that one.

As for the CRAP PORT... shitty developer excuses. That is NOT the case. If you haven't been paying attention... games have taken a jump up in terms of texture quality the last year or so. Most Nvidia buyers IMO will expect the 4070s they paid a lot of money for will be able to handle at least high settings for a year or so. When they can't and they stutter, texture pop, or crash... sure people will yell OPTIMZE this, and complain about bad ports. Instead of getting annoyed with Nvidia who is selling GPUs into the higher mid range they know full well are going to run into those problems. Nvidia works very closely with developers they are fully aware 12GB is not enough anymore. 12gb belongs on 60 cards not 70. These should have had a min of 16gb. When a PS5 can access 15GB of texture ram... why should game developer jump through a ton of hoops to try and squeeze their high quality texture packs into 12GB. At the least I hope more developers start making it very clear to gamers what to expect, They should be putting in their min system requirements High quality textures requires 12gb min and ultra quality requires min 16gb. (not that many gamers actually look at requirement lists)
 
Its not just crashing on 3070 8gb its also textures popping in and out in multiple games. Its not just one game.
Yes of course you can just reduce the quality settings and use less vram. The point is... Nvidia is selling cards with just enough ram to get them through 1-2 years of use at quality settings people would expect to be able to use. Yes a 3070 isn't a 90 class card.... but Nvidia is selling these "mid" range cards for what we used to pay for the halo products. Cards people are going to have to drop settings to medium in some games within a year of a GPUs launch. I don't know I don't ever wanna pay that much for a GPU that can't handle at least high settings a year out. But everyone will have to make their own value judgements on that one.

As for the CRAP PORT... shitty developer excuses. That is NOT the case. If you haven't been paying attention... games have taken a jump up in terms of texture quality the last year or so. Most Nvidia buyers IMO will expect the 4070s they paid a lot of money for will be able to handle at least high settings for a year or so. When they can't and they stutter, texture pop, or crash... sure people will yell OPTIMZE this, and complain about bad ports. Instead of getting annoyed with Nvidia who is selling GPUs into the higher mid range they know full well are going to run into those problems. Nvidia works very closely with developers they are fully aware 12GB is not enough anymore. 12gb belongs on 60 cards not 70. These should have had a min of 16gb. When a PS5 can access 15GB of texture ram... why should game developer jump through a ton of hoops to try and squeeze their high quality texture packs into 12GB. At the least I hope more developers start making it very clear to gamers what to expect, They should be putting in their min system requirements High quality textures requires 12gb min and ultra quality requires min 16gb. (not that many gamers actually look at requirement lists)

7e7.gif
 
to be fair the vast majority of games that are having these VRAM issues are AMD sponsored titles- RE4 remake, LoU Part 1 etc...you don't see these issues much with Nvidia sponsored games (which are the vast majority) or games that aren't sponsored by either
 
to be fair the vast majority of games that are having these VRAM issues are AMD sponsored titles- RE4 remake, LoU Part 1 etc...you don't see these issues much with Nvidia sponsored games (which are the vast majority) or games that aren't sponsored by either
Going forward, I expect PS5 exclusives (not released on PS4 pro or xbox series X) to have "porting" issues. So if you plan on playing those games in 4k then either invest in 4080+ or play on PS5 itself
 
As far as I can recall the 3080 12GB was far more power efficient then the 3080ti. No one is finding any 3080 new or used at $250, especially the 12GB version. A new 4070 is $600 minimum as has poorer performance @ 4K/ VR then both 3080's so I don't understand why you think they are not worth owning or comparing.
It's called buyer remorse.
 
to be fair the vast majority of games that are having these VRAM issues are AMD sponsored titles- RE4 remake, LoU Part 1 etc...you don't see these issues much with Nvidia sponsored games (which are the vast majority) or games that aren't sponsored by either
Not true. Hogwarts and Plague Tale also gave the 3070 issues.
 
Both 2070 FE and 4070 FE were (over)priced at $599 at market launch. Hoping that the AIB will fall back to $499 as they did for 2070 AIB. Probably ain't gonna happen anytime soon but I can wait ;)
 
They are relative performance TPU numbers:
https://www.techpowerup.com/gpu-specs/geforce-gtx-980.c2621

Using the x70->x70 jump

.770->.970 43% faster
.970->1070 47% faster
1070->2070 37% faster
2070->3070 50% faster
3070->4070 31% faster
No matter how you pin it the 4000 series is not exactly compelling if you own a 3000 series, if you own a 2000 series and you legit can't play something you want too because of it then sure, but anybody coming from a 1000 series can see some benefits from it but pricing is bad across the board.
The 4000 series isn't terrible for a 2 gen jump, you are looking at the 80% range there instead of a 90% so not a dramatic difference, but the 3070 was and still is solid for 1440p and below.
Lots of the people I know are making the jump to 1440p and what they are all doing is going gsync/freesync, so they are putting a little extra into the actual displays and cutting back a little on the GPU, because an extra $100 on the display seems to make a lot more of a difference than $100 on the GPU right now because the prices there are just bad.
 
Last edited:
Don't believe big reds lies, the cards are based on the same die
That's is AMD's core strategy with Chiplets since day 1, they make 1 chip depending on how it fairs during validation because of errors they either change the clock speeds or block off sections of silicon and move it down the line. It is very cost-effective because 1 part now fills the entire stack and it greatly cuts down on wasted silicon because there needs to be a lot wrong with it for them not to be able to salvage it. For the entire Ryzen lineup they make 1 CCD, and they try for a full 8 cores, and if one of them is defective well shut that block down and now it is a 6-core part or god forbid a 4 core part, do some slight binning based on stable clock speeds and blam you now have a full product stack based on a single chip and you have gone from an 80% yield rate to a 98% yield rate. Yields start improving and you find yourself in a spot where 4 core CCD's arent that common and you don't want to spend money blocking off good silicon just adjust the price of the 6 core to account for the better yields and make the 4 a bad value so it gets avoided by all those with options. Still have a product stack for everybody and lower amounts of old silicon sitting on the shelves come year end when your new lineup arrives.
It has worked very well for AMD on the CPU side and it is their intent to do the same for the GPU's, AMD doesn't want to make 4 or 5 separate GPU dies anymore because then they have to figure out the demand and how much of one die to they make instead of another and blah blah blah, much easier when it is just 1 die and they can modify them as needed to match demand on the fly.
 
Both 2070 FE and 4070 FE were (over)priced at $599 at market launch. Hoping that the AIB will fall back to $499 as they did for 2070 AIB. Probably ain't gonna happen anytime soon but I can wait ;)
the only reason it dropped in price was due to the 10 series still selling quite well. so nvidia took it one step further by buying back all the available 1080/ti cards from retailers. Nvidia put all their egg's in the RT basket and they had to do anything they possibly could to make sure it didn't fail because if it did they would of probably been screwed for a few generations.
 
Well, according to this post, Nvidia is mulling a $50 price rebate on 4070. Only time will tell.

But still overpriced.

Would be great if true. But the article mentions there are "discounts" on the MSI Ventus OC and ASUS Dual OC for $30 and $10. Day 1 of the launch, both of those were $600 (same price as the non-OC) at both Newegg and Best Buy. So those "discounts" they are mentioning aren't really new, and were day 1 discounts. My assumption is those OC models are limited and will be phased out.
 
Back
Top