NVIDIA GeForce RTX 4070 Reviews

I've yet to experience it yet (DLSS 3)

I'm sensitive to latency, why I liked GameStream so much, it really was the best and only with no perceptable latency to me - everything else from Moonlight (with Sunshine) to Steam In-Home streaming does - if you used GameStream and the others - you know exactly what I'm talking about.

So I'm assuming I'll stick to DLSS 2 myself. But if I can't notice it with DLSS 3 or it's just so low in practicality it's tolerable (or also if they improve it with the Reflex part of DLSS 3) - I got a 170Hz monitor I'd be happy to use with it.
 
Idk. It just doesn't feel good. The 3070 made a big splash by being "2080ti performance, but for $500!"

Now you get not quite 3080 performance for.....$600. Oh, but you get dlss 3. I guess if you're going to make a ton of use out of that?
Yeah my thought too.

Honestly it feel like a 3070 on a die shrink with more cache, cut in memory bus but more vram, and AI upscalers. But the underlying silicon doesn't feel particularly like an upgrade. 3070 and 4070 literally have same number of CUDA cores. Yeah I get Lovelace is stronger and all. But feels like a shit showing for $600 to me.
 
Wait before the people start coming in claiming Hardware Unboxed is Biased toward AMD:
Funny mentioning bias, after I watched the LTT's review of the 4070 this morning I got the impression Linus and his writers wrote a script to purposely cast light onto Nvidia. I was expecting them to bash the product like everyone else, but to my surprise they didn't. Then I recalled them bashing the 4090 launch so I went back and rewatched that review. My memory must be flawed because lo and behold they cast light onto it too.

The whole bias thing seems to be meaningless because people have flawed memory anyway.
 
What about the folks getting so mad that others are just buying a Nvidia 4070? 🤔

To them, I say:

View attachment 563709

Your right if anyone wants a 12gb 4070 they should go ahead and grab one. I think they are going to be disappointed in under a year when some new game they are excited about and buy day one can't manage anything but medium settings. Still it is their money.

The more people support Nvidia and their 10/12gb silly memory setups... the more Nvidia will find ways to force obsolescence into their designs.

I remember a couple years ago people going on about AMDs 16gb cards being nothing but stat padding ect.... and here we are a couple years later and last gen AMD cards are running decently well with the latest (apparently they are all buggy) titles. They are also all of a sudden outperforming Nvidia in RT on last gen hardware in newer titles.

This launch feels like much of the same from Nvidia to me.... great GPU hampered by having JUST enough ram for now.
 
DLSS3 is not a good selling point. It looks awful, it adds latency. It is inferior compared to DLSS2 which doesn't generate frames.

DLSS3 is a joke.
When you add reflex in there the latency issue is not bad, pair it with a GSync-compatible display and it feels even better from there, updates to DLSS 3 since launch have also significantly improved visuals and greatly reduced artifacts. At launch there was a lot of rushed implementations of it which gave weird results but updates have greatly improved it since then.
 
When you add reflex in there the latency issue is not bad, pair it with a GSync-compatible display and it feels even better from there, updates to DLSS 3 since launch have also significantly improved visuals and greatly reduced artifacts. At launch there was a lot of rushed implementations of it which gave weird results but updates have greatly improved it since then.

Latency is so subjective no one should ever take anyone's word for it as gospel though - if a lot of people give it a thumbs up, check it out - but it's always so subjective - especially the more sensitive you are - expect the worst.

That said, you speaking from hands-on experience or things you read online? 🤔
 
Latency is so subjective no one should ever take anyone's word for it as gospel though - if a lot of people give it a thumbs up, check it out - but it's always so subjective - especially the more sensitive you are - expect the worst.

That said, you speaking from hands-on experience or things you read online? 🤔
I needed a new work machine and the Avigilon software requires an Nvidia GPU if I want facial, or license plate recognition, so I mean... given the cost of a 4080 compared to an equivalent RTX A4000 I mean it was a no brainer right... So yeah my work desktop is a better gaming rig than my actual gaming rig and I am sorta mad about that, so I had to give it a try, and yeah it was pretty decent. My monitor being GSync compatible was a pure fluke and wasn't something I had paid attention to when I ordered it because Dell was just giving me a bulk deal on a series of monitors they were looking to offload.
I would certainly never play a few rounds of Gotham Knights with the friends from here while I do server updates after hours.... No...
 
I needed a new work machine and the Avigilon software requires an Nvidia GPU if I want facial, or license plate recognition, so I mean... given the cost of a 4080 compared to an equivalent RTX A4000 I mean it was a no brainer right... So yeah my work desktop is a better gaming rig than my actual gaming rig and I am sorta mad about that, so I had to give it a try, and yeah it was pretty decent. My monitor being GSync compatible was a pure fluke and wasn't something I had paid attention to when I ordered it because Dell was just giving me a bulk deal on a series of monitors they were looking to offload.

Nice 👌
 
When you add reflex in there the latency issue is not bad, pair it with a GSync-compatible display and it feels even better from there, updates to DLSS 3 since launch have also significantly improved visuals and greatly reduced artifacts. At launch there was a lot of rushed implementations of it which gave weird results but updates have greatly improved it since then.
DLSS3 is, as of now, the most useless framerate improvement tech yet, making sense only when there already is ample underlying performance.
 
DLSS3 is, as of now, the most useless framerate improvement tech yet, making sense only when there already is ample underlying performance.
Yeah you do need at least 60fps for it or FSR 3 to be viable, I suspect it exists to be better compatible with 4K 120hz (I don't have one so I can't test it), but from what I have read if you use DLSS 3 and frame lock yourself to 120 FPS you can turn on more eye-candy and not deal with tearing or VSync issues on 4K TV's if you are using them as your display. Supposedly the nicer newer displays like the LG C2 don't have this issue as there is some sort of GSync compatible VRR there but for the more abundant TLC panels that are all over the place that don't, it is a big help.
I see it as a play to the console people, lots of people who bought a new XBox or PS5 also had a matching 4K TV, and this lets them use that TV for their PCs and not feel like things are worse.
 
Latency is so subjective no one should ever take anyone's word for it as gospel though - if a lot of people give it a thumbs up, check it out - but it's always so subjective - especially the more sensitive you are - expect the worst.

That said, you speaking from hands-on experience or things you read online? 🤔
That’s why I said feels better, like to me it’s not bad, but I am not a twitch gamer and most FPS titles make me want to puke so I get it. I’ve only used DLSS 3 on Darktide, so I can say in that one instance any latency issues the frame generation did induce weren’t really distinguishable from latency generated by my connection to their servers.
 
lol I don't live in the US. :) But fair. Its still a terrible idea to buy a 12gb card in 2023.
I tend to agree for "future proofing" but I can see a 12gb card making sense. If you're a peasant like I am, maybe all you really want is 1080p gaming. 12gb does that with plenty to spare. Even at 1440p it's not so bad but I'd be concerned about it a year from now.

As for the 4070 itself....it is overpriced? Oh hell yes. It wouldn't have cost much more to equip it with 16gb either.
 
Well tbf, Nvidia kind of went out of their way to make the 4090 the only card that really stands out this generation.

Everything else has been kind of...meh to not good.

the 4080 was like a 900 dollar card (and actually sold for 900) I'd buy one.
The technical issues, don't exist on cards with a proper amount of vram.

For what both companies are asking for mid range cards these days... I want to know its going to actually be able to play a game released NOW. I don't care how well they play 4 year old games like AC Odyssey, I mean my 5700xt can handle that game. I want to know how a new GPU handles games my 3 year old cards are just now starting to stumble on. Its very easy to see multiple newer games that are pushing 15gb in use at 1080. I wanna see those games reviewed before I would consider pulling the trigger on a 12gb cards. (even if they have to add a bunch of *s and list version numbers to the results)

I also dont get why they still benchmark games that get like 200fps.
 
the 4080 was like a 900 dollar card (and actually sold for 900) I'd buy one.


I also dont get why they still benchmark games that get like 200fps.
Easy answer.. Esports gamers with high refresh rate monitors want the high frames possible at 1080p. That calls for the fastest cpu and gpu.
 
I tend to agree for "future proofing" but I can see a 12gb card making sense. If you're a peasant like I am, maybe all you really want is 1080p gaming. 12gb does that with plenty to spare. Even at 1440p it's not so bad but I'd be concerned about it a year from now.

As for the 4070 itself....it is overpriced? Oh hell yes. It wouldn't have cost much more to equip it with 16gb either.

That is sort of the problem though. 12gb is just barely enough for 1080p high settings in newer games. 8gb for sure isn't good enough anymore... I posted Hardware unboxed video from the other day detailing how 8gb 3070s are shitting the bed on multiple newer titles. In those side by side comparisons he ran it was obvious the AMD 6800 with 16gb of ram was using 15gb of it at 1080p with RT enabled. (and not only outperforming the 3070s they sold for the same or cheaper then... but also not crashing)

12gb may be enough just barely right now for 1080p... and is really not enough for 1440 today. At the price of these mid range cards I don't know... if I am running into tons of games in less then 2 years that I have to run at medium settings because of Vram issues I would be a bit annoyed personally. 12gb seems like a huge gamble right now with the number of developers that are basically saying we refuse to twist ourselves to squeeze this into less then 16gb.
 
Last edited:
I also dont get why they still benchmark games that get like 200fps.

I suspect... and maybe I'm just an old cynic. That a lot of these reviewers get a list of approved for review games. Being kind perhaps They get feed lines like.... X developer hasn't optimized that title yet or some other BS. Right now though I am finding it very interesting that every single big outlet review of the 4070 is not touching games like Hogwarts, Calisto, Resident evil, Plague Tale.... I swear there was a time when reviewers where [H] and actually attempted to run the latest greatest choke your hardware software when doing reviews. (even the mid range stuff)
 
That is sort of the problem though. 12gb is just barely enough for 1080p high settings in newer games. 8gb for sure isn't good enough anymore... I posted Hardware unboxed video from the other day detailing how 8gb 3070s are shitting the bed on multiple newer titles.

I haven't had any issues on mine, except for a handful of games with ray tracing. Not that frame rates were high enough to use without DLSS in the first place, which lowers the required VRAM. And I am running 2560x1440.

I know a few games gave me VRAM warning (I think RE3 remake?) but it ran absolutely fine.
 
Another garbage tier release by nvidia. It's fascinating that all of their releases are good cards, the pricing and naming conventions just suck. This would've been great as the 4060 ti at 450 dollars. Both companies seem hell bent on maximizing pricing, I'm expecting the 7800 XT to slot in right between this and the 4070 ti and priced to match.

honestly i bet if they named it the 4060 and priced it at 600 dollars with the 4070ti being the 4070 at it's current price both cards would of sold hot cakes because people would of justified it over the performance increase of the previous generation card. there's enough of a gap between all the cards they could of easily slotted in with or without an increased CU count but with 16GB of ram and increased TDP as the Ti variants.
 
For people that has the case-PSU for a 6950xt (they are almost 2 different tier of cards in that regard, so for many the direct comparison is not that useful), but for some it does considering the really close price range at the moment:



In a sense it is impressive how close a 295mm-192 bits-190watt or so card can keep up with a 520mm-256bits, 350watts or so card, released less than a year ago at $1100 msrp, would just need AMD offer to force Nvidia to pass more of all those impressive rebate to us with their 7700 offer.

Looking at what a 7800x3d can do by what GPU can do around 200 watt now, console next generation will be able to do a lot with that 160-200 watt budget with the dynamic upscaling and others tricks getting better and better.
 
Last edited:
For people that has the case-PSU for a 6950xt (they are almost 2 different tier of cards in that regard, so for many the direct comparison is not that useful), but for some it does considering the really close price range at the moment:



In a sense it is impressive how close a 295mm-192 bits-190watt or so card can keep up with a 520mm-256bits, 350watts or so card, released less than a year ago at $1100 msrp, would just need AMD offer to force Nvidia to pass more of all those impressive rebate to us with their 7700 offer.

was this already pasted?

 
7i1xyb.jpg


172g1undnkta1.jpg
 
Last edited:
I suspect... and maybe I'm just an old cynic. That a lot of these reviewers get a list of approved for review games. Being kind perhaps They get feed lines like.... X developer hasn't optimized that title yet or some other BS. Right now though I am finding it very interesting that every single big outlet review of the 4070 is not touching games like Hogwarts, Calisto, Resident evil, Plague Tale.... I swear there was a time when reviewers where [H] and actually attempted to run the latest greatest choke your hardware software when doing reviews. (even the mid range stuff)
That was very obvious to me as well. When I saw Control I was like, "Really?"
 
That was very obvious to me as well. When I saw Control I was like, "Really?"

What you don't wanna know how a brand new GPU handles a 4 year old single player title that anyone who enjoyed has already went through twice... that has been a free give away multiple times now from multiple sources. Control is great and all but ya I played that game out 4 years ago.
 
If it was $200 less, slam dunk!! $100 less, pretty good choice. But at the MSRP and nowadays (anyone remember the good ole days?) that means AIB cards will be higher... I think you have to weigh the choices carefully.

Personally, I'm just happy to see a card designed to "fit" your existing case.
 
Well, they didn't sell out. Maybe things will improve over the next year price wise. Nvidia won't cut prices unless AMD does, they don't need the gaming revenue.
 
I am finding it very interesting that every single big outlet review of the 4070 is not touching games like Hogwarts, Calisto, Resident evil, Plague Tale....
Sometime techpowerup can feel like the main reference and they did for some of those:
https://www.techpowerup.com/review/nvidia-geforce-rtx-4070-founders-edition/27.html
the-callisto-protocol-2560-1440.png
a-plague-tale-requiem-2560-1440.png


Pretty sure HUB will have their 50 something title out soon, does some of those big outlet had those test in their recent 7800x3d review and not now ?
 
Hogwarts Legacy. Calisto Project, Last of us Part one, The new resident evil (which hardware unboxed just showed a 8gb 3070 crashing on at 1080p), The new Plague tale game. No doubt there are more... and many more on the way. Everyone of those will use well over 12gb of ram at 1080p ultra and 1440 high.

I would not go as far as to call shenanigan's on all the major reviewers today not including games like Hogwarts which is one of the most popular games going at the moment... I do have to wonder why no one has covered any of this years big eye candy titles. I seem to remember a time when reviewers didn't ONLY look at 2+ year old games in a review for a brand new GPU. Sure perhaps they mention and * hey we used patch X.X.... but to ignore all games released in the last two years seems odd to me.
Uhhh.. what? I'm actively playing Hogwarts Legacy right now, and have gone through each graphical setting to see how much VRAM is used i.e Native, DLSS, DLSS + FG, etc. it never ever goes above 11GB at full Ultra with RT enabled.
 
Uhhh.. what? I'm actively playing Hogwarts Legacy right now, and have gone through each graphical setting to see how much VRAM is used i.e Native, DLSS, DLSS + FG, etc. it never ever goes above 11GB at full Ultra with RT enabled.

How much ram do you have available ? I also didn't say you would run out of ram with 12gb.... I said 8gb was already shown to run out of ram in that game. You just said yourself your pushing 11gb. That means all those 8gb Ray tracing for the future proofing buys from the last product cycle WOULD run out of ram right. How many people grabbed 3060s... and 3070s based on the RT is the future crap, sure AMD is faster raster but Nvidia has RT). The 8gb 3070 that people bought cause it would do RT better in the future vs 16gb AMD options, are crashing and having texture popping issues in games like potter where the AMD competition that may have less then RT hardware can feed it as they do not and are actually reporting much better even playable RT in such games.
 
Last edited:

That's a six month old video, there have been a few DLSS updates released since then that's fixed that issue. Frame Generation is a very good feature, it's just sad that it gets so much flack since it's a 40-series only feature, that and what makes it even worse is people talking about it, although probably never have used it, but regurgitating what reviewers said six months ago. As someone who actively uses it, I've seen little to no change in input latency with either KB/M or controller.

I bet once FSR 3 drops, all the nay-sayers are going to call it the best thing since sliced bread, ignoring all the shit they gave FG, as well as Nvidia being the first, and more than likely having the best implementation. The AMD fanboyism is strong in this thread.
 
How much ram do you have available ? I also didn't say you would run out of ram with 12gb.... I said 8gb was already shown to run out of ram in that game. You just said yourself your pushing 11gb. That means all those 8gb Ray tracing for the future proofing buys from the last product cycle WOULD run out of ram right. How many people grabbed 3060s... and 3070s based on the RT is the future crap, sure AMD is faster raster but Nvidia has RT). The 8gb 3070 that people bought cause it would do RT better in the future vs 16gb AMD options, are crashing and having texture popping issues in games like potter where the AMD competition that may have less then RT hardware can feed it as they do not and are actually reporting much better even playable RT in such games.

Hogwarts is poorly optimized. Yes you can brute force it, but that doesn't meant it shouldn't be better. The stuttering issues were present on all GPUs last I checked. I ran it fine with some ray tracing and DLSS. Frame rates were good but some areas were poor, mainly some particular spots on the map. The patches fixed a lot of that. I have not kept up on the progress the patches have made since I finished the game.

Outside of a few ray tracing settings turned off I ran the game just fine on an RTX 3070. Frame rates would dip in a certain hall to 45-50, otherwise was 70-90 in other places. I assume they have fixed some more of the performance issues since.

Now if you're running at 4K it will be out of VRAM, but you won't be getting good frame rates in the first place.
 
This is actually wrong. 4k 6800xt, no fsr. Runs great at 48-50+ fps, zero stutters.

On the release build? Stuttering wasn't a problem I experienced either on 8GB VRAM but I played with some patches. Only issues were very particular areas with low frame rates.
 
Back
Top