Anyone seriously considering the 4070 TI

I just got an Asus TUF OC 4070 Ti and it's performing beyond my expectations in 4K. Everything is set to max/ultra on Call of Duty MW3, Forza Motorsport, and a few other games with very playable frame rates.
 
Last edited by a moderator:
I have the 4070ti and it was worth it. I could have got an AMD card for a little less, with a little more power, but the benefits of DLSS over FSR made it worth it to me. In many titles either card would need to use upscaling to reach my 4k60fps at ultra. Now, if you include what DLSS is going to be doing with ray tracing and the competitive price drops, the green card looks better every day. I wouldn't call anyone stupid for getting the other card though.

People flipping out about this card not being worth it are spending too much time looking at the specs and not looking at the FPS. 16gb is better than 12gb for sure, but a terabyte of video memory is also better than 20gb. Hoping your currently unused video ram 'future proofs' your system is silly.
 
That's true. There's no point in having all that extra VRAM if you don't have the GPU muscle to use it.
 
I've been hitting the 12GB VRAM limit of my 3080 12GB a *lot* lately. It's absolutely holding me back.

If I only had a single monitor and a lower resolution it probably wouldn't matter, but I'm routinely over 10GB in games and programs I use daily and don't even get me into how close to the edge I juggle my AI art generation to make sure that I use *just* under my 12GB limit.
 
That's true. There's no point in having all that extra VRAM if you don't have the GPU muscle to use it.
Probably not far away from 4k requiring 16gb to run optimal while 12gb will most likely be enough for 1440p for a couple of years. Currently ideal vram would be 12gb for 1440p and 16gb for 4k, but it might move towards 16gb for 1440p in 1-2 years with the level of optimization that they are doing now days. My 3 year old 3080 10gb is barely holding up at 1440p now due to lack of VRAM while the GPU itself is generally fast enough. I have had high end GPUs that were basically either best single chip available or within 10-15% of the best single chip cards that were obsoleted by lack of VRAM multiple times. Basically no point having all the GPU muscle if you don't have the VRAM to use it.
 
I've been hitting the 12GB VRAM limit of my 3080 12GB a *lot* lately. It's absolutely holding me back.

If I only had a single monitor and a lower resolution it probably wouldn't matter, but I'm routinely over 10GB in games and programs I use daily and don't even get me into how close to the edge I juggle my AI art generation to make sure that I use *just* under my 12GB limit.
What resolution do you use? I am wondering what is needed for 4K res.... I am interested in compute and video editing - so, even though the 4070 Ti - would be good enough in general - the limit of the 12gb I would be concerned about - the AI thing I am interested in, also - but, vram is 'valuable' for the other things, too. I sold my 3080 10gb - in anticipation of getting a gpu with more vram - so, it's difficult to justify 'switching' to a 4070 Ti, right?

The 4080 is just so expensive here - :-/ But, is 16gb enough for all that? I was wondering about the 4080 20gb card that is supposedly going to be released in the wild - but, the price of it isn't gonna help me at all.
 
If I only had a single monitor and a lower resolution it probably wouldn't matter, but I'm routinely over 10GB in games and programs I use daily and don't even get me into how close to the edge I juggle my AI art generation to make sure that I use *just* under my 12GB limit.

It might be worth using a 2nd videocard and moving your secondary monitors over to the 2nd videocard. I use two videocards mainly just because I run 6 monitors and each card can only support a max of 4. But I only run 2 from my 4080 and the other 4 from my second card just to keep as much resources on the 4080 free for gaming as possible. You can get used 1080Ti cards with 11GB of VRAM for ~$175 now which should be more than enough to handle secondary monitors even if you were doing important VRAM-intensive stuff on them.
 
Probably not far away from 4k requiring 16gb to run optimal while 12gb will most likely be enough for 1440p for a couple of years. Currently ideal vram would be 12gb for 1440p and 16gb for 4k, but it might move towards 16gb for 1440p in 1-2 years with the level of optimization that they are doing now days. My 3 year old 3080 10gb is barely holding up at 1440p now due to lack of VRAM while the GPU itself is generally fast enough. I have had high end GPUs that were basically either best single chip available or within 10-15% of the best single chip cards that were obsoleted by lack of VRAM multiple times. Basically no point having all the GPU muscle if you don't have the VRAM to use it.
Then you are in the same boat as I. I just got a 3080 FE not too long ago. Also at 1440p (well, 1600, but I run everything letterboxed in 16:9 because almost every goddamn game is HOR+)
 
Then you are in the same boat as I. I just got a 3080 FE not too long ago. Also at 1440p (well, 1600, but I run everything letterboxed in 16:9 because almost every goddamn game is HOR+)
With regards to performance we are in the same boat, but difference is I received mine 2 months after launch and I generally have 2 year lifespan expectancy on a GPU with anything more as a bonus. Mostly have to fine tune settings a bit in the newest titles and it works fine, but I knew 10GB might not be enough for more than 2 years at the time of purchase, but took a gamble. Basically I've had 3 years with it so I've gotten my value out of it. Shame the 40xx series has so bad price/performance ratio outside of the 4090.
 
What resolution do you use? I am wondering what is needed for 4K res.... I am interested in compute and video editing - so, even though the 4070 Ti - would be good enough in general - the limit of the 12gb I would be concerned about - the AI thing I am interested in, also - but, vram is 'valuable' for the other things, too. I sold my 3080 10gb - in anticipation of getting a gpu with more vram - so, it's difficult to justify 'switching' to a 4070 Ti, right?

The 4080 is just so expensive here - :-/ But, is 16gb enough for all that? I was wondering about the 4080 20gb card that is supposedly going to be released in the wild - but, the price of it isn't gonna help me at all.
I have dual 4k60hz monitors and 1x 1440p165hz monitor.
I usually game on the 1440p monitor these days and use the 4k monitor for videos and such. Still, in certain games 4k60hz makes more sense to me.
 
  • Like
Reactions: pavel
like this
With regards to performance we are in the same boat, but difference is I received mine 2 months after launch and I generally have 2 year lifespan expectancy on a GPU with anything more as a bonus. Mostly have to fine tune settings a bit in the newest titles and it works fine, but I knew 10GB might not be enough for more than 2 years at the time of purchase, but took a gamble. Basically I've had 3 years with it so I've gotten my value out of it. Shame the 40xx series has so bad price/performance ratio outside of the 4090.
You are right. I am just getting started with this GPU. I got the best that I could afford at the time. Pricing on this generation is just far too rich for my blood...

I guess I'll find out how long 10GB will do fine. I do have the option to trade for a 3080 Ti if I pay the difference between my GPU and my friend's, but that is for another time. I've spent too much on this PC as it is.
 
You are right. I am just getting started with this GPU. I got the best that I could afford at the time. Pricing on this generation is just far too rich for my blood...

I guess I'll find out how long 10GB will do fine. I do have the option to trade for a 3080 Ti if I pay the difference between my GPU and my friend's, but that is for another time. I've spent too much on this PC as it is.
Most of the time it will be about turning down textures as they take most of the memory and tuning some memory hungry effects. Shame that textures is also often the "cheaper" option to increase graphics. The scary part is actually the lack of optimization in newer games as most people are on 6gb or 8gb GPUs and upgrading to anything with 12gb is too expensive for them with the 3060 pretty much gone. The 4070ti would normally be the 4070 and priced around 550-600, but Nvidia went crazy with pricing this gen and AMD followed suit so PC gaming is going to be very expensive unless things change with the next gen. CPUs are very cheap now days, but GPUs are ridiculously expensive when looking at what is needed to run some of the newest titles.
 
Most of the time it will be about turning down textures as they take most of the memory and tuning some memory hungry effects. Shame that textures is also often the "cheaper" option to increase graphics. The scary part is actually the lack of optimization in newer games as most people are on 6gb or 8gb GPUs and upgrading to anything with 12gb is too expensive for them with the 3060 pretty much gone. The 4070ti would normally be the 4070 and priced around 550-600, but Nvidia went crazy with pricing this gen and AMD followed suit so PC gaming is going to be very expensive unless things change with the next gen. CPUs are very cheap now days, but GPUs are ridiculously expensive when looking at what is needed to run some of the newest titles.
You're not wrong. It's stupid how expensive they've become. My wallet won't be able to keep up.
 
According to the system requirements, Alan Wake 2 uses 12GB at 1080p "High" (1080p internal res, upscaled to 4k with DLSS), no ray tracing. With the full set of RT effects, it becomes 16GB.

12GB at 720p "medium", with medium RT.
 
You're not wrong. It's stupid how expensive they've become. My wallet won't be able to keep up.
I agree as well. I can afford to buy the higher end hardware (and have), but not anymore. I'm happy with my current PC and it should last me until maybe prices come back to where they should be if that ever happens.
 
I agree as well. I can afford to buy the higher end hardware (and have), but not anymore. I'm happy with my current PC and it should last me until maybe prices come back to where they should be if that ever happens.
Hopefully... Fingers crossed but my hopes of this are not high, sadly. As I've said before, Jensen is committed to that fourthyacht money after the cryptoboom showed him what his GPUs will actually go for...

MSRP has taken a sharp incline since then. To bastardize Meijer's slogan: Why pay more? Because we fuckin' said so!
 
I have dual 4k60hz monitors and 1x 1440p165hz monitor.
I usually game on the 1440p monitor these days and use the 4k monitor for videos and such. Still, in certain games 4k60hz makes more sense to me.
I only have the one 4k (TV) display and it's 60 hz - so, that's my only choice right now. :) Therefore, I want a gpu that can at least do 4k 60 hz - I don't need more but it's a shame if it struggles at 4k high/ultra- so, I prefer it can do that. I acknowledge that the 3090 /3090 Ti might 'struggle' - or have some drawbacks with all the eye candy on. But, if I had the $$ for a 4080 or 4090 right now, then that wouldn't be a concern.
 
GPU Performance logging software can only see how much GPU memory is being allocated. It doesn't mean it is being used in the current scene or being used at all. Game engines have become pretty good at moving data around thanks to previous console generations being so limited in video memory. Now if you have your vram hitting max at the same time you can see hitching on a graph, you have a good case to make. Just not feeling good because you see all 12gb are allocated is probably a waste of emotion.

My wife games on a 3050 with only 3gb of vram. She plays Remnant 2 with me without issue. The laptop only has ddr4 for system memory, but the system memory fills right up, presumably with graphics textures ready to be swapped into video memory.
 
GPU Performance logging software can only see how much GPU memory is being allocated. It doesn't mean it is being used in the current scene or being used at all. Game engines have become pretty good at moving data around thanks to previous console generations being so limited in video memory. Now if you have your vram hitting max at the same time you can see hitching on a graph, you have a good case to make. Just not feeling good because you see all 12gb are allocated is probably a waste of emotion.
When I hit 12GB on certain games I get massive hitching (Diablo IV being a huge one - had to lower settings to even get the game to run), and when I hit 12GB on my AI image generation software I get crashes of the program including error messages about how much VRAM it wasn't able to allocate. My current settings allow me to use about 11GB of VRAM - which is enough for my background desktop VRAM usage, but if something else is using extra VRAM (like STEAM for example sometimes allocates a lot, or just general explorer VRAM bloat which I've noticed is a thing) I get crashes.
 
When I hit 12GB on certain games I get massive hitching (Diablo IV being a huge one - had to lower settings to even get the game to run), and when I hit 12GB on my AI image generation software I get crashes of the program including error messages about how much VRAM it wasn't able to allocate. My current settings allow me to use about 11GB of VRAM - which is enough for my background desktop VRAM usage, but if something else is using extra VRAM (like STEAM for example sometimes allocates a lot, or just general explorer VRAM bloat which I've noticed is a thing) I get crashes.

Assuming you're using Stable Diffusion and Auto1111, just to make sure, you are using this line in your webui-user.bat, right?
Code:
set COMMANDLINE_ARGS=--xformers --reinstall-xformers
xformers greatly reduces the amount of VRAM you're using. What are you generating at (and upscaling to) anyway? Also make sure to just close the GUI if you're doing a large batch job, otherwise it'll slow things down a bit. Although SDXL, if you're using an XL model, might be a bit more hungry regardless.

On my 4090 I can play Starfield while generating at 512x512 (then upscaling to 912x912) while having every program under the sun open and I'm not hitting any issues, but I'm still on 1.41 due to my issues with SDXL builds. I haven't revisited that in quite a while. It's using about 11GB VRAM while I'm doing this.
 
Last edited:
Assuming you're using Stable Diffusion and Auto1111, just to make sure, you are using this line in your webui-user.bat, right?
Code:
set COMMANDLINE_ARGS=--xformers --reinstall-xformers
xformers greatly reduces the amount of VRAM you're using. What are you generating at (and upscaling to) anyway? Also make sure to just close the GUI if you're doing a large batch job, otherwise it'll slow things down a bit. Although SDXL, if you're using an XL model, might be a bit more hungry regardless.

On my 4090 I can play Starfield while generating at 512x512 (then upscaling to 912x912) while having every program under the sun open and I'm not hitting any issues, but I'm still on 1.41 due to my issues with SDXL builds. I haven't revisited that in quite a while. It's using about 11GB VRAM while I'm doing this.
I generate at 1280x768 and upscale to roughly 4k and then add more detail via grid rendering.
An example of a recent workflow. I have it set up to use Controlnet if I want, but I had it disabled for this particular picture.
image (1).jpg
 
I generate at 1280x768 and upscale to roughly 4k and then add more detail via grid rendering.
An example of a recent workflow. I have it set up to use Controlnet if I want, but I had it disabled for this particular picture.
View attachment 609573
I don't have much experience with ComfyUI (which is what I believe that is). I was just wondering if you made sure you were using xformers, but I suppose you probably have to be, if you're upscaling to 4k. Granted ComfyUI (I believe) does use a bit less VRAM than Auto1111, and it looks like you have the upscaler tweaked quite a bit (unless the number of steps and whatnot were simply the default settings).
 
Yeah, I have a LOT of the settings tweaked quite a bit. I've been slowly learning and adding new nodes as I go. I have it set up so that it uses roughly 11gb of my 12276 total ram.
 
The 4080 is just so expensive here - :-/ But, is 16gb enough for all that? I was wondering about the 4080 20gb card that is supposedly going to be released in the wild - but, the price of it isn't gonna help me at all.
I believe you’re talking about the 4080 Super. Of course it will be more expensive because Nvidia is greedy AF.

The 4070TI is decent performance for the money ratio. However the 12GB VRAM isn’t really ideal for 4K. If you’re playing say ultra 1440p it should do just fine though.
 
I believe you’re talking about the 4080 Super. Of course it will be more expensive because Nvidia is greedy AF.

The 4070TI is decent performance for the money ratio. However the 12GB VRAM isn’t really ideal for 4K. If you’re playing say ultra 1440p it should do just fine though.
I don't think the 4070 ti is decent for the money, at all. Its 10 - 25 frames better than a 4070, for $200 more. Performance is good. Price is not.

The best Nvidia card for the money, is one of the lowest cost 4070. Around $530, plus Alan Wake II. That's a decent deal, for right now.
 
The 4070ti seems more my style plus I game at 1080P at 144hz. Also like the fact I can use my 750 watt PSU with the card with no problems
so I won't have to upgrade to something like 1000 Watt PSU with the other 4000 series cards basically require I mean I have two Back up 750 watt PSUs would hate to think it's obsolete because of the lower wattage. I mean I ran a 450 watt for like a decade the increase power consumption is just silly it's suppose to be going down not up.
I bought an ASUS 4070 OC for $599, moving up from a 1070 OC. Switched monitors too from 1920x1200 to 2160x1440 (2K). Initially DP wouldn't boot...
 
I don't think the 4070 ti is decent for the money, at all. Its 10 - 25 frames better than a 4070, for $200 more. Performance is good. Price is not.

The best Nvidia card for the money, is one of the lowest cost 4070. Around $530, plus Alan Wake II. That's a decent deal, for right now.
Everything is relative in a market where GPU prices have grown crazy. Compared to the current crop of GPUs the 4070Ti is middle of the pack to decent in terms of performance per dollar, cost per frame, etc. compared to its competitors. Unsurprisingly the 3080, 4080, 3090, 3090 Ti, and 4090 are the worst cards for that metric. Check out one of the recent reviews.

1699737607873.png


MSI 4070Ti review: https://www.techpowerup.com/review/msi-geforce-rtx-4070-ti-gaming-x/33.html

PS Yes I notice that the AMD RX 6900 XT, 6800 XT, and 6800 are all better than the 4070 Ti for price/performance but that's like saying the sky is blue. Everyone knows this but certain people will still pay the premium to get an Nvidia card instead and out of the Nvidia lineup the 4070Ti is decent compared to the other product line SKUs from Nvidia.
 
outside of the 4090, the 7900XTX and 7900XT are the best overall GPU's...problem is they still suck at ray-tracing and if you care about that feature then you need to stick with Nvidia...Nvidia doesn't currently have a good mid-range card in their 4000 lineup

maybe the 4070 Ti Super will fix this
 
Everything is relative in a market where GPU prices have grown crazy. Compared to the current crop of GPUs the 4070Ti is middle of the pack to decent in terms of performance per dollar, cost per frame, etc. compared to its competitors. Unsurprisingly the 3080, 4080, 3090, 3090 Ti, and 4090 are the worst cards for that metric. Check out one of the recent reviews.

View attachment 612719

MSI 4070Ti review: https://www.techpowerup.com/review/msi-geforce-rtx-4070-ti-gaming-x/33.html

PS Yes I notice that the AMD RX 6900 XT, 6800 XT, and 6800 are all better than the 4070 Ti for price/performance but that's like saying the sky is blue. Everyone knows this but certain people will still pay the premium to get an Nvidia card instead and out of the Nvidia lineup the 4070Ti is decent compared to the other product line SKUs from Nvidia.
What if you want a gpu for 4K? Also, I consider the 30 series - for used cards - 3090 is $800 here (min. - if you're lucky) used. The 4070 Ti is current gen. - I've seen a few sellers selling a 2nd hand card - about $900-$950 - at the lowest end.
Then, how would you evaluate the deals? One is newer but only 12 gb and the other is previous gen - much higher power but you get 24 vram.

outside of the 4090, the 7900XTX and 7900XT are the best overall GPU's...problem is they still suck at ray-tracing and if you care about that feature then you need to stick with Nvidia...Nvidia doesn't currently have a good mid-range card in their 4000 lineup

maybe the 4070 Ti Super will fix this
I don't about raytracing too much - everything I read - ppl say that the performance hit is too considerable. I do, however, care about productivity software - and that makes me hesitant to go with an AMD RDNA 3 card - and I only care about the 7900 series - the 7800 series and below - has subpar performance in productivity tasks. Also, I think they're 2K cards?
The 6000 series - productivity performance is meh, too.

I have a strange collection that I compare - 3090, 4070 Ti and 7900 XT (XTX is probably $100 more - and most expensive) - the 4070 Ti/Super might be 16gb and so that could be interesting - but, if it's more expensive than the current 4070 Ti - that would price it as much as the 7900 XT (new) here.
 
I was debating between RTX 4070, RTX 4070 Ti or Radeon 7800 XT. I game at 1440p with a Ryzen 3700x. Just picked up a RTX 4070 due to the lower power consumption meaning I didn't have to upgrade my current power supply. This card will be fine for 1440p gaming for a while yet.

I tried RT on in F1 23 and went from 170 to 110 FPS on max settings. No thanks considering I don't notice much differences when racing.
 
outside of the 4090, the 7900XTX and 7900XT are the best overall GPU's...problem is they still suck at ray-tracing and if you care about that feature then you need to stick with Nvidia...Nvidia doesn't currently have a good mid-range card in their 4000 lineup

maybe the 4070 Ti Super will fix this

Poly is the only person on Earth who cares about Ray Tracing. Everyone else knows it's nothing to write Hard Forum about
 
Last edited:
Back
Top