Should I just get a 4070Ti?

Nebell

2[H]4U
Joined
Jul 20, 2015
Messages
2,379
Edit:
Ended up buying 7900 XTX.

I'm a casual gamer, but I game at 4k/120hz. I found out my PS5 gets more gaming use than PC but there are games that I prefer to play on PC, like upcoming Diablo 4 and many MMORPGs. And it just happens that those games are usually not very demanding.

I could probably try and find 3090 for about the same price, but I want a white card and that automatically makes it a lot rarer second hand (I'm from EU).
I'm not anti DLSS(3) and I think Ray Tracing performance is important so this is the reason I'm considering 4070Ti instead of 3090/Ti or AMD. I think AMD will just fall behind in performance as more and more games start implementing RT. AMD really messed up this gen.
 
Last edited:
The 4070 Ti is a fine card. The only potential downside is the relatively low amount of VRam. 4K combined with things like AA settings cranked way up and view distance set to max in an MMORPG will really raise VRam requirements. If you run out of VRam, performance will drop like a rock. Cards like the 3090 and 3090 Ti have twice the VRam of the 4070 Ti.
 
4070 Ti suffers from a castrated memory bus.
I would try to find a 3090 Ti on the cheap if memory bandwidth and capacity is important. Otherwise get a 4080 for 4K/120. 3090 Ti isn’t a 4K/120 card. Only real 4K/120 card is 4090.
 
Considering how much electricity can cost in the EU, 4070ti over a 3090-3090TI-AMD could be interesting (specially if you use multi monitor), if the not too very demanding game will not have an issue at 4k with the memory bus-memory quantity of the 4070ti (which I imagine very possible)

But I am not sure how common ray traced at 4k native and not very demanding will really be.
 
4070 Ti suffers from a castrated memory bus.
I would try to find a 3090 Ti on the cheap if memory bandwidth and capacity is important. Otherwise get a 4080 for 4K/120. 3090 Ti isn’t a 4K/120 card. Only real 4K/120 card is 4090.
I would take the small hit to performance at 4k over the insane power consumption difference between the 4070 ti and 3090 ti.

power-gaming.png
 
I was considering 4080 but in Sweden it's some €500 more expensive than 4070Ti. It's just not the amount of money I'm willing to spend. If it was about €1300 then sure, easy buy.
Its a damn shame as 4080 would be a fantastic card for the next 3-4 years.
4070Ti is also expensive but it's not so bad compared to how much 3090Ti cost a year ago.
Not much of an AAA gamer here. I won't be playing Cyberpunk at 4k and RT enabled.
 
Considering how much electricity can cost in the EU, 4070ti over a 3090-3090TI-AMD could be interesting (specially if you use multi monitor), if the not too very demanding game will not have an issue at 4k with the memory bus-memory quantity of the 4070ti (which I imagine very possible)

But I am not sure how common ray traced at 4k native and not very demanding will really be.

Unless you're gaming like it's a full time job, the power difference is going to be negligible over the course of a year.
 
Which ones are still worth playing? Any recommendations?

Honestly, none of them are worth playing. They are a massive time sink better spent elsewhere. And you can't really get great xperience outside the wellknown MMORPGs like WoW, FFXIV, Black Desert, New World, etc. I found those small indie MMORPGs to be trash.
 
Honestly, none of them are worth playing. They are a massive time sink better spent elsewhere. And you can't really get great xperience outside the wellknown MMORPGs like WoW, FFXIV, Black Desert, New World, etc. I found those small indie MMORPGs to be trash.
Josh Strife Hayes on YouTube plays a lot of shitty MMOs so you don't have to.
 
The 4070ti is a fine card. It caught negative press because of Nvidia's BS naming attempt and other general crappy business practices but there is nothing wrong with the hardware. Roughly the same horsepower as the 3090ti with half the power consumption in a smaller form factor that supports modern feature sets. Also fairly easy to find at MSRP.
 
The 4070ti is a fine card. It caught negative press because of Nvidia's BS naming attempt and other general crappy business practices but there is nothing wrong with the hardware. Roughly the same horsepower as the 3090ti with half the power consumption in a smaller form factor that supports modern feature sets. Also fairly easy to find at MSRP.
Yeah, just trying to find a white one has been real pain in the ass. Also AMD 7900 XT/XTX don't exist in white.
 
Unless you're gaming like it's a full time job, the power difference is going to be negligible over the course of a year.
In germany-engand if you're work from home :

power-multi-monitor.png



Over 3 years can cost you easily cost around $140 has a start between an xtx and a 4070ti, gaming 10 hours a week between a 3090Ti vs a 4070ti can be around $145

It is not that big obviously (and depends about the heating-ac situation) but over negligible, in the sense that could be enough that if you that to the price tag of the 3090Ti-6950xt the 4070Ti could look more interesting to some.
 
With the cost of electricity here and the amount that I game it makes no meaningful cost difference. The issue is the extra 200 w heating up the freaking room especially in the summertime.
 
the amount that I game
I am a bit curious for which game raytracing does matter but is relatively easy to run a 4k native we are talking about here ?

Is that not 2 different subset of game you have in mind and upscaling would be relevant ?
 
Just to add. In using Ray tracing you will use more video memory. And since you are running at 4K like myself, you will be seriously gimping yourself.

I would try to snag a 3090 or 3090ti. Otherwise your 4070ti will age very poorly at 4K.
 
Yup. No way is 10-12gb going to last for 4k.
Yes we're going to have more and more games running into vram issues this year with only 10 to 12 gigs. And something that a lot of people haven't noticed is monitoring vram usage is not the same as it was before. Before every game pretty much would allocate more vram than it needed and people learned that if a game was using 9 or 10 gigs that it was probably fine with 8. What we have now are some games using maybe only 9 gigs of vram if you have 10 but is already using system ram. In other words some games are limiting the amount of vram that they use well before they actually get to the limit and are going ahead and spilling over into system ram. Bottom line is you can't look at a game and say oh it's not using all of my vram so it's not a problem because it clearly can be a problem so make sure to look at your system RAM usage and see if it's going up if you are getting hitching and stuttering or FPS drops.
 
No way I would take a 3090 over a 4070Ti. Take the 4070Ti for sure. If heat matters at all the 3090Ti is miserable I would get rid of a 3090Ti let alone aquire one.
 
No way I would take a 3090 over a 4070Ti. Take the 4070Ti for sure. If heat matters at all the 3090Ti is miserable I would get rid of a 3090Ti let alone aquire one.
For 4k you are incorrect. 12GB of memory is going to gimp 4k gaming with a 4070ti. Get a 4080 GTX then
 
Not sure if it is the bandwith or the quantity, but yes:
relative-performance_2560-1440.png
relative-performance_3840-2160.png


3090ti goes from 101% to 110% from 1440p to 2160p

The 16gig 6900xt, 10 gig 3080 result and maybe more so 3070ti 8gb seem to point that the memory bandiwith is probably the issue here not quantity

6900xt-4070ti: 500 gb/s
3070ti: 600gb/s
3080: 760gb/s

Having 500gb/s memory even with a good cache has issue at those resolutions it seems.

Chance are raw power-bandwith at keeping 4k native going close to that 120fps will be an issue before 12 gig is.
 
I would also like to add that TPU found when testing VRAM with the new Hogwarts game (Which is now the benchmark for a lot of future games)

1677107108442.png


This is directly taken from the review:

The ray tracing performance hit can be reduced by lowering the ray tracing quality setting from the "Ultra" default, which helps a lot with performance, especially on AMD. RX 7900 XTX can now reach 60 FPS with ray tracing "low," at 1080p. Not impressive, but still a huge improvement over 28 FPS. In terms of VRAM usage, Hogwarts Legacy set a new record, we measured 15 GB VRAM allocated at 4K with RT, 12 GB at 4K with RT disabled. While these numbers seem shockingly high, you have to put them in perspective. Due to the high performance requirements you'll definitely not be gaming with a sub-16 GB card at 4K. 8 GB+ at 1080p is certainly a lot, but here, too, owners of weaker GPUs will have to dial down their settings anyway. What's always an option is to use the various upscaling methods like DLSS, FSR and XeSS, which lower VRAM usage, too, by running at a lower internal rendering resolution.
 
Unless you're gaming like it's a full time job, the power difference is going to be negligible over the course of a year.
It really comes down to electric rate and your average usage. Here's a quick chart on how much the difference would cost you in a year. Multiply it by 2 or 3, assuming you keep the card for a few years before upgrading.

This also isn't factoring in the extra heat dumped in the room (during extended gaming sessions). Every extra watt saved means that much less your A/C has to work in the summer. It's suprising how much of a difference a few hundred watts can make in a small room/office.

Note: $.06 and $.30 are on the extremes but do exist in some areas. The current overall US average is ~$.17/kWh

power.png
 
12GB VRAM at that performance level sounds like a bad time. The 192-bit bus sux, but I'm less concerned about that- bottlenecking due to lack of GDDR bandwidth is definitely an L but it's a consistent fps decrease that will reduce performance but not necessarily make the experience unbearable (see Navi21 vs GA102). Bottlenecking due to VRAM capacity though can be a disaster for performance as can cause not only a huge performance hit but also inconsistent frametimes and pacing.

FWIW, 3090Ti undervolted to 925mV runs at like 300-330W in-game and can still get close to 2Ghz with good cooling. IMHO 3090/Ti are still the cards to get for people who care more about resolution, RT, and textures more than framerate and have budget is under $1K USD. For example Cyberpunk at 3K with Psycho RT and many, many texture mods and tweaks goes over 20GB usage with my current config. Other modded games can have similar behaviour.

However If you care more about framerate and will play at 1440P with lowered or no RT and no mods, 4070Ti would make sense I think. The narrower, high-clocked core seems suited to high-FPS gaming at more modest resolutions and effects settings.
 
4070 TI in my opinion is a stupid card for the price, limited in the short term. The 7900 XT which is kinda stupid is still a better overall card in my opinion for virtually the same price especially with the AIBs with better cooling over the reference going for around $850 in the states, don't know elsewhere. RT is still in a place where one has to choose if the performance hit is worth the IQ advantage if any in using it. It still comes down to subjective choices in the end, I don't think Nvidia automatically sweeps performance with RT -> It depends also on a system that can support Nvidia more CPU heavy requirements to perform, even a 6950 XT can beat even a 4090 in RT in some RT games at 1080p (if DLSS is using 1080p to upscale, that will have same issues) on more constrain systems. Now if you have a top notch system, meaning Intel 12th/13 gen, AMD Zen 4, fast DDR5 memory -> Nvidia RT performance starts to really excel and I expect even more so with Zen 4 X3D processors coming out (have to wait and see).
 
Are there any benchmarks that show the 3090 TI versus the 4070 TI in 4k?
1677114611858.png

the 3090ti 10% faster and has 50% more memory and wider memory bus.

Its pretty much is a no brainer to go with a 3090ti at 4k. The reason they went up in price is because Nvidia knew they were releasing the 4070ti at a terrible price and would not be able to beat a 3090ti.

the 4070ti is not a bad card (bad price). It just isn't a good 4k card. Which the TS said he plays in.
 
Last edited:
I'd pick up a 3090 because of the VRAM. who cares if its white? when youre playing a video game are you staring into your case or are you staring at the glorious game with the eye candy turned on Ultra?
 
I just bought a 3090 Zotac Trinity video card off of Facebook Marketplace for $525 with overnight shipping included. You could find amazing deals online if you look.
 
I'm a casual gamer, but I game at 4k/120hz. I found out my PS5 gets more gaming use than PC but there are games that I prefer to play on PC, like upcoming Diablo 4 and many MMORPGs. And it just happens that those games are usually not very demanding.

I could probably try and find 3090 for about the same price, but I want a white card and that automatically makes it a lot rarer second hand (I'm from EU).
I'm not anti DLSS(3) and I think Ray Tracing performance is important so this is the reason I'm considering 4070Ti instead of 3090/Ti or AMD. I think AMD will just fall behind in performance as more and more games start implementing RT. AMD really messed up this gen.
You can't plug a keyboard and mouse into your ps5? How did AMD mess up cause they didn't push heavier into the same useless features Nvidia marketing machine pushes? Enjoy your expensive upscaling machine to play Diablo with fuzzy graphics 😄
 
You can't plug a keyboard and mouse into your ps5? How did AMD mess up cause they didn't push heavier into the same useless features Nvidia marketing machine pushes? Enjoy your expensive upscaling machine to play Diablo with fuzzy graphics 😄
AMD = the off brand for a reason... Nice fallacies though and flamebait.
 
View attachment 551213
the 3090ti 10% faster and has 50% more memory and wider memory bus.

Its pretty much is a no brainer to go with a 3090ti at 4k. The reason they went up in price is because Nvidia knew they were releasing the 4070ti at a terrible price and would not be able to beat a 3090ti.

the 4070ti is not a bad card (bad price). It just isn't a good 4k card. Which the TS said he plays in.
The 10% isn't really worth it. Consuming 40% or more power turns into heat dumped into your computer room, that's terrible. I'll take take the 90% as powerful and 40-60% more power efficiency for sure. If you want more power go with the 4080 or 4090, the 3090Ti is a turn off now with how hot it is and the memory on the backside of the pcb and the high power heat it produces. I actually hate it in comparison to the 4000 series cards.
 
The 10% isn't really worth it. Consuming 40% or more power turns into heat dumped into your computer room, that's terrible. I'll take take the 90% as powerful and 40-60% more power efficiency for sure. If you want more power go with the 4080 or 4090, the 3090Ti is a turn off now with how hot it is and the memory on the backside of the pcb and the high power heat it produces. I actually hate it in comparison to the 4000 series cards.
I live in MA where it is cold as hell. I could care less about the heat dumped. I actually want it. I'll take the 10%.
 
I live in MA where it is cold as hell. I could care less about the heat dumped. I actually want it. I'll take the 10%.
That's like saying you'd rather take a gas guzzling 330hp V8 6.3l that wastes 50% more gasoline vs a 300Hp i4 2.2l that is only 10% less torque yet twice as efficient. 10% is one thing, but 10% being twice as inefficient while generating twice as much heat? Seems crazy if you're deciding right now on a purchase to get a 3090Ti tbh. If you want to keep warmer in MA cold flip your back and front fans around to they are reversed and pushing hot case air forward instead of backwards. You'll be sweating in 20 mins of some good gaming even with an efficient GPU you can OC your CPU and produce tons of hot air. Here in California hot air is my worst enemy. Nothing more miserable than sweating while wearing headphones. I wish I lived In a colder state lol.
 
I don't think most users care about the electricity cost or about the heat dumped it's barely noticeable. Especially when you're talking about high-end gamers. I really don't think they give three s**** about that. In my apartment it's always chilly . I could care less literally about heat or electricity cost. I'm not scraping by or surviving . Then again who is when they're buying $1,000 video card. Doesn't make any sense. In my opinion but to each his own. I've been custom building PCS since 1990 when I was 15 years old and not once literally never once did electricity cost or heat produced ever influenced my decision to build a high-end rig literally it has never even crossed my mind not even once.
 
Last edited:
I don't think most users care about the electricity cost or about the heat dumped it's barely noticeable. Especially when you're talking about high-end gamers. I really don't think they give three s**** about that. In my apartment it's always chilly . I could care less literally about heat or electricity cost. I'm not scraping by your surviving . Then again who is when they're buying $1,000 video card. Doesn't make any sense. In my opinion but to each his own. I've been custom building PCS since 1990 when I was 15 years old and not once literally never once did electricity cost or heat produced ever influenced my decision to build a high-end rig literally it has never even crossed my mind not even once.
It only bothers fanboys. When one brand is more efficient than the other it becomes VERY important. CPU's/GPU's it's always the same.
 
Back
Top