Some 1660 TI Models Could Have 3GB of VRAM

AlphaAtlas

[H]ard|Gawd
Staff member
Joined
Mar 3, 2018
Messages
1,713
As the 1660 TI launch nears, more and more designs from various manufacturers are starting to pop up around the web. Videocardz posted designs from MSI, Asus, Galax, Palit, and EVGA, and they even managed to get a die shot of what they claim to be the TU116 GPU, as well as a shot of the MSI Geforce 1660 TI Ventus Xt's PCB. On top of the pictures, Videocardz spotted yet another potential leak from the Eurasion Economic Commission's website. This one in particular suggest that Asus could lanch several 1660 TI SKUs with 3GB of memory instead of 6GB.

The 6GB variant may be exclusive to ROG STRIX series though, as there is no such product with 3GB memory listed. ASUS will launch DUAL, Expedition, Phoenix, TUF (the new series), Turbo and ROG STRIX series of GeForce GTX 1660 Ti graphics. Models are as always divided into Advanced, OC and non-OC variants (which basically feature different clock speeds). The GTX 1660 Ti graphics cards will be announced on February 22nd. We expect a large list of custom models to be available at launch as GTX 1660 Ti basically replaces GTX 1060, the most popular SKU in NVIDIA offer.
 
Great more muddling of the waters to confuse the hell out of consumers... Is 3 GB even viable anymore? Low rez gaming and probably 1080P... I knew I was on the edge of usability with the 2GB 1050 card in my laptop and I would see a performance hit for it.
 
Great more muddling of the waters to confuse the hell out of consumers... Is 3 GB even viable anymore? Low rez gaming and probably 1080P... I knew I was on the edge of usability with the 2GB 1050 card in my laptop and I would see a performance hit for it.

The 3.5/4 in my 970 still runs strong. It's only been in the last 6 months or so that I've really had to turn down anything more than shadows in a game to get a solid 60.

That said, it is getting pretty long in to tooth, and 3GB is worse than my 3.5/4. Sheer amount of RAM isn't the only factor though. Depending on what type of RAM it uses, it could still be a decent card. I'd still prefer they just put 6GB in all of the 1660tis to eliminate confusion.
 
Adding a chart which explains the above a bit clearer, it definitely seems there's no 'could' about it, and others are also preparing 3GB models.


upload_2019-2-15_18-49-44.png
 
It could also turn into the same deal they did to screw people on the 1060 3GB variants and reduce the amount of CUDA cores as well without telling anyone until they get caught.

It would be nice to go back to only have 3-5 cards to choose from to eliminate general consumer confusion.
 
It could also turn into the same deal they did to screw people on the 1060 3GB variants and reduce the amount of CUDA cores as well without telling anyone until they get caught.

It would be nice to go back to only have 3-5 cards to choose from to eliminate general consumer confusion.

It apparently works in the manufacturers' favour - I recall watching an interview years ago about this where the card maker rep said that if they have, for example, a shelf full of cards at various prices, people will tend to go for the flashier, more expensive one, believing the extra cost will result in relative performance gains - even if the actual gain in reality is very, very minor.
 
It apparently works in the manufacturers' favour - I recall watching an interview years ago about this where the card maker rep said that if they have, for example, a shelf full of cards at various prices, people will tend to go for the flashier, more expensive one, believing the extra cost will result in relative performance gains - even if the actual gain in reality is very, very minor.

Yea, I was going to include something along the lines of consumer extortion based on exactly that, but I think extortion might be too strong of a word for it; they definitely do it to take advantage of the generic and uninformed consumer, and it seems to just keep getting worse.
 
As the 1660 TI launch nears, more and more designs from various manufacturers are starting to pop up around the web. Videocardz posted designs from MSI, Asus, Galax, Palit, and EVGA, and they even managed to get a die shot of what they claim to be the TU116 GPU, as well as a shot of the MSI Geforce 1660 TI Ventus Xt's PCB. On top of the pictures, Videocardz spotted yet another potential leak from the Eurasion Economic Commission's website. This one in particular suggest that Asus could lanch several 1660 TI SKUs with 3GB of memory instead of 6GB.

The 6GB variant may be exclusive to ROG STRIX series though, as there is no such product with 3GB memory listed. ASUS will launch DUAL, Expedition, Phoenix, TUF (the new series), Turbo and ROG STRIX series of GeForce GTX 1660 Ti graphics. Models are as always divided into Advanced, OC and non-OC variants (which basically feature different clock speeds). The GTX 1660 Ti graphics cards will be announced on February 22nd. We expect a large list of custom models to be available at launch as GTX 1660 Ti basically replaces GTX 1060, the most popular SKU in NVIDIA offer.

If this is really true and this thing starts at $250, then it's going to be DOA. NV sure has the balls to even come up with a 3G card in a mid-range segment in 2019. Talk about insulting to gamers and their customers!
 
The 6 GB in the RTX 2060 is on the low side, bit would only be an issue on some games using ultra settings.

Putting 3 GB on a card that should be on par with a 980ti/FuryX is an absolute joke.

If you plan on play at medium settings, it might be ok such as 4k tv users. For actual PC gamers, it will be trash.
 
Navi needs to get here ASAP..... need a video card for my son. Could really go for GTX 1080 performance for RX 590 money....
 
Really a shame that they gimp VRAM this hard.

A 1660 TI with 8gb GDDR6 for under $300 would have been cool.
 
I can see a 3GB variant being viable. I imagine it will be priced/positioned for esports type games: LoL, Fortnite, Overwatch, CS:GO, etc. at 1080P will be quite happy with these specs and only 3GB of VRAM.
 
I can see a 3GB variant being viable. I imagine it will be priced/positioned for esports type games: LoL, Fortnite, Overwatch, CS:GO, etc. at 1080P will be quite happy with these specs and only 3GB of VRAM.

Even then, we are seeing signs that even 16gb of system ram could soon hold back cards with only 3 GB of vram as we can already see the case when using 8gb of system ram:

https://www.google.com/amp/s/www.techspot.com/amp/article/1770-how-much-ram-pc-gaming/

Having to install 32 GB of memory really blows the value aspect of system using this card.

For absolute penny pinchers, you would most likely get a smoother gameplay experience using an 8 GB RX 580 with 8 GB of system ram than a 3GB GTX 1160ti using the same amount of system ram.
 
Even then, we are seeing signs that even 16gb of system ram could soon hold back cards with only 3 GB of vram as we can already see the case when using 8gb of system ram:

https://www.google.com/amp/s/www.techspot.com/amp/article/1770-how-much-ram-pc-gaming/

Having to install 32 GB of memory really blows the value aspect of system using this card.

Sure, I agree it would be problematic for anything more taxing than CS:GO, Fortnite, Overwatch, Rocket League. The article you linked doesn't cover any of the titles I mentioned, but their conclusion was that 8-16GB DDR would be enough for those older & light weight games; I presume that is this card's niche and will probably be priced/positioned accordingly.
 
I can see a 3GB variant being viable. I imagine it will be priced/positioned for esports type games: LoL, Fortnite, Overwatch, CS:GO, etc. at 1080P will be quite happy with these specs and only 3GB of VRAM.

I see this card being use at internet café in the Asian markets.
 
Great more muddling of the waters to confuse the hell out of consumers... Is 3 GB even viable anymore? Low rez gaming and probably 1080P... I knew I was on the edge of usability with the 2GB 1050 card in my laptop and I would see a performance hit for it.


3GB is probably fine for 1080p, and a lot of people still game at 1080p.

It depends though.

If a game is poorly designed and uses the same resolution textures for all render resolutions (rather than having higher resolution textures for higher screen resolutions) you could wind up with a lot of VRAM wasted on larger textures than is necessary...
 
3GB is probably fine for 1080p, and a lot of people still game at 1080p.

It depends though.

If a game is poorly designed and uses the same resolution textures for all render resolutions (rather than having higher resolution textures for higher screen resolutions) you could wind up with a lot of VRAM wasted on larger textures than is necessary...

Still very viable for 1080p gaming. Using a 1060 3gb laptop card, the first game where I have dropped from "ultra" textures to high was BFV. Seemed to give a bump to fps by doing this, generally though max settings at 1080p works fine with the card.
 
Still very viable for 1080p gaming. Using a 1060 3gb laptop card, the first game where I have dropped from "ultra" textures to high was BFV. Seemed to give a bump to fps by doing this, generally though max settings at 1080p works fine with the card.


That's good.

Yeah, I generally feel people exaggerate the need for more and more VRAM.

I have a 12Gb Pascal Titan X, and I tend to try to play at 4k ultra settings, and I don't think I've ever seen VRAM use go above 6GB. Heck, usually it stays just above 4GB in most reasonably new titles.
 
3GB in on a mid-range card 2019?

Nvidia just can't stop punching itself in the face. ¯\_(ツ)_/¯
 
I still don't understand why this card exists. Nvidia must have made a warehouse full of these chips and "forgot" about them. They're trying as hard as they can to kill off the 10xx series for the 20xx series, and all of a sudden this thing pops up.
 
I still don't understand why this card exists. Nvidia must have made a warehouse full of these chips and "forgot" about them. They're trying as hard as they can to kill off the 10xx series for the 20xx series, and all of a sudden this thing pops up.
Lots of folks walking down the isles of Best Buy, Frys, Walmart, and Microcenter still. These cards sell hugely in retail.
 
Should have called it 2050. I understand why they didn't, but come on what is next the 1666.
 
We have been here before. I think these are just place holders until more model names are solidified:
https://www.techpowerup.com/250924/...in-six-variants-based-on-memory-size-and-type

The 6 GB GDDR6 turned out to be the 1160ti. The 6 GB GDDR 5 could be an 1160. The 3 GB GDDR 6 could be a 96 bit 1150 ti. The 3GB GDDR5 could be a 96 bit 1130. The 4GB GDDR6 model could be a 128 bit 1150 and maybe a slower version again with the 1150 GDDR5.

In short, I think these are just placeholders.
 


This instantly ran through my mind when I heard this. 3gb? i see the market for it, but It depends on how it does in real world applications. There's something to be said about "Just good enough" Ala VHS vs beta.
 
I think this is a very fair point. Some people don't need 4k resolutions or even 1440p. 1080p is plenty for them and if it supplies a decent framerate? why not?


My stepson plays at 1080p.

He is still using my old 2013 6GB Kepler Titan. If I didn't have that kicking around for him, I might consider something like this, but in this performance range I think I'd rather go AMD...
 
Update 2/15/2019: HardOCP's sources have confirmed that the Nvidia GeForce GTX 1660 TI will launch on February 22nd for $279. Additionally, the GTX 1660 is currently set to launch on March 14th for $229, but as we mentioned before, precise launch dates and prices aren't necessarily set in stone this far away from release. We also heard that the 1660 could end up being a rebrand of the Pascal-based Nvidia GeForce GTX 1060, but we can't confirm that rumor yet.
 
Last edited:
I don't foresee Nvidia going with a 3GB ti version.

This card is supposed to be GTX 1070 performance level, but it has 10% more memory bandwidth than the 1070 Ti. In other words, it's fucking overkill.

I expect the cut cards (1660) to drop down to 160-bit at 5GB, or 128-bit GDDR6 at 4GB. That would still give you enough vram, while just barely enough bandwidth to power a GTX 1060 level of performance.

Nvidia has the power to pl,ay these types of games witth memory bus widths, if they so choose. Just look at the GTX 1050 3GB, the GTX 1060 5GB.
 
Last edited:
At 1080p or below VRAM capacity is a much lower issue and concern. There are exceptions to the rule, but Ghost Recon Widlands VRAM usage is far from average. You also have the option to lower texture quality/bump mapping and drastically reducing VRAM usage by doing so. I don't know the specs exactly, but it should be faster in different area's that could negate it's need for a higher VRAM to a point. Just look at the amount of games where 4GB cards beat 8GB cards at higher resolution. Wait til reviews to see where it resides in terms of relative value.
 
I think this is a very fair point. Some people don't need 4k resolutions or even 1440p. 1080p is plenty for them and if it supplies a decent framerate? why not?

Usage for vRam scales much faster with settings than they do for actual resolution. ie if a game chews through 6 GB on 4k ultra, it might still use 5 GB on 1080p ultra. However, dialing things back to high may use only 3 GB at 4k at 2.5 GB at 1080p.

PC users tend to be ultra setting whores, even when they are using midrange hardware. Digital Foundry had a good video in why cranking the settings back not only greatly improves fps with little hit to visuals, but can even make things look objectively better in the case of overly sharp shadows.



Making these adjustments help in performance in fps, in CPU requirements as the case with draw distance, and notably here, huge vram savings as the case with textures.

So really, you can have something like a 3GB 1160ti running 4k visuals at OneX-like settings and still crush that console in fps by pairing it with a sub-$200 cpu.
 
Am I the only one who thinks "1660" is an absolute terrible name for a gpu? Has a really odd flow to it, and only 3gb is really not enough for 1080p in 2019, should keep that on the 1050.
 
Usage for vRam scales much faster with settings than they do for actual resolution. ie if a game chews through 6 GB on 4k ultra, it might still use 5 GB on 1080p ultra. However, dialing things back to high may use only 3 GB at 4k at 2.5 GB at 1080p.

PC users tend to be ultra setting whores, even when they are using midrange hardware. Digital Foundry had a good video in why cranking the settings back not only greatly improves fps with little hit to visuals, but can even make things look objectively better in the case of overly sharp shadows.



Making these adjustments help in performance in fps, in CPU requirements as the case with draw distance, and notably here, huge vram savings as the case with textures.

So really, you can have something like a 3GB 1160ti running 4k visuals at OneX-like settings and still crush that console in fps by pairing it with a sub-$200 cpu.


That's why you are on PC so you can tweak that stuff. Of course everyone wants to crank it up, otherwise they would be on a console. As for myself, fuck yeah I want it on ultra. That's why I try to build a higher end rig to get the most out of games I play. As for 1060ti and 3GB, it's way too low for its price/performance segment. IMO, it's deliberate gimpage. On higher end models, price soeared so high already, nobody would really bitch more than they already are if they had to pay $50-60 for more VRAM. Given the investment games in that segment want to assure some reasonable longevity of their investment and VRAM is a big deal. I think pretty much all the games I play at 4k are using well over 6GB. For visuals it also depends on the game and implementation as well as you how settings are tweaked. In many cases visual difference is very apparent. For a mid range card in 2019 it should absolutely be able to run everything maxed out at 1080@60. From what I've seen a lot of games use over 3GB even at 1080p. Doesn't matter whether it is bad coding or not, it is what it is so card that target that segment should accommodate. Even 1060 3GB was like $200 and before mining boom you could find one for less. I personally feel this is a shame that NV didn't increase VRAM on this new generation (being all this new and hot stuff).
 
That's why you are on PC so you can tweak that stuff. Of course everyone wants to crank it up, otherwise they would be on a console.

This mentality keeps newer and more casual gamers on console. Stay elitist my friend.

When a less informed gamer gets a 4k tv, he sees that the OneX plays the game on his shiny new 4k tv. I realize that is absurd to the user base here, but as he checks reviews of a game and browses the forums, it becomes 'clear' that he will need a $700 gpu to play that game at 4k.

What a silly PCMR thing to say. The biggest advantage to PC over console for low and midrange gamers is frame rate. How about pushing that point instead of focusing on razer's sharp shadows 6 miles away.
 
Last edited:
I've been happy with my 1060 6gb in my laptop. That being said, I went with a 1050ti 4gb rather than a 1060 3gb for my streaming system due to fear of running out of vram while streaming and gaming.
 
This mentality keeps newer and more casual gamers on console. Stay elitist my friend.

When a less informed gamer gets a 4k tv, he sees that the OneX plays the game on his shiny new 4k tv. I realize that is absurd to the user base here, but as he checks reviews of a game and browses the forums, it becomes 'clear' that he will need a $700 gpu to play that game at 4k.

What a silly PCMR thing to say. The biggest advantage to PC over console for low and midrange gamers is frame rate. How about pushing that point instead of focusing on razer's sharp shadows 6 miles away.

Nothing elitist about this, for those who don't want to bother the tweaking settings or games look good enough they simply get a console and are happy. Others that want to take it up a notch, get a PC. There are simply more options as well as ability to stretch the cost by upgrading existing PC. Sure hardware costs more but it also does a lot more than just gaming. Why would you be gaming on a PC and not want to crank up the settings (regardless of hardware you own, just in general), seriously what's basically one of the key reasons to be a PC gamer. There are many advantage of a PC including mods and generally cheaper games or games always going on sale so over the time, I suppose it can be a lot cheaper to own than a console. To each their own buddy but lately as it's gotten so much easier to build and maintain a PC, a lot of casuals also switched over (and I wouldn't be surprised if that was done for higher IQ and framerate).
 
This mentality keeps newer and more casual gamers on console. Stay elitist my friend.

When a less informed gamer gets a 4k tv, he sees that the OneX plays the game on his shiny new 4k tv. I realize that is absurd to the user base here, but as he checks reviews of a game and browses the forums, it becomes 'clear' that he will need a $700 gpu to play that game at 4k.

What a silly PCMR thing to say. The biggest advantage to PC over console for low and midrange gamers is frame rate. How about pushing that point instead of focusing on razer's sharp shadows 6 miles away.
Biggest advantage is more overhead for higher quality and/or frame rates.

Nothing elitist about this, for those who don't want to bother the tweaking settings or games look good enough they simply get a console and are happy. Others that want to take it up a notch, get a PC. There are simply more options as well as ability to stretch the cost by upgrading existing PC. Sure hardware costs more but it also does a lot more than just gaming. Why would you be gaming on a PC and not want to crank up the settings (regardless of hardware you own, just in general), seriously what's basically one of the key reasons to be a PC gamer. There are many advantage of a PC including mods and generally cheaper games or games always going on sale so over the time, I suppose it can be a lot cheaper to own than a console. To each their own buddy but lately as it's gotten so much easier to build and maintain a PC, a lot of casuals also switched over (and I wouldn't be surprised if that was done for higher IQ and framerate).
It's elitist to consider any settings below ultra equivalent to console level of graphics. Yes striving for better IQ/frame rates is good, but let's not pretend with some modest settings adjustments you can't get close on IQ/performance for a lot less cost involved in many cases. The 3GB version of the 1060Ti isn't aimed at 4K users and not really aimed at 1440p particularly either. It could selectively handle a bit of both, but it's clearly designed more most cost efficient than the 6GB version and primarily aimed at 1080p where the 6GB version allows it more headroom for those higher resolutions and extreme IQ settings that aren't VRAM vampires that offer a tiny bit more IQ in exchange for reducing performance very noticeably. Provided you don't hamper the VRAM limitation this card should be on par with 6GB version for less money in a lot of cases at it's intended resolution target. There is a point where at too low a resolution too high texture quality is simply a waste of resources from a practical standpoint. I mean you might notice with your face looking at a poster on a wall up close, but if 99.9% you aren't doing that and even then it's barely distinguishable who god damn cares if it costs you a kidney to afford it. The issue is how you went about comparing it to console and associating anything below max IQ to it that's where you come across a a bit of a elitist snob so to speak. There are huge differences from even lower mid range GPU's to console level graphics and performance so it's way off base in that sense. It's just smart to reduce a few settings that are way to performance taxing depending on the game. I mean there are cases where I'd rather suffer a bit of frame rate and input lag for higher graphic quality, but there are cases where performance matters far more. I doubt the 3GB card will be any more on average than the 1060 6GB card at 1080p and that's where it fits in. In fact I think on average it'll come out ahead til you pile on the IQ settings to bloat down the VRAM. On the plus side it should be quicker at swapping around the VRAM capacity it has and rendering frames that the other GPU would lag more due to the core speed, count, ect so there is a balance.
 
What will the price gap be between the 3 GB vs 6 GB? $50?

I wouldn't have a problem seeing a 3GB 1550 ti or something. For a card as fast a 1070, no way is 3GB enough.
 
Back
Top