Biggest advantage is more overhead for higher quality and/or frame rates.
It's elitist to consider any settings below ultra equivalent to console level of graphics. Yes striving for better IQ/frame rates is good, but let's not pretend with some modest settings adjustments you can't get close on IQ/performance for a lot less cost involved in many cases. The 3GB version of the 1060Ti isn't aimed at 4K users and not really aimed at 1440p particularly either. It could selectively handle a bit of both, but it's clearly designed more most cost efficient than the 6GB version and primarily aimed at 1080p where the 6GB version allows it more headroom for those higher resolutions and extreme IQ settings that aren't VRAM vampires that offer a tiny bit more IQ in exchange for reducing performance very noticeably. Provided you don't hamper the VRAM limitation this card should be on par with 6GB version for less money in a lot of cases at it's intended resolution target. There is a point where at too low a resolution too high texture quality is simply a waste of resources from a practical standpoint. I mean you might notice with your face looking at a poster on a wall up close, but if 99.9% you aren't doing that and even then it's barely distinguishable who god damn cares if it costs you a kidney to afford it. The issue is how you went about comparing it to console and associating anything below max IQ to it that's where you come across a a bit of a elitist snob so to speak. There are huge differences from even lower mid range GPU's to console level graphics and performance so it's way off base in that sense. It's just smart to reduce a few settings that are way to performance taxing depending on the game. I mean there are cases where I'd rather suffer a bit of frame rate and input lag for higher graphic quality, but there are cases where performance matters far more. I doubt the 3GB card will be any more on average than the 1060 6GB card at 1080p and that's where it fits in. In fact I think on average it'll come out ahead til you pile on the IQ settings to bloat down the VRAM. On the plus side it should be quicker at swapping around the VRAM capacity it has and rendering frames that the other GPU would lag more due to the core speed, count, ect so there is a balance.
Please reread what I stated in my post. I simply stated that folks who don't want to get involved with a PC and want a simple setup get a console. I didn't say that for 4K you need a PC or you will have a shitty experience. Many games look fine on a console but there's no point arguing that you can have a much more superior setup on a PC as well as more flexibility in settings and performance. Regarding ultra and 4K, I expressed my own expectations and I don't run on mid range hardware. Nowhere did I mention anyone intending to run 4K with that card. New card costs more than previous gen in the same price segment and has less VRAM. This card is several years newer which is a reason many people including myself as dissatisfied with NV decision (this kind of applies to other cards like 2070 and 2080). Whenever virtual VRAM is used there is always a huge performance degradation and you see sudden fps drops as your end up loading resources from much slower RAM and SSD/HDD.