1080 ti for +50%+.I just want something with more VRAM than my 1070 that actually looks worth spending money on.
The wait continues...
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
1080 ti for +50%+.I just want something with more VRAM than my 1070 that actually looks worth spending money on.
The wait continues...
IF the RTX2080Super was bandwidth starved, increasing the bandwidth would help, duh!!! Nvidia is not as bandwidth limited as AMD since it started using all its compression technologies.
Only in Mining did bandwidth make a huge difference, but gaming wise it doesn't affect much, if at all.
This is a rather arrogant claim. I did not see any reviews where the memory overclocking was isolated for performance gains. Even then, performance can degrade as clocks are pushed to the limits. Then only way to validate this 'claim' is to reduce memory clocks and see if performance degrades.
You can try to defend this card all you want, but $700 for an 8gb card is pathetic in 2019, even if you want to ignore the bandwidth issues.
We will see if nVidia compression pixie dust will keep min frames high on upcoming titles such as CP2077 or Doom Eternal.
Yep, $700 for 8gb of of 256 bit memory. Meanwhile my $380 OneX I bought 6 months ago was blessed with 12gb and 384 bits. GDDR5, but you get the point. Hell, we given 8gb 512 bit cards for $350 years ago, but we can't get 12GB today because....GDDR6!
... Nvidia is not as bandwidth limited as AMD since it started using all its compression technologies.
I find it odd that the RTX2080Super is only second to the 2080Ti and Vega VII in memory bandwidth, yet you claim its bandwidth starved...
Curious thing. Vega VII has double the bandwidth, yet its slower than the 2080super. And now the 5700XT has less than half of Vega VII bandwidth and still manages to beat it in some games.
You can try to defend this card all you want, but $700 for an 8gb card is pathetic in 2019, even if you want to ignore the bandwidth issues.
We will see if nVidia compression pixie dust will keep min frames high on upcoming titles such as CP2077 or Doom Eternal.
Yep, $700 for 8gb of of 256 bit memory. Meanwhile my $380 OneX I bought 6 months ago was blessed with 12gb and 384 bits. GDDR5, but you get the point. Hell, we given 8gb 512 bit cards for $350 years ago, but we can't get 12GB today because....GDDR6!
Without having looked at the specs recently.. the various cards you are comparing (memory bandwidth), 1080Ti, 2080, 2080 Super, have different #s of cuda cores, running at different speeds, and (probably) different counts of shader cores, SMU's, whatever. It isn't any single factor.
I tend to agree with the point that memory bandwidth isn't a big of an issue as you might think, because of the points about the HBM2 and lots of bandwidth doing nothing appreciable that anyone can tell for game performance. Nvidia had a 512bit memory bus on the 8800gtx iirc. Newer cards had a smaller bus. Why? Faster memory doesn't need as wide of a bus.
Someone take a 5700xt, 2080 super, and downlock the memory by 10%, and compare performance both before and after. Good way to make the point one way or the other. And at any rate, at some point memory bandwidth is meaningless if it sits unused.
The 1660 is the 1650ti at this point. For $230 on average. 1650 has GDDR5 too vs 6 on the two 1660 cards.
I am really tired of hearing this nonsense from some people. The 1660 is a 120 watt card and a MASSIVE performance increase over the previous 1050 ti. In no way shape or form would the 1660 have been a 50 level card.The 1660 is the 1650ti at this point. For $230 on average. 1650 has GDDR5 too vs 6 on the two 1660 cards.
I am really tired of hearing this nonsense from some people. The 1660 is a 120 watt card and a MASSIVE performance increase over the previous 1050 ti. In no way shape or form would the 1660 have been a 50 level card.
Are you truly that out of the loop about video cards? AGAIN the 1660 is a 120 watt card so that alone should give you a hint that it would NOT be a 1050/1050 ti replacement. And also AGAIN the 1660 is way to big of a performance jump over the 1050/1050 ti for them to give you that at the same price point.Of course it would, if the 2060 wasn't $50-100 more expensive than the 1060 was. Its currently selling at GTX 960 prices from 2015.
Are you truly that out of the loop about video cards? AGAIN the 1660 is a 120 watt card so that alone should give you a hint that it would NOT be a 1050/1050 ti replacement. And also AGAIN the 1660 is way to big of a performance jump over the 1050/1050 ti for them to give you that at the same price point.
Are you truly that out of the loop about video cards? AGAIN the 1660 is a 120 watt card so that alone should give you a hint that it would NOT be a 1050/1050 ti replacement. And also AGAIN the 1660 is way to big of a performance jump over the 1050/1050 ti for them to give you that at the same price point.
And its not my fault that you cant comprehend reality. Anyone with a clue knows that the 50 level cards are not going to go from 75 to 120 watt cards. And if you live in the real world then you would know damn well they were not going to DOUBLE performance at the same price point on a 50 level of card especially only going to 12nm from 16nm.Thats their problem not the customers. Nvidia is the one who jacked up prices across the board. The 1650 is apparently the new 1050 upgrade and is a pointless sku (doesn't even have the new encoder) except to have Nvidia in that performance level instead of ceding it to 3 year old rx 570/580 cards. Sorry maybe I am out of the loop in that I don't think paying $220+ for a 1660 is a good value nor is a 1660ti. I don't give a good gosh darn about your power requirements, I'm going by price points because at the end of the day, that is what matters most. What performance you get for the moneys you pay.
Same thing. If used had worse bang/buck no one would buy it. Used prices always have to fall to offer better value than new, so arguing that used has better value is pointless.
yeah, lets just agree to disagree then..
And its not my fault that you cant comprehend reality. Anyone with a clue knows that the 50 level cards are not going to go from 75 to 120 watt cards. And if you live in the real world then you would know damn well they were not going to DOUBLE performance at the same price point on a 50 level of card especially only going to 12nm from 16nm.
At this point you are just to clueless about this subject to realize how foolish it is to think you were going to get TWICE the performance for the same price and have the power consumption go to 120 watts. That is NOT the way it works in the real world especially on lower end 50 series cards.
You really do not know when to stop. No one with the least bit of logic and sense would expect the 50 class card to DOUBLE in performance and go from a 75 watt TDP to 120 watt and keep similar price point. The 50 class cards are lower power so 120 watts would not have been suitbale. And it is beyond ridiculous to think they would give you twice the performance over last gen on a card like that. At this point you will never understand how foolish you look by thinking the 1660 should have been 50 class card.So its ok for nvidia to shift up their model numbers up a level but not down a level? The 2060 is at xx70 prices now but the xx50 cards should still be $150-200 despite 3 year old competition from amd and their own old cards beating them on price/perf? The only reason for the 1650 to exist is for the oh so vaunted 75w only gamers. Who don't even get the new encoder chip.
now this something we can agree onNo problem, I see lots of people disagreeing with reality.
now this something we can agree on
Nvidia over charges their customers.
I paid $800 for my RTX2080. I doubt I'll ever buy an Nvidia product again, mostly because of the lock-in structure of g-sync.
Nvidia over charges their customers.
I paid $800 for my RTX2080. I doubt I'll ever buy an Nvidia product again, mostly because of the lock-in structure of g-sync.
You really do not know when to stop. No one with the least bit of logic and sense would expect the 50 class card to DOUBLE in performance and go from a 75 watt TDP to 120 watt and keep similar price point. The 50 class cards are lower power so 120 watts would not have been suitbale. And it is beyond ridiculous to think they would give you twice the performance over last gen on a card like that. At this point you will never understand how foolish you look by thinking the 1660 should have been 50 class card.
So far, the Sapphire card is massively improved over reference for thermals, which was our first biggest complaint with the reference card. A few people out there tried to argue that the reference blower wasn’t actually loud or hot, but this is precisely why that’s wrong. It was both objectively hot and objectively loud, particularly in the comparative against partner models. There’s no point in defending a bad cooler design, and Sapphire gives us the numbers to definitively show that there is a better way to make a card. At about $10 more than the reference blower, it’s not even much more expensive.
You pretty much always have to wait until round 2 with nVidia. I bought the 1080 and then a few months later the 1080 Ti came out at the same price. Was not real happy about that.
I mean, how many generations now have we seen Ti type cards? Mainline cards are released, then Titan type cards, and then Ti type cards are last. It's a fairly predictable release schedule.
Unless there's going to be a price-drop to go along with it, the 2080S looks like it's an uneventful refresh. 5% performance gains, same price.
Hardware Unboxed points out that the 2060S costs 18% more than a 2060 while giving only an 8% performance increase, costing 9% more per FPS.
Comparing the 2080S to the 1080Ti, HU points out that we're getting a 9% performance increase at the same price-point as the 1080Ti 2.5 years later. Is that the worst performance-over-time scaling we've ever had in gaming GPU history?
The RTX 2000 series and Super refreshes look like they're an exercise in pretending to be offering something while not actually offering something.
True, though it's not always consistent the price / performance difference between the Ti and non-Ti cards, or how long it takes between releases.