The Nvidia 1660 TI Will Launch on February 15 at $279

Really, they are. They are the normal of the business world. AMD is the exception through their own failures- and act no different when they succeed.

Please give examples of this. I have never seen AMD nerf performance on all in order to make their products look even better. This was a big part of the AdoeredTV video. Games sponsored by AMD always seem to run great for both companies. Then we had the GPP attempt. Also, we can talk about freesync vs. Gsync. Sure, Nvidia has opened it up now, but I kind of feel is was due in part to people running the 2400/2200g Freesync glitch. It was really hard for Nvidia to justify the lockout after that.

I will say that both companies have misrepresented their product lineup - ie the Rx570-like rx580s, DDR4 1030, a version of the RX 560 that had less cores if I remember correct and so on.
 
I have never seen AMD nerf performance on all in order to make their products look even better.

I haven't seen this; further, literally everything you list is normal. You don't see it from AMD not because they wouldn't do it, but because they cannot do it.
 
I haven't seen this; further, literally everything you list is normal. You don't see it from AMD not because they wouldn't do it, but because they cannot do it.

The first sentence just shows a difference in conduct expectations and the 2nd sentence just seems like pointless baiting so no need to argue it further.
 
The first sentence just shows a difference in conduct expectations and the 2nd sentence just seems like pointless baiting so no need to argue it further.

If you expect AMD to always be the underdog, then perhaps you might be surprised when they behave differently...
 
Read the 2060 review at hardocp and look at their feedback. Like I said I ain’t paying $380 for a fuckin card with 6gb of ram. Otherwise nvidia will never learn and next time RTX 3070 or whatever it’s called will be 450 and 6gb of ram, no thanks. Lol.
I'm not paying 700-800 for 8GB of RAM either.
 
  • Like
Reactions: N4CR
like this
Not needed and pointless, but continue to wallow in non techie ignorance ;).
When compared to a 1080ti in VR it shows Nvidia made a mistake. "techie ignorance"? Are you kidding? LoL!
What year are you living in that a $700+ card is okay with 8GB?
 
  • Like
Reactions: N4CR
like this
So why does a 8GB 2080 continually loose to a 11 GB 1080ti in VR?
Again a $700+ card with only 8GB. Only Nvidia would dare go backwards.

Why would you attribute that to VRAM amount in VR of all things? The vast majority of VR games are made to run with minimal amounts. The frame time charts from Babeltech don’t indicate any sort of VRAM issues and they blamed early drivers. Dropped frames were in single digits for both cards. If VRAM was an issue you’d see way more than a few dropped frames.

https://babeltechreviews.com/the-rtx-2080-vs-the-gtx-1080-ti-in-vr/3/

Personally if I was offered 8GB and 16GB it would be a tough choice on a RTX card. I’d only go with the 16GB for potential RT reasons, not rasterized. For something like the new Radeon I’d easily go 8GB for cheaper.
 
Last edited:
So why does a 8GB 2080 continually loose to a 11 GB 1080ti in VR?
Again a $700+ card with only 8GB. Only Nvidia would dare go backwards.
Vram means nothing in 2019, nvidia has some magic cloud dlrtx software to fix it or something perhaps. Funny how this time they sound like fresh amd fanboys with the 'x feature will save the world and make up for 4gb vram'.. Yeah No.
 
This right here is the perfect example of what I have been saying all along. AMD exists to give Nvidia fans better shit for cheaper price so they can buy cheaper Nvidia cards lol. Let me tell you they don't owe anyone anything. They have every fricking right to price the card close to Nvidia for similar performance. Why? Because people want AMD shit to be cheaper than nvidia because "OH they are value brand"!


AMD has absolutely no incentive to make shit cheaper. Its just making them look like that cheap brand and people bend over to Nvidia to pay higher prices and if AMD undercuts them Nvidia will just drop prices and people buy Nvidia.

LOL! Look at rx 570 and rx 580. people be buying the 1050ti to buy from a company because they have higher prestige? Haha.

I hope AMD prices their cards similar to Nvidia on the mid-range too. I think pricing them too cheap makes people think they are cheap cards because they don't cost enough. Atleast Nvidia fanboys will stop thinking AMD is here to do them favor.

AMD will be just fine selling zen 2 and eypc 2. Thats where the money is going to be for them.

Your indirectly showing that AMD is responsible for raising prices. If Nvidia drops prices to undercut AMD, that's a good thing for consumers and is the mechanisms of competition in action. But Nvidia is a business like anyone else and they care about profit. Nvidia can be greedy with margins and there is a certain point Nvidia will not follow and this is where AMD need to price their product if they want to increase their marketshare. Selling at the same high pricing as Nvidia will simply alienate their existing customer base which typically put high price/performance as a priority in purchasing a card.

Right now, AMD is in a better position to undercut Nvidia and obtain marketshare because they have invested essentially nothing further into R and D on Polaris and do not need to recover R and D profits at this point. Nvidia does since Turing is a new series with a tonne of R and D invested in it on top of being a new architecture. Likely billions.

AMD is not in a position to price their products at similar price to Nvidia and need to undercut them for their own benefit. At equal pricing, Nvidia will outsell AMD. Keep prices high and watch that marketshare melt away.

For AMD to obtain the premium branding Nvidia has at this point, they need to take the flagship performance crown with a single GPU and do it consistently for a few years. That's the only way they will be able to displace and take Nvidia's position.

At the moment, although overpriced, Nvidia's flagship is 2x as fast as AMD without a real node difference. That's the biggest it has ever been. As a result, AMD deserves it budget moniker because it is simply too far behind in performance overall. They cannot expect to sell equally at the same price to performance based on hate alone.

Overpricing their products just as much as Nvidia serves nothing but to reduce their own value proposition and lower their own desirability vs Nvidia cards. Giving a good price vs Nvidia at the very least earns AMD a better review for their product which can raise that particular products impressions.

Except the 590 has gotten little positive press given its release price. It is also down to $260 with 3 free games. IIRC, it has had free games since its release. Now, even at 260 its not really worth the extra 50-60 than a 580 unless you want the free games and 10% or so better performance. I sure as hell haven't seen nvidia offering any good 1060 deals in the past two months competing with 570/580 deals.

I won't be surprised if for the first 2 months or so, depending on supply/demand we'll see AIB 1660ti's pushing $300+. The current ram issue reminds me of the gtx 960 2gb vs 4gb deabtes. I was wrong then, the 4gb was worth the extra 25-50 bucks if you want to keep the card more than 2 years.

Edit: NE deal on the asrock 590, 240-20 rebate for $220.

This is simply the market correcting itself and a price drop happening because of the RTX 2060. In the history of Videocards, the pricing of the RX 590 was unprecedented in a bad way. Never has price of a respin of the same chip been more expensive for the GPU market than previous iterations. At worst they have been the same price, but never have they been more expensive. The game bundle is the red herring to get people to focus away from this bundle, but look at the attractive bundles all AMD cards were getting at the same time and it becomes clear AMD was trying to mark up the price of the third iteration of Polaris with the game bundle meant to focus peoples attention away from this fact. But considering Vega 56/64 got the same bundle and the rx580/570 got the same bundle minus 1 game, the 60 dollar difference in price at the time(80 at street pricing) was for the hardwardware not the software since these companies get games cheap.

However if this anti technology/consumer move succeeded, we could see GPU series stay on the market for 4 years

The RTX 2060 is the best card to launch over the last 12 months over a poor series of cards. At 280 dollars, the RX 590 barely made any sense to begin with and why it typically got lackluster reviews. However because of the viral marketing teams, it was generally better received than it should have been in forums. Look at the discussion forums for this particularly review and it was eerily positive.

The problem with consumer embracing this type of behavior is it can leads to heavy stagnation in performance and price over time. What if nvidia stops releasing new architectures and series and starts milking the same GPU over like Polaris and sustaining the product with respins like AMD is doing and what Intel has been doing.

This would becomes 8% percent performance jumps every 16 month, and if one of these jumps was a bit bigger than this, we would have a 12 percent jump with a 27% increase in price. That is something I don't want to see happen to the GPU industry because outside of the bump in performance with Ryzen compared to Phenom II, it has become a very boring hobby and the GPU market is the only thing semi exciting. And this is what people don't appreciate about Nvidia.

Nvidia has been pouring the most money into R and D in terms percentage and annual growth. What has this translated into consumers?

They have pumped out at 2 to 3 times the number of new chips AMD has and they have released 3 different architectures(Kepler, Maxwell and Turing) while AMD is still on GCN. And this isn't a superficial quantity number either, this has lead to bigger jumps in price to performance.

What Nvidia has definitely been doing better compared to AMD is not rebranding and not refreshing products with respins. This generally means faster products between generations, and because of this, higher jumps in price to performance as older tech gets obsoleted and replaced with smaller, faster cheap chips sooner.

https://www.hardocp.com/article/2018/08/07/nvidia_gpu_generational_performance_part_2/3
https://www.hardocp.com/article/2018/09/04/amd_gpu_generational_performance_part_1/4

Because Nvidia is not rebranding and developing new architectures, rather then rehashing the old ones, the jumps in performance have been more substantial. Adoredtv tried to paint Nvidia as the bad guy for having smaller jumps in performance over time, but AMD has been far worse in respect to this as the above performance links show. This isn't about greed as much as is the laws of physics. Nodes used to come more frequently accessible to Nvidia/AMD and jumps in IPC were easier to obtain.

What people completely forget about is the R and D aspect of making a chip. People are trying to justify Vega VII priced based on the cost of HBM2(which is certainly not 320 at this point considering that is the price of HBM2 x 2 during the Vega period and prices have certainly fallen at this point). Where as Nvidia 2.2 billion annually on R and D and amortize this in 2(4+ billion now) years as a new generation gets developed, AMD is spending maybe 300 million annually in R and D on GPUs at most because of the outsourcing of labor to China for GPU development and the lionshare of R and D going to CPUs(AMD spent 1.35 billion).

Yes Nvidia is a very profitable company but considering nearly half of Nvidia's revenue comes from the professional market now where margins are 80% plus(1.76 billion gaming, 1.41 billion profession, auto, IP data center), it should be a tremendously profitable company, particularly with the mining spike which will all but disappear in following quarter.

I didn't say the 1660 is anti-consumer but was in reference to their actions in general, closed standards, only just allowing (a really shitty) implementation of freesync, paying developers to focus only on Nvidia optimisations, cheating (both are guilty here), hell, up until the 1000 series they didn't even give users more than 8 bit colour outside of DX apps ffs. Fuck Nvidia.
I also don't disagree about AMD rebrands but Nvidia is guilty of recycling low end products too, they both do it...
The 1060 is an 580 competitor and the 590 is more for the 2060 and to test 12nm as a pipe cleaner (and it failed) but yes AMD needs a new product here, hence Navi this year. SO for the next few months your criticisms are valid. But just because AMD hasn't got a release cycle quite aligned doesn't mean they should be singled out for doing the same shit. Look at a 1050Ti, people buy that over a 5xx which is faster because green logo.. that's ignorance, misinformation and lack of education and research.

7970 Was faster than everything Nvidia had at the time, even the Titan, so I don't see what the pricing issue is. I bought one within a month of launch and it was the best GPU I've ever had. 40% OC on reference edition cooler is unheard of usually.
People use the same to justify the insane 2080Ti pricing, that is nothing to do with AMD, the 2080 is matched by VII apparently so again, up to consumers to bring that price down by not buying that expensive shit. AMD isn't a charity either and just because lesser informed people or brainwashed people think they are or expect them to be 'cheaper because AMD' that's not my fault. With drivers much better in recent years and the power circuitry and card design being more robust I think AMD has a far better quality product in most cases, but no one knows or cares about that. Vote with your wallet. With CPUs, AMD has a compelling offer hence stealing market share from Intel.
I don't buy mid-low end cards, they have historically been recycled, rebranded shit from the last generation from both sides.. also I have been running 1440p since 2011.. so no dice there. Not everyone has the piles of cash like Nvidia to make a million different dies each year.

AMD has at times been the value brand when they can't compete, which has practically come to an end. They are not a charity, they are a corporation.

AMD uses open standards because it is cheaper for them and because it is the only way their standard gets somehow adopted.

Closed standards are expensive to enforce. When AMD tried to initially push Mantle as a close standard, it died a quick death.

If your referring to companies not adopting directx 12 readily and staying on directx 11, that a sham. What you need to realize is directx 12 put all the driver responsibility on the game developers rather than AMD/Nvidia which is why AMD wants it. Less work and cost for them. Developers with deadlines and a desire to target the entire PC market(not just windows 10 users) have every reason to use directx 11. Directx 11 is way easier to use and program for because a high level API handles the work, errors in code, scheduling of resources and reduces the complexity of the code across multiple pieces of hardware.

https://www.pcgamer.com/what-directx-12-means-for-gamers-and-developers/

89d5ecb3df6cb23d93052f069fe93d21-650-80.png


It works with consoles because there is only 1 hardware configuration, but when you add the infinite combinations with PC hardware you have a beast that only the most veteran of programmers with experience with ancient code without a high level API have been to deal with successfully.

Dice with battlefield 1 and V have definitely failed with their directx 12 implementations as HardOCP has shown.

https://wccftech.com/resident-evil-2-remake-pc-performance-explored/

Resident evil 2 remake is another fail where directx 12 works worse than 11.

https://www.hardocp.com/article/2017/03/22/dx12_versus_dx11_gaming_performance_video_card_review/11

In 2017, Hardocp tested a bunch of games and only 1 game out of 7 actually produced a significant positive result.

Shadow of the Tomb Raider, I think is the only game in 2018 where the results with directx 12 were better than directx 11.

And this is the problem, when there is an incredible amount more work for the code to work properly and the chances of producing a success which improves performance is low, developers don't have enough incentive to focus primarily on the directx 12 version when it also locks them out of none windows 10 builds. Nvidia doesn't need to pay developers to use their platform as the lead development platform to build their code around when they represent 70+ percent of the GPU market.

People buying a GTX 1050 ti over an RX 570 is not Nvidia's fault but consumers own ignorance. It not a unique ignorance found in the GPU industries and happens in the CPU one as well and any other ones including the headphone industry, phones. However at this point, it also possible that the RX 570/580 is a heavily saturated market considering the age of the polaris 10 architecture(most people that want Polaris already own it which is very likely with the flood of cheap cards post mining). The GTX 1050 ti could very well be selling better because its a low power mainstream part that can go into cheap mainstream desktops like HP, Dell etc which often have 250watt power supplies.

Correction there was no titan at the release of the 7970, just the GTX 580 which was a 500 dollar card at the time and of course the 7970 which was 550. The problem with that release is that if a card only has to perform faster than last gen flagship prices, Nvidia's entire pricing structure with RTX is justified. That is the RTX 2080/2070/2060 are slightly faster in the case of the GTX 1080 ti/2070 to greatly faster as is the case with the GTX 2060. So keeping the pricing the same for the slightly better performance is justified, while increasing the price greatly with a bigger jump in performance is justified.

A 7970 was only 20% faster than a GTX 580, but more importantly, the 7970 series used a smaller die on a relatively cheap 28nm node meaning the 7970 was not in the same class of chip. Things like Nvidia today, look bad for the 7970 when you compare it to their previous generation pricing/lineup.

The 6970 series from AMD launched at 370 dollars. When the 7970 launched, it was initially 30 % faster, but at a price of 550 dollars, was 49% more expensive than a 6970 for a chip clearly in the same die size.

perfrel_1920.gif


As a result, it had horrific price to performance.

perfdollar_1920.gif



Not all that different than the RTX 2080 in fact compared to their contemporaries.

performance-per-dollar_2560-1440.png


The worst offender in my mind with the 7870

perfdollar_1920.gif


Now look at the performance per dollar of the RTX 2060

performance-per-dollar_2560-1440.png


This is the problem and what AMD viral marketing tactic have done to these forums.

The RTX 2060 is actually a good card even from a price to performance standpoint as it slightly improves upon it's predecessor which was already pretty good. Now look at the 6870(initial MSRP = 240, street price at the time $155) price to performance vs the 7870(350)s price to performance and you will see through the pricing of the 7970 and 7870, AMD created to perfect storm for Nvidia to increasing pricing. The 7870 had horrible price to performance, was made on a cheap 28nm process and was only a 213mm2 product. But look at the consumer reception, it was relatively positive. The reason was forums were not nearly as toxic as today.

https://www.techpowerup.com/forums/threads/amd-radeon-hd-7850-hd-7870-2048-mb.161644/

Now look at the review impressions from people on forums for the RTX 2060, particularly on this forum and you will see the reception is absolutely toxic even though reviewers are not seeing it that way. AMD marketing plan have turned forums absolutely toxic by having people a large global marketing base team red infiltrate forums, events and high light anything negative about the competition, particularly video's like AdoredTVs.

https://www.forbes.com/sites/jasone...24-event-to-celebrate-pc-gaming/#4259be634d8b

What videos like adoredtv do is highlight only the negative stuff from Nvidia while turning AMD into a tragic hero but as the pricing of the 7970/7870 show are just as responsible for pricing as Nvidia, particularly when they only match price to performance of Nvidia's products. Now they are doubling down on this strategy by using fake leaks and rumors about unrealistic product and pricing for AMD products that won't happen to make the competitions currently existing products look bad(fiji, Polaris, Vega and likely Navi fall into this trap) and when AMD products do get released to disappointing fan fare, hope consumers buy AMD products out of spite for Intel and Nvidia.

The sad thing is it is working and this manipulation is very real. People selling their GTX 1080 ti's to buy a possibly slower Radeon VII 2 years later at the same price while losing some money in the selling process is completely irrational. Particularly when these peoples original expectation was to buy a product around 250 dollars with performance about 5-10% slower than this.

I think the very worst thing Nvidia has done for consumers is the founders edition crap. This is something I despise and I think is particularly greedy. But the toxicity of the forums is something that should concern us as a community.

What inspired this huge rant is that people are strongly against the RTX 2080 at 699, but when something with similar speed comes out from AMD at 699(while being the friend of consumers), it's okay because the expensive bill of materials. And yet this excuse doesn't apply to the RTX 2060(the DDR6 is more expensive than DDR5 and provides twice the bandwidth and the die size is almost the same as GTX 1080 ti) and it's garbage unless it's 250 dollars. This juxtaposition attitude on these forums is completely irrational and shows how deeply AMD has infiltrated these forums.
 
People are really selling a 1080ti right now for a VII? Where? The only way that makes sense to do is they think they can get $600+ used for it since many want to avoid RTX. Also, the cheapest 2080 in the USA is 695 via pcpartpicker. Two cards are under $700. The highest is $869 and most are $750-799.

As for bandwidth of the 2060 gddr6, thats nice that you believe in it. I believe in ram size being more important, especially when I'm spending $300+ on a video card.

Nvidia has been pouring the most money into R and D in terms percentage and annual growth. What has this translated into consumers?

Its lead to rising prices for the past 4 years across every level of performance.
 
  • Like
Reactions: N4CR
like this
As for bandwidth of the 2060 gddr6, thats nice that you believe in it. I believe in ram size being more important, especially when I'm spending $300+ on a video card.

I guess they could have used 8gb with 256 bit bus using slower GDDR5x but the tradeoffs would probably be similar to a 3gb 1030 over a 2gb 1030.
 
Its lead to rising prices for the past 4 years across every level of performance.

This is incorrect; we've seen a prices rising with performance, as opposed to performance per price increasing with prices largely staying flat.

That isn't to say that performance per price isn't increasing, it's just that it perhaps isn't increasing as fast as it has in the past. We can attribute part of that to R&D and production costs outright, and part of that to demand (and variance in demand) over time.
 
This is incorrect; we've seen a prices rising with performance, as opposed to performance per price increasing with prices largely staying flat.

That isn't to say that performance per price isn't increasing, it's just that it perhaps isn't increasing as fast as it has in the past. We can attribute part of that to R&D and production costs outright, and part of that to demand (and variance in demand) over time.

Eh, looking at Nvidia's revenue they're not hurting from R&D and production costs.

I'll edit my statement to appease you, "its lead to rising prices for the past 4 years across every level of the product line/stack.
 
Eh, looking at Nvidia's revenue they're not hurting from R&D and production costs.

They've not had revenue issues in over a decade- however, the stuff is getting more expensive to design and more expensive to make on a per-unit basis.
 
They've not had revenue issues in over a decade- however, the stuff is getting more expensive to design and more expensive to make on a per-unit basis.

There has to be a better balance of designing features that are actually useful (e.g. Ray Tracing available in 1 game 5 months after launch) and keeping pricing in line with what people are willing to pay. See also Nvidia's Q4 results and subsequent 15% drop in stock value. A $250 2060, a $400 2070, a $550 2080 and a $800 2080Ti would do wonders for sales and subsequently stock prices. Soft Chinese demand is only part of the puzzle regardless of what Huang says.
 
actually useful

While I get what you're saying overall, I should point out that RTX is 'actually useful'; to the end user, at this point after the introduction of the technology not so much, but in terms of moving real-time graphics technology forward?

Huge.
 
Considering its price level we are probably talking entry-1440p levels of performance or high 1080p levels of performance.

What uses that much VRAM at those resolutions?

Really it depends on usage scenario. Many games will do fine, if you do any type of deep learning etc. Then the RAM makes a big difference.
 
Playing Onward on the Rift using GPU-z, it says the memory used was 10172MB dedicated and 578MB dynamic. I would say that 6gb would not be all that great. They could of went with 256 bit and 8gb memory at least?
 
it says the memory used was 10172MB dedicated and 578MB dynamic. I would say that 6gb would not be all that great.

This isn't actually 'memory used', as in 'memory required', and is the source of much of the confusion. In this case, it's more than likely memory being 'filled' because it's available. Some games do pre-loading and that adds to the confusion.

Of course, that doesn't answer the question really. We'd need to compare cards with the same GPU and differing amounts of memory across various games, resolutions, and detail levels, in order to map out where exactly the limits are.
 
This isn't actually 'memory used', as in 'memory required', and is the source of much of the confusion. In this case, it's more than likely memory being 'filled' because it's available. Some games do pre-loading and that adds to the confusion.

Of course, that doesn't answer the question really. We'd need to compare cards with the same GPU and differing amounts of memory across various games, resolutions, and detail levels, in order to map out where exactly the limits are.
Yeah, I am not sure on 6gb vs 11gb in how it smooth it is or more detail. But if you are buying a card you might as well get one with more memory as modern games seem to be more and more demanding.
 
Yeah, I am not sure on 6gb vs 11gb in how it smooth it is or more detail. But if you are buying a card you might as well get one with more memory as modern games seem to be more and more demanding.

Not going to argue against getting more memory if that's the only difference (or perhaps the only difference is a small increase in cost). There's usually other stuff to consider, though.
 
So the GTX 1660 Ti will perform a bit better than the GTX 1060 but still under a GTX 1070?

I guess I'm glad I bought a handful of GTX 1070 cards for $180-200.
 
So the GTX 1660 Ti will perform a bit better than the GTX 1060 but still under a GTX 1070?

I guess I'm glad I bought a handful of GTX 1070 cards for $180-200.

Most are estimating it to be on par with the GTX1070. I think it will be slightly faster in some scenarios due to the higher bandwidth.
 
Thanks for the update. I was looking for a review and came back here for discussion on the status.
We will not have a review because we will not sign NVIDIA'S NDA. And then NVIDIA forbids their AIBs from sampling us as well. GPP still has them upset.
 
Back
Top