More 1660 TI Pictures Emerge as Launch Nears

Discussion in '[H]ard|OCP Front Page News' started by AlphaAtlas, Feb 11, 2019.

  1. AlphaAtlas

    AlphaAtlas [H]ard|Gawd Staff Member

    Messages:
    1,527
    Joined:
    Mar 3, 2018
    As the 1660 TI's launch nears, more pictures of the unreleased GPU are leaking onto the internet. One Reddituser posted what appears to be a Galax GeForce 1660 TI, while Videocardz uploaded a picture of a Palit GTX 1660 TI StormX. An anonymous source told us that the card will launch on the 15th for around $279 back in January, but noted that the information "is likely in flux now and changing day to day" at the time. More recent rumors (which we haven't confirmed yet) suggest the card will launch on the 22nd.

    Palit is preparing two models of StormX series. Both cards are equipped with TU116 GPU and 1536 CUDA cores. The GTX models from Palit feature 6GB of GDDR6 memory. The StormX OC variant features a clock speed of 1815 MHz, while the non-OC variant is set to default 1770 MHz.
     
  2. alamox

    alamox Gawd

    Messages:
    598
    Joined:
    Jun 6, 2014
    let me guess, it's faster than Radeon VII
     
  3. buttons

    buttons [H]ard|Gawd

    Messages:
    2,007
    Joined:
    Oct 12, 2011
    this should trade blows with an RX 590 ?
     
  4. filip

    filip [H]ard|Gawd

    Messages:
    1,262
    Joined:
    Aug 15, 2012
    More excess inventory more problems... Oh sorry I ment products.
     
  5. Chimpee

    Chimpee [H]ard|Gawd

    Messages:
    1,327
    Joined:
    Jul 6, 2015
    Probably trade blows with a 1070.
     
    Nightfire likes this.
  6. Brian_B

    Brian_B 2[H]4U

    Messages:
    2,089
    Joined:
    Mar 23, 2012
    Did it ever get determined what the performance difference between TU and GP cores are, apart from Turing/RTX (that presumably won't be present on 1660)?
     
  7. KATEKATEKATE

    KATEKATEKATE n00b

    Messages:
    16
    Joined:
    Jan 20, 2019
    WCCFTech was reporting this would go for $349 which made no sense given the 2060 is a thing. $279 sounds about right though.
     
  8. Captindecisive

    Captindecisive [H]Lite

    Messages:
    108
    Joined:
    Apr 13, 2017
    If it really only trades with a 1070 it's not going to be very impressive and there will be another batch of unsold cards; saw 1070 on Newegg today at $299, so either they have to drop the price in the $230-240 range, or beat the 1070 by 5-10% to be a viable option.
     
  9. Nightfire

    Nightfire [H]ard|Gawd

    Messages:
    1,267
    Joined:
    Sep 7, 2017
    It can be roughly estimated using the ratios between the RTX 2060 and Maxwell cards. The RTX 2060 has the same amount of cores as the GTX 1070 yet performs close to a GTX 1080. If you use the same ratio, the 1660 ti should perform very close to a GTX 1070 despite having a core count closer to the GTX 1060.

    I actually think it will perform better than the GTX 1070 in certain dx12 titles due to the greater bandwidth as well as Vulkan due to how well Turing is doing there overall.

    It should be a new performance/$ champ.
     
    Brian_B likes this.
  10. Ski

    Ski Gawd

    Messages:
    929
    Joined:
    Jun 21, 2008
    "1536 CUDA cores."

    It never ceases to amaze me how my 6-year old card almost has double the amount of cores then a 2019 model. Regardless of where the 1660 is labeled as, the fact that in this amount of time how things have stagnated blows my mind.
     
  11. sirmonkey1985

    sirmonkey1985 [H]ard|DCer of the Month - July 2010

    Messages:
    20,912
    Joined:
    Sep 13, 2008
    or it shows just how inefficient your card is that a card with half the amount of cuda cores out performs it.. that's not stagnation, lol..
     
  12. Nightfire

    Nightfire [H]ard|Gawd

    Messages:
    1,267
    Joined:
    Sep 7, 2017
    It's one thing for AMD to give up on the high end, but it is another thing all together for them to let Nvidia take over the midrange.

    The RX590 looks like a joke now compared to the 1660ti. It will lose in efficiency AND performance/$ the way things are looking.

    Why AMD is not releasing something like a cut down Radeon VII with half the bandwidth and HBM 2 using 48 CUs is beyond me.

    Even if it still lost to the GTX1660ti / RTX 2060 in performance/$, it would at least have the compute and workstation use with potential resale that would justify the extra cost.
     
    Shadowed likes this.
  13. KATEKATEKATE

    KATEKATEKATE n00b

    Messages:
    16
    Joined:
    Jan 20, 2019
    shader core counts have gone up slowly because just adding more cores alone doesn't scale well- there have been huge clock and ipc improvements as well. Also this is a low-midrange card that is much faster and cheaper than your 290X(?) (shot in the dark) while consuming half the power so doesn't say much about leading-edge progression.
     
    GoldenTiger likes this.
  14. Shadowed

    Shadowed Limp Gawd

    Messages:
    450
    Joined:
    Mar 21, 2018
    While I doubt AMD can release a cut down Radeon VII, they absolutely need a new card to compete in mid range again.

    RX 480 is so damn old now.

    1660 ti is going to sell like hotcakes.
     
    Silentbob343 likes this.
  15. _mockingbird

    _mockingbird Gawd

    Messages:
    801
    Joined:
    Feb 20, 2017
    ...because HBM2 and 7 no die are expensive
     
  16. PhaseNoise

    PhaseNoise [H]ard|Gawd

    Messages:
    1,192
    Joined:
    May 11, 2005
    So those guys suing AMD for bulldozer are going to go nuts here.
     
  17. _mockingbird

    _mockingbird Gawd

    Messages:
    801
    Joined:
    Feb 20, 2017
    Do you mean low end?

    Polaris is low end.

    Vega is mid-range (and, in case of Radeon VII, high end)
     
  18. Nightfire

    Nightfire [H]ard|Gawd

    Messages:
    1,267
    Joined:
    Sep 7, 2017
    $200-$250 cards are still midrange as there is about 3 tiers under that. This doesn't change because team green has about 2 ultra high ends and 2 High end cards. Vega could be considered upper midrange.
     
  19. _mockingbird

    _mockingbird Gawd

    Messages:
    801
    Joined:
    Feb 20, 2017

    People are confused.

    When it said that Navi will be first released for the mid-range, that just means under the Radeon VII.
     
  20. illli

    illli [H]ard|Gawd

    Messages:
    1,181
    Joined:
    Oct 26, 2005
    I think the rumor was appx 15% faster than the 1060. But this being "ti" maybe it would match a 1070. Honestly this whole generation hasn't been too impressive to me. Price/performance is not up to par as previous generations (just my opinion).
     
  21. ole-m

    ole-m Limp Gawd

    Messages:
    407
    Joined:
    Oct 5, 2015
    Performance of VII, V64 or V56 have never been the issue, the issue have been TDP and no cheaper than Nvidia counterparts for the most part while some good deals show up from time to time.
    Some (like me) picked one up at hilarious pricing but I feel 440$ at launch was acceptable price and not even good cause the hardware really isn't good.

    GTX1660TI will compete with RX590.
     
  22. Nolan7689

    Nolan7689 [H]ard|Gawd

    Messages:
    1,156
    Joined:
    Jun 5, 2015
    I’m not disagreeing at all, but the 290x is still a 1080p beast today. Probably has more to do with games not pushing the cards as much.



    But yeah Ski looking at purely the amount of cores is pointless. It’s apples and oranges across architectures, and while I’ll jokingly always say “more=better than” you can just look at the Vega arch to know that more cores doesn’t equal more performance. Case in point Vega 56 and 1080ti both boast 3584 cores. They aren’t even in the same league as each other, and whichever 6 year old card you’re referring to is also well beneath what even the Vega 56 can do.
     
    KATEKATEKATE likes this.
  23. KATEKATEKATE

    KATEKATEKATE n00b

    Messages:
    16
    Joined:
    Jan 20, 2019
    sorry, didn't mean to disparage the 290X- it is (and was) a great card! My first dip into 4K gaming- 30P, but it was 2014 after all- was on an OC'd fast and loud Sapphire reference model. :)

    I might get flamed for saying this but I actually appreciate that Xbone/PS4 ports are keeping base requirements reasonable because it keeps good older cards relevant. I disagree with the notion that it's holding back gfx- big titles are still pushing the limits of new hardware, the downward scalability is just broader. imho
     
    Nightfire likes this.
  24. KATEKATEKATE

    KATEKATEKATE n00b

    Messages:
    16
    Joined:
    Jan 20, 2019
    right! haha they think it's complicated now
    wait till they hear about how many different Cache/Scheduler/ALU/FPU configurations GPU architectures have!
     
  25. focbde

    focbde Gawd

    Messages:
    512
    Joined:
    Jan 31, 2008
    Yes yes yes, but is there going to be a non-raytracing equivalent of the 2080ti? Don't give a crap about raytracing at this early stage, but want an upgrade for my 1080....
     
  26. sirmonkey1985

    sirmonkey1985 [H]ard|DCer of the Month - July 2010

    Messages:
    20,912
    Joined:
    Sep 13, 2008
    nope the odds are definitely against you on that one..
     
  27. /dev/null

    /dev/null [H]ardForum Junkie

    Messages:
    13,676
    Joined:
    Mar 31, 2001
    Is this 1660Ti also going to be gimepd @ 6GB of vram or 8GB+ ?
     
  28. blackscreen

    blackscreen Limp Gawd

    Messages:
    248
    Joined:
    Nov 4, 2010
    The picture shows 6GB on that box.
     
    Nightfire likes this.
  29. Bawjaws

    Bawjaws Limp Gawd

    Messages:
    319
    Joined:
    Feb 20, 2017
    I highly doubt it, because Nvidia are unlikely to offer the consumer the chance to buy 2080Ti levels of (non-RT) performance at a lower price than the RTX 2080Ti, for the simple reason that that reduces the incentive for anyone to, er, actually buy an RTX 2080Ti. That might change once/if ray-tracing takes off, but right now Nvidia aren't going to cannabilise their own sales.

    The reason that the 1660Ti makes sense is that it apparently doesn't offer equivalent non-RT performance to the RTX 2060, so if you want 2060 levels of performance (even if you don't care a fig for ray-tracing) then you have to pony up for one. The 1660Ti slots in below the 2060 regardless of ray-tracing, but it wouldn't make sense if the only differentiator between the 2060 and 1660Ti was ray-tracing alone, because given the choice who wants to pay more money for no actual real-world benefit?
     
    msshammy and Nightfire like this.
  30. focbde

    focbde Gawd

    Messages:
    512
    Joined:
    Jan 31, 2008
    Oh I highly doubt it too... But a guy can dream.
     
  31. Loose Nut

    Loose Nut Limp Gawd

    Messages:
    352
    Joined:
    Oct 21, 2009
    The " MEH " is real :meh:
     
    Ranulfo likes this.
  32. Raghar

    Raghar Limp Gawd

    Messages:
    207
    Joined:
    Jun 23, 2012
    All these cards have 192-bit interface, thus all these cards gets 3/6 GB RAM. NVidia no longer does what I have on my GTX 660. Connect 3x2 RAM chips to 192-bit, 2x1 to 128-bit. (8 chips 2 GB RAM. They probably used single 192-bit controller with ability to work as 128-bit controller when it accessed these 2 most distant chips.)

    Nonetheless, using an array with two different speeds can cause problems, which is reason why NVidia allowed allocating whole RAM as single array only in 32 bit mode, lack of 32 bit address space forced arrays to be segmented, thus it easily allowed to keep each segment at its own speed. 64 bit OS allowed single address space, and as a consequence NVidia was forced to block creation of an array that would cross the space between fast 192-bit access, and slower 128-bit access.

    Short version, for 64 bit OS it's too big hassle to bother with 8 GB RAM on 192-bit interface, and 9GB RAM would make mid end card more expensive, and could cut into profits from high end 8 GB RAM 256-bit interface using card.