Volta Rumor Thread - Volta spotted in the Wild

Discussion in 'nVidia Flavor' started by Dayaks, Jul 25, 2017.

  1. Chimpee

    Chimpee Gawd

    Messages:
    677
    Joined:
    Jul 6, 2015
    Maybe, but I think a lot of people will be happy with a 50% bump from 1080ti.
     
  2. Factum

    Factum Gawd

    Messages:
    984
    Joined:
    Dec 24, 2014
    If you by this time are unware of how NVIDIA release their SKU's (and have been ever since GK104) it is really your own fault...
     
  3. geok1ng

    geok1ng [H]ard|Gawd

    Messages:
    1,920
    Joined:
    Oct 28, 2007
    now that is an overstatement. Crapdozer never sold much. AMD has sold every 570/580 GPU they built. they had so much success that NVIDIA was forced to launch a Miner's Edtion 1070ti trying to gain some market share on that price segment. Considering that sales revenue is the main metric to measure a product sucess, i would say that AMD GPUs are a better product than threadripper, which is king of value on its price range.
     
  4. geok1ng

    geok1ng [H]ard|Gawd

    Messages:
    1,920
    Joined:
    Oct 28, 2007
    Funny thing: i also decided that i will get a Volta Ti. I just don't know if there will be a Volta Ti, when it will be launched or how much will it cost. :p

    :eek::eek::eek: The [H]orror!!!

    You should check the 4k 120hz kits for 27" and 39" panels. I among the "founders edition" buyer list. We will always need to upgrade. PERIOD.
     
  5. Phlorge

    Phlorge Limp Gawd

    Messages:
    434
    Joined:
    Jun 19, 2014
    yeah coming from a 1080. that will be huge for me.
     
    Chimpee, AceGoober and {NG}Fidel like this.
  6. Dayman

    Dayman Limp Gawd

    Messages:
    322
    Joined:
    Jul 12, 2017
  7. chx

    chx Gawd

    Messages:
    652
    Joined:
    Jun 21, 2011
    I for myself can't wait for the 2050 Ti or 1150 Ti or whatever it'll be called. The Quadro P2000 is just a bit too expensive for me (that's a slightly reduced 1060 to fit a 75W envelope and single slot cooler).
     
  8. sblantipodi

    sblantipodi 2[H]4U

    Messages:
    3,374
    Joined:
    Aug 29, 2010
  9. Dayman

    Dayman Limp Gawd

    Messages:
    322
    Joined:
    Jul 12, 2017


    Apparently was using 190w during LuxMark
     
  10. Iratus

    Iratus Gawd

    Messages:
    956
    Joined:
    Jan 16, 2003
    Wish I knew when it was coming out, my water cooling has been in a box after needing a warranty replacement on my 1080ti and I now dunno if it’s worth putting it back on if i’m going to be ebaying it in 2-3 months
     
  11. BrainEater

    BrainEater Gawd

    Messages:
    726
    Joined:
    Jul 21, 2004
    The only thing I want to know is exactly what power connectors I should be planning for.

    :D
     
  12. bezant

    bezant Gawd

    Messages:
    751
    Joined:
    Oct 7, 2009
    Hey, where did you get your wage data from? Just curious, since you never make your numbers up, but always have a solid source.
     
  13. razor1

    razor1 [H]ardForum Junkie

    Messages:
    9,622
    Joined:
    Jul 14, 2005
  14. Communism

    Communism Limp Gawd

    Messages:
    158
    Joined:
    Feb 24, 2013
  15. razor1

    razor1 [H]ardForum Junkie

    Messages:
    9,622
    Joined:
    Jul 14, 2005
    Most likely around March launch give them 6 months for volume production, seems about right.
     
    Armenius likes this.
  16. Shintai

    Shintai [H]ardness Supreme

    Messages:
    5,298
    Joined:
    Jul 1, 2016
    January or March.
     
    Armenius, Caffeinated and razor1 like this.
  17. TaintedSquirrel

    TaintedSquirrel [H]ardForum Junkie

    Messages:
    8,509
    Joined:
    Aug 5, 2013
    What do you guys think about the Ampere rumors?
     
  18. razor1

    razor1 [H]ardForum Junkie

    Messages:
    9,622
    Joined:
    Jul 14, 2005
    10 nm Volta? probably there
     
    Armenius likes this.
  19. JRUHg

    JRUHg Limp Gawd

    Messages:
    172
    Joined:
    Jan 5, 2016
  20. razor1

    razor1 [H]ardForum Junkie

    Messages:
    9,622
    Joined:
    Jul 14, 2005
    yep, either or :) 2018 end of sounds right for it too.
     
    Caffeinated likes this.
  21. Communism

    Communism Limp Gawd

    Messages:
    158
    Joined:
    Feb 24, 2013
    The max that a VGA card can have in terms of PCI-E connectors is 8+8 pin if a company wants to stay within PCI-E 4.0 standard and actually be PCI-E certified.
     
  22. Fixall

    Fixall [H]ard|Gawd

    Messages:
    1,142
    Joined:
    May 17, 2011
    20% increased performance over a 1080ti for $650 and I'm a day one buyer. I'll be coming from a Maxwell Titan X and I can't hold out much longer.
     
    DLGenesis likes this.
  23. AlphaQup

    AlphaQup [H]Lite

    Messages:
    98
    Joined:
    Oct 27, 2014
    I'm in the same boat man. I want to dominate 1440p at 144hz with all the eye-candy turned on, and if this is the first card that can do that for me across the board for the most part (there's always going to be those heavy-weight AAA titles) then I'm going to have a real hard time not jumping on one.

    I have such a hard time waiting for the Ti when I see big performance jumps gen to gen :( It's going to be hard to be complacent with the 1080 and wait for the Ti... although once I'm on the Ti train, it'll be much easier to not get excited about the new XX80 series...
     
  24. TaintedSquirrel

    TaintedSquirrel [H]ardForum Junkie

    Messages:
    8,509
    Joined:
    Aug 5, 2013
    I hope they lead with a Titan this generation. I'd much rather drop 10+ benjamins right away then have to wait a year+ for the cut part.
    Plus it gives you the longest lifespan. But no, they want people to buy a GTX 2080 and then a new Titan 6mos later.

    C'mon nvidia, earn that money.
     
    Armenius likes this.
  25. Caffeinated

    Caffeinated [H]ard|Gawd

    Messages:
    1,525
    Joined:
    Oct 16, 2002
    I do hope AMD brings something competitive again, and SOON! Maybe since Raja is gone, somebody might actually let Vega out of the box.

    Or maybe it is ... kind of underwhelming.
     
  26. trandoanhung1991

    trandoanhung1991 [H]ard|Gawd

    Messages:
    1,039
    Joined:
    Aug 26, 2011
    Personally I'm gonna wait til 4k 144Hz HDR monitors and full fat Volta both drops and then sink some serious cash on those 2.

    I hope that some crazy stuff like dual 4K HDR VR headsets won't be announced next year. Some Chinese startup already managed to get a prototype with dual 4K screens running.
     
  27. Shintai

    Shintai [H]ardness Supreme

    Messages:
    5,298
    Joined:
    Jul 1, 2016
    You have to wait for Intel at this point if you want competitive.
     
  28. Caffeinated

    Caffeinated [H]ard|Gawd

    Messages:
    1,525
    Joined:
    Oct 16, 2002
    Sadly, you may be right!
     
  29. Lmah2x

    Lmah2x Limp Gawd

    Messages:
    431
    Joined:
    Apr 3, 2014
    The last 2 Titans have been 3 months after the small chip. Atleast its not a 10 month wait anymore like the first 2 Titans.
     
    Armenius and Montu like this.
  30. Montu

    Montu [H]ard DCOTM x4

    Messages:
    6,448
    Joined:
    Apr 25, 2001
    Yep, my current Titan was an excellent buy. I'll be waiting for another Titan and buy again. Looks like I'll be getting close to 2 years out of this one.
     
  31. Caffeinated

    Caffeinated [H]ard|Gawd

    Messages:
    1,525
    Joined:
    Oct 16, 2002
    That sounds interesting. Link?
     
  32. trandoanhung1991

    trandoanhung1991 [H]ard|Gawd

    Messages:
    1,039
    Joined:
    Aug 26, 2011
    Acer/AOC/Asus all announced 4K 144Hz HDR G-Sync AH-VA panels at Computex this year. Though they look to all be delayed to Q1 2018.
    https://www.asus.com/us/Monitors/ROG-SWIFT-PG27UQ/
    https://www.acer.com/ac/en/US/press/2017/255816
    https://www.144hzmonitors.com/monitors/aoc-ag273ug-aoc-ag353ucg/

    I just noticed AOC got a 35" 1440p Ultra-Wide 200Hz HDR G-Sync monitor planned. 27" 4K or 35" Ultra Wide 1440p hmm...

    If you're asking about the VR headset, here:
     
    razor1 likes this.
  33. Communism

    Communism Limp Gawd

    Messages:
    158
    Joined:
    Feb 24, 2013
    They're all going to suck until Displayport 1.5?2.0? and/or HDMI 2.1 are actually implemented. Considering the next Displayport standard isn't even a thing yet and that HDMI 2.1 is tentatively going to be finalized in December, you have a long wait in front of you if you want a product that actually has either of them.
     
    Shintai likes this.
  34. Armenius

    Armenius [H]ardForum Junkie

    Messages:
    11,362
    Joined:
    Jan 28, 2014
    Displayport 1.4 already has enough bandwidth for 4K HDR at 144 Hz using DSC, and all Pascal cards already have DP 1.4 outputs.
     
  35. Communism

    Communism Limp Gawd

    Messages:
    158
    Joined:
    Feb 24, 2013
    #1, DSC is a kludge, #2, the ones that are announced so far do not support DSC afaik.

    There's also no guarantee that DSC won't have additional latency attached to it's use.
     
  36. Armenius

    Armenius [H]ardForum Junkie

    Messages:
    11,362
    Joined:
    Jan 28, 2014
    1. You have enough information to call DSC a kludge? I don't think a kludge could be visually lossless like DSC is. DSC is on version 1.2 in Displayport 1.4 and is now 5 years old.
    2. Specs have not been finalized on the monitors, so yes, we don't know. However, if you look at Linus' video from CES we can see that it is connected using only one Displayport cable.
    3. Maximum latency added by the DSC decoder stage is only up to 8 μs, or 0.008 ms.
     
  37. Communism

    Communism Limp Gawd

    Messages:
    158
    Joined:
    Feb 24, 2013

    "Visually lossless" is a meaningless term. People called mpeg2 "visually lossless" and terms like that as well. People called 4:2:0 "visually lossless" and terms like that as well.

    At best, DSC will be a deterministic compression algorithm, which works at the packet level.

    At worst, DSC will be a semi-deterministic compression algorithm, which works at the frame level.

    http://www.vesa.org/wp-content/uploads/2014/04/VESA_DSC-ETP200.pdf

    https://mipi.org/sites/default/files/Hsinchu-Hardent-Create-Higher-Resolution-Displays.pdf

    Don't buy into hype, dig deep.

    The latency hit is obvious, if you know how G-sync works in the first place.

    (even worse latency if it is implemented with AMD-sync implementation type).


    The standards documents are talking out of both sides of their mouths.

    They are taking the "visually lossless" claim from the full frame compression type.

    They are taking the "low latency" claim from the bit-line compression type.




    DSC isn't magic.

    The reason they are "hyping" it is due almost entirely to them not wanting to spend 1/100th of a cent more for production per cable by upping the transport standards.
     
    Last edited: Nov 17, 2017 at 8:55 AM
  38. Armenius

    Armenius [H]ardForum Junkie

    Messages:
    11,362
    Joined:
    Jan 28, 2014
    The original point is that 4K HDR @ 144 Hz is still possible with Displayport 1.4.
     
  39. Communism

    Communism Limp Gawd

    Messages:
    158
    Joined:
    Feb 24, 2013
    Don't run away.

    This is just getting to the fun part :p.

    I just got done spoon-feeding you, you don't get to quit already, wheres the fun in that? :D
     
    Last edited: Nov 17, 2017 at 9:04 AM
  40. Sycraft

    Sycraft 2[H]4U

    Messages:
    4,002
    Joined:
    Nov 9, 2006
    Also it sounds like it really isn't intended for being a long term solution to higher rez displays, more it is intended for lower power applications, letting you save power on mobile devices by lowering the signaling rate, thus lowering the power consumption.

    Well hang on there, if you think that high speed cables are cheap, you should take a look at the prices some time. It is getting hard to keep making cables faster and faster. The problem is physics. To get more data down a given set of wires, you have to increase either the frequency bandwidth or the SNR. SNR increases are pretty much a non-starter so you are left with more frequency increases. Thing is, as frequency goes up, it gets harder and harder to get that down the wire. You get more cable losses, more noise leaking in, and more reflections/crosstalk. For a good example, look at 1gig vs 10gig copper ethernet. 1 gig officially works over Cat-5e but really will work over Cat-5 out to 100 meters. 10gig requires Cat-6a for 100 meters. Then look at the differences in cables, both prices and construction, to see what it takes to make 6a instead of 5. They are thicker, have tighter tolerances, separators, and so on.

    This gets even harder if you want the cable to maintain low latency. See there are some tricks to pack in more data at a lower frequency, basically to make more efficient use of the spectrum. However more complex signaling adds cost to the transmitter and receiver but also adds latency. Straight binary, serial, signaling is extremely fast. Doing something like QAM (as cable modems do) is more complex and adds latency.

    Interconnect speed is a big issue in computing in general, not just displays. It is a PITA to make cables that can reliably pass higher and higher signaling speeds. It's not an unsolvable problem, but it is more than just adding a fraction of a cent to manufacturing costs.