Volta Rumor Thread - Volta spotted in the Wild

when it comes out it will be nerfed but still 50% better than the 1080ti.
6 months later we will see the TI version which is what you are looking at now.
F job just like the 1080

Maybe, but I think a lot of people will be happy with a 50% bump from 1080ti.
 
when it comes out it will be nerfed but still 50% better than the 1080ti.
6 months later we will see the TI version which is what you are looking at now.
F job just like the 1080

If you by this time are unware of how NVIDIA release their SKU's (and have been ever since GK104) it is really your own fault...
 
They are also worse of in the GPU segment now than they where with Bulldozer in the CPU segment.

now that is an overstatement. Crapdozer never sold much. AMD has sold every 570/580 GPU they built. they had so much success that NVIDIA was forced to launch a Miner's Edtion 1070ti trying to gain some market share on that price segment. Considering that sales revenue is the main metric to measure a product sucess, i would say that AMD GPUs are a better product than threadripper, which is king of value on its price range.
 
I'm just going to wait for the Ti being released for $500 less and 90% of the performance.

Funny thing: i also decided that i will get a Volta Ti. I just don't know if there will be a Volta Ti, when it will be launched or how much will it cost. :p

until higher res VR comes out I don't have a need to upgrade

:eek::eek::eek: The [H]orror!!!

You should check the 4k 120hz kits for 27" and 39" panels. I among the "founders edition" buyer list. We will always need to upgrade. PERIOD.
 
I for myself can't wait for the 2050 Ti or 1150 Ti or whatever it'll be called. The Quadro P2000 is just a bit too expensive for me (that's a slightly reduced 1060 to fit a 75W envelope and single slot cooler).
 
Wish I knew when it was coming out, my water cooling has been in a box after needing a warranty replacement on my 1080ti and I now dunno if it’s worth putting it back on if i’m going to be ebaying it in 2-3 months
 
The only thing I want to know is exactly what power connectors I should be planning for.

:D
 
You would just get slower performance at your desired price bracket and you can already buy that. I dont see the issue besides you may feel you belong in a higher group than you do? And since the GTX7900 wages here have gone up 35-40% or so.
Hey, where did you get your wage data from? Just curious, since you never make your numbers up, but always have a solid source.
 
Most likely around March launch give them 6 months for volume production, seems about right.
 
The only thing I want to know is exactly what power connectors I should be planning for.

:D

The max that a VGA card can have in terms of PCI-E connectors is 8+8 pin if a company wants to stay within PCI-E 4.0 standard and actually be PCI-E certified.
 
Maybe, but I think a lot of people will be happy with a 50% bump from 1080ti.

20% increased performance over a 1080ti for $650 and I'm a day one buyer. I'll be coming from a Maxwell Titan X and I can't hold out much longer.
 
20% increased performance over a 1080ti for $650 and I'm a day one buyer.

I'm in the same boat man. I want to dominate 1440p at 144hz with all the eye-candy turned on, and if this is the first card that can do that for me across the board for the most part (there's always going to be those heavy-weight AAA titles) then I'm going to have a real hard time not jumping on one.

I have such a hard time waiting for the Ti when I see big performance jumps gen to gen :( It's going to be hard to be complacent with the 1080 and wait for the Ti... although once I'm on the Ti train, it'll be much easier to not get excited about the new XX80 series...
 
I hope they lead with a Titan this generation. I'd much rather drop 10+ benjamins right away then have to wait a year+ for the cut part.
Plus it gives you the longest lifespan. But no, they want people to buy a GTX 2080 and then a new Titan 6mos later.

C'mon nvidia, earn that money.
 
I do hope AMD brings something competitive again, and SOON! Maybe since Raja is gone, somebody might actually let Vega out of the box.

Or maybe it is ... kind of underwhelming.
 
Personally I'm gonna wait til 4k 144Hz HDR monitors and full fat Volta both drops and then sink some serious cash on those 2.

I hope that some crazy stuff like dual 4K HDR VR headsets won't be announced next year. Some Chinese startup already managed to get a prototype with dual 4K screens running.
 
I do hope AMD brings something competitive again, and SOON! Maybe since Raja is gone, somebody might actually let Vega out of the box.

Or maybe it is ... kind of underwhelming.

You have to wait for Intel at this point if you want competitive.
 
I hope they lead with a Titan this generation. I'd much rather drop 10+ benjamins right away then have to wait a year+ for the cut part.
Plus it gives you the longest lifespan. But no, they want people to buy a GTX 2080 and then a new Titan 6mos later.

C'mon nvidia, earn that money.

The last 2 Titans have been 3 months after the small chip. Atleast its not a 10 month wait anymore like the first 2 Titans.
 
The last 2 Titans have been 3 months after the small chip. Atleast its not a 10 month wait anymore like the first 2 Titans.

Yep, my current Titan was an excellent buy. I'll be waiting for another Titan and buy again. Looks like I'll be getting close to 2 years out of this one.
 
Personally I'm gonna wait til 4k 144Hz HDR monitors and full fat Volta both drops and then sink some serious cash on those 2.

I hope that some crazy stuff like dual 4K HDR VR headsets won't be announced next year. Some Chinese startup already managed to get a prototype with dual 4K screens running.

That sounds interesting. Link?
 
That sounds interesting. Link?

Acer/AOC/Asus all announced 4K 144Hz HDR G-Sync AH-VA panels at Computex this year. Though they look to all be delayed to Q1 2018.
https://www.asus.com/us/Monitors/ROG-SWIFT-PG27UQ/
https://www.acer.com/ac/en/US/press/2017/255816
https://www.144hzmonitors.com/monitors/aoc-ag273ug-aoc-ag353ucg/

I just noticed AOC got a 35" 1440p Ultra-Wide 200Hz HDR G-Sync monitor planned. 27" 4K or 35" Ultra Wide 1440p hmm...

If you're asking about the VR headset, here:
 
Acer/AOC/Asus all announced 4K 144Hz HDR G-Sync AH-VA panels at Computex this year. Though they look to all be delayed to Q1 2018.
https://www.asus.com/us/Monitors/ROG-SWIFT-PG27UQ/
https://www.acer.com/ac/en/US/press/2017/255816
https://www.144hzmonitors.com/monitors/aoc-ag273ug-aoc-ag353ucg/

I just noticed AOC got a 35" 1440p Ultra-Wide 200Hz HDR G-Sync monitor planned. 27" 4K or 35" Ultra Wide 1440p hmm...

If you're asking about the VR headset, here:


They're all going to suck until Displayport 1.5?2.0? and/or HDMI 2.1 are actually implemented. Considering the next Displayport standard isn't even a thing yet and that HDMI 2.1 is tentatively going to be finalized in December, you have a long wait in front of you if you want a product that actually has either of them.
 
They're all going to suck until Displayport 1.5?2.0? and/or HDMI 2.1 are actually implemented. Considering the next Displayport standard isn't even a thing yet and that HDMI 2.1 is tentatively going to be finalized in December, you have a long wait in front of you if you want a product that actually has either of them.
Displayport 1.4 already has enough bandwidth for 4K HDR at 144 Hz using DSC, and all Pascal cards already have DP 1.4 outputs.
 
Displayport 1.4 already has enough bandwidth for 4K HDR at 144 Hz using DSC, and all Pascal cards already have DP 1.4 outputs.

#1, DSC is a kludge, #2, the ones that are announced so far do not support DSC afaik.

There's also no guarantee that DSC won't have additional latency attached to it's use.
 
#1, DSC is a kludge, #2, the ones that are announced so far do not support DSC afaik.

There's also no guarantee that DSC won't have additional latency attached to it's use.
  1. You have enough information to call DSC a kludge? I don't think a kludge could be visually lossless like DSC is. DSC is on version 1.2 in Displayport 1.4 and is now 5 years old.
  2. Specs have not been finalized on the monitors, so yes, we don't know. However, if you look at Linus' video from CES we can see that it is connected using only one Displayport cable.
  3. Maximum latency added by the DSC decoder stage is only up to 8 μs, or 0.008 ms.
 
  1. You have enough information to call DSC a kludge? I don't think a kludge could be visually lossless like DSC is. DSC is on version 1.2 in Displayport 1.4 and is now 5 years old.
  2. Specs have not been finalized on the monitors, so yes, we don't know. However, if you look at Linus' video from CES we can see that it is connected using only one Displayport cable.
  3. Maximum latency added by the DSC decoder stage is only up to 8 μs, or 0.008 ms.



"Visually lossless" is a meaningless term. People called mpeg2 "visually lossless" and terms like that as well. People called 4:2:0 "visually lossless" and terms like that as well.

At best, DSC will be a deterministic compression algorithm, which works at the packet level.

At worst, DSC will be a semi-deterministic compression algorithm, which works at the frame level.

http://www.vesa.org/wp-content/uploads/2014/04/VESA_DSC-ETP200.pdf

https://mipi.org/sites/default/files/Hsinchu-Hardent-Create-Higher-Resolution-Displays.pdf

Don't buy into hype, dig deep.

The latency hit is obvious, if you know how G-sync works in the first place.

(even worse latency if it is implemented with AMD-sync implementation type).


The standards documents are talking out of both sides of their mouths.

They are taking the "visually lossless" claim from the full frame compression type.

They are taking the "low latency" claim from the bit-line compression type.




DSC isn't magic.

The reason they are "hyping" it is due almost entirely to them not wanting to spend 1/100th of a cent more for production per cable by upping the transport standards.
 
Last edited:
The original point is that 4K HDR @ 144 Hz is still possible with Displayport 1.4.
 
The original point is that 4K HDR @ 144 Hz is still possible with Displayport 1.4.

Don't run away.

This is just getting to the fun part :p.

I just got done spoon-feeding you, you don't get to quit already, wheres the fun in that? :D
 
Last edited:
DSC isn't magic.

Also it sounds like it really isn't intended for being a long term solution to higher rez displays, more it is intended for lower power applications, letting you save power on mobile devices by lowering the signaling rate, thus lowering the power consumption.

The reason they are "hyping" it is due almost entirely to them not wanting to spend 1/100th of a cent more for production per cable by upping the transport standards.

Well hang on there, if you think that high speed cables are cheap, you should take a look at the prices some time. It is getting hard to keep making cables faster and faster. The problem is physics. To get more data down a given set of wires, you have to increase either the frequency bandwidth or the SNR. SNR increases are pretty much a non-starter so you are left with more frequency increases. Thing is, as frequency goes up, it gets harder and harder to get that down the wire. You get more cable losses, more noise leaking in, and more reflections/crosstalk. For a good example, look at 1gig vs 10gig copper ethernet. 1 gig officially works over Cat-5e but really will work over Cat-5 out to 100 meters. 10gig requires Cat-6a for 100 meters. Then look at the differences in cables, both prices and construction, to see what it takes to make 6a instead of 5. They are thicker, have tighter tolerances, separators, and so on.

This gets even harder if you want the cable to maintain low latency. See there are some tricks to pack in more data at a lower frequency, basically to make more efficient use of the spectrum. However more complex signaling adds cost to the transmitter and receiver but also adds latency. Straight binary, serial, signaling is extremely fast. Doing something like QAM (as cable modems do) is more complex and adds latency.

Interconnect speed is a big issue in computing in general, not just displays. It is a PITA to make cables that can reliably pass higher and higher signaling speeds. It's not an unsolvable problem, but it is more than just adding a fraction of a cent to manufacturing costs.
 
Back
Top