Maxwell GM200 pictured

This should be 50% faster than the 980. Where I currently get a laggy 67FPS, I will have 100FPS.
 
This should be 50% faster than the 980. Where I currently get a laggy 67FPS, I will have 100FPS.

Not sure if sarcasm or you just have a 100Hz display ;)

I imagine the price on these will be insane, but the rumored 3072 CUDA cores and 12GB VRAM...can't type, drooling...
 
This looks very nice but I guess it will cost about £600+

The 390 / 390x will probably be better for price / performance, unless nvidia decide to stop charging extreme prices for their top end cards like titan etc.
 
This looks very nice but I guess it will cost about £600+

The 390 / 390x will probably be better for price / performance, unless nvidia decide to stop charging extreme prices for their top end cards like titan etc.

$1,500 USD I bet.
 
I am floored seeing ppl first salivating about 3072 shaders/12 GB GPU and then arguing about el cheapo monitors and connectors. Get life guys, this GPU will love something like 4K@40" and hate 1080p, DVI, 21", and so on. I hope it has DP1.3/HDMI 2.0 only. I hope it will have support for pairing DP1.3 outputs to drive future full 8K monitors.

BTW, specs are for professional Quadro card with 12GB, this means consumer card will have 6GB.
 
I am floored seeing ppl first salivating about 3072 shaders/12 GB GPU and then arguing about el cheapo monitors and connectors. Get life guys, this GPU will love something like 4K@40" and hate 1080p, DVI, 21", and so on. I hope it has DP1.3/HDMI 2.0 only. I hope it will have support for pairing DP1.3 outputs to drive future full 8K monitors.

BTW, specs are for professional Quadro card with 12GB, this means consumer card will have 6GB.

and maybe have reduced amount of cuda cores. even with that said, it looks to be a monster.
 
I am floored seeing ppl first salivating about 3072 shaders/12 GB GPU and then arguing about el cheapo monitors and connectors. Get life guys, this GPU will love something like 4K@40" and hate 1080p, DVI, 21", and so on. I hope it has DP1.3/HDMI 2.0 only. I hope it will have support for pairing DP1.3 outputs to drive future full 8K monitors.

BTW, specs are for professional Quadro card with 12GB, this means consumer card will have 6GB.

But this will power games for the Oculus Rift quite nicely.
 
Yeah, I'm not shelling $800-$1000 for one Titanic GPU, i'd rather just buy another STRIX 980 and still come out ahead
 
DP can output to VGA, DVI, and HDMI, so I don't see the point of putting a big honkin' DVI connector on a card anymore.
 
On topic please, take the CRT vs Panel debate to your own thread.
 
Last edited:
Sorry kyle, I didn't see that as I was trying to tell the guy he was so very wrong. :p

Maxwell looks interesting. I would like to see DVI ports still be available. Maybe for the less expensive cards ;)
 
Is this 16nm?

Hope nVidia doesn't price this out of the very upper limits of the mainstream segment (more than $700).
 
Is this 16nm?

Hope nVidia doesn't price this out of the very upper limits of the mainstream segment (more than $700).


Very unlikely. It might be 20nm, but if it is, volume will be severely limited.

16nm processes aren't coming online as fast as expected, and 20nm yields and volumes aren't improving as fast as expected, at the same time as mobile chip manufacturers (Apple, Qualcomm, etc.) are eating up most of the available capacity.

This means, 16nm is very unlikely, and 20nm would be only for extreme low volume high end parts in the near future.

I wouldn't be surprised if both AMD and Nvidia keep their entire next gen of GPU's at 28nm. :(
 
Well, at any rate, I hope it is on a smaller process than 28nm. If not, then I'm waiting for Pascal and whatever comes after AMDs 300 series.
 
Looking forward to it, hopefully EVGA puts out a aftermarket cooler for it if nVidia restricts selling them to reference coolers.
 
Zarathustra[H];1041364105 said:
I wouldn't be surprised if both AMD and Nvidia keep their entire next gen of GPU's at 28nm. :(



Oh yeah, as if it that truly is the end of the world :rolleyes:
 
6GB VRAM for the consumer card sounds decent...I'm more interested in the other specs though as VRAM is more then enough for me at 4GB for the time being (GTX 970)
 
DP can output to VGA, DVI, and HDMI, so I don't see the point of putting a big honkin' DVI connector on a card anymore.

Same, and HDMI is DVI with a different connector as far as the signaling goes, just use a cable with DVI on one end and HDMI on the other if your display only has a DVI port. As far as the cards go, looking forward to them but I'll be hanging onto my pair of MSI Gaming GTX 970 cards in SLI oc'd @ 1506mhz core/8020mhz memory for the time being as a cool/quiet 24/7 gaming overclock. I'd be kind of tempted to go for one of the GM200 cards or even two :eek: but I don't have the spare money I want to throw at them right now... I'd much rather wait for 16nm GPU's to come around with HBM from either camp in early 2016 and hang out with what I have for 4K gaming for the time being. 16nm+HBM together on new chips should be a humongous jump in performance, and that's where I want to lay down my cash.
 
I would really rather 12 GB. I can see four of these driving 3x 4K displays, which has already been shown to max out the VRAM of 6 GB Titans.

Well, if they follow the last gen the next Titan will be 12GB, and the next top Geforce will be half that.
 
Oh yeah, as if it that truly is the end of the world :rolleyes:

Didn't suggest it was, but if they do, there will be less performance increases this generation, and cards will likely come with larger and louder cooling solutions in order to get more performance out of 28nm.

Not the end of the world. After all it's just a video card, but it would be nice if we got 20nm GPU's.

That being said, I'm already considering putting together my first custom water cooling loop though, so maybe I shouldn't worry as much about that :p

(side note, geez, full cover water blocks are expensive...)
 
Zarathustra[H];1041364508 said:
Well, if they follow the last gen the next Titan will be 12GB, and the next top Geforce will be half that.

I can see nVidia pulling shenanigans with VRAM...

970ti = 4GB or 6GB depending on bus width
980ti = same as 970ti
Titan2 = 6GB or 9GB
Titan2 Black = 9GB or 12GB
Titan2Z = 12GB (x2)
 
Only intel has made it to 22nm on a production level (as of my last check) if Global foundries or TSMC made it to 22 successfully, they kept it very quiet about it. If there's another place that chips like that could be made (and that's not some classified secret) I'd like to hear it.
 
Last edited:
Only intel has made it to 22nm on a production level (as of my last check) if Global foundries or TSMC made it to 22 successfully, they kept it very quiet about it. If there's another place that chips like that could be made (and that's not some classified secret) I'd like to hear it.

Plenty of new Apple devices running on TSMC 20nm chips, right now. They aren't little media device chips, either, since they have more transistors than either the 2 or 4 core mobile Haswells.
 
Only intel has made it to 22nm on a production level (as of my last check) if Global foundries or TSMC made it to 22 successfully, they kept it very quiet about it. If there's another place that chips like that could be made (and that's not some classified secret) I'd like to hear it.

Intel is shipping 14nm, see Core M 5Y10. TSMC is shipping 20nm, see Apple's processors.
 
^^and a sweet XPS 13 that I recently bought for a new employee :D Has an i7 5500U, 14nm.
 
Plenty of new Apple devices running on TSMC 20nm chips, right now. They aren't little media device chips, either, since they have more transistors than either the 2 or 4 core mobile Haswells.

Yep, and they are the reason Nvodia and AMD are delayed to 20nm. The likes of Apple and Qualcomm are hogging all the 20nm capacity.
 
^ I see this as good news as we aren't beta testers for new nodes. By the time AMD and Nvidia go for it all the major bugs should be ironed out.
 
Back
Top