NVIDIA Releasing Three GTX 800 Graphics Cards In October?

@Dayaks ... You're being too literal for the sake of proving a point. I think everyone knew what I meant. Anytime Nvidia releases a new model of a video card, or AMD releases something new, they leap frog one another in terms of single digital frames, not always but normally.

The point being ( and why am I explaining to to you? Oh, the literal part ) that I think the best bet is waiting on 20nm if you already have a 290 or 780.

2015 is right around the corner. We've been on 28nm for long long while.
 
@Dayaks ... You're being too literal for the sake of proving a point. I think everyone knew what I meant. Anytime Nvidia releases a new model of a video card, or AMD releases something new, they leap frog one another in terms of single digital frames, not always but normally.

The point being ( and why am I explaining to to you? Oh, the literal part ) that I think the best bet is waiting on 20nm if you already have a 290 or 780.

2015 is right around the corner. We've been on 28nm for long long while.

I completely agree. I was a little trolly.
 
I can't wait till October. I normally buy at least a pair of cards from AMD and NV every cycle so now it's time to go back with NV until AMD answers.
 
Looks like a good time to buy an eVGA card and use their step up program.
 
Everything in 2014 was delayed to 2015.
So 2015 is basically two years of game releases crammed into one year. Blame next-gen consoles.
 
can someone explain the ARM feature and why it's supposedly such a game-changer...is 28nm with ARM a big enough deal on its own or is it better to wait for 20nm?

The ARM core is suppose to decrease overhead for PCs, a way of increasing performance for people who don't have beefy cpus. Think of it as a coprocessor. Correct me if anybody can be more specific.
 
The ARM core is suppose to decrease overhead for PCs, a way of increasing performance for people who don't have beefy cpus. Think of it as a coprocessor. Correct me if anybody can be more specific.

Nope you are correct from what I understand. But are they even bothering with the ARM side on the 800 series with it being 28nm? I thought they weren't going to put that on this set. I honestly don't know. :confused:
 
The ARM core is suppose to decrease overhead for PCs, a way of increasing performance for people who don't have beefy cpus. Think of it as a coprocessor. Correct me if anybody can be more specific.

For CUDA-type applications it is amazing because one of the bottlenecks is having to execute non-parallel code on the CPU. And right now for NVIDIA that means taking your data back across the bus to system memory so you can do the work on the CPU, and then when you're done, send all the data back to GPU memory. There is quite a tangled coordination that a programmer has to deal with.

That's like driving a few miles to do five minutes of work when you could have just done it from home on your computer. So if they could unify the memory that would be a huge advantage. So basically all the same advantages we hear about with HSA and hUMA, etc.

Whether or not any of that would carry over to game programming, I'm not competent to say. But that would require developers putting in vendor- and architecture-specific code to run on the card's CPU. And you know what it's like getting developers on board.

The long-term play is that it would remove a dependency on Intel in certain high-end scenarios, including HPC or custom game consoles. Basically Tegra on steroids.


Of course, there really hasn't been any talk of this ARM CPU idea from nvidia for a long, long time and has mysteriously disappeared from product schedules. So I think they dropped the idea. Which, to me, is evidence that they intend to de-emphasize their involvement in the HPC market.
 
Yo dawg, I just heard Volta is coming back and it's gonna be 16nm FinFET taping out in early Q2'15, right after the really real 20nm 3rd-gen Maxwell 900-series drops! :D

TSMC claims that all scheduled 16nm tape-outs will be done by the end of 2014, so we'll see. Of course... schedules do change.
 
nVidia-branded super anti-static of course!:D
yNGACvL.png
 
TSMC claims that all scheduled 16nm tape-outs will be done by the end of 2014, so we'll see. Of course... schedules do change.

Yeah, I totally made that stuff up about 16nm Volta.. did WCCFtech pick up the story via Sweclockers yet?

Knowing TSMC, another delay seemed completely believable. The delay being so slight made it optimistic to the reader. The Volta part was complete hogwash. Think that's the formula for a good tech rumor.. something believable, something hopeful, something bullshit. And if the rumor catches on and is repeated enough, even the manufacturer starts to believe it, and things start happening.

Now everybody repeat, 800-series to be more powerful, cheaper and coming in.. August! ;)
 
TSMC claims that all scheduled 16nm tape-outs will be done by the end of 2014, so we'll see. Of course... schedules do change.

Well they don't even have 20nm in proper production, won't see it til least probably q2/q3 2015. q2 is if they do a refresh with smaller process tech, But that is optimistic I think. Less they plan to us it on the lower end 800 series chips and put out a revision of high end later on.
 
Yeah, I totally made that stuff up about 16nm Volta.. did WCCFtech pick up the story via Sweclockers yet?

Knowing TSMC, another delay seemed completely believable. The delay being so slight made it optimistic to the reader. The Volta part was complete hogwash. Think that's the formula for a good tech rumor.. something believable, something hopeful, something bullshit. And if the rumor catches on and is repeated enough, even the manufacturer starts to believe it, and things start happening.

Now everybody repeat, 800-series to be more powerful, cheaper and coming in.. August! ;)

The reveal is! The only thing is that the availability wont be that great. 680 case again...
 
Well they don't even have 20nm in proper production, won't see it til least probably q2/q3 2015. q2 is if they do a refresh with smaller process tech, But that is optimistic I think. Less they plan to us it on the lower end 800 series chips and put out a revision of high end later on.

They have no reason to delay 16nm, aka their 20nm Finfet. The 16nm FinFet+ process was 95% ready in April '14 and uses mainly the same tools/equipment as 20nm planar.

TSMC's 20nm process is expected to account for ~20% of TSMC's wafer revenue in Q4 2014, which is inline with what 28nm did in Q4 '12, 22%.

new 28nm GPUs at end of '14
20/16nm GPUs Q2/Q3 '15.
16FinFet+ Q1/Q2 '16.

TSMC's 10nm, which is actually 14nm FinFet, is supposed to be entering risk production in Q4 '15. That could definitely be delayed because that requires EUV.
Best case of GPUs on TSMC's 10nm is Q4 '16 but likely sometime in '17.
 
Return of the 6-month product cycle? :3
800 series buyers gonna be maaaaad.

Including me.

Well it could certainly be stretched out longer than that and will be highly dependent on capacity, since 16nm FinFet and FinFet+ will be highly desirable for the mobile markets.
 
Return of the 6-month product cycle? :3
800 series buyers gonna be maaaaad.

Including me.


I day dream too. :) Sadly, they're just fantasies. Given we're already approaching the 18 month mark right now I'd say even a rapid release cycle of the 800 > 900 in < 1 year would really just kind of even it out. The last time they did that was the 400 > 500 for noise, heat, and power consumption reasons.

I have a feeling neither GPU maker wants to blow their load anymore after this TSMC 20nm delay. We are rapidly approaching the end of silicon with still no real replacement. Getting to 10nm and below is going to be a challenge. I don't understand TSMC's node comparisons to others like Intel, but LordEC911 seems to be able to see through the labeling.
 
Even if it is 1 year, AMD could beat Nvidia to the punch they'll have something out within 6 months of the first Maxwell cards, maybe sooner.

Some kind of magical Pirate Islands release could turn the tides.
Could be another case of the 5870 --> GTX 480 which took a good few months. Or 7970 --> GTX 680, but flip AMD/Nvidia.

Although they're talking about Tonga (snooooze) and 290 XT (snoooooze).
Point is, if you choose to sit around and wait for Nvidia then you could die a sad, old, lonely, Maxwell-less man.
 
I think I'll milk my 670 GTX for what it's worth till games start playing bad.
Someone said to wait for the 2nd cycle of cards so 900 series it will be.
The thing is I payed 400.00 for my 670GTX back in 2012. How many years did I get out of it Two so far.

This will be a good upgrade for Gsync monitors.
 
I think I'll milk my 670 GTX for what it's worth till games start playing bad.
Someone said to wait for the 2nd cycle of cards so 900 series it will be.
The thing is I payed 400.00 for my 670GTX back in 2012. How many years did I get out of it Two so far.

This will be a good upgrade for Gsync monitors.

I might follow your line of thinking and stick with what I have or go three way on the cheap (used).
 
the driver thing is/was way overblown simply because they're more nvidia fan boys seeing that amd can actually compete if not better them

I dunno, I've done both team red and team green, both crossfire and SLI, and my experience has lead me to a few conclusions based on my experiences

1) xfire/sli is cool, but getting drivers that fix bugs or performance is significantly slower than with the single card solution, regardless of nvidia or amd

2) nvidia cards/drivers seem to cause me less issues in game from day one.

3) Games get patched for nvidia cards sooner if there is a bug specific to that product.

4) nivdia drivers drop more often, so more chances for a bug fix if you ran into a bug.

5) Nvidia drops more drivers for a given card before it gets ignored in the bug fix/performance boost area. Amd drops less drivers for a given card before ignoring it, but does so over a longer span on the calendar.

I'm less willing to spend as much on bleeding edge these days, and my conclusion is just buy the fastest single GPU nvidia card less often. It has resulted in the minimum of hassles despite the AMD card being a better value for money from a technology standpoint lately. FPS per dollar doesn't really matter to me given that almost everything is gimped for cross platform support of consoles. I pay nvidia a bit more so they pay the studios a bit more and give them a bit more support, and I get a less problematic experience.

With AMD in the console space and the trend towards lower level hardware interface, that will likely undermine my strategy in general, but its' taken quite a while and that is still yet to actually result in the situation changing yet.
 
I think I'll milk my 670 GTX for what it's worth till games start playing bad.
Someone said to wait for the 2nd cycle of cards so 900 series it will be.
The thing is I payed 400.00 for my 670GTX back in 2012. How many years did I get out of it Two so far.

This will be a good upgrade for Gsync monitors.

I have the same thinking. The 680 has been a solid performer and until more demanding games start to hit the market, I don't see much point in jumping on the next release as soon as it hits.

My thinking will most likely change as soon as the next gen console games start porting their way over to PC. Looking forward to GTA V and Witcher 3 to be a good gauge of next gen performance on PC hardware.

This will be the first time in a long while I've skipped a couple of generations.
 
I can't break the division of 2's I have been going by. (Just realized I was doing it. lol) Left Amd at the 1800XT for the 6800, then 8800 series, Gtx 260-216, Gtx 470, and Gtx 660Ti SLI. Thinking of going with the top end for once and getting just one GTX 880. I would love to hold out till next year, but who the hell are we all kidding? You know you are going to cave in. I am just not going to fight it. lol :p
 
I can't break the division of 2's I have been going by. (Just realized I was doing it. lol) Left Amd at the 1800XT for the 6800, then 8800 series, Gtx 260-216, Gtx 470, and Gtx 660Ti SLI. Thinking of going with the top end for once and getting just one GTX 880. I would love to hold out till next year, but who the hell are we all kidding? You know you are going to cave in. I am just not going to fight it. lol :p

I heard that. My chance to finally build a respectable stupidly overpowered rig after almost 10 years is coming up in OCT and brother, I'm getting me some HORSEPOWER. I figure 2 will do the job just fine, but if they are at the price point people keep talking about, I might just go ahead and get 3 of them for shit's and giggle's. :)
 
I think I'll wait for the 28nm Titan II and skip the early 800s releases. From the leaked benchmarks its still far too slow to even hit 30 fps in plenty of current gen games at 4K (even with no AA), while 1080p is still easy enough to drive that a 780 can handle it for probably the next 2 years. And after having SLI 285s, CF 6970 and SLI 780s I pretty much give up on multi-GPU, I'm too sensitive to microstutter and the noise of multi GPU is also pretty hard to deal with if you play a lot games with dialogue/cutscenes. It will take at least an overclocked Titan II that's 70-80% faster than a stock GTX 880 to play 4K at a reasonable level with current gen games (aka ~45fps+ in Crysis 3 or BF4 with no AA).
 
There really are no leaked benchmarks though; just the images people completely fabricated in Photoshop. So it's too early to really know anything yet.
 
It will take at least an overclocked Titan II that's 70-80% faster than a stock GTX 880 to play 4K

Yes, because the original Titan overclocked was that much faster than a 780......:rolleyes:

Also, where are those 4k benches for an 880 that the "Titan II" can compare itself at 4K?

Actually the Titan III will be the card to look for....:confused:
 
Yes, because the original Titan overclocked was that much faster than a 780......:rolleyes:

Also, where are those 4k benches for an 880 that the "Titan II" can compare itself at 4K?

Actually the Titan III will be the card to look for....:confused:

The original Titan was 40% faster than a 680, and overclocked on air could be almost 100% faster.

I dont know why you think the performance delta between a 430mm^2 (gtx 880) and a 600mm^2 card (Titan II) is going to be similar to Titan (551mm^2) and GTX 780 (551mm^2) but I guess we are just going to have to disagree.
 
The original Titan was 40% faster than a 680, and overclocked on air could be almost 100% faster.

I dont know why you think the performance delta between a 430mm^2 (gtx 880) and a 600mm^2 card (Titan II) is going to be similar to Titan (551mm^2) and GTX 780 (551mm^2) but I guess we are just going to have to disagree.

The Titan has nothing to do with a 680. <----period.

And if that will be the case, the prices/performance will only be appealing to users who actuall need the technology.
 
Last edited:
NVidia as always will hold back on fully unlocked GPU till the end. Just like when they released GeForce 680 and now fully unlocked GeForce 780 TI.

This. I'll probably wait till next summer whenever they release the 880 deluxe/ultra/plus etc. If I get the itch the to upgrade I'll just pick up a third gtx680 for cheap.
 
This. I'll probably wait till next summer whenever they release the 880 deluxe/ultra/plus etc. If I get the itch the to upgrade I'll just pick up a third gtx680 for cheap.

That would be your best bet, and possibly even hold out longer because who knows, maybe we will get custom 880 cards in that time and then along will come the "Ti" to save the best for last. I just wonder what power usage will be like and if they will come with single slot designs.
 
The original Titan was 40% faster than a 680, and overclocked on air could be almost 100% faster.

Whoa, that math suggests +50% performance on overclock (gpu and vram) ... very very few original Titans would hold 1240 MHz even with bios flash. Many were 1050-1125 range. The 6GB vram definitely didn't overclock well at all, realistically 100-150 more max reliably.

NVidia as always will hold back on fully unlocked GPU till the end. Just like when they released GeForce 680 and now fully unlocked GeForce 780 TI.

780Ti 6GB is still MIA ...
 
The original Titan was 40% faster than a 680, and overclocked on air could be almost 100% faster.


Wait, are you sure about those numbers? I don't recall the Titan being near that fast. Maybe in some cases, but not the average.

Guru3d GTX 780 Ti Reference Review:

So yeah, The GeForce GTX 780 Ti is a clusterfrack of performance alrifght, man what a fast card, exceptional really. We had a great time testing this one GPU, but as you know we'll be showing SLI scaling versus Ultra High Definition game performance in another article as well. Now coming from a GeForce GTX 680 you can expect a good 35% more performance and looking upwards from a GeForce GTX Titan, the performance difference is roughly 5 to 10%.

So that hints at the original titan being 30% at best and 25% at worst faster than the GTX 680.
 
Last edited:
Back
Top