When will Nvidia or AMD release their next-gen cards?

The 78xx series should be coming out soon but not sure if you consider that "next gen" or not since it will still be VLIW4
 
Current rumors for high-end nvidia or ati/amd 28nm parts (we've been stuck on 40nm forever it feels like) are late Jan through early-mid March, take your guess... I'm going to guess they'll be here at the beginning of February myself.
 
AMD will be out several months before Nvidia will release there Next Gen Card.
Nvidia will be 6 months late to the party, they'll release a card that costs twice as much as ATI, runs twice as hot, uses twice as much power, and offers 20% to 30% more performance. Then they'll wait 6 months and release a refresh that fixes all the problems I just mentioned.

Yeah, we know the drill.
 
It's only been a year since the GTX580 and Radeon 6870 launched

except those werent real generations. there hasnt been major improvement since HD5870 and GTX480 and those were in 09

next gen amd is releasing jan/feb and nvidia aprilish.
 
do you even need new hardware with the latest releases? seems all i read about is how the games are all console ports and aren't really pushing the systems graphically
 
Nvidia will be 6 months late to the party, they'll release a card that costs twice as much as ATI, runs twice as hot, uses twice as much power, and offers 20% to 30% more performance. Then they'll wait 6 months and release a refresh that fixes all the problems I just mentioned.

Yeah, we know the drill.

And people will still buy Nvidia's botched overpriced card defying all common sense and logic ;)
 
Yes, even 580 scaled to 28nm process would be a great improvement
 
Nvidia will be 6 months late to the party, they'll release a card that costs twice as much as ATI, runs twice as hot, uses twice as much power, and offers 20% to 30% more performance. Then they'll wait 6 months and release a refresh that fixes all the problems I just mentioned.

Yeah, we know the drill.

hehehe :D
 
What happed to paper launches several months before the product become available?? If AMD comes out with 28nm product in January, we should have a paper launch NOW!

I would like to see some benchmarks soon! :)
 
I'm hoping soon. I'd like to look at a 7950, and sell current 6950 (with maybe 6 hours on it), or have it drive prices down and throw a second 6950 in and try crx
 
Well the leaked Nvidia slides point to Q3/Q4 2012 for their big enthusiest card, meanwhile we may see mainstream cards in Q2.

005.jpg.jpeg


AMD on the other hand seems to have their fabs more under control and will have their enthusiest and mainstream cards out anywhere from Q1 to Q2 2012.

Ofcourse getting parts out early has market share advantages, meanwhile tho it does give Nvidia time to adjust their parts to be more competitive.

For example if Nvidias parts turn out to be slower, nvidia will up the reference clocks and may also add more Vram. However by then AMD could throw down a price drop which could further erode nvidias market share.

The consumer ultimately wins and such is the benefit of having competition in the market.
(looking strait at you AMD, get your act together in the CPU arena!)
 
What happed to paper launches several months before the product become available?? If AMD comes out with 28nm product in January, we should have a paper launch NOW!

I would like to see some benchmarks soon! :)

Didn't nV have a patent the paper launcher?
 
Bit sad about AMD pushing back release of 7970 to Mar 12 timeframe. Adding a second 6970 for this reason alone :p.
 
Power Efficiency

We give shits about power efficiency for our gaming PC's?

This isn't oftforum.

Also, its funny how EpsilonZer0 believes that AMD magically got TSMC to be better for them than Nvidia. Good job taking fake slides.
 
Hoping nVidia at least has a paper launch around the time AMD starts releasing the 78xx and 79xx series so I can figure out if I'm buying AMD right away or not.

Unless the 79xx series is absolutely amazing, I'm likely looking at a 7850/7870 or a 660Ti to replace my GTX 460. Power consumption is huge for me with it being in my somewhat small HTPC case under the TV.
 
do you even need new hardware with the latest releases? seems all i read about is how the games are all console ports and aren't really pushing the systems graphically

I certainly do. I am running a 120hz monitor and certain games at full graphics struggle to achieve anywhere near steady 120 FPS. I am running 2x5850, overclocked. And this is at 1920x1080.

eg games: BF3, The Witcher 2
 
We give shits about power efficiency for our gaming PC's?

This isn't oftforum.


Haha. Efficiency use to be a concern for me because my PS' were 750w and 850w. Then I said screw it, and upgraded to 1050w & 1350w units and haven't looked back.
 
I certainly do. I am running a 120hz monitor and certain games at full graphics struggle to achieve anywhere near steady 120 FPS. I am running 2x5850, overclocked. And this is at 1920x1080.

eg games: BF3, The Witcher 2

Upgrade to a better monitor then, IPS/PLS. 120Hz TN panels are garbage.
 
What does this have to do with 120 fps?

Well the only way to get 120 FPS visibly drawn is with a 120Hz monitor.

All 120Hz consumer LCD monitors are crap anyways, so get a better monitor, cap at 60 fps and stop taking placebo.
 
I don't think they are in any hurry. With everything consolized, what's the point? It used to be we'd have to tweak to get 1080x1024... now it's 1080p and up with 2-3 different AA settings without even the need to o/c. Games no longer push the hardware like they used too.

I haven't had a title yet I coulnd't play 1900x1200 crystal smooth with a single 6950.
 
I don't think they are in any hurry. With everything consolized, what's the point? It used to be we'd have to tweak to get 1080x1024... now it's 1080p and up with 2-3 different AA settings without even the need to o/c. Games no longer push the hardware like they used too.

I haven't had a title yet I coulnd't play 1900x1200 crystal smooth with a single 6950.

Yeah these days it's nothing to get most games to run good on a single high end card on one 2560x1600 monitor. A few years ago, that was challenging to say the least. As a result, the need to push the boundaries just isn't there as you said. As long as people drool over "1080P" and everything we see is pretty much a console port for antiquated hardware, we're pretty good with what we have. On the high end there is still some need for more if you have multiple monitors, but such configurations are rare. As a result it's hard for companies to justify the release of brand new cards as frequently. We might see graphics cards stagnate again like we did with the 8800GTX and GTX 280's at this point. Not sure yet. We'll know as we near the end of the year, early part of next year. They may very well push back the new stuff and refresh what we already have.

I just wish I knew. I'd be tempted to add a third GTX 580 if I knew I'd get six months of use out of it.
 
I don't think they are in any hurry. With everything consolized, what's the point? It used to be we'd have to tweak to get 1080x1024... now it's 1080p and up with 2-3 different AA settings without even the need to o/c. Games no longer push the hardware like they used too.

I haven't had a title yet I coulnd't play 1900x1200 crystal smooth with a single 6950.

Then you are not playing BF3. With anything more than low settings, that game will choke a 6950 at that resolution. Especially with any AA turned on. Otherwise, sure. Most games will work just fine. BF3 is just a beast but also an indicator of games to come. I am looking forward to Nvidias new release. Especially since I prefer single cards.
 
Bit sad about AMD pushing back release of 7970 to Mar 12 timeframe. Adding a second 6970 for this reason alone :p.
http://semiaccurate.com/2011/11/17/more-bits-on-hd7000northern-islandsgcn-leak/

Say what you want about Charlie Demerjian, but his insider sources deliver. January it is!

All 120Hz consumer LCD monitors are crap anyways, so get a better monitor, cap at 60 fps and stop taking placebo.
Just because you can't perceive a clear difference between 60Hz and 120Hz doesn't mean others can't (or that they are lying to themselves, as you suggest). Personally, I can't really tell after about 90. Someone who has been playing fast-paced games at a very high competitive level will likely be able to perceive even higher.

Saying that people can't tell past 60 is like saying nobody runs faster than 10 miles per hour. You can't just make a blanket statement about everyone's abilities, unless it was some astronomical figure like "nobody can tell the difference between 10,000Hz and 20,000Hz", or "nobody can run 500 miles per hour".
 
Last edited:
Upgrade to a better monitor then, IPS/PLS. 120Hz TN panels are garbage.

Most people are perfectly fine with a TN panel, I would rather have 2 GTX 580's and a TN than 1 GTX 560 and an IPS. That said, if I had nothing to upgrade anymore, IPS would be rolling in.
 
Just because you can't perceive a clear difference between 60Hz and 120Hz doesn't mean others can't (or that they are lying to themselves, as you suggest). Personally, I can't really tell after about 90. Someone who has been playing fast-paced games at a very high competitive level will likely be able to perceive even higher.

Saying that people can't tell past 60 is like saying nobody runs faster than 10 miles per hour. You can't just make a blanket statement about everyone's abilities, unless it was some astronomical figure like "nobody can tell the difference between 10,000Hz and 20,000Hz", or "nobody can run 500 miles per hour".

I never said that you can't tell the difference.

With my own testing I preferred the higher quality monitor and lower frame rate because it gives me more head room to improve visuals and not notice (literally) jarring changes in frame rate. The jump from 120 > 60 with vsync on is horrible, and if you're running 120Hz you'd better have Vsync on otherwise you'll be running into all sorts of lovely "out of sync" frames.

I can see it if you're still a fan of source games(CS:S for one), or Quake 3. Anything modern is flawless and smooth at 60fps/60Hz. That's what the engine is designed around. Anything more and you get funky things happening, this is why most dev's have Vsync turned on, it's actually helpful.

You can feel the difference of higher frames, but you can also feel the difference between 60FPS barely, or a steady 60FPS. The responsiveness is completely different.

TL;DR: Depending on the engine anything more the 60fps is wasted anyways, so you'd be better suited with a better quality monitor.
 
I'm actually glad both AMD and Nvidia have slowed down...too many refreshes is no good...should be a minimum of 1 year between new releases
 
Yeah these days it's nothing to get most games to run good on a single high end card on one 2560x1600 monitor. A few years ago, that was challenging to say the least. As a result, the need to push the boundaries just isn't there as you said. As long as people drool over "1080P" and everything we see is pretty much a console port for antiquated hardware, we're pretty good with what we have. On the high end there is still some need for more if you have multiple monitors, but such configurations are rare. As a result it's hard for companies to justify the release of brand new cards as frequently. We might see graphics cards stagnate again like we did with the 8800GTX and GTX 280's at this point. Not sure yet. We'll know as we near the end of the year, early part of next year. They may very well push back the new stuff and refresh what we already have.

I just wish I knew. I'd be tempted to add a third GTX 580 if I knew I'd get six months of use out of it.

Agreed. I'm running 3D Vision on pair of 570s, and the only thing that was choking it on 1920x1080 is Rift. Otherwise things like Crysis 2, Batman, Skyrim are running maxed and without problems.

Though, I might get the pair of 670s to run at some 32xforced AA and 3D with decent frame rates :)
 
Back
Top