We're all gonna love Kepler.....but ask yourself this : Do we really NEED Kepler?

Status
Not open for further replies.

ShuttleLuv

Supreme [H]ardness
Joined
Apr 12, 2003
Messages
7,295
I dont think theres a reason outside of multi displays personally, but I want one bad.
 
But a couple SLI cards currently out can provide that with some clocking and tweaking.
 
Speak for yourself, I have a 4870x2 that sounds like a jackhammer when it heats up and am tired of cleaning it out and driver issues. It was a good value for $370 3+ years ago, but I'm ready to move on.
 
28nm goodness done right with at least 35% performance increase over gtx 560ti 448 for $300 and I don't need it but I value it and want it and am prepared to buy it.
 
Yes, and on single displays too. 120hz and high IQ require a lot of GPU power.
 
Yep, need it for 3D. Plan on buying three at launch if the rumors prove to pan out. I might hold back however if there is a 780 or a much better GK110 coming out in the fall when I do my Windows 8 build.
 
Need?

No, but if a single Kepler card will let me max all the eye candy settings on my 30" 2560x1600 monitor, then...

I WANT!!!!!!!! :D
 
See if we had UHD displays (7,680×4,320 as is supposed to be happening in the next 10 years)... You would be asking "Is just dual SLIed keplars enough for a single display?". To which the answer would be probably not. Especially as 30fps is shitty and annoying at any resolution. :D
 
If I can have the same performance than a gtx 580 and produce half ot the heat, I will be quite happy
 
Yes we do in the sense of competition and possibly bringing down the prices of 28nm GPUs as a whole.
 
Yes true but they are such a small minority. Even less than triple displays methinks.

I saw the stats somehwere on rage3D, 120hz display users far outweigh triple display users. For serious gamers, eyefinity is fail, 120hz is the better option by far.
 
I saw the stats somehwere on rage3D, 120hz display users far outweigh triple display users. For serious gamers, eyefinity is fail, 120hz is the better option by far.

Well you could do both like me and have 3D Surround.;)
 
Well you could do both like me and have 3D Surround.;)

Yes thats true, but to push 120fps at high IQ across 3 displays is out of current techs capabilities. It already takes Tri 580 or Tri AMD 7 series to run 120fps at 1080p in BF3 for example with some form of transparency AA.
 
Yes thats true, but to push 120fps at high IQ across 3 displays is out of current techs capabilities. It already takes Tri 580 or Tri AMD 7 series to run 120fps at 1080p in BF3 for example with some form of transparency AA.

See, this is why we need Kepler.;)
 
Yup, agreed, but I doubt it will do it either though. 2 x Kepler may be able to just push 120fps at 1080p but 4 will not push 3 x 1080p.

For me a single 27" 120hz display coupled with a single 27" 1440p IPS display is the go, best of both worlds.
 
find me a single gpu solution that gets me 120fps in BF3 set to ultra.

oh right, doesn't exist. My 480 SLI can't even handle it.
 
Yeh I have Tri 580 that just manages to sit on a solid 125 fps capped in BF3 using 4xtrSSAA, but a SB chip @ ~5GHz is also needed. Tri 7 series is needed to achieve the same results.. hopefully dual kepler can do the job.
 
Yes, we always need the new generation because it doesn't just increase the performance but does it at the same price and power consumption. Hence 'poor people' like us get the previous gen's 500$ performance at 300$ at much reduced power consumption as an added bonus.
Even if price is no issue as is suggested by your thread title, there still isn't a single GPU which runs Crysis 2 1080p maxed out at a constant 60fps without breaking a sweat. As for Metro the no is <50fps for current flagship cards.
 
We need Kepler way more than we need newer more powerful CPU's!

So HECK YEAH!

3D gaming, Metro 2033, BF3, 3D Surround gamers, best single card solution avilable!! I want a 6GB VRAM or better card hopefully that is in the works. I'll wait for the double memory MSI TF version, I hope that doesn't take to long to get released...:eek: Christmas in Spring/Summer F-YEAH! Me Likey Likey :D Fudzilla is sayin March 23 for 680GTX...
 
I have limited vision and can't see the edges of a monitor larger than 20", so I'm keeping my 20.1" 1680x1050 NEC 20WMGX2 long-term, and I have a backup 20WMGX2 in case this one should fail. I have been hoping that new monitor tech would result in smaller monitors with high resolution, but that doesn't seem to be happening any time soon. Of all things, the iPad 3 is coming close to what I want.

In the meantime, a single 2GB GTX 680 would be ridiculous overkill for 1680x1050 and would last me 3 years easily, just as my old 8800GTX did and my current GTX 275 has. With the smaller resolution, I need to jack AA up as high as possible.
 
I'd like a single card that can drive a 2560x1440 display reasonably well for now.

Hoping one of these will be it if I can get a job to buy one.
 
That was harsh Kyle lol...

Well...I love seeing new technology so bring on Kepler but honestly I don't play a single game that needs one. Or even a 7970 for that matter.
 
That was harsh Kyle lol...

Well...I love seeing new technology so bring on Kepler but honestly I don't play a single game that needs one. Or even a 7970 for that matter.

Sometimes the truth is just Harsh.
 
That was harsh Kyle lol...

Well...I love seeing new technology so bring on Kepler but honestly I don't play a single game that needs one. Or even a 7970 for that matter.

I'll be able to chew up three of them with 3D Surround. For single screen gaming at 1920x1200 there probably wouldn't be a lot of benefit with current generation games but plenty of people will be able to take Kepler to the limits.

So will Kepler go on sale Thursday or next Friday? I'm hoping is Thursday so that I'll have the weekend to play.
 
Why does Sonda5 still think it will be $300?

i highly doubt it will be $300...i did hear multiple sources say $300 was the original target goal....then the past few weeks, sources say nvidia plans to jack up the price to $550, which is 7970 price....kinda lame of nvidia to do this, considering this new Kepler chip isn't suppose to be "high end" and just a mid range card...its just the performance is so good compared to the 7970, they can market it as high end, and milk the extra money out of us
 
Why does Sonda5 still think it will be $300?

since this an ask yourself this thread...


ask yourself this "why would you keep reading sonDa5's posts?"


i highly doubt it will be $300...i did hear multiple sources say $300 was the original target goal....then the past few weeks, sources say nvidia plans to jack up the price to $550, which is 7970 price....kinda lame of nvidia to do this, considering this new Kepler chip isn't suppose to be "high end" and just a mid range card...its just the performance is so good compared to the 7970, they can market it as high end, and milk the extra money out of us

theres no such thing as "suppose to" on something that hasn't even been released to begin with. its what ever nvidia wants it to be. hell we don't even know if GK110 actually exists or if it was some pipe dream nvidia came up with. my guess is they tried and GK110 was a complete failure so they went to their backup plan instead. i mean think about it, if the 680 is a (rumored) 190w TDP chip and you look at the size difference of what GK110 should of been to GK104 the TDP must of been insane.
 
Last edited:
Does it BLEND?!!! If it can meet up to my monitors capability of doing near 120hz, Of course! :)
 
I think I might get one this round to tide me over until the GK110's come out, then I'll go SLI with those and pass this off to my secondary rig. My goal is to go 3d Surround with 3 27" ACER 3d monitors eventually. The delays are making monstar rig in sig cry, so I guess I'll hold out for 3d Surround until next year.
 
I dont think theres a reason outside of multi displays personally, but I want one bad.
Let me share my experience here which I think is more similar to a common user's setup. 1920*1080, no 3D or any fancy display setup. Currently I'm using a GTX 295 which is similar to a GTX 480 in terms of performance, apart from its lack of DX11.

There are already quite a number of games that I cannot run full quality, no AA. Battlefield 3, Call of Pripyat, Metro 2033 and DCS A-10.

I would also love to have my fps at a constant 60 fps to ensure its as smooth as possible, and even in Skyrim with all quality up, no MSAA, no mods, the fps does dip down to 40 in areas with plenty of foliage around.

Ultimately, I want to be able to run games with 4x MSAA, all quality turned on, and getting a constant 60fps.

While GTX 580 is faster than the 480, I wasn't sure if the improvement would be enough to achieve that. Who knows, maybe I'll find that the 680 isn't enough too :eek:. but at least, it'll be closer than the 580 (and I've waited long enough, lol)

But a couple SLI cards currently out can provide that with some clocking and tweaking.

Another setup I'd like to get away from. Multi GPU produces a lot of heat and some games still require manual profile setup in the driver. A minor inconvenience, but something I'd rather stay away from.
 
Categorically, no, Kepler (or 79xx) is not needed with the console-isation of gaming and anemic line-up of PC games pushing the graphical envelope. And for the argument that there are upcoming games that will need this amount of power, who are we kidding? We know there will be no title that will a require power jump like games in the past e.g. far cry, crysis or HL2... those days are over until there is a console refresh. There is nothing my dual 480s doesnt run exceptionally well and will for a long while - bad for the hobbiest in me, good for my wallet, I suppose.

One could argue that the reduced cost of a die shrink and improved manu process that is driven by new GPUs is why we need the new cards, but as we are painfully seeing with AMD and nV pricing schemes, this is not the case.

Looking outside of the PC gaming sphere and into the mobile space, however, one can see how the improved performance per watt will be hugely useful. The mobile space is where AMD and nV are marshalling their strength and where we will see real benefits. PC gaming is, unfortunately, becoming more and more of a niche.
 
i highly doubt it will be $300...i did hear multiple sources say $300 was the original target goal....then the past few weeks, sources say nvidia plans to jack up the price to $550, which is 7970 price....kinda lame of nvidia to do this, considering this new Kepler chip isn't suppose to be "high end" and just a mid range card...its just the performance is so good compared to the 7970, they can market it as high end, and milk the extra money out of us

The $300 dollar rumor was a machination of nV's guerilla marketing egged on by wishful (naive?) thinking. It was likely never a target. Why are we still throwing this number around?
 
My Gtx 470 isn't running all games at max any more and I am at 1920x1080 single card. So yeah, I need Kepler and want. The only question will be: at this resolution, will I be able to max graphics on the 670 when it comes out or will I "have to" get the 680. Hopefully we will know all next Friday.
 
Kepler (or 79xx) is not needed with the console-isation of gaming and anemic line-up of PC games pushing the graphical envelope. And for the argument that there are upcoming games that will need this amount of power, who are we kidding? We know there will be no title that will a require power jump like games in the past e.g. far cry, crysis or HL2... those days are over until there is a console refresh. There is nothing my dual 480s doesnt run exceptionally well and will for a long while - bad for the hobbiest in me, good for my wallet, I suppose.
Very true, even my 2y-old GTX460 (and 5y-old C2D CPU) is fast enough for 95% of all new games. The need to upgrade is just not there anymore. Ten years ago, it would have been unthinkable to game on parts this old. Look at it this way;

It's the same thing as being able to game on a 300Mhz CPU with a Voodoo5... in 2002, the year when Radeon 9700Pro and 2Ghz CPU's were top of the line.
I can tell you, that was impossible. in 2002, ES3: Morrowind required a 500mhz CPU and Geforce3 to run smoothly.

Ten years later, ES5: Skyrim runs at 30-60 fps everything maxed on five year old hardware.

One could argue that the reduced cost of a die shrink and improved manu process that is driven by new GPUs is why we need the new cards, but as we are painfully seeing with AMD and nV pricing schemes, this is not the case.
The new pricing scheme is unfortunate, but then again; when a GTX460 is fast enough for all console ports, who cares what a GTX870 or Radeon HD8990 will cost?

PC gaming is, unfortunately, becoming more and more of a niche.
Maybe, maybe not. But I am worried about the future of the GPU. We'll see what happens when the next-gen consoles are announced. Nv / AMD will at least want to release new GPU's that are "better than the new consoles". Could trigger a new GPU 'arms-race'.
If however the new consoles are based on crappy HD6670-class hardware, I suspect the GPU-market will collapse. Who will upgrade their card then?
Maybe the future of PC-gaming will be an external-HDD-like console-graphics device you plug into your PC. And everyone will have the same hardware. :)
 
That was harsh Kyle lol...

Well...I love seeing new technology so bring on Kepler but honestly I don't play a single game that needs one. Or even a 7970 for that matter.

Harsh but very true. This thread seems about the same as the 'You know it was coming...' thread asking if people will buy Kepler that had only a yes/no option and wasn't very informative and missing some key options.

In regards to 'I don't play a single game that needs one', that's a shame. Skyrim and Battlefield 3 are both pretty fun games and probably contenders for a few GOTY awards. If you turn up the AA in BF3, it starts to go over the 1.5GB of ram on the 580's lower end models and the game starts to run a bit chunky.

Similarly, Skyrim can be customized via ini files to make the view-distance a lot further including the 'Grid' option which went set to 7 or 8 can start to use near 3GB due to the amount of stuff being loaded on the screen at one time. 12 crashes after a few seconds on even GTX 580s according to HardOCP's custom image quality guide for Skyrim. So yes, we do need Kepler for ram limitations -- or the 7970's 3GB models. People have already mentioned titles. Are you reading the replies people post to your comments?

With regard to do we 'NEED' it, nobody really needs new technology. It's always a 'want' with the possibly exceptions of new Directx versions for excusive titles to the new DX technology. IE a "DX10 -- only" game back in the days of DX9/10. In reality, everything else is pretty much a want. One could take a single monitor, settle for 25 fps and all detail settings on low and be happy with a 5650 at 1024x768 resolution or less. Games wouldn't look as good as we 'WANT' but they'd run as well enough as we NEED to experience the gameplay.

I generally tend to buy two top-end cards of new DX versions and SLI/Crossfire them(depending on what company is on top that generation) and start out gaming in 1080p glory with 8xAA and as new models come out each year just drop the AA settings by half till I reach 2xAA and then usually just drop the resolution slider down one notch. At that point, usually the new DX version is out and I'm back to 1080p glory with 8xAA from SLI/Crossfiring the new top-end cards. With that in mind, I do like to wait for the higher-memory versions whenever possible so the cards last longer. IE 3GB rather than 1.5GB.

So I wouldn't say I feel a need for kepler at this moment and I'm more curious for personal-sake if the 8970 and 780 will be the generation with DX12 and thus time for me to upgrade or if it that'll be the year-after. Frankly, I don't care too much either way b/c in any case, I know my next upgrade will just be that much more significant and impressive (to me) how much better my games look.
 
Status
Not open for further replies.
Back
Top