Kepler in March/April?

  • Thread starter Deleted whining member 223597
  • Start date
HardOCP considers it playable, and I take their standard over some "minimum 60fps" wanking any day. (if you OC the card and turn down all the eye candy it would even be close to 60 I guess)

If it does not fit your use case, so be it. But there are lots of popular games and lots of users who would be perfectly ok with a high-end GPU and three 1920x1200 monitors. Even the 580 MDT was playable with three 1680x1050 monitors.
 
HardOCP considers it playable, and I take their standard over some "minimum 60fps" wanking any day. (if you OC the card and turn down all the eye candy it would even be close to 60 I guess)

If it does not fit your use case, so be it. But there are lots of popular games and lots of users who would be perfectly ok with a high-end GPU and three 1920x1200 monitors. Even the 580 MDT was playable with three 1680x1050 monitors.


Yes a multiplayer game is very playable at 23FPS..:rolleyes:
 
Theres a big difference between "playable" as in I can see stuff moving around and can play the game, and "playable" and enjoyable, meaning I get the most out of my games/equipment but can't really play online and not have jerky aiming and other horribleness. 35fps isn't enjoyable or the best experience, especially in multiplayer, but is at double that. After spending a pile on monitors, wouldn't you want something to maximize the potential? Instead of being stuck with crappy low settings or juddery framerates...
 
Not sure yet but I think 3840x800 (3 projectors) is going to be quite enjoyable with my 7970. I'll let you know soon. Even right now at 6048x1200 it's pretty good.

If Kepler isn't capable of 3 screens from a single card then I think nVidia made a mistake.
 
Not sure yet but I think 3840x800 (3 projectors) is going to be quite enjoyable with my 7970. I'll let you know soon. Even right now at 6048x1200 it's pretty good.

If Kepler isn't capable of 3 screens from a single card then I think nVidia made a mistake.

i dont know about that, unless you're playin very old games.
 
So you could fire up BF3 with high settings and play 3x1200p monitors at minmum 60fps with a single card?


You said modern games and BF3 isn't the only one. Obviously its going to depend on what games the buyer will play and some won't consider it sufficient where others might.



HardOCP considers it playable, and I take their standard over some "minimum 60fps" wanking any day. (if you OC the card and turn down all the eye candy it would even be close to 60 I guess)


Why not have your own standards? If a game seems to be running kind of sluggish would you wait for HardOCP to write about it before coming up with an opinion? :D

But good point about the video settings. Lots of reviews use the highest settings and many people assume (or so it seems to be based on lots of comments on this forum) that is the only way it runs. Duh! Just fiddle with settings till you get something you like. My GTX 460 works fine in BF3 (minimums around 45 and usually between 50 and 65fps) after just a very small amount of tweaking. It actually ran fine at the game suggested settings (default medium) but I adjusted some higher cause I wanted better textures and don't care about AA.
 
So you could fire up BF3 with high settings and play 3x1200p monitors at minmum 60fps with a single card? minimum 23fps is not playable by any kind of standard. Nor will the 600 series be likely capable of much better either. :p

OK this post is kind of dumb, of course the 7970 can handle it just fine. There is WAY more to the story than minimum FPS.

http://www.hardocp.com/article/2011/12/22/amd_radeon_hd_7970_video_card_review/9

35.5 FPS average FPS. A dip to 23 may be recorded but we don't know the frequency count of said dip in a given time envelope (hint hint, add more information like this to your reviews). The majority of the time it's in the 30-40 FPS ballpark which is pretty playable for most people. This was also done with BF3's FXAA on which causes an absolutely huge performance hit. It's very hard to tell the difference between max post processing with FXAA and without. If you turned that FXAA off, you're looking at a 15-20 FPS boost to the minimum framerates with basically no IQ loss.

So yeah, 3 monitor gaming on a single 7970 seems plenty doable. Though this is a pointless thing to say since the people that spend that much on monitors generally spend that much on a Crossfire/SLI as well.
 
Hey, where did all of trinibwoy's post go?

Lol, no kidding. All of his posts have been wiped as far as I can tell. It's causing this thread on the "nVidia Flavor" forum to report as 7 pages; then when you click on the thread, only 6 pages. :confused:
 
OK this post is kind of dumb, of course the 7970 can handle it just fine. There is WAY more to the story than minimum FPS.

http://www.hardocp.com/article/2011/12/22/amd_radeon_hd_7970_video_card_review/9

35.5 FPS average FPS. A dip to 23 may be recorded but we don't know the frequency count of said dip in a given time envelope (hint hint, add more information like this to your reviews). The majority of the time it's in the 30-40 FPS ballpark which is pretty playable for most people. This was also done with BF3's FXAA on which causes an absolutely huge performance hit. It's very hard to tell the difference between max post processing with FXAA and without. If you turned that FXAA off, you're looking at a 15-20 FPS boost to the minimum framerates with basically no IQ loss.

So yeah, 3 monitor gaming on a single 7970 seems plenty doable. Though this is a pointless thing to say since the people that spend that much on monitors generally spend that much on a Crossfire/SLI as well.


1)See the grapf below the chart
2)See it dip below 30 fps every minute or so for 5s to over a min under the 30 fps...
3)look at this post.

I would not consider any games under 30 at all time playable in singleplayer. How's that a "High" standart.

could probably run it at lower IQ but don't tell people it's playable at ultra PLEASE. If you look closely you'll see that actually the gtx 580 is more adequate cause it's dip not as often and really not as long as the 7970
 
OK this post is kind of dumb, of course the 7970 can handle it just fine. There is WAY more to the story than minimum FPS.

http://www.hardocp.com/article/2011/12/22/amd_radeon_hd_7970_video_card_review/9

35.5 FPS average FPS. A dip to 23 may be recorded but we don't know the frequency count of said dip in a given time envelope (hint hint, add more information like this to your reviews). The majority of the time it's in the 30-40 FPS ballpark which is pretty playable for most people. This was also done with BF3's FXAA on which causes an absolutely huge performance hit. It's very hard to tell the difference between max post processing with FXAA and without. If you turned that FXAA off, you're looking at a 15-20 FPS boost to the minimum framerates with basically no IQ loss.

So yeah, 3 monitor gaming on a single 7970 seems plenty doable. Though this is a pointless thing to say since the people that spend that much on monitors generally spend that much on a Crossfire/SLI as well.

Think you might have gotten your wires crossed there buddy- FXAA produces almost no performance hit, while improving image quality (matter of perspective of course). MSAA causes the game to tank, particularly on AMD cards.
 
1)See the grapf below the chart
2)See it dip below 30 fps every minute or so for 5s to over a min under the 30 fps...
3)look at this post.

I would not consider any games under 30 at all time playable in singleplayer. How's that a "High" standart.

could probably run it at lower IQ but don't tell people it's playable at ultra PLEASE. If you look closely you'll see that actually the gtx 580 is more adequate cause it's dip not as often and really not as long as the 7970

lol what? :confused:
 
The GTX 580 is also not running ambient occlusion (but to be honest I don't either). As far as single cards go: it is playable at Ultra, even on the 580 as you pointed out. You may have to turn off a few things like ambient occlusion, FXAA/MSAA - but the IQ hit just isn't there, especially in intensity of multiplayer where eyecandy is far less noticeable.

http://www.youtube.com/watch?v=sgF1A72wG2Q

I don't care about the brand.. A single card from any brand can handle modern games including BF3 at Ultra with 3 monitors with acceptable framerates in multiplayer. If turning down some (very) marginal IQ improvements invalidates that for you, I'm sure AMD or nVidia would love to have your extra $500+
 
35.5 FPS average FPS.

Outstanding... No single card, not the 580, or the 7970 nor the next nvidia whatevers will be able to do 3 screens well with 1 card. If people want to cheap out, fine, but playing with 35fps or low settings on an online shooter is a silly thing to do after you have spent $600+ on monitors.

Your quote agrees with what I said anyhow "pretty playable" is not "playable", its almost there, and not the kind of thing you want after spending $1100+ on equipment.

But anyway back to the topic, people that dont like playing 3 monitors at crappy settings or low framerates will have a second card, thus 2 connectors per card is fine.
 
Why not have your own standards? If a game seems to be running kind of sluggish would you wait for HardOCP to write about it before coming up with an opinion? :D
Heh, I'd love to be able to afford a new graphics card every month and just discard the ones that don't satisfy my gaming needs. :) In the absence of unlimited money I have to rely on reviews.
 
But anyway back to the topic, people that dont like playing 3 monitors at crappy settings or low framerates will have a second card, thus 2 connectors per card is fine.
All the screens must be connected to the same card in order for Eyefinity or Crossfire to work. It's a different story for nVidia Surround.
 
Agreed but a new die shrink process doesn't relate to the gtx 285 in the context of how you wanted it too. The 260/270/280/285 were all the same architecture, which is the point the other poster was trying to make. What is so difficult for you to understand about that? The 6970 is not the same die shrink or architecture as the 7970. Can't make it any simpler than that!

I'm pretty sure you're the one struggling with understanding here, I never stated the 6970 and 7970 were the same architectures, not sure where you read that.

I said if you are comparing % gained by the new architecture it doesn't make sense to skip the last/best iteration of the previous generation, because really you're not jumping from 280->480, you're jumping from a 285->480, the 285 was a die shrunk / 280 with higher clocks, it's the best NV could do at the time and that's what they had, now you compare it to their next card the 480, again, the best NV could release at the time, so going from the GTX285->GTX480 Best->Best would yield this % increase.

Anyway this has derailed far enough, IMO he was just cherry picking #s that best support his argument, which is something you seemed to miss as well.
 
If Charlie writes something good about an Nvidia product, then it's going to rock I guess :D


The short story is that Nvidia (NASDAQ:NVDA) will win this round on just about every metric, some more than others. Look for late March or early April availability, with volumes being the major concern at first. GK104 cards seen by SemiAccurate all look very polished and complete, far from rough prototypes or “Puppies“. A2 silicon is now back from TSMC, and that will likely be the production stepping barring any last second hitches. Currently, there aren’t any.

For the doubters, both of the current cards we saw have two DVI plugs on the bottom, one HDMI, and one DP with a half slot cooling grate on one of the two slot end plates. The chip is quite small, and has 8 GDDR5 chips meaning a 256-bit bus/2GB standard, and several other features we can’t talk about yet due to differences between the cards seen. These items won’t change the outcome though, Nvidia wins, handily.

http://semiaccurate.com/2012/01/19/nvidia-kepler-vs-amd-gcn-has-a-clear-winner/
 
Lol, no kidding. All of his posts have been wiped as far as I can tell. It's causing this thread on the "nVidia Flavor" forum to report as 7 pages; then when you click on the thread, only 6 pages. :confused:

Yeah where the hell did all my posts go? Lol, guess I pissed off somebody.

Edit: hmmm I see them, did they disappear for a while or something?
 
I love how some people on these forums put a winky icon at the end of their posts, almost like you are trying to say you know a secret... I can do that as well ;)

No man, I don't know any more than you do. If I did why would I waste time speculating/arguing on a forum? :) I was just stating the obvious. There will be more info on Kepler in the next few weeks and it will spark far more interesting discussion and provide better context for the 7970.

Regarding the 285, yeah I forgot it was a mini shrink to 55nm. Essentially the same chip though and certainly not on the same level as a G80 or Tahiti from a generational standpoint.
 
I'm pretty sure you're the one struggling with understanding here, I never stated the 6970 and 7970 were the same architectures, not sure where you read that.

I said if you are comparing % gained by the new architecture it doesn't make sense to skip the last/best iteration of the previous generation, because really you're not jumping from 280->480, you're jumping from a 285->480, the 285 was a die shrunk / 280 with higher clocks, it's the best NV could do at the time and that's what they had, now you compare it to their next card the 480, again, the best NV could release at the time, so going from the GTX285->GTX480 Best->Best would yield this % increase.

Anyway this has derailed far enough, IMO he was just cherry picking #s that best support his argument, which is something you seemed to miss as well.

Ditto...
 
Outstanding... No single card, not the 580, or the 7970 nor the next nvidia whatevers will be able to do 3 screens well with 1 card. If people want to cheap out, fine, but playing with 35fps or low settings on an online shooter is a silly thing to do after you have spent $600+ on monitors.

Your quote agrees with what I said anyhow "pretty playable" is not "playable", its almost there, and not the kind of thing you want after spending $1100+ on equipment.

But anyway back to the topic, people that dont like playing 3 monitors at crappy settings or low framerates will have a second card, thus 2 connectors per card is fine.

Are you trying to be ignorant or can you not read? I'll give you the benefit of the doubt and just assume you are trying to attempt a straw man. Running BF3 on ultra at 5040x1050 with max post processing & ambient occlusion off is not "low settings." There's no doubt that at stock speeds that it can easily pull off around 40-45 FPS without AO & FXAA. Overclocked, a single 7970 averages 50 FPS with everything on, including FXAA & HBAO. (http://www.hardocp.com/article/2012/01/09/amd_radeon_hd_7970_overclocking_performance_review/6)

Let's not forget that the argument of "those who spend $1100 might as well spend another $550" is also fallacious reasoning. Adding in another card is a 48% cost over the initial pricetag. Sorry, you just couldn't be more wrong. Look up the facts next time - especially when they come from this very site. Thanks for playing.
 
Until/unless PLP support ever becomes a reality, I'll be gaming on 1 monitor while using 3. 1920x* is just too painful a downgrade for everything else and 3x2560x1600 is so far beyond my budget it's not even worth thinking about.
 
I suggest you learn to read before suggesting someone else can't.

Enjoy your piss low standards. :p

:confused:

Lol what? I think this is where I laugh, shake my head, and put you on ignore. Otherwise, you will no doubt continue reply with asinine comments and fallacious reasoning because you simply can't admit you're wrong (and over something so trivial). What an ego - must be fun being that crazy.
 
:confused:

Lol what? I think this is where I laugh, shake my head, and put you on ignore. Otherwise, you will no doubt continue reply with asinine comments and fallacious reasoning because you simply can't admit you're wrong (and over something so trivial). What an ego - must be fun being that crazy.

Stop being a little child trying to start arguments. Constantly when disagreeing with your opinion that >60fps was amazingly playable you have insulted me for no good reason and given 0 evidence to say it wasn't so other than it just is. You then barely read anything then make up some weird stories then rant about me being "ignorant" and "dumb". It doesn't matter what is said, you have convinced yourself that you are right. So like I said last time, there is no point continuing, yet you decide to do that. Guessing you want the last word or something...:confused:

Just stop already.
 
Until/unless PLP support ever becomes a reality, I'll be gaming on 1 monitor while using 3. 1920x* is just too painful a downgrade for everything else and 3x2560x1600 is so far beyond my budget it's not even worth thinking about.

I feel the same way. I'm buying the U3011 in a few months because I like the center monitor to be the largest of the three.
I had three 23" monitors in Eyefinity and I didn't like how wide it was and disliked the bezels in portrait.
Which ever single card that can drive the 30" the best I will buy, that includes the dual gpu models. Another user patiently waiting for PLP support.
 
I feel the same way. I'm buying the U3011 in a few months because I like the center monitor to be the largest of the three.
I had three 23" monitors in Eyefinity and I didn't like how wide it was and disliked the bezels in portrait.
Which ever single card that can drive the 30" the best I will buy, that includes the dual gpu models. Another user patiently waiting for PLP support.

I personally will be getting a 120hz 27inch monitor.

I dont do the 3d thing, but Having games stay at 120fps would be my ideal goal....of course getting bf3 to stay at 120hz and never drop is a wet dream.

But I agree, Bezels suck ass....
 
Add me to the list of people waiting to see if Kepler can support triple monitor gaming on one card. I use a single 5870 right now, which means I often end up running games with no AA to make them playable. If the kepler cards can't do something similar to Eyefinity, I guess I'll go for a 7-series instead (which will hopefully drop in price around the kepler release).
 
Back
Top