Where did all the GTX 580s go?

one 580 is all you'd need on a single display for a very long time, (until the new consoles come out)

Nope_speech_bubble.png


What you said only makes sense if you add a conditional saying you're also happy to only get 30fps in several titles.
 
I try and look at things realistically, if I can manage Metro/bf3/Arkham City/Skyrim etc etc maxed out on my res @ 60+fps, I honestly don't see devs pushing it any further than that over the coming years, why would they? They are developing games for outdated console hardware -There's only so much more bells and whistles they can add to the PC versions of said games that will take advantage of our beasty cards, sure pointless unoptimized variations of things like phyx might pop up in an attempt to squeeze more out of the pc enthusiast, but I honestly believe that for the majority of the PC gaming market (people running on single displays @ 1920x1200) Kepler cards will be overkill right now, I'm personally sticking with my 580 till the new consoles hit the market. Which is honestly what most people should be waiting for to upgrade.
 
Zarathustra[H];1038092611 said:
Agreed.

My guess is as follows based on the article:

GK110: Replaced GTX590 with two GK104's
GK104: GPU to be used for 580 and 570 replacements. Will not be out soon.
GK106: GPU to be used to replace 560 and 560Ti. May be out early 2012
GK107: GPU to be used to replace lower level GPUs, like 550 and below.

BUT, you may find that GK106 may be as fast as a current 580, even if it isnt its replacement. As such there would likely be little market left for a 580.

But where is GK100? In Fermi, the first flagship GPU is codenamed GF100, and it was the refresh GTX 580 that's codenamed GF110.

That's the problem I have with that rumor that's been floating around. GK112 looks more like a refresh of the GK100 which have been mysteriously left out.
 
one 580 is all you'd need on a single display for a very long time, (until the new consoles come out)

Let's put this to rest once and for all shall we?

There are MANY - even older - titles that do not run well on a single monitor with one GTX580.

When I had my GTX580, if I turned on any kind of antialiasing, my single GTX580 would be just barely sufficient to run older titles such as S.T.A.L.K.E.R at 2560x1600.

if you want to run with AA at a 2560x1600 resolution even without anti-aliasing turned up, for many titles this just isn't enough.

Metro 2033 - for instance - won't even run well with TWO GTX580's if you want to crank up the settings and AA at 2560x1600. It may even be a challenge with three of them... My two 6970's certainly aren't sufficient...

A single GTX580 is fine for single monitor resolutions if you like leaving everything at medium, only use very low levels of AA and play at 1920x1200 or below, otherwise there is DEFINITELY a need for more GPU power.
 
Zarathustra[H];1038095240 said:
Let's put this to rest once and for all shall we?

There are MANY - even older - titles that do not run well on a single monitor with one GTX580.

When I had my GTX580, if I turned on any kind of antialiasing, my single GTX580 would be just barely sufficient to run older titles such as S.T.A.L.K.E.R at 2560x1600.

if you want to run with AA at a 2560x1600 resolution even without anti-aliasing turned up, for many titles this just isn't enough.

Metro 2033 - for instance - won't even run well with TWO GTX580's if you want to crank up the settings and AA at 2560x1600. It may even be a challenge with three of them... My two 6970's certainly aren't sufficient...

A single GTX580 is fine for single monitor resolutions if you like leaving everything at medium, only use very low levels of AA and play at 1920x1200 or below, otherwise there is DEFINITELY a need for more GPU power.


FXAA inject FTW. About 1% performance hit if any...

I wouldn't run anything at 2560 with a single GPU. Thats why I'm sticking with my 120hz 1680x1050 monitor. I can crank everything up and still get great FPS with a GTX 580.

I don't see the need for 1920 or 2560 for myself.
 
FXAA inject FTW. About 1% performance hit if any...

I wouldn't run anything at 2560 with a single GPU. Thats why I'm sticking with my 120hz 1680x1050 monitor. I can crank everything up and still get great FPS with a GTX 580.

I don't see the need for 1920 or 2560 for myself.

You should try gaming on a 30" 2560x1600 monitor some time. You might wind up liking it :p
 
Zarathustra[H];1038095514 said:
You should try gaming on a 30" 2560x1600 monitor some time. You might wind up liking it :p

Honestly, I don't even want to... cause things would get too expensive too fast. For me 1680x1050 is fine, a single GTX 580 is fine... I can't justify dropping the extra money I'd have to in order to play at 30" with great framerates... with 3 kids and #4 on the way, I'll settle for what I have now ;)
 
Honestly, I don't even want to... cause things would get too expensive too fast. For me 1680x1050 is fine, a single GTX 580 is fine... I can't justify dropping the extra money I'd have to in order to play at 30" with great framerates... with 3 kids and #4 on the way, I'll settle for what I have now ;)

I completely understand that way of thinking. I stayed away from 30" 2560x1600 monitors for a long time because I feared I might like it too much.

Then I tried one, and it so turned out I did like it too much :p
 
Zarathustra[H];1038095633 said:
I completely understand that way of thinking. I stayed away from 30" 2560x1600 monitors for a long time because I feared I might like it too much.

Then I tried one, and it so turned out I did like it too much :p

2560x1600 S-IPS for the win... I love my 3007WFP-HC :D.
 
Zarathustra[H];1038095633 said:
I completely understand that way of thinking. I stayed away from 30" 2560x1600 monitors for a long time because I feared I might like it too much.

Then I tried one, and it so turned out I did like it too much :p

That's what I was afraid of myself so I went with a 27" Dell U2711 for 1440p. Love this monitor, but saved a nice chunk of change versus the U3011.
 
Zarathustra[H];1038095240 said:
When I had my GTX580, if I turned on any kind of antialiasing, my single GTX580 would be just barely sufficient to run older titles such as S.T.A.L.K.E.R at 2560x1600.
.

That is a complete lie. I run STALKER with visual mods at that resolution, max settings, with a GTX 460...

I have absolutely no issue whatsoever running it, not even close.

You have a Pentium 2 in your system?
 
That is a complete lie. I run STALKER with visual mods at that resolution, max settings, with a GTX 460...

I have absolutely no issue whatsoever running it, not even close.

You have a Pentium 2 in your system?

Well, when I turned on AA, my frame rates were constantly dropping down to just under 30fps (like 27 or 28) when things got busy.

This was on my old core i7-920 at stock speeds (2.67Ghz)

This was - however - in Clear Sky and Call of Pripyat. I played Shadow of Chernobyl on my old monitor (1920x1200) and video card (unusually well overclocked GTX470)

I doubt it had anything to do with the CPU, as lowering the resolution (or other graphics settings) increased frame rates.

Sure, it is possible to turn the graphics down so that it plays better, but where is the fun in that? I don't want a stripped down ugly version of the game. I want to play it like it was intended.
 
Honestly, I don't even want to... cause things would get too expensive too fast. For me 1680x1050 is fine, a single GTX 580 is fine... I can't justify dropping the extra money I'd have to in order to play at 30" with great framerates... with 3 kids and #4 on the way, I'll settle for what I have now ;)

It's how it always goes.

"One GTX580 is enough to play anything - all games are console ports - faster graphics cards shouldn't exist"

"What about large monitors or 3D?"

"Oh, I don't like those"
 
I changed over from tri-fire 5970/5870 on a 1920x1080 to a single GTX 580 at 2560x1600 during the bitcoin craze. Fortunately, this card has more than enough oomph for C&C 3 and Portal 2. However, when I buy a copy of Arkham City, after all the bugs are worked out, it will be following the purchase of either two 7970's or two GTX 680's all for the purpose of maintaining high image quality at high fps.

The really big jump will be to a triple 2560 setup and that will definitely require more than one high end video card.
 
Zarathustra[H];1038096639 said:
Well, when I turned on AA, my frame rates were constantly dropping down to just under 30fps (like 27 or 28) when things got busy.

This was on my old core i7-920 at stock speeds (2.67Ghz)

This was - however - in Clear Sky and Call of Pripyat. I played Shadow of Chernobyl on my old monitor (1920x1200) and video card (unusually well overclocked GTX470)

I doubt it had anything to do with the CPU, as lowering the resolution (or other graphics settings) increased frame rates.

Sure, it is possible to turn the graphics down so that it plays better, but where is the fun in that? I don't want a stripped down ugly version of the game. I want to play it like it was intended.

There is something wrong with that. I play Call of Pripyat at everything max settings on a 460 with the Complete graphics (http://artistpavel.blogspot.com) mod. I can't even get the frame rate to stutter...

I'm not telling you to turn down the settings, I think something is wrong with your system if your 580 has trouble running it.
 
There is something wrong with that. I play Call of Pripyat at everything max settings on a 460 with the Complete graphics (http://artistpavel.blogspot.com) mod. I can't even get the frame rate to stutter...

I'm not telling you to turn down the settings, I think something is wrong with your system if your 580 has trouble running it.

Call of Pripyat's a demanding game, it's your system I'd suspect, not Harderstyle's. CoP is way beyond the capabilities of a GTX460 for smooth gameplay at max detail.
 
There is something wrong with that. I play Call of Pripyat at everything max settings on a 460 with the Complete graphics (http://artistpavel.blogspot.com) mod. I can't even get the frame rate to stutter...

I'm not telling you to turn down the settings, I think something is wrong with your system if your 580 has trouble running it.
BS if that mod is more demanding than the standard game. your gtx460 will not even average 40 fps in the standard game on Ultra DX11 with 4x AA at 1920.
 
@Zarathustra, it seems you completely ignored everything I said, and you're over exaggerating hugely lol.

Gaming @1920x1200.... not on 30" displays @ 2560x1600 .. I'd like to see any games that come out before the new consoles even touch a single 580 @ 1920x1200..

If there's some huge boom in PC exclusives aimed @ enthusiasts for some bizarre reason I myself will be upgrading, but there just won't be.
 
@Zarathustra, it seems you completely ignored everything I said, and you're over exaggerating hugely lol.

Gaming @1920x1200.... not on 30" displays @ 2560x1600 .. I'd like to see any games that come out before the new consoles even touch a single 580 @ 1920x1200..

If there's some huge boom in PC exclusives aimed @ enthusiasts for some bizarre reason I myself will be upgrading, but there just won't be.

We won't do the reading for you. Go look it up. From memory I already recall there being over 20 games that are too demanding for a GTX580 to get 60fps continuous at 1920x1080.
 
Dont know where they are going, but i got off my butt and ordered a matching 580 with reference design before supply runs out!
 
We won't do the reading for you. Go look it up. From memory I already recall there being over 20 games that are too demanding for a GTX580 to get 60fps continuous at 1920x1080.

hahahha another guy who's over-exaggerating massively. total bs
 
I think the problem lies in what we are considering acceptable. Everyone has their own definition of this.
 
Back
Top