GTX Titan or GTX 690

GhoztGT

n00b
Joined
Feb 17, 2013
Messages
3
Hi hard forum!

I am new to this forum although i did lurk around from time to time to check out certain discussions and i was wondering if you fellow [h] peeps can help me out.

I was going to invest $999 into the gtx 690 this week and with all the news of titan i honestly don't know what to invest in now and i don't know what the benefits are from having a dual gpu card as opposed to a single GPU card ( titan )

So if you guys can help me out with my decision, I'd very grateful. Thanks!
 
Wait until this coming morning and you'll know for sure, rumor has it Titan is obscenely powerful, potentially more than the GTX 690 ;). NDA is supposed to drop within the next 12-18 hours.
 
i would buy Titan, later on you can add another Titan for sli.
and that's if money is no object.
 
Lets see what happens when we see this thing in the wild actually being benchmarked.

..but really Titan all the way. Single extremely powerful GPU over dual.
 
Too many recomendations, without asking the OPs resolution, system specs, etc....

Titan is really only going to be needed for surround 3D, 4k monitors, or surround with 2560 res monitors.
Single monitor, 1080P 3D, etc... the 690 would be faster. Just depends on the final price, and if it causes the 690 to drop in price at all. A single 670/680 can already play every game Ive tried maxed out at 2560x1440 (no need for high levels of AA at that res, so you'll hear some people say they dont max everything because they cant do 8xAA in certain games at that res, but there is no need for it anyway).
 
Too many recomendations, without asking the OPs resolution, system specs, etc....

Titan is really only going to be needed for surround 3D, 4k monitors, or surround with 2560 res monitors.
Single monitor, 1080P 3D, etc... the 690 would be faster. Just depends on the final price, and if it causes the 690 to drop in price at all. A single 670/680 can already play every game Ive tried maxed out at 2560x1440 (no need for high levels of AA at that res, so you'll hear some people say they dont because they cant do 8xAA in certain games at that res, but there is no need for it anyway).

I7 3770 ( NON - K ) 3.5ghz
8GB of Ram
sabertooth z77
gtx 580

The resolution im on is 1920x1080. I would get a 680 but the problem is down the line i will prbably end up investing into the 7 series (maxwell) But i really don't want to spend anymore money on a GPU for a long time, Might be impossible lol due to technology always getting better and the 7 series will probably beat titan but not by much.
 
At 1080P, you could max every game with high levels of AA with a single 670/680. Having a Z77, SLI in the future for cheap would be an option too. 690/Titan for 1080P is way over kill.
 
Not overkill if you want 120fps for your 120hz monitor @1080p.

He didnt say anything about 120hz, now did he ;)
I even mentioned in my post above, for 3D the 690 would be better then the Titan, but for regular 1080P 60Hz, no reason to have more then a 670. Which in itself is already over kill.
 
At 1080P, you could max every game with high levels of AA with a single 670/680. Having a Z77, SLI in the future for cheap would be an option too. 690/Titan for 1080P is way over kill.

Yes i know but i'm talking about future proofing my rig for a long time.
 
Yes i know but i'm talking about future proofing my rig for a long time.

Next gen consoles are focusings less on high performance, and more on more ways to play. So with 99% of games being ports, graphics arent likely to advance much in the next few years.

With a 670 at 1080P 60Hz, there is no game that comes close to bringing it to its knees yet. The 670/680 would be like the 8800GTX back in the day, and be up to date for a good 2-3 years.


Besides, there is no future proofing with PCs.


A 670 for $300 today, save that other $700 for something else. If a day comes you need more performance, you could pick up a second 670 for somewhere in the $100s, and have 690 level performance then.

Really, 660TI is plenty for 1080P. But with your 580, I wouldnt even bother trying to upgrade yet.

Or get a 670 for $300, then get a 2560 res monitor to make use of it. And still spend less then a 690/Titan.
 
Well to be fair, the PS4 is going to be a very powerful machine. If it lands at 400$ i'll definitely get one.

Granted, not as powerful as a modern PC but it will be very nice.
 
I just bought a 7950, but I'm thinking about selling it for the Titan if the price is right...
 
Next gen consoles are focusings less on high performance, and more on more ways to play. So with 99% of games being ports, graphics arent likely to advance much in the next few years.

With a 670 at 1080P 60Hz, there is no game that comes close to bringing it to its knees yet. The 670/680 would be like the 8800GTX back in the day, and be up to date for a good 2-3 years.

There are quite a few games out, and coming out, that can easily stress a 670 @ 1080P..FarCry 3, Crysis 3, Metero 2033/Last Light,PlanetSide 2, Skyrim Mods etc..

I have a buddy locally that was playing @ 1080P with a single 680 and PS2 in large battles struggles to maintain 50fps, often dipping into the upper 40s..FC3 also won't do a solid 60 FPS with all options maxed, and Crysis 3 (out today) is even more stressful..

OP, if you wanna be future proof with the best single card you can get, not including Titan, then I would look at a 7970Ghz edition/ 680 with good cooling..They both offer great performance, and can o/c to the moon, enough that you can gain 10-15% in some titles..
 
He didnt say anything about 120hz, now did he ;)
I even mentioned in my post above, for 3D the 690 would be better then the Titan, but for regular 1080P 60Hz, no reason to have more then a 670. Which in itself is already over kill.
He never said anything about 60hz either.
 
At 1080P, you could max every game with high levels of AA with a single 670/680..

Absolutely rubbish. u cannot max every game at 1080p with any card.

try maxing far cry 3, arma 2 and gw2 just to name a few at ultra max settings with fps not dipping below 60fps
 
Absolutely rubbish. u cannot max every game at 1080p with any card.

try maxing far cry 3, arma 2 and gw2 just to name a few at ultra max settings with fps not dipping below 60fps

I have no problem playing those games at high to max settings on a single 680 at 1920x1080. It may not be 60 FPS and above at all times, but you're never going to achieve that with a game like ArmA 2 for example, especially on the highest settings. It doesn't matter what rig you have, it just won't happen.
 
Absolutely rubbish. u cannot max every game at 1080p with any card.

try maxing far cry 3, arma 2 and gw2 just to name a few at ultra max settings with fps not dipping below 60fps

Like I said in my post :rolleyes:

Why do some people feel that not having it drop below 60 is required for it to be maxed, or to have an enjoyable expierance?

It used to be as long as you could average 30, it was fine. Then 40-45 became the new number, then people got stuck to 60.

As long as frames arent dropping below mid 20s to low 30s (depending on the game) you arent going to notice. Most games today use motion blur, which makes it where you cant notice frame drops as much. Thats why games like Crysis are playable even if you only average 30, and it drops below 20 at times.


Single 670 at 2560x1440P, Far Cry 3, Crysis 2, and BF3 all maxed out, never drops below 30FPS in either of them (again, no super high AA, or settings that dont make any difference just for epeen). And thats 3.7Mp vs 2Mp, almost double.



And about the 120Hz, its a new thing, and most people, when they say 1080P without saying 120Hz, means its 60hz. 120hz is rare enough, someone will make a point to say they have a 120Hz monitor if they do.



There are always going to be poorly coded games, or games that have server lag. Like Guild Wars 2, a 7770 HD can play it nearly maxed out at 1080P as long as the server is running right. When its not, you could have 4 titans at 720P and it wouldnt matter.
 
Like I said in my post :rolleyes:

It used to be as long as you could average 30, it was fine. Then 40-45 became the new number, then people got stuck to 60.
.

Where you get that 30fps used to be a norm?
My norm is 120fps since 1996 and Quakeworld and then Quake 2, Quake 3 etc... competitive gaming.
I got shivers imagining playing under 60fps, 60 is bare min since last milenium on PC.

Actually it took me a while to adjust to a new norm of 60fps after more then 10 years of silky
smooth, lag free 120fps gaming.
 
Well to be fair, the PS4 is going to be a very powerful machine. If it lands at 400$ i'll definitely get one.

Granted, not as powerful as a modern PC but it will be very nice.

It's using an APU. Not all that powerful

Like I said in my post :rolleyes:

Why do some people feel that not having it drop below 60 is required for it to be maxed, or to have an enjoyable expierance?

It used to be as long as you could average 30, it was fine.

I don't remember a 30 fps AVERAGE ever being the "goal" not to mention, with today's LCD's actually being inferior to yester-years CRTs for motion, a higher FPS is essential to minimuze things like input lag.
 
Where you get that 30fps used to be a norm?
My norm is 120fps since 1996 and Quakeworld and then Quake 2, Quake 3 etc... competitive gaming.
I got shivers imagining playing under 60fps, 60 is bare min since last milenium on PC.

Actually it took me a while to adjust to a new norm of 60fps after more then 10 years of silky
smooth, lag free 120fps gaming.
I started PC gaming in 06ish, and most games then were fine averaging 30fps. And a console game that held 30FPS was rare.
Twitch shooters like that and counter strike, etc.. dont have motion blur. Thats why I said "depending on the game". But they are so easy to run, it doesnt make a difference. Whats really crazy is people that want more frames then the refresh rate of their monitor :rolleyes:
 
I started PC gaming in 06ish, and most games then were fine averaging 30fps. And a console game that held 30FPS was rare.
Twitch shooters like that and counter strike, etc.. dont have motion blur. But they are so easy to run, it doesnt make a difference. Whats really crazy is people that want more frames then the refresh rate of their monitor :rolleyes:

What monitor are you running with a 30Hz refresh?
 
Where you get that 30fps used to be a norm?
My norm is 120fps since 1996 and Quakeworld and then Quake 2, Quake 3 etc... competitive gaming.
I got shivers imagining playing under 60fps, 60 is bare min since last milenium on PC.

Actually it took me a while to adjust to a new norm of 60fps after more then 10 years of silky
smooth, lag free 120fps gaming.

120 since 1996? The Voodoo 1 was lauded as being one of the first consumer graphics cards to even run 3D games at 30 FPS.

I'm not saying that more isn't better (sure, 60 is smoother than 30), but I've never met a single person who's complained about a game only running at 30 fps (talking about minimum). Typically, when I do see people complaining about it, their definition of 30 fps is in reality more like 8-10 fps.
 
What is so funny btw tv and movies is that film is 24, anything over that looks video/broadcast and is not cinematic/looks cheesy. Even the hobit, which was 48fps had people complain (my friends at least because we work in vfx) how it looked broadcast ahah.

In video games you want more FPS, the more the better aahah but in fact you are fine with 24fps for certain games. I have to say that I prefer games that run at least at 45-55 fps but when I had a 560ti some games dipped lower and it didnt bother me, unless they were online because it put me a t a disadvantage.
 
120 since 1996? The Voodoo 1 was lauded as being one of the first consumer graphics cards to even run 3D games at 30 FPS.

I'm not saying that more isn't better (sure, 60 is smoother than 30), but I've never met a single person who's complained about a game only running at 30 fps (talking about minimum). Typically, when I do see people complaining about it, their definition of 30 fps is in reality more like 8-10 fps.

What is so funny btw tv and movies is that film is 24, anything over that looks video/broadcast and is not cinematic/looks cheesy. Even the hobit, which was 48fps had people complain (my friends at least because we work in vfx) how it looked broadcast ahah.

In video games you want more FPS, the more the better aahah but in fact you are fine with 24fps for certain games. I have to say that I prefer games that run at least at 45-55 fps but when I had a 560ti some games dipped lower and it didnt bother me, unless they were online because it put me a t a disadvantage.
Exactly what Ive been saying.

Obviously more is better, but there comes a point when people get way too serious about this subject.

Just about ever review site I can think (including [H]) aims for about 40-45FPS average for a game to be smoothly playable. Which is going to put minimum around 30 or less.


And OT, I couldnt watch the Hobbit in 48Hz. Well, I did, but I was so distracted, it took me 2 hours to even start to pay attention to the story. Had to see it again in normal 24Hz to actually like it.
 
:confused:

Who said 30Hz?



I remember when Crysis came out and I had a 3870. Wanted eye candy, so I played it down to 17FPS, and still had fun :D

You keep bringing up 30fps and then say you don't need anything higher than refresh (which isn't actually true, there are still benefits to be had)

I remember when Crysis came out. I had a 4850 and couldn't stand 17fps and bought a HD 5870
 
Exactly what Ive been saying.

(snip)

Just about ever review site I can think (including [H]) aims for about 40-45FPS average for a game to be smoothly playable. Which is going to put minimum around 30 or less.


And OT, I couldnt watch the Hobbit in 48Hz. Well, I did, but I was so distracted, it took me 2 hours to even start to pay attention to the story. Had to see it again in normal 24Hz to actually like it.

That's not what you've been saying. Big difference between a 30fps average (which is what you said) and a 30 fps minimum (which is what you're saying now)

There's also a difference between watching TV where additional frames are manipulated in and a game running at 120fps.
 
Like I said in my post :rolleyes:

Why do some people feel that not having it drop below 60 is required for it to be maxed, or to have an enjoyable expierance?

It used to be as long as you could average 30, it was fine. Then 40-45 became the new number, then people got stuck to 60.

As long as frames arent dropping below mid 20s to low 30s (depending on the game) you arent going to notice. Most games today use motion blur, which makes it where you cant notice frame drops as much. Thats why games like Crysis are playable even if you only average 30, and it drops below 20 at times.


Single 670 at 2560x1440P, Far Cry 3, Crysis 2, and BF3 all maxed out, never drops below 30FPS in either of them (again, no super high AA, or settings that dont make any difference just for epeen). And thats 3.7Mp vs 2Mp, almost double.



And about the 120Hz, its a new thing, and most people, when they say 1080P without saying 120Hz, means its 60hz. 120hz is rare enough, someone will make a point to say they have a 120Hz monitor if they do.



There are always going to be poorly coded games, or games that have server lag. Like Guild Wars 2, a 7770 HD can play it nearly maxed out at 1080P as long as the server is running right. When its not, you could have 4 titans at 720P and it wouldnt matter.
go play console games if u like 30fps. i invest on pc platform to play 60fps games especially racing sims. racing sims below 60fps is a major deal breaker for me and the likes of PCARS and Rfactor 2 cant be maxed out on your beloved 680 series cards.
 
That's not what you've been saying. Big difference between a 30fps average (which is what you said) and a 30 fps minimum (which is what you're saying now)

There's also a difference between watching TV where additional frames are manipulated in and a game running at 120fps.

You keep bringing up 30fps and then say you don't need anything higher than refresh (which isn't actually true, there are still benefits to be had)

I remember when Crysis came out. I had a 4850 and couldn't stand 17fps and bought a HD 5870

If you guys actually read what I said, was that it used to be 30FPS average was ok. As in a playable fine experiance.

Then I said, talking about minimum, not average frame rate,
As long as frames arent dropping below mid 20s to low 30s (depending on the game) you arent going to notice.

Then said games like Crysis with lots of motion blur are playable around 30FPS because of the motion blur,
Most games today use motion blur, which makes it where you cant notice frame drops as much. Thats why games like Crysis are playable even if you only average 30, and it drops below 20 at times.

And then went on to say that I play with rates that dont go below 30FPS...
Single 670 at 2560x1440P, Far Cry 3, Crysis 2, and BF3 all maxed out, never drops below 30FPS in either of them (again, no super high AA, or settings that dont make any difference just for epeen).


I didnt say 30 average was a smooth experiance today. But again, that depends on the game.


And as for movies/TV, aside from the Hobbit, and a couple others here and there, most things are shot at 23.976Hz, which is the only way I like to watch stuff. Obviously games are different, and I never said they werent. Its not like I ever tried to say 120Hz tvs (no such thing as a real one) were like 120Hz monitors.
 
I read what you said and responded to it. I'll respond again...

I don't EVER remember 30 fps average being considered good, by any stretch of the imagination.

If you know TV and games are different, why even attempt to draw a comparison? It makes it sound like you don't really know they're different.
 
I read what you said and responded to it. I'll respond again...

I don't EVER remember 30 fps average being considered good, by any stretch of the imagination.

How old are you? Just because YOU don't remember it doesn't mean it isn't true.
 
well do u have actual facts to disregard the guys statement?

Im trying to find old reviews that state that, but do you know that even today the reviews here on [H] say 44 FPS is the rate they find acceptable. Is it so hard to believe that 30 FPS was acceptable 20 years ago?

Look at this line from wikipedia

The first 3D first-person shooter game for a personal computer, 3D Monster Maze, had a frame rate of approximately 6 FPS, and was still a success. In modern action-oriented games where players must visually track animated objects and react quickly, frame rates of between 30 and 60 FPS are considered acceptable by most, though this can vary significantly from game to game.

6 FPS, 6! Can you see how 30 FPS was ground breaking?
 
I read what you said and responded to it. I'll respond again...

I don't EVER remember 30 fps average being considered good, by any stretch of the imagination.

If you know TV and games are different, why even attempt to draw a comparison? It makes it sound like you don't really know they're different.

I didnt bring up the TV thing, r3awak3n did. And it wasnt a comparison to games either. I just agreed with him about the hobbit being wierd, which is why I said "OT" which stands for "off topic". Just need to read a little more closely ;)


About 6-8 years ago, console games had a hard time pushing graphics and holding 30FPS. I remember when one of the Ratchet And Clank games came out, and it was a huge selling point that it could maintain 30FPS. For a lot of review sites, 30FPS average was the target for playable settings. Im sure many had to raise that to the 40s to stop fighting with people over what is considered smooth, as it depends on the player and the game. Too many people have blown this out of proportion, from epeen, placebo effect, just because someone once said, or whatever.

Again though, more is obviously better. But most gamers were happy pushing the graphics until the average was in the 30s. And even today there is nothing really wrong with that if you like a better looking game over a faster one. Nothing wrong walking around an RPG at 30FPS. You wouldnt want to play a twitch shooter like that though, which is why its game dependent.


Back to the original point. To most people today, "maxing" a game out, generally means using settings that anything higher doesnt visually change anything (like going to 16xAA vs 8xAA or 4xAA, or using HDAO in Far Cry 3, or using the in game AA from Battlefield 3 which makes everything blurry) and being able to hold frames constantly above 30FPS.
A GTX-670 is able to do that in every game I tried, that isnt limited by something else (like crappy servers or coding).

A 690 or Titan for a non 3D 1080P display is silly. The OP's (original poster's) GTX-580 really doesnt need to be upgraded for anything currently out.


And considering Crysis 3 looks better then anything the next gen consoles can do (acording to Microsoft, Sony and Cevat Yerli),
http://www.eurogamer.net/articles/2...gen-consoles-to-match-the-power-of-gaming-pcs
its a safe bet that Crysis 3 will be the new standard benchmark. So if you can play that how you like, you wont need an upgrade for a while. (I dont have, and havent played it yet, so Im not able to say how it runs).
 
It was. When people starting caring about FPS the idea was that 30 FPS is fine for movies and TV (even though some were still 24 FPS) so it should be fine for games. Memory is short.

I'm sorry, but your probably confusing hard core pc gamers with playstation 1 users back then i used to own a cybercafe hence name GamingArena and was all pro gaming CPL, PGL, WCG and everyone tried to game at 120FPS even if you had to lower to 640x480.

All 3d games at the days ran on Quake engines and unreal (90% of 3d FPS games) and we all tried to reach magic 120FPS and we did one way or the other.

"People start caring about 30fps" you mean casual solitaire gamers and Console gamers.

We talking about hard core PC gamers here, or im on the wrong website here i thought im amongst the PC elite :) i guess i was wrong.
 
I read what you said and responded to it. I'll respond again...

I don't EVER remember 30 fps average being considered good, by any stretch of the imagination.

If you know TV and games are different, why even attempt to draw a comparison? It makes it sound like you don't really know they're different.

I'm sorry, but your probably confusing hard core pc gamers with playstation 1 users back then i used to own a cybercafe hence name GamingArena and was all pro gaming CPL, PGL, WCG and everyone tried to game at 120FPS even if you had to lower to 640x480.

All 3d games at the days ran on Quake engines and unreal (90% of 3d FPS games) and we all tried to reach magic 120FPS and we did one way or the other.

"People start caring about 30fps" you mean casual solitaire gamers and Console gamers.

We talking about hard core PC gamers here, or im on the wrong website here i thought im amongst the PC elite :) i guess i was wrong.

You are in the .01% of gamers who think they need 120FPS. There are hundreds of millions of gamers out there. About 90% of them are console gamers, or facebook gamers, or whatever. That other 9.99% is most PC gamers (and most of which are also console gamers, or now tablet and phone gamers), who just want to play the darn thing, and dont care what FPS they get, as long as its not choppy. This is a hardware forum, not a MLG forum.

Then there is a large part of the 9.99% that dont know any better (likely because they are new to PC gaming, and just came to it from console gaming) and listen to gamers like you, and think they need to have a Titan or 690 to have 120FPS on their 60Hz monitor in order to have a good experiance, or to be more "elite" then their Xbox/PS playing buddies.
 
Back
Top