New Titan X. Make the jump to 1440p G-Sync or go 1080p G-Sync.

heatthegreat

Limp Gawd
Joined
Oct 12, 2007
Messages
424
Title says it all.. I'm currently at 1080p 60hz and with the Titan X, I feel I need to upgrade my monitor. Question I have is, should I stay at 1080p and go gsync 144hz,or get the new Acer 1440p IPS gsync 144hz? I went to 1080p in the first place because I didnt want to upgrade as often with 1440p, but with G-sync and how it feels smooth regardless if you are at 50 fps or 120, I might reconsider going back to 1440p. Also, is it safe to say that g-sync lengthens the life span of your gpu because you dont feel you have to upgrade as often due to your game being smooth at any framerate? I've never used it before.
 
Even with gsync the difference in smoothness between 50 fps and 120+ is very evident. So it will help but there's no substitute for raw gpu power.
 
First off I would not buy a $1000 gpu with a cpu that can not fully push it. You are not even going to fully push a Titan in some games at 1440 nevermind 1080. Heck in Crysis 3 you cant even maintain 60 or even 50 fps in some spots with a 2500k. Definitely go 1440 with gsync if you are crazy enough to spend that much on a gpu so you will get the best use and experience out of it.

EDIT: And why do you have a 3x2gb ram setup? Your cpu needs to run in dual channel for its full potential. Plus 6 GB is the very minimum for some games even right now.
 
Last edited:
Guys I'm going to be brutally honest here, Running to G1 980s in Sli with my Rog Swift monitor at 1440p vs my 2 Evga SC 780s in Sli and I can't really tell the difference in almost all games when running Gsync. I'm running a 2600K @ 4800mhz so don't tell me my Cpu is slow, it's churning out some serious Fps. If I could go back and do it all over again I'd keep the 780s and save the cash towards my next gpus, the Gsync is that good.
 
Wife just went on a work trip and we hooked up the 34" 3440x1440 curved... It's an interesting experience. BF4 is quite different. Feels a lot more immersive. Browsing has to grow on me.

Not sure if the "this is the one we always had" will work this time (24" flat to 34" curved is quite the diff.)

A single Titan X will push a 1440p fine. Some games still tax it. Like farcry 4 at 1080P maxxed a Titan X OC'd would likely sit around 60FPS.
 
I'd get the new Acer monitor. I have a Titan X and 4K G-Sync monitor, and am considering getting one myself. As nice as the Titan X is, a single card at 4K even with G-Sync isn't enough. At least not for me. Unfortunately my eyes can still tell a difference between 30-59fps with G-Sync, and 60fps. Plus, I'd like to ditch TN and go back to IPS.

I don't know if that helps, but that's where I'm coming from.
 
I've been told that my 2500k at 4.2ghz will in fact not bottleneck the Titan or bottleneck to the point where it warrants an upgrade. And I can go higher, i just chose 4.2 ghz. Also, the reason why my memory is 3x2GB is when I built this rig 2 years ago, I had no clue on how dual channel worked. I was completely in noob status. I'll be moving to 2x8gb for 16gb as well with this purchase.
 
I've been told that my 2500k at 4.2ghz will in fact not bottleneck the Titan or bottleneck to the point where it warrants an upgrade. And I can go higher, i just chose 4.2 ghz. Also, the reason why my memory is 3x2GB is when I built this rig 2 years ago, I had no clue on how dual channel worked. I was completely in noob status. I'll be moving to 2x8gb for 16gb as well with this purchase.

It won't. CPU technology has barely changed since Sandy Bridge. Your 4.2 GHz is the same (essentially) as an Ivy Bridge or Haswell CPU's (with equal core count) capability. PCIe 2.0 could be a bummer if you go 2x or more cards, but otherwise, you're just fine. Get that 16GB of RAM and your new card and enjoy.

I'm personally waiting for this monitor:
http://us.acer.com/ac/en/US/content/model/UM.HB0AA.001
 
I've been told that my 2500k at 4.2ghz will in fact not bottleneck the Titan or bottleneck to the point where it warrants an upgrade. And I can go higher, i just chose 4.2 ghz. Also, the reason why my memory is 3x2GB is when I built this rig 2 years ago, I had no clue on how dual channel worked. I was completely in noob status. I'll be moving to 2x8gb for 16gb as well with this purchase.
It will most certainly hold back the Titan from its full potential. I even gave you a clear example of that with Crysis 3 where you cant even stay above 50. If you are getting a gsync 1440 monitor then it wont matter much though. You need to get your memory straight though as 3x2 GB makes no sense with your cpu and again 6 GB is the very minimum for games. Sell that 3x2 GB setup and get you a good 2x8 GB 1600 to 2133 mhz 1.5 v ram setup.

And pcie 2 can hold you back in a couple of games regardless of what people claim. No big deal overall but again you are dropping big coin on a card here.

Personally I would not spend 1000 bucks on gpu that I could not fully utilize in all my games. If you are going to do it then at least go for a 1440 gsync screen.
 
Last edited:
First off I would not buy a $1000 gpu with a cpu that can not fully push it. You are not even going to fully push a Titan in some games at 1440 nevermind 1080. Heck in Crysis 3 you cant even maintain 60 or even 50 fps in some spots with a 2500k. Definitely go 1440 with gsync if you are crazy enough to spend that much on a gpu so you will get the best use and experience out of it.

EDIT: And why do you have a 3x2gb ram setup? Your cpu needs to run in dual channel for its full potential. Plus 6 GB is the very minimum for some games even right now.

You're really over stating the bottleneck issue. Most games use well below 4GB of RAM. And CPUs hardly matter for a good number of games, like Metro Last Light and Tomb Raider. Going for an AMD 965 at stock settings to a 4670K at 4GHZ gained me a whole 2 frame rates. Tomb Raider was similar. BF3 did see a good performance jump though.

A 2500K at 4.2GHZ isn't much slower than your CPU in the majority of games.
 
You're really over stating the bottleneck issue. Most games use well below 4GB of RAM. And CPUs hardly matter for a good number of games, like Metro Last Light and Tomb Raider. Going for an AMD 965 at stock settings to a 4670K at 4GHZ gained me a whole 2 frame rates. Tomb Raider was similar. BF3 did see a good performance jump though.

A 2500K at 4.2GHZ isn't much slower than your CPU in the majority of games.
Again 6 GB is listed as the very minimum for most demanding games. And I thought I made it clear that the fact he is using just 3 sticks of ram was also good reason to go ahead upgrade it.

"A 2500K at 4.2GHZ isn't much slower than your CPU in the majority of games".

Did I say it was? I said his cpu would not fully push the Titan in all games. I gave an example of Crysis 3 as worse case example where he cant even maintain 50 fps in spots. But I also then said it would not really matter if getting a gsync 1440 monitor.


I am sure my words will get twisted and taken out of context yet again as thats par for the course around here.
 
You're really over stating the bottleneck issue. Most games use well below 4GB of RAM. And CPUs hardly matter for a good number of games, like Metro Last Light and Tomb Raider. Going for an AMD 965 at stock settings to a 4670K at 4GHZ gained me a whole 2 frame rates. Tomb Raider was similar. BF3 did see a good performance jump though.

A 2500K at 4.2GHZ isn't much slower than your CPU in the majority of games.

I completely agree with you.

Average case there is maybe a 20-25% difference in AVX heavy code between Sandy Bridge and Haswell/Broadwell. Sure, that can rocket up to 75% in very select situations, IF you hunt for them. But what games do you know that use AVX2 code that's INT heavy?! Those of us in the know KNOW that the GPU is the limiting factor 99% of the time.

Most games are fine using up to a 4GB framebuffer unless you pile on 4x supersampling with FSAA, all running at 4K resolution, then yes, you will run into a VRAM wall fast. Typical 2 or 4x FSAA/MFAA/FXAA and max details are fine on such a card up to 2560x1440. If someone were to actually plan on using those insane settings, anyone in their right mind would be spending to buy a 2P workstation with 3 more of those Titan X cards.

Most cases you're facing a 10% performance deficit in an average worst case scenario running Sandy Bridge uArch CPUs versus a Haswell CPU. It's why I didn't hesitate to buy a 4 year old i7 2600 at a lower price than I paid for my original i5 3570 that went in this system many moons ago. It's still plenty fast enough for 110% of the things I, and most others do. Maybe he can't stomach the thought of a $1,000 video card being used with anything less than a $1,000-1,500 Intel Extremely Expensive Edition CPU?
 
Last edited:
Is there any proof that a 2500k at 4.2 to 4.4 is any different than any of the current processors at the same speed when gaming? Not calling you a liar, but I've had more people tell me that it won't cause a bottleneck than those that say it does.. I usually like upgrading my CPU when I upgrade my GPU but it seems that there really hasn't been a jump in terms of performance between my 2500k and what is out now when gaming is involved. The $300 spent on a new proc can go towards more ram, a bigger SSD and also towards a new monitor. If you can provide me some sort of benchmarks or something that should convince me to buy purchase a new processor, I would be happy to read/see it and then make my decision.

EDIT: And yes, the 6GB 3x2GB setup was me being noob 2 years ago. I have already ordered 2x8gb for my rig.
 
Is there any proof that a 2500k at 4.2 to 4.4 is any different than any of the current processors at the same speed when gaming? Not calling you a liar, but I've had more people tell me that it won't cause a bottleneck than those that say it does.. I usually like upgrading my CPU when I upgrade my GPU but it seems that there really hasn't been a jump in terms of performance between my 2500k and what is out now when gaming is involved. The $300 spent on a new proc can go towards more ram, a bigger SSD and also towards a new monitor. If you can provide me some sort of benchmarks or something that should convince me to buy purchase a new processor, I would be happy to read/see it and then make my decision.

EDIT: And yes, the 6GB 3x2GB setup was me being noob 2 years ago. I have already ordered 2x8gb for my rig.

Read my above post. The difference is negligible, and only perceivable in benchmarks or very select use cases (most of them not in games). Honestly, since more and more games are reaching into more threads (MGSV is a good example uses 4 by default but will spawn additional threads as needed, I've seen it bang on all 8 in heavy load scenarios, and with the consoles having 8 cores, this will become the norm rather than the exception), Save yourself $300 by buying a SB or IB i7, a GTX 980, and pocket the rest for games. Titans are not worth nearly double the price for an extra 30% of performance unless you plan on doing double or triple SLI at insane settings that will actually use the majority of that insanely huge 12GB framebuffer.
 
Last edited:
Is there any proof that a 2500k at 4.2 to 4.4 is any different than any of the current processors at the same speed when gaming? Not calling you a liar, but I've had more people tell me that it won't cause a bottleneck than those that say it does.. I usually like upgrading my CPU when I upgrade my GPU but it seems that there really hasn't been a jump in terms of performance between my 2500k and what is out now when gaming is involved. The $300 spent on a new proc can go towards more ram, a bigger SSD and also towards a new monitor. If you can provide me some sort of benchmarks or something that should convince me to buy purchase a new processor, I would be happy to read/see it and then make my decision.

EDIT: And yes, the 6GB 3x2GB setup was me being noob 2 years ago. I have already ordered 2x8gb for my rig.
Yes with my 2500k at 4.4 there were spots in Crysis 3 that dropped into the 40s and those same spots are well above 60 fps with a 4770k at same speeds. This has been pointed out by others in that game too. I am not going to get into a back and forth bickering here as some users here will make excuses or discount any links anyway. In the end your cpu is fine but again it simply cant fully push a Titan in every case.

And I did not say you had to upgrade your cpu anyway. I said if you get a 1440 gsync monitor then that would just fine. At 1440 you would be more gpu limited and with gsync it would not matter if you happen to drop into 50s or even 40s for a bit in something like Crysis 3.

So bottom line if you really want the Titan and want to keep your cpu then just get a 1440 gysnc monitor.
 
I agree that there is no point in arguing. You stand by your opinion, I stand by mine. I do agree with you that he would be better off spending that $1000 on a Titan differently, however.
 
If your frame rate dips below 50 you will notice even on a g-sync monitor. Personally I noticed it at 60 fps on bf4. Most of the time the frame rate would be in the 90-100 range but as soon as there was some explosions it would tank to 50-60. This was max settings at 2560x1440 And single 980. As soon as I added another 980 for sli the frame rate mostly pegged 144fps the max the ROG Swift will do with g-sync enabled And go as low as 110-120ish durring those explosions. Way better and smoother feeling when the frame rate stays that high.

Just don't make the same mistake as I did with benching the system with g-sync enabled! Haha.. limits your frame rate to 144.

So go with either 2x 980's or a titan but know that 2 980's will be faster!
 
P.S. using a 1440 res 27" ROG Swift monitor I feel like I'm living the life of luxuary everytime I sit down in front of it.. it's that nice!
 
I have 970's in SLI and I was running a i7-920 @ stock clocks. I then upgraded to a 5820 and there was a huge increase in performance in some game benchmarks and not so much in others. AC: Unity was unplayable with the 920 @ 4k with lots of stuttering the new CPU fixed almost all of the stutter. Watchdogs was another that my max FPS went from 30 to almost 60 (although the average was maybe like 5 FPS overall) Doing the batman benchmark I would get min values in the single digits with the old 920 and now it was closer to 40.
 
Personally, I think some people waaaayyy overestimate the Titan X.

I am a bit split between get a Titan X or a 980/new CPU. You might be able to get a 980 for a decent price on the FS forum.

For the monitor I just went to a 3440x1440 34" curved and it's a joy.
 
Last edited:
How do you guys have the Titan X already? I thought its not even out yet.....
 
To the OP:

Several reviews points that Titan X is on par with SLI 970's, but it's a Single GPU solution rather than dual, so has no MFAA/DSR problems with SLI/G-Sync etc, and 1440p on 970 SLI is already very playable, barring few exceptions (AC:U etc), so you won't be disappointed with 1440p, especially since it is IPS G-Sync.
 
Get a FW900.

Get a life. :rolleyes:

In any case, I'm sure your love of 300Hz gaming at 640x480 on a 22.5" monitor weighing 100 pounds has been duly noted by now. So give it a rest already jesus.


I would not recommend buying from nVidia direct if just because we know very little about the quality of their RMA service.

Also I might be imagining things, but I don't like how nVidia is selling direct -- cutting out the middleman -- and competing with their AIB partners for sales. Based on this alone I would not buy direct from nVidia, and would choose a "branded" version instead.
 
Last edited:
Get a FW900.

!

Get a life.

This times over 9000.

The only ones worth buying come from UnkleVito, and are $1,000 local pickup only. Even if he were to ship it freight, it'd probably be another couple hundred for shipping, with no warranty to damage that may very well occur during transit. These CRTs are over 10 years old, and even reconditioned, are nearing the end of their reliable lives.

They are insanely heavy. Most desks I see people with will bow or break under 100lbs of monitor.

A grand will only buy you ~21" of usable screen real estate. In the same price bracket ($1200-1500) you could have a 32" 3840x2160 IPS/*VA display BRAND NEW with a warranty that will have an order of magnitude higher chance of arriving at OP's porch in one piece. And it will always have perfect screen geometry. Can't say that about a 13 year old CRT, now can you?

Switching times have improved markedly in the last decade, and are perfectly usable. More so in TN panels than wide viewing angle panels, but even my 5 year old LP2475 (released 2 years before I bought one) is sufficient for even FPS games. I have to look for blur/overdive artifacting, and when it's present, it's minimal.
 
Sooooo, am I good with a 2500k at 4.2.. Lol I'm not cheap to upgrade if it's really going to an issue.. But if my 2500k still has the legs to run with my Titan X, I want to get every last ounce out of it. And by the way, thanks for all the replies!

Ps: numerous threads on the web state that 970 sli won't bottleneck a 2500k and that seems where Titan Xs power is roughly at?
 
Sooooo, am I good with a 2500k at 4.2.. Lol I'm not cheap to upgrade if it's really going to an issue.. But if my 2500k still has the legs to run with my Titan X, I want to get every last ounce out of it. And by the way, thanks for all the replies!

Ps: numerous threads on the web state that 970 sli won't bottleneck a 2500k and that seems where Titan Xs power is roughly at?
You have your mind made up that there is no bottleneck at all. Having used a 2500k and 4770k I can tell you that there is a little bit even with a 980 in some games. AGAIN I already gave you the Crysis 3 example. A 4770k runs rings around a 2500k in parts of that game. I was at full cpu usage in several spots with my 2500k at 4.4 and it could not even stay out of the 40s where as the 4770k stays above 60. That game is worse case scenario like I mentioned though. Most other games its little to no difference if you are going to crank the graphics settings at 1440.
 
as far as I am aware there is no 1080p gsync monitors, so thats not an option unless you willing to downscale and go away from native monitor resolution.
 
I am actually surprised AOC is that high, always thought it was the 'budget' G-Sync.
 
There are a couple 1080p G-Sync monitors actually, but they carry a 50%+ premium compared to equivalent non G-Sync monitors.

thanks yeah, that premium is horrific especially as they all crappy TN. I am not paying 400 for a TN monitor thats bonkers.

hopefully when freesync is on everything it will force nvidia to become more realistic with their fees.
 
Get a life. :rolleyes:

In any case, I'm sure your love of 300Hz gaming at 640x480 on a 22.5" monitor weighing 100 pounds has been duly noted by now. So give it a rest already jesus.

Nah, I prefer 2048x1280 at 90Hz, with IPS color and veiwing angles, OLED contrast, and ULMB-type motion quality.
 
Back
Top