I'm going to try Monday/Tuesday to see if I can get an elevated CS representative, the whole ordeal and phone call nonsense when the card smoked happened on the weekend - same with the CID department yesterday. I honestly at this point don't even want another 760, I'd take a $100 credit and go...
Bought a GTX 760 in September of '13 and had it smoke itself by October of '13 due to infant mortality of a power delivery component on the board (couldn't tell if it was a MOSFET, VRM, or power cap). The 3 year rapid replacement warranty is awesome and I get a new one sent out to me promptly. I...
Our control boards for power range instrumentation and rod step control use Seasonic industrial PSUs and DC-DC boards - if they're good enough for a nuke plant they're more than good enough for me. The only PSU that has failed in 10 years (past lifespan and replacement frequency anyway) wasn't a...
That 40 minute video was holy wtf levels of good. 14% battery was used during that whole video which I would say averaged the upper end of moderate usage with low periods of heavy use - extrapolate that to a full charge and drain and it's just shy of 5 hours of moderate to (sparingly) heavy use...
$171 (shipping included) to the US is a pretty good deal for that triple stand w/ telescoping wings. I got mine on an Amazon sale for like $190...it was also when the MSRP on those was over $300 and most places didn't have it cheaper than $250-275. They've since come down but pennies saved are...
Several actually.
<-- Back
<-- Back more recent picture (ignore all the empty alcohol bottles - haven't done spring cleaning yet)
<-- Front
<-- Front more recent picture (same thing with the empty alcohol bottles)
The more recent pictures were taken after I had to re-setup everything...
^ This is actually what I'm using for my dual 34" LG ultrawides. Exact monitor stand here on Amazon. Another option is to get this stand since you don't really need the telescoping wings (though your monitors will have to angle in slightly).
Dual 34" 3440x1440 monitors here, upgraded from dual 23" 1920x1080. Letterboxed the monitors turn into a centered 27" 2560x1440 picture. I sit about 3.5 feet away from them (they're angled to fit on the desk so some parts are closer to 3 feet others closer to 4 feet but generally I spin the...
If the VRAM isn't being efficiently used to actually improve the experience or visual fidelity of the game I'd rather have it sit there unused than trying to seek out every last MB the card says it has and use it poorly. The other part of that is the coding and engine optimizations that exist in...
That's incorrect by the very definition of optimize - make the best or most effective use of (a situation, opportunity, or resource). At best you could call the bold/unlined/italicized statement inefficient - but certainly not optimized. There is nothing effective about using 4GB of VRAM to...
Actually it kind of does matter - nobody expects a pair of $325 graphics cards to drive 18.66 million pixels with that level of detail. Sure, it proves the point shit hits the fan when the VRAM is pushed above its limitation but tell me.....what fucking card doesn't shit the bed when its VRAM is...
The game choosing to use what's available is directly related to the quality of programming that went into developing the graphical engine. So yes, I can blame the game for using what's available despite not needing to. I don't see half of my Steam library of 180 some odd games maxing out my...
That's just it, they wouldn't have. If we weren't in an era of shitastic console --> PC ports (ports that are done shitastically may I add) where lazy coding, lack of desire to actually optimize the game, and seemingly widespread acceptance of the practice, nobody would've known for at least...
And if the game was properly (read: efficiently and not lazily) coded it wouldn't just scale to use the maximum memory available if it did not need to. Even rendering 3440x1440 on Very High system spec and settings with 4x MSAA Crysis 3 barely uses 3GB of VRAM despite my 980 having a real 4GB...
What I notice more in the Dying Light video is that the 780 has a smaller FOV...as if it is zoomed in and is missing out on the action at the edges of the screen. The edges of the screen where the effects of stuttering are most apparent as the new peripheral objects are rendered within the FOV...
Yea, that game is a fucking mess code/engine wise. But seriously, we need to hold the game devs to this 'hive mind carrying pitchfork' standards nVidia is apparently being held to with this 970 debacle. Are the games they're giving us really "next-gen games"? Lets demand refunds and sue the shit...
Probably a little of both to be honest once they realized the mistake - side note unrelated to your thoughts (directed at another user) - really...dismissing the [H] crew's SLi performance review because they used 'old games' like Crysis 3, Tomb Raider, and BF4?
God damn those bastards for...
I'd like to see the electric motor design that causes that to happen - because that's a terrible motor to begin with. I work on AC and DC motors for a living, orientation matters only when you're talking about coupling to pumps and how that's going to affect seal leakage; it should have no...
So follow #2a and 2b in my earlier post and get a chargeback/refund/hire a lawyer. No sense rehashing the same shit and speak with your wallet - no sense in saying the same thing for a dozen pages but not taking matters into your own hands.
There will always be some minor stutter/hanging present unless your screen refresh is perfectly synced with your graphics card output. Have you tried running your monitor at a lower refresh rate (I notice its a 144Hz model) more in line with your generally framerate to see if the issue is as...
nVidia said the 0.5GB can be heuristically programmed to work more efficiently with the main section of 3.5GB...how does that not fall under the definition of (at minimum) potentially being alleviated by driver updates? The physical hardware hasn't changed since day one...just the people's...
I think its about time someone summed up the important info presented here, compiled it, and give it the old /thread because quite frankly I don't think much more needs to be said than the following:
1) The GTX 970 is more accurately described as a 3.5GB + 0.5GB buffer video card the latter of...
Maybe I fail to see the point in a monitor much more than that size; I feel a TV would be a better option if you're sitting 5 feet away from the panel as you only need 1080p at that screen size/distance combination.
You'd need a 42" 1440p panel for it to be noticeably better than a 1080p...
Ever thought that's because the pixel density is unfathmobly low by today's standards? A mere 77ppi for a screen that size at 2560x1440 - compared to a 27" 2560x1440 monitor that has a 109ppi. To be honest I'm surprised any content was displayed awesomely.
^ This
At what point would then a straight 4GB instead of 3.5GB + 0.5GB make any sort of difference since you're still short over 1GB on a 980...this outcry is fucking asinine. Your card still has 4GB but just doesn't pull from it until needed - with a measured performance hit of a whopping 2% or less...
Not worth unless you can get it essentially free (aka sell the 780's and with that money buy both 970's). I haven't read all of the FarCry 4 stuff on the [H] but what little I did glimpse at showed AMD suffering hard, so definitely a good choice on not going there. Kepler cards with FC4 do seem...
My recommendation would be this, the 100-D28-B13. It's basically the 100-D16-B03 with the bigger vertical bar and the extra mount. You can also convert a D16-B03 to a D28-B13 by buying 3 parts (part numbers/conversions found on the Ergo-tech website which I can't link on the forum for some...
I would except the entirety of what seems to comprise a white knighting 16:10 argument is the extra vertical pixels over 1080p...not that 16:9 resolutions are inherently inferior. I'm simply showcasing the fact that for awhile now 16:10 has been inferior to 16:9 if you're just rehashing the...
Calling 1920x1200 'mainstream' is like calling 2560x1440 'mainstream' as of a year ago. Granted as CES 2015 showed 2560x1440 is gaining popularity but I still wouldn't call it 'mainstream' by any means. Also, I'd hate to break it to you 16:10 white knights but there's been this thing called...
Ah yes, ye olde A4, slightly thinner and slightly taller, that makes more sense than it did before, you can't quite get A4 vertically at proper scale even at 1440on the vertical, its really damn close though.
Honestly, you might just feel the way you do because you came from such a large and...
Do you do a lot of programming/coding? That would be the one benefit I really see for the extra height over a 16:9 monitor...though really 1440p isn't far from 1600...closer than 1080 is to 1440 at least. You mentioned not being able to throw up side by side word documents at full size on a 16:9...
You move a lot in FFXIV: ARR? Last I checked the global was still like 2 seconds so you have time to make a sammich or three before you can do anything again....although maybe thats the issue, so much time is spent not pressing buttons you gotta spin and run around randomly.
On a serious note...
Must've just been general 144Hz issues then or bad modules/panels, that I absolutely know happened (bad panels and people with shit tier DP cables). Either way I feel 144Hz is unnecessary, especially on a 4K panel, unless you like playing exclusively monolithic games with $1k+ graphics card...
Go to the ROG Swift thread on these forums? There's gotta be at least 10 pages worth of bitching and sad emoticons about how their monitors flip shit whenever they have Gsync and try to boost it to 144Hz.I'm not sifting through 200 pages to find examples, but I do remember around pages 124-128...
Unless you simply want to have the options available having ULMB and Gsync and 144Hz does you no real good as Gsync can't be used with ULMB and Gsync doesn't play nicely with 144Hz either (and why pay for shit you won't/can't use all at once) Just stick with 120Hz and either Gsync or ULMB -...