Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.
Yeah I was worried about the 100Hz ULMB. I guess we still don't have "The Perfect Monitor." I'm definitely jealous you got yours already. Local PC shop here doesn't even know when they're getting it yet. I need me some Titan X cards and this monitor. Like Pronto. :(
You can still get the ROG Swift in Canada for $867 CAD before taxes. It's a tough decision to make as I'm thinking about the $133 savings with guaranteed Nvidia 3D Vision support and 1ms response time, to an unknown price unknown availability IPS from Acer with potentially no support for Nvidia...
I responded to another uninformed individual on the interwebs earlier, who had a similar mindset. To save myself some time, I'm just going to copy/paste what I wrote him for your benefit:
I don't have a GSYNC monitor yet. But I will be picking up the new 144Hz 1440p IPS GSYNC monitor from...
GSYNC integration doesn't cost more than $100~ to add in for manufacturers. But manufacturers also want to profit from selling the premium feature. Otherwise they're just putting in a feature that costs more money that only Nvidia will profit from, in the hopes that the GSYNC feature will...
Out of curiousity...if it's a 4ms response time panel, and 144Hz is about 7ms between frames, why would there be a difference in smoothness with a panel that has 2ms response time or even 1ms response time when you have ULMB active? If I understand it correctly, the increase in response time...
I hope they start making more ultra wide screen displays so people stop doing multi-monitor setups for panoramic gaming. :P I never understand the joy of wide screen when there was even the slightest bit of bezel visible.
Expect it to be in the $899-$1199 CAD range depending on how aggressively they plan to challenge Asus who happens to be their only competitor.
My personal guess would be $999 CAD.
I gave the ROG Swift a pass (publicly so) due to the filter. I understand some people can benefit from it. But when you put anything...regardless of how "tuned" or how "aggressive" you claim that it isn't, between the display and your eyes, you are taking away from image quality. ROG Swift...
Doesn't work for everybody. As for the app, the link has been updated. The author has linked to my video and website. The version that was previously distributed was the free version. Not the paid version.
I'm not aware of any recording apps that will record both the desktop and in-game as you tab back/forth. Unless I was running in Windowed mode, which would rob the game of performance. As would the screen capture itself. I could have recorded and stitched videos together as well. But then...
The proof was in the video...GPU's were sitting at 40%-50% usage. Problem is with the way the grass/foliage is handled. All its movement is calculated by the CPU. That's why they had to run it at a lower 30Hz rate. The limitation isn't from a theoretical standpoint. It's from a very real...
That's just because I forgot to turn the monitor back from 126Hz (testing) to 120Hz (stable). It's monitor artifacts. Not caused by the game or the application.
For those of you who are CPU limited and are tired of the low performance in Crysis 3, you may want to check out this video. If you are GPU capped, then you likely won't see any benefit. My results were a 27% FPS increase on a 4945MHz i7 3770K with Tri-Sli GTX 680's. GPU's still had room to go...
You're certainly correct. I know the kepler series actually has h264 encoder built in to the hardware outside of generic gpu acceleration. I wonder if that helps at all. Or how Tesla or Titan compare to the 580 they tested in the article. Sounds like something I may need to do a review on? =D
From the link you listed, the GTX 580 encode looks much better than the cpu encode. But upon further review you seem to be right. Looks like the majority of my encoding has been purely CPU based. There is still one thing I'm looking forward to with the Titan, however. And that's Smooth Video...
Not that I've heard....and especially not for lossless codecs. Nor with higher end modern gpu's and opencl. Unless you have info that I'm missing? Besides even if there were some loss in quality, it would be nothing compared to the quality loss from YouTube's final compression.
I don't think that's fair. My reason for doing it is that I'm finding some compatibility issues with Quad-SLI, even though I enjoy the performance I get from it. I had to disable and unplug one of my cards, running in Tri-SLI now to resolve my problem. 2 or 3 Titan's would be a great solution...
But someone could get nearly 3 GTX 670's (on sale) for the same price as a single Titan. I think the Titan is targeted towards a different crowd. The crowd that will spend whatever money without worry. Especially knowing that it's a limited production. If the performance numbers are real...
I think that changed with the release of the GTX 680. It was supposed to be the 670 or even 660, considering how much smaller of a die they used for it compared to the gtx 580. But if they had released that as the 660, and released the actual 680 as the titan, that would have been a card...
Depends what you call a "Few Months." With the Titan launching in February, and the 780 possibly coming as late as Q4, that's potentially a good 3/4th of a year between releases. And with the Titan being a limited edition VERY SMALL QUANTITY run for enthusiasts only who want to get top...
AMD said it would be about 45% faster than the 7970. (note: this was before they said the 8000 series was just a rebrand of 7000 series for oem. So for the purpose of our discussion, I suppose we're talking about what is likely to be named the 9000 series) It's just hearsay at this point and...
Can you show where it's confirmed that the 780 will be the same 28nm process? And a 60%-90% boost after a year and a half isn't too out of place with GPU power increases in the past and moore's law regarding the matter.
You sound awfully sure of yourself. If the 780 was only 15%-30% faster than than the 680, that would leave it behind the performance of the 8000 series AMD cards.
I run 1440p 120Hz. All settings maxed out. 32x CSAA. Between 4x-8x transparency AA. Doesn't help when a game like Farcry 3 has me limited to as low as 80fps~ in some areas while the GPU's never go above 70%.
The only game that's actually maxed out all 4 cards that I can think of is Skyrim...
It would be impossible for Nvidia to put out a 780 using 28nm, without making it equal to or inferior to the Titan. But seeing your comment regarding the 3770k vs 3930k for gaming...I'm not sure how much credit I should give you. As current generation games only use a limited number of cores...
And honestly....I'm ok with that. And I'll tell you why. In 90% of games today, you will be CPU limited before you're GPU limited. I found this out the hard way when I picked up 4x GTX 680 Classified 4GB for some Quad-SLI fun. 3770k only OC's to 4.6ghz, unfortunately. But even the extra 9%...
Actually the it would be very unlikely for the 780 to "Crush" the 690. Pretty well no previous generation of card has been able to dominate its predecessor in SLI. That would require a nearly 2x increase, and that really doesn't happen.
Based on some numbers I've ran, the GTX 780 should...
There is a difference between a "popular OC product" and a "only buying it to OC product." There is absolutely nothing special about these monitors except that they can be OC'd. It's not based off of an assumption. It's based off of an expectation. The monitor costs nearly 50% more. And you...
You should wait a few days for proper thermal bonding before overclocking a CPU too high. But I'm not sure if we're discussing best practices or peoples impatience. Either way...as with all equipment, the warranty is void once OC'd. That means outside of the first few weeks if anything comes...
Realistically, as is the expectation and as we wrote as a precaution before you're allowed to purchase, we made sure people are aware that overclocking anything (cpu, video card, ram, monitor, etc..) voids the manufacturer's warranty. Including models like intels multiplier unlocked K series...
Nope. other than the 1% there is no markup by us. There is some risk but we attempted to limit it.
1) fortunately my paypal account is a 13 year verified business account
2) all but the "explain to wife" is covered by the warranty +free return shipping
3) we only ship to paypal addresses so...
Because we did a sort of organized group-buy ahead of time to buy the entire stock of monitors for our members that required the payment to be sent at once. And because $4.60 hasn't been an issue for...anyone, actually. Haven't heard any complaints about it.
And, I think, considering we sold out of the monitors in 21 minutes, our members have a pretty good amount of faith in us. And hopefully once everybody gets their monitors, all of the fear will go away. Similar to how it was at the beginning when we first started hearing about buying monitors...
I've been busy with the site so haven't checked in here. You know it's a bad sign when it's 1am, you're at a bar, and you're remote-desktop'ing to your Korean contact to talk about Monitors. But I felt I needed to respond here.
1) The stand. Some people hate it. Some people are ok with it...
if you want to test whether its a software/driver/configuration issue, or a monitor issue, leave your computer off in your bios. if you come back later and get a garbled screen...its the monitor. if its clear, then its a driver/setting issue as stated earlier.