G-Sync is never coming to AMD video cards, its proprietary nV tech.
Quote me. Soon AMD will announce their own version which will be exclusive to Samsung's next batch of 144hz TN panels.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature currently requires accessing the site using the built-in Safari browser.
G-Sync is never coming to AMD video cards, its proprietary nV tech.
Quote me. Soon AMD will announce their own version which will be exclusive to Samsung's next batch of 144hz TN panels.
I hope it doesn't turn out like AMD's HD3D. They basically released it and said, "fuck it" and completely forgot about supporting it - and from what I remember, Samsung was one of the partners on board for HD3D.
I never liked 3D vision or HD3D, but nvidia at least maintained support for it. I would hope if AMD releases something similar that they will actually, you know, maintain support...
"Their own version" of the same tech going into G-Sync isn't G-Sync though, G-Sync is proprietary as TruForm or TrueAudio. And 2 competing vendor specific monitor standards is dumb as hell.Quote me. Soon AMD will announce their own version which will be exclusive to Samsung's next batch of 144hz TN panels.
Based on what? Gonna need a citation.I think that vendors are waiting for AMD to give them go, not the other way around.
Goal post shifting has never convinced anyone of anything and makes you look bad. Also a OC'd 780GTX or 780Ti will still make quite a bit of noise. Quiet stock air cooled HSF's for heavily OC'd top end cards don't exist. None of them are going to run at cool temps either if that is your concern. Not that it should be so long as temps are in the manufacturers' range. They're all going to push 90C+ easily on air when heavily OC'd with stock coolers.150$ premium over 290x is for 780ti not custom 290x. I got kids in next room and noisy GPU is not an option....Nice, cool Lightning or something like that is still my No1 choice for 1200p.
If you already have a current high end GPU then yes buying a R9 290/X is a bad idea. Of course so would be buying a 780Ti since its only 5-10% faster than a 780GTX for quite a bit more...unless you're going to run SLI, while OC'd of course right? But then you can expect lots of heat to get dumped into your house and noise will probably be deafening, driving you and your kids insane with sleep deprivation + heat stroke*.And I still love gtx 780
LOL Nvidia fanboys out in full force, will defend Nvidia pride, and try to downplay uber mode as much as possible, they must do everything to try to make the 780 Ti look alot better, even it isn't.
Wow, I had no idea you could get a 39" 4K for $700!
Looks like it only has HDMI 1.4 which means gaming @ 4K 120Hz is a no-go. It's capped at 30Hz.
Never. 16:10 is gone. Accept the bigger sizes and resolutions.
Zarathustra[H];1040378636 said:I wouldn't even consider getting any 16:9 monitor.
Shouldn't you guys be using SSAO in Far Cry 3? I've always been told that HDAO and HBAO are vendor specific.
Far Cry 3 implements a new and improved version of HDAO that uses full 3D camera space position data to detect valleys in the scene that should be shaded darker, and attenuates the lighting based on valley angle. This effect can be computed at full resolution, or at half resolution, depending upon the hardware resources available. In order for a half resolution HDAO buffer to be re-matched with the main color scene, a DirectCompute accelerated bilateral dilate and blur is performed to ensure that AO properly meets objects from the full screen resolution scene. In Far Cry 3 this technique has been significantly improved in both performance and quality.
I agree because fuck logic and value.
1000$ buys 1x 30" 2560x1600 monitor
1000$ buys 2-3x 27" 2560x1440 monitors
1000$ buys 2x Crossover 2755AMG's which support 3840x2160 over HDMI @30hz.
Which solution offers more screen real-estate? Same pricing scheme applies to budget monitors: 2x 16:9 1080p=1x 16:10 1200p monitor.
LOL Nvidia fanboys out in full force, will defend Nvidia pride, and try to downplay uber mode as much as possible, they must do everything to try to make the 780 Ti look alot better, even it isn't.
1.) Does the Crossover 2755AMG support 3840x2160?
2.)For a monitor that is roughly a fifth of the price of its competition, how bad is it?
30Hz ANYTHING -> NO.
1.) Yes
2.) The 2755AMG is an excellent monitor (nearly finished review) aside from the Plasma Deposition Coating (subjective, I'm not a fan of blacks), lack of a real warranty, stand and inability to easily exchange+return it.
Agreed.
According to what I have gleaned from eBay and the review, it is a 1440p monitor
It is a native 1440p monitor but it supports 3840x2160 over HDMI @30hz
spamming to 50 posts FTLAMD FTW!
Both cards are great pieces of kit...but it's becoming more and more apparent that raw speed numbers (in reviews or benchmarks) should not be the only deciding factor to which a person base their decision to purchase.
All the AMD cards (290 series included), are struggling to run that less than graphically demanding simulation...and have been for a while.
It's a fair comment, but unfortunately smaller titles very commonly have huge gulfs in performance between GPU manufacturers. If it's not big enough to warrant code in the drivers to optimise it, it'll probably run terribly on whichever brand the developers didn't have in their testing machines.
I'm hoping that once the aftermarket R9 290Xs come out, and the bit/litecoin craze calms down, the prices of the GTX 780 Ti are driven down so I can step up to one and only pay a minimal difference.