GeForce GTX 780 Ti vs. Radeon R9 290X 4K Gaming @ [H]

2 AMD R9 290s + $100 for the price of a single 780Ti? Kind of a no brainer. I'm switching back to AMD. To get 60fps minimum in BF4 I have to run the game on low/medium on my GTX 670.

Nvidia should be ashamed of themselves for the prices they've been pushing the last six months. Top of the line video cards should not cost more than $500. EVER. No excuses, no bullshit. If I need two of them to play games the way I want then they should never EVER cost more than $500.

Hopefully two 290s with Mantle can play BF4 at 4k 60fps+. I'll buy a Westinghouse 4k with DP1.2 to HDMI 2.0 adapter the second they become available.

NV's eyes are swelling with tears of shame. Fortunately, they have enough $100bills from the sales to wipe away the tears.
 
Can't watch videos here- but why on earth would it need profiles, and how is it possible to make a game that can't work with a technology that replaces V-Sync?
I think he was saying that games that do not work correctly with it will not be given gsync option in game's profile. that way you will know ahead of time that it does not work. just like how ambient occlusion or some other settings are grayed out for some games in their profiles. and he was saying that some games tried to already compensate for vsync issues and they ended up not being compatible with gsync.
 
I wouldn't say the only things you would be saving is noise and power. Your also getting a game bundle, so if you just like one of those games, that already 50 dollars right there. Add in the overclock-ability out of the box(no need of a non-stock cooler) and you have maybe something not worth 150 dollars more, but I think easily 100 dollars more.
 
I think he was saying that games that do not work correctly with it will not be given gsync option in game's profile. that way you will know ahead of time that it does not work. just like how ambient occlusion or some other settings are grayed out for some games in their profiles. and he was saying that some games tried to already compensate for vsync issues and they ended up not being compatible with gsync.

Sounds like Rage, though the problem on the surface appears to be pretty rare and doesn't have much bearing until we get more information. I'd still rather have G-Sync over any other incoming technology, and it'd be nice to get some word from AMD as to when they'll be supporting G-Sync monitors.
 
Those games are worth ~$15 now, go check on ebay.
 
I wouldn't say the only things you would be saving is noise and power. Your also getting a game bundle, so if you just like one of those games, that already 50 dollars right there. Add in the overclock-ability out of the box(no need of a non-stock cooler) and you have maybe something not worth 150 dollars more, but I think easily 100 dollars more.

I prefer NV cards this generation, but I don't think anyone is paying 50$ for these games. They're all selling for 10-15 bucks a pop at best - free game codes causes price depreciation fairly rapidly. The same thing happened with all of the never settle titles.
 
I kind of like to see it proven that there are some games that are so advanced that not even flagship cards right now can be used with max video settings. It's at 4K res granted, but it's nice to see the argument of software being behind the hardware capability get shot down...good for the consumer as it gives motivation to the GPU designers to really make strides with each new hardware generation in terms of performance and efficiency.

Thanks for another awesome review!
 
Hopefully sooner than later. $3K+ for a display is still at the insane level.

I'm still on 1080p until the 1440p displays are affordable. $600+ for a good 1440p is nuts. I refuse to buy those cheap b-grade error panels.

Where is my 4K, <2ms, 144Hz display? (My god the GPU power required for this)

Sometime late next year is when 4K monitors should be a little cheaper. You really are paying for all that real estate though until more monitors come to market and causes a price drop.
 
Maybe off topic, but for those who have used 4k for games, what is it really like? I've only seen the demo video at best buy and thought it was great looking. I am curious if it's like the difference between 800x600 and 1080p or is it not as great of a jump.
 
Forbes doesn't make the 290x sound too good. Your article was great and has informed me on what to buy in the future.

Thanks,

Kerpal
 
Where is the talk about HDMI 2.0? Is AMD or Nvidia going to support it anytime soon?

4K gaming is going to take place in the living room on 4k TV's, not on a 31.5" screen that's overpriced at $3500.00
 
People who are bringing up g-sync needs to remember that it requires a part you add to the monitor our buy a monitor with the part. Not all monitors are capable of adding the g-synce module.

It won't work with games that have fixed fps profiles. These are mainly poor console ports.

If you are going to mention a feature as part of the cost calculation you need to keep in mind the cost of getting g-synce, especially when bitching about the price of a 4k monitor.
 
I'm glad this review didn't pretend like the fan noise was a deal-breaker for absolutely everybody (because clearly, absolutely everybody prioritizes the card's dB above EVERYTHING ELSE </s>) like another review site which shall remain unnamed (the writer admitted this bias, then proceeded to try to justify it again).
 
I know I speak largely for myself, but it would be awesome to see some CUDA GPU rendering tests with these cards, it's a good contrast from the normal gaming scene, but alas I know this isn't the community. On a related note, I'm receiving my 780ti this week so if anyone is interesting in seeing me do some render tests, lemme know, I"ll be more than happy to display the results for you visualization folks.
 
Last edited:
Maybe off topic, but for those who have used 4k for games, what is it really like? I've only seen the demo video at best buy and thought it was great looking. I am curious if it's like the difference between 800x600 and 1080p or is it not as great of a jump.

For gaming it's not really all that impressive in most situations. For professional applications be it image/video editing, or CAD/CAM it's insanely useful.

Also your comparison isn't all that good. Back in ye old 4:3 days it was more 800x600 as the standard definition with 1600x1200 being the highest definition most consumer monitors supported. The higher end consumer monitors (more often pro monitors in most cases) spat out 2048x1536 or 2304x1440, in some cases even higher (though this required really specific hardware).

The jump in quality from 800x600 to 1600x1200 was massive and worth the asking price. The higher resolutions often weren't even supported by games and the increase wasn't nearly as visually impressive when the game was actually in motion. For static images though you could really tell.

Nvidia should be ashamed of themselves for the prices they've been pushing the last six months. Top of the line video cards should not cost more than $500. EVER. No excuses, no bullshit. If I need two of them to play games the way I want then they should never EVER cost more than $500.

Video cards over 500 bucks have been a thing since the 6800, video cards going over 600 bucks have been a thing since the 8800. Top of the line mobos didn't used to cost what they do now either.

Plus the markets for just about everything are moving to super premium markets that are price exclusive, and cheaper stuff that's deliberately lower quality for the rest to help increase the appeal of the exclusive. This isn't a bad thing, this is the proper free market, low regulation, capitalist system working. Point blank the people who deserved more are taking home their fair share of the pie now and idiotic liberal schemes to build a middle class have been gradually dismantled. If you don't like it move to a job that pays more or a more protectionist country, this is a period of great progress.
 
..this is the proper free market, low regulation, capitalist system working. Point blank the people who deserved more are taking home their fair share of the pie now and idiotic liberal schemes to build a middle class have been gradually dismantled. If you don't like it move to a job that pays more or a more protectionist country, this is a period of great progress.

kUKVtpz.png
 
Sad part is that he's right- but it's a pretty obvious conclusion, and one hell of an overblown response to forum kids complaining about video card prices :D.
 
Zarathustra[H];1040372873 said:
When we start seeing decent 16:10 4k monitors at the $1000 - $1500 price range, I will start considering them a little bit more seriously.

Never. 16:10 is gone. Accept the bigger sizes and resolutions.
 
Where is the talk about HDMI 2.0? Is AMD or Nvidia going to support it anytime soon?

4K gaming is going to take place in the living room on 4k TV's, not on a 31.5" screen that's overpriced at $3500.00

DP 1.2 to HDMI 2.0 adapters + cheap chinese 4k = Happy Panda.
 
Sad part is that he's right- but it's a pretty obvious conclusion, and one hell of an overblown response to forum kids complaining about video card prices :D.

I'm 32, make a solid living, have no financial limitations and still think >$500 is just ridiculous for a videocard.
 
I agree on using the über mode for testing, for all the reasons mentioned in the article. The thing is, those exact same arguments would justify maxing out the power and temp limits for the GK110 cards, with only one exception: it can't be done without 3rd party software. But since that software is the kinds of Precision X or Afterburner, I don't see the problem: everyone is using them anyways. If anything, It's easier than opening your case and flipping a switch.

I would argue that for GK110, those software limits and their adjustability are official features by Nvidia and leaving them unaltered is artificially holding back the performance of Boost 2.0 video cards.
 
I agree on using the über mode for testing, for all the reasons mentioned in the article. The thing is, those exact same arguments would justify maxing out the power and temp limits for the GK110 cards, with only one exception: it can't be done without 3rd party software. But since that software is the kinds of Precision X or Afterburner, I don't see the problem: everyone is using them anyways. If anything, It's easier than opening your case and flipping a switch.

I would argue that for GK110, those software limits and their adjustability are official features by Nvidia and leaving them unaltered is artificially holding back the performance of Boost 2.0 video cards.

The fan on NV GPUs is already capable of going to max fan speed by default. The only thing the switch does on the 290X is change the fan from a locked 40% to a locked 50% fan. So NV GPUs already have the advantage with the fan not being locked, while even in Uber mode, it is locked at 55%.

The only thing you could change on the NV GPUs is to raise the power level, and the offset. When you do that, you are overclocking. We'd then have to also raise the power target and so forth on the AMD side, to be fair.

So using Uber mode is only making the cards more similar. Even in Uber, they aren't "the same" since one is locked at 55% fan, and the other gets to go to whatever it wants with no caps. Even in Uber mode, NV GPUs have the advantage still on fan.

Uber mode is NOT overclocking, people need to get that through their heads first. It just raises the fan from 40% to 55%. We are not manipulating power or clock offset.

It is a hardware implementation on all R9 290X's, and does not require any software manipulation.

We will continue to test in Uber mode for all the reasons stated in the article.
 
I agree on using the über mode for testing, for all the reasons mentioned in the article. The thing is, those exact same arguments would justify maxing out the power and temp limits for the GK110 cards, with only one exception: it can't be done without 3rd party software. But since that software is the kinds of Precision X or Afterburner, I don't see the problem: everyone is using them anyways. If anything, It's easier than opening your case and flipping a switch.

I would argue that for GK110, those software limits and their adjustability are official features by Nvidia and leaving them unaltered is artificially holding back the performance of Boost 2.0 video cards.

IMO, this is not an overclock versus overclock GPU contest. I'm sure that article will be forthcoming at some point - what you seem to be asking for is a max OC 290X versus max OC 780ti comparison. That isn't the intent of this article. I've stated multiple times that I do not like the 290X compromise for uber mode, but with that being said what you're asking for is not in any way comparable. Uber mode is a factory warrantied preset that isn't equal to overclocking. In fact, uber mode merely guarantees consistent boost clockspeeds, while adjusting power targets and offsets for the 780ti is overclocking.
 
^ All Uber mode does is set the fan to be be able to go up to 55%, versus 40%. It doesn't do anything else. And the fan is still dynamic, it doesn't have to run at 55%, but it can go up to it, if it needs to, with the higher fan cap.

Whereas, on NVIDIA GPUs the fan cap is already at 100%. The fan can go up to 100% by default if it wants to.

Therefore, even at Uber mode, the NV GPUs still have an advantage in fan profile, and even at Uber mode the 290X is still capped, versus the NVIDIA.

290X = Quiet Mode fan cap 40%, Uber Mode fan cap 55% (can go up to 55% if needed)
NV GPUs = No fan cap by default, can go up to 100% if needed

So you tell me which one has the advantage out the gate.
 
I wouldn't call it advantage since this is how cards have been working a while. More like amd is disadvantaged because they castrated the fan profile to avoid noise.
 
The fan on NV GPUs is already capable of going to max fan speed by default. The only thing the switch does on the 290X is change the fan from a locked 40% to a locked 50% fan. So NV GPUs already have the advantage with the fan not being locked, while even in Uber mode, it is locked at 55%.

The only thing you could change on the NV GPUs is to raise the power level, and the offset. When you do that, you are overclocking. We'd then have to also raise the power target and so forth on the AMD side, to be fair.

So using Uber mode is only making the cards more similar. Even in Uber, they aren't "the same" since one is locked at 55% fan, and the other gets to go to whatever it wants with no caps. Even in Uber mode, NV GPUs have the advantage still on fan.

Uber mode is NOT overclocking, people need to get that through their heads first. It just raises the fan from 40% to 55%. We are not manipulating power or clock offset.

It is a hardware implementation on all R9 290X's, and does not require any software manipulation.

We will continue to test in Uber mode for all the reasons stated in the article.

I agree on your logic regarding the über mode and with your decision to run your tests using it.

The problem is that you don't use the same logic on the power and temperature limits for the GK110. Your reasoning in using the über mode had three arguments (bolded):

1. It is an official performance mode: True. But what is your definition of "official"? I think the balancing act of GPU Boost 2.0 is quite official, looking at Nvidia's own slides or, for example, Tom Petersen's appearance on PCPer's GTX Titan live review. Moving those two limits is not overclocking, it's allowing the card to sustain the max boost clocks. Until you use the offset slider or give the card the permission to use voltage bins above the stock setting, there is no increase in max clocks. It just keeps those max clocks better. To me, that looks like exactly what über mode does to Hawaii cards.

2. It is easily obtained by any and all Hawaii users: True. But what makes maxing the power and temp limit on a GK110 more difficult? The only difference is that in stead of a physical switch, you have two software sliders. Oh yes, and you need to install software that just about everyone uses anyways.

3. You don't want to "artificially" hold back performance: What makes maxing the power and temp limits more "artificial" than enabling the über mode? Is your threshold for "artificial" really the line between software and hardware? So if the über mode was a switch in CCC, you wouldn't enable it? Are the driver changes that AMD made to equalize the fan RPM of all the cards "artificial", too since they are after all software?

This isn't a big deal as such, it's just that your own arguments don't seem to keep their logic when going from AMD to Nvidia. The only real difference is that AMD has a hardware switch and Nvidia has two software switches. It's not overclocking in either case, just altering the balancing limits that each manufacturer has seen fit as the default setting.

"If you are spending that much money on a video card, you don't want to hold it back and not get your money's worth" is what you say in the article. The same is true for GK110, even more so since it is after all even more money. :D
 
Official means you have an official way of doing it, something provided by the company itself. If Nvidia had an uber mode hardware switch which increased the power and temp target, they'd use that. Same if they had one in the Nvidia control panel. But when it requires a third-party application to achieve, it can't be argued to be an official alternative.

And keep in mind, just as Nvidia's cards are being held back some with default, official settings, so is the AMD cards being held back some in the official performance mode.
 
Official means you have an official way of doing it, something provided by the company itself. If Nvidia had an uber mode hardware switch which increased the power and temp target, they'd use that. Same if they had one in the Nvidia control panel. But when it requires a third-party application to achieve, it can't be argued to be an official alternative.

And keep in mind, just as Nvidia's cards are being held back some with default, official settings, so is the AMD cards being held back some in the official performance mode.

So when a company says this on their official webpage, it's not official because of 3rd party?

More Control, Less Noise

GPU Boost 2.0 also offers improved user-control to GTX TITAN and GTX 700 Series owners, who can tweak the Boost behavior by increasing or decreasing the Temperature Target with third-party software. This allows users to decrease the maximum temperature, speed, and noise output of a GPU when working or playing older games, and to ramp everything up to max when playing the likes of Metro: Last Light.

SOURCE

There really are no clear-cut answers in all of this variable clock rate debate. For example, the basic philosophy between GPU Boost 2.0 and AMD's new Powertune is very much the same, but the ways to give end users the ability to manipulate the parameters is different. That difference is that AMD included a physical switch to get rid of most of the throttling whereas Nvidia decided to do the same by co-operating with 3rd parties on software. Both used very similar PR in promoting the customizable nature of their balancing technologies, so to rule one way as "artificial" and the other as "natural" is a judgement call. Not necessarily a wrong call, but debatable.
 
Maybe, but instead of including an option for a higher performance mode in their control panel, which Nvidia could easily have done, they chose a solution which they knew would really just be used by a smaller number of their customers. I'd certainly call that 'less official'.
 
Note the fact that quote says "with third party software"

That's the difference. With the AMD card, it's a switch put on the card by AMD, no software manipulation. With NVIDIA you have to install third party utilities to modify software settings. If NVIDIA had a hardware performance switch, we'd use it. We are going with officially supported performance modes, AMD has two modes, Quiet and Uber. NVIDIA has just one mode.
 
Nice review Brent. How soon till an overclocked vs overclocked review is done? Curious what cooling measures need to be taken to oc the r290x's, and what the resultant temperatures and % improvements in performance are.
 
and it'd be nice to get some word from AMD as to when they'll be supporting G-Sync monitors.
G-Sync is never coming to AMD video cards, its proprietary nV tech.

The concepts G-Sync are a nice idea but as a proprietary tech, which ties your monitor to a particular IHV's video card, its just plain stupid to buy. Unless you're rich and have money to burn that is.
 
Note the fact that quote says "with third party software"

That's the difference. With the AMD card, it's a switch put on the card by AMD, no software manipulation. With NVIDIA you have to install third party utilities to modify software settings. If NVIDIA had a hardware performance switch, we'd use it. We are going with officially supported performance modes, AMD has two modes, Quiet and Uber. NVIDIA has just one mode.

If that is where you draw the line, then that is completely fine. What you could have done is to include that notation to the article. To say this...

Third, we do not want to "artificially" hold back the performance of a video card. If we do so, what are we showing really? We want the potential performance to come through.

...and then say nothing about the nature of GPU Boost 2.0 and its configurability leaves the story half-told IMHO.

But enough about this. The article was top notch, keep them coming!
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
yes, it also shoud have a vga......the one i maybe will buy is anderson, wich are probally a rebranded seiki, if a vga helps , i dont know. been some years since i was a gamer
 
http://hisense-usa.com/tvs/XT880/55/0

this was another price should be 2000$

Terminals
RF Input 1
CVBS Video Input 1
L/R Audio Input for CVBS 1
YPbPr Video Input 1
L/R Audio Input for YPbPr 1
RGB Input (15-pin D-Sub for PC) 1
Audio Input for RGB 1
HDMI/HDCP Input 4 (1x UHD, 1x ARC, 1x DVI, 1xHDMI)
Digital Audio Output 1x Optical
USB (v3.0/2.0) 3
Earphone / Analog Audio Output 1
RJ45 (Ethernet) 1

but it tells me that 4k will go down price next year
 
Back
Top