The 750Ti needs an "Uber Mode"

Stoly

Supreme [H]ardness
Joined
Jul 26, 2005
Messages
6,713
The 750 Ti is really no match for the 265x on performance. but once you OC it can match it and still use less power.

From what I've seen pretty much every card can do over 1.21 gigawatts... I mean ghz. and more under boost. Yet it will still work with a 400W PSU and draw less than 90w.

nvidia should have pulled an AMD and set an uber mode for gaming and a "normal" mode for HTPC.
 
For Nvidia to make every card have a higher boost, they need to be certain every card coming off the production line can meet those clocks.
Apparently, they can't guarantee it.

Also, something something power requirements. Nvidia wants people to put 750 Ti's in their old Dell's running generic brand 200W PSU's.

As for the R7 265, it's not even released yet. And it requires a 6-pin connector which makes it a non-option for a lot of people.
 
They seem to have a lot of headroom. I've been playing around with my superclocked edition today and I've got it stable at 1409mhz so far, that is boost clock of course. I can clock it a little higher and it will pass some benches but I was getting driver crashes after a few mins of gaming. It seems pretty solid so far at 1409mhz.
I'm debating buying the FTW version with the extra 6pin, just to see if the extra voltage would let it clock higher.

 
Last edited:
The 750 Ti is really no match for the 265x on performance. but once you OC it can match it and still use less power.

From what I've seen pretty much every card can do over 1.21 gigawatts... I mean ghz. and more under boost. Yet it will still work with a 400W PSU and draw less than 90w.

nvidia should have pulled an AMD and set an uber mode for gaming and a "normal" mode for HTPC.

I doubt they will encourage it until they've cleared off enough GTX 650/660 stock.
 
I doubt they will encourage it until they've cleared off enough GTX 650/660 stock.

As it is, even the GTX660 is in trouble simply due to the reality that the GTX750Ti can keep up with it. Forget about performance per watt - on a straight performance for price basis, GTX660 loses to GTX750Ti. (To add insult to injury, the GTX660's memory bus is twice that of GTX750Ti.)

If you remember the old engineer's joke "The difficult we do right away - the impossible takes a little longer.", the GTX750Ti is the real-world proof, as it does what was long thought - by nearly everyone - to be impossible.
 
All the various reviews show the 660 is still pushing quite a few more frames then the 750ti if you simply forget about watts and go for pure power.

The 750ti chokes pretty hard on the latest games because memory bandwidth is low forcing the quality to be turned down to low just to have playable rates.

Besides Skyrim which is more CPU bound I'd say the 750ti is only good for 1600x resolution.
 
Amd card sold at MSRP? Hahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahaah. That was good, tell me another one.
 
Amd card sold at MSRP? Hahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahahaah. That was good, tell me another one.
If you're not in the US, it can make sense.

As it is, even the GTX660 is in trouble simply due to the reality that the GTX750Ti can keep up with it. Forget about performance per watt - on a straight performance for price basis, GTX660 loses to GTX750Ti. (To add insult to injury, the GTX660's memory bus is twice that of GTX750Ti.)
Another example is how gimped the GTX 460's clockspeeds were so they can get rid of those pathetic GTX 465 cards.
 
If you remember the old engineer's joke "The difficult we do right away - the impossible takes a little longer.", the GTX750Ti is the real-world proof, as it does what was long thought - by nearly everyone - to be impossible.
Did people genuinely believe it was impossible? NVIDIA just did good engineering work with a greater focus on power consumption: there's nothing magical, that I'm aware of, about Maxwell. As NVIDIA explains, the greatest benefit came simply from further partitioning SP clusters into smaller units.

"Simply" being used loosely, of course.
 
Did people genuinely believe it was impossible? NVIDIA just did good engineering work with a greater focus on power consumption: there's nothing magical, that I'm aware of, about Maxwell. As NVIDIA explains, the greatest benefit came simply from further partitioning SP clusters into smaller units.

"Simply" being used loosely, of course.

wonderfield - the denial, even with the numbers being easily duplicated, is still present.
Lots of folks STILL don't want to believe that a GPU with a 128-bit memory bus can keep up with a GPU with twice the memory bus, even with the data staring them right in the face.

Basic Kepler is plenty powerful - that much isn't in dispute. However, it's not efficient (as has been typical for most "1.0" GPU architectures), and definitely not as efficient as even enthusiasts would prefer - not even at the highest end (GTX Titan and GTX Titan Black) Titan and Titan Black are for those in want of pure power - and efficiency be irrelevant.

TegraK1, on the other hand, is all about vastly increased efficiency, since there isn't enough room in a smartphone or tablet or other targeted device to apply boatloads of power - there isn't enough display to even take advantage of all the power available. The direct inverse of basic Kepler - maximum efficiency, but pure power is mostly irrelevant.

Maxwell splits the difference - Kepler's power, but TegraK1's efficiency.

In racing terms, GTX Titan and GTX Titan Black are drag racers (Top Fuel drag racers, to be exact) - quarter-mile beasts untouchable by more plebian hardware in terms of sheer speed/power. However, that lack of suitability (lack of efficiency) makes them unsuitable for longer distances, or tracks that require you to make turns (ovals, tri-ovals, etc.)

TegraK1 is hyper-efficient - think Fiat 500 Abarth. Energy-sipping marathoners - built for those long distance endurance races (Rolex 24, etc.).

Maxwell overall is a more-efficient Kepler; GM107 in particular is a proof of concept, targeting the overlong overhang that GTX650Ti and GTX660 represents. Efficiency CAN make up (to an extent) for lack of power - GTX650Ti and GTX650Ti BOOST proved that, merely compared to GTX550Ti. GTX750 and GTX750Ti prove it yet again, except this time, GTX650TI and Ti BOOST are the victims (along with, to an extent, even GTX660). It isn't designed, or even intended, to threaten GTX760; however, it does provide a 7-series GPU to slot in just below it in terms of both price and performance. (Remember, until this GPU came around, the 7-series had NO real midrange, let alone mainstream, members - while the GTX760 has occupied that role, it wasn't by design so much as by default.)

Base Kepler has an efficiency problem - that much we have known simply due to the data that testing GTX Titan, let alone other 7-series GPUs, have coughed up, and the utter lack of a sub-GTX760. AMD's R7 rebadges are designed to (in the short term) take advantage of that lack - even AMD wasn't expecting that lack to continue forever.

The disruptive factor of Maxwell overall is vast improvements in efficiency - due to the Kepler heritage, it's not really lacking in terms of power. The disruptive factor of GTX750/750Ti in particular is that the increased efficiency can utterly erase (and cover up for) certain flaws (such as lack of bandwidth) that were thought to be unaddressable by gains in efficiency. No - Maxwell didn't reinvent the wheel - however, the efficiency gains that Maxwell brings to the table are overlarge, and without precedent, and especially in this particular space.
 
All the various reviews show the 660 is still pushing quite a few more frames then the 750ti if you simply forget about watts and go for pure power.

The 750ti chokes pretty hard on the latest games because memory bandwidth is low forcing the quality to be turned down to low just to have playable rates.

Besides Skyrim which is more CPU bound I'd say the 750ti is only good for 1600x resolution.

Which is supposed to be the case - the GTX660 has the extra power available to it, while the GTX750Ti does not.

You are insisting on trading power for efficiency.

Until now, if you wanted to even consider an nV 7-series GPU, that was your only option, in fact. GTX750 (in both base and Ti trim) brings massive efficiency gains to the table, while sacrificing surprisingly little in terms of deliverable power. And exactly what IS the loss in terms of frames per second in Skyrim (GTX750Ti compared to higher-bandwidth GPUs from both AMD and nVidia), all else being equal? I'm not saying there isn't - the question is where is it and how much is it?

There ARE users that have to (for whatever reason) make the reverse tradeoff - increased efficiency, not increased power. That is what Maxwell (GM107 now, and GM200 in the future) is about.
 
TegraK1, on the other hand, is all about vastly increased efficiency, since there isn't enough room in a smartphone or tablet or other targeted device to apply boatloads of power - there isn't enough display to even take advantage of all the power available. The direct inverse of basic Kepler - maximum efficiency, but pure power is mostly irrelevant.


TegraK1 is based on kepler, no way its more efficient than maxwell.
 
TegraK1 is based on kepler, no way its more efficient than maxwell.

I said that TegraK1 is more efficient than base Kepler, but has less power than either Kepler or Maxwell.

Tegra doesn't need as much power as Maxwell, let alone Kepler, because it is dealing with smaller display sizes - how large is the largest TegraK1 display, in screen size OR resolution?

Maxwell has greater efficiency than base Kepler - in other words, less of the power it has is wasted; that is, in fact, a contribution from the engineering of TegraK1. However, it is not designed to trade power shots with Kepler - yet. Maybe some future smaller-node iteration of Maxwell COULD trade power shots with GTX Titan Black or even GTX Titan - however, GM107 isn't it.

GM107, due to greater efficiency (not just compared to Kepler, but even to Fermi) can darn near trade power shots with Fermi, at a lower TDP by far than Fermi - the disbelief comes in due to GM107's price, which is less than even refurbished/reconditioned Fermi, in addition to GM107 having as little as half the memory bus of Fermi. THAT is where the mutters come in.
 
Which is supposed to be the case - the GTX660 has the extra power available to it, while the GTX750Ti does not.

You are insisting on trading power for efficiency.

Until now, if you wanted to even consider an nV 7-series GPU, that was your only option, in fact. GTX750 (in both base and Ti trim) brings massive efficiency gains to the table, while sacrificing surprisingly little in terms of deliverable power. And exactly what IS the loss in terms of frames per second in Skyrim (GTX750Ti compared to higher-bandwidth GPUs from both AMD and nVidia), all else being equal? I'm not saying there isn't - the question is where is it and how much is it?

There ARE users that have to (for whatever reason) make the reverse tradeoff - increased efficiency, not increased power. That is what Maxwell (GM107 now, and GM200 in the future) is about.

I understand, those with small boxes or a low power psu without a 6 pin these are going to be golden but for those with full towers, 600w+ psu where power draw means nothing the 660 is more powerful.

If the 660 would drop down to MSRP like it was supposed to months ago to around $170 they would fly off the shelf.

As for frames with Skyrim the 750ti easily stays above 60 maxed out but the 750ti on some of the more demanding games it chokes unless quality is turned down with HD, 1600 not so much.

If only they would have released a 755 or something on 28 with double what the 750 does for around 120W TDP that would be killer but sadly it would beat the 760 and ruin their sales.
 
I understand, those with small boxes or a low power psu without a 6 pin these are going to be golden but for those with full towers, 600w+ psu where power draw means nothing the 660 is more powerful.

If the 660 would drop down to MSRP like it was supposed to months ago to around $170 they would fly off the shelf.

As for frames with Skyrim the 750ti easily stays above 60 maxed out but the 750ti on some of the more demanding games it chokes unless quality is turned down with HD, 1600 not so much.

If only they would have released a 755 or something on 28 with double what the 750 does for around 120W TDP that would be killer but sadly it would beat the 760 and ruin their sales.

That area is the turf that the GTX760 occupies - that is why this is a GTX750Ti or GTX750, not a GTX760 replacement.

There ARE GTX760 variants that use but a single power connector, such as the ASUS GTX760 DirectCU II; even better , they are priced just above even the GTX750Ti. If it weren't for the GTX750Ti, those would be the top of my shortlist.

The reasons that the GTX750Ti replaces the GTX760 DirectCUII atop the shortlist are threefold:

1. Extremely minimal power requirements - it can trade blows (in the games that I play) with even GTX660, while using less power than even GTX550Ti (which it would replace).

2. Direct streaming support - this is the lowest-priced card that supports nVidia GameStreaming, replacing the GTX650Ti (which it will replace in nVidia's lineup as well).

3. SLI "why"? My interest in SLI is zero - my interest in cryptocurrency is also zero (except maybe as a "tinkerer's project"). If either were a higher-priority interest, there is the GTX760DirectCU II and its clones.
 
There are clearly two lines of thinking at play here. First is the power desktop user that doesn't give a shit about power consumption. For this user there are obvious choices that deliver better performance per dollar than GM107. However, that being said, the second user is someone looking for a SFF or mobile system with the best graphics performance possible - for this, GM107 is a fairly incredible product IMO. GM107 will be used in a wide variety of ultrabooks and SFF systems and will far outperform anything else in the 60W TDP range by a mile. More than a mile in fact. It is that much faster than everything else at 60W TDP, it isn't even funny - more than twice the speed of the 65W TDP R7-250.

Again, the desktop user with a 600W PSU or greater who doesn't care. Clearly there are better options here. But GM107 was designed for mobile first and as such, the obvious focus was on efficiency. I think that's pretty cool for two reasons: one, this thing can be used in so many small devices. Two, GM200 and GM204 should be very promising once scaled up with tons of CUDA cores.
 
There are clearly two lines of thinking at play here. First is the power desktop user that doesn't give a shit about power consumption. For this user there are obvious choices that deliver better performance per dollar than GM107. However, that being said, the second user is someone looking for a SFF or mobile system with the best graphics performance possible - for this, GM107 is a fairly incredible product IMO. GM107 will be used in a wide variety of ultrabooks and SFF systems and will far outperform anything else in the 60W TDP range by a mile. More than a mile in fact. It is that much faster than everything else at 60W TDP, it isn't even funny - more than twice the speed of the 65W TDP R7-250.

Again, the desktop user with a 600W PSU or greater who doesn't care. Clearly there are better options here. But GM107 was designed for mobile first and as such, the obvious focus was on efficiency. I think that's pretty cool for two reasons: one, this thing can be used in so many small devices. Two, GM200 and GM204 should be very promising once scaled up with tons of CUDA cores.

xoleras - that is indeed the point I have been trying to make.

I pointed out that choosing refurbished GTX550Ti as an in-place upgrade was basically making the best of a poor economic situation. The embarrassing part (for me) is that GM107 (in GTX750Ti trim) not only spanks refurbished GTX550Ti, but can trade blows with GTX650Ti or even GTX660 - despite significant power and memory-bus handicaps, and for the same price as GTX650Ti, and less than refurbished GTX660. Ignoring that sort of performance would be not merely illy, but tupid.

An "Uber Mode" GTX750Ti would likewise be illy - especially with GTX760 just above it - in terms of nVidia's lineup and price. If GTX760 would be a better performance fit, but you want a one-feed design, there is at least one GTX760 that fits that spec - ASUS' GTX760 DirectCU II; further, the DirectCU II supports SLI - the GTX750Ti specifically does not.

GM107 is nastily and messily disruptive - that much is certain. However, it also addresses several issues that nVidia has to face:

1. Lack of a midrange 7-series GPU: GTX760 has the space by default, not design.

2. The resultant overhang of GTX65x/66x - GTX650/Ti/Boost, and even GTX660, have become the Windows XP of nVidia's GPU lineup - hanging around and hanging around far longer than they are supposed to.

3. I have nothing against folks needing more power; all I'm saying is that not everyone needs it, or can even use it. That is what midrange GPUs are supposed to be for.
 
I was looking at the MSI 750 Ti with the heatpipes. But that model is the OC model. Does anyone know if you can keep it on stock speeds?

One odd thing though, is at the time of this writing, it's $170 while the MSI GT 640 was under $100 when I bought it. However, the GT 640 is not the model with the heatpipes and it's noisy.
 
I was looking at the MSI 750 Ti with the heatpipes. But that model is the OC model. Does anyone know if you can keep it on stock speeds?

One odd thing though, is at the time of this writing, it's $170 while the MSI GT 640 was under $100 when I bought it. However, the GT 640 is not the model with the heatpipes and it's noisy.

You can always undercrank a factory-overclocked card, using the same utility used to monitor or increase overclocks.
 
Back
Top