Adding A GeForce GTX 750 Ti To An OEM PC

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
PC Perspective decided to add a GeForce GTX 750 Ti to a handful of budget OEM computers just to see what would happen.

In our article today we saw increases as high as 10.2x - an order of magnitude increase in the average frame rate!! Bioshock Infinite on the Lenovo H520 went from a completely unplayable mess to an ultra-smooth 1920x1080 title with plenty of room for image quality increases.
 
Great video and really hits the mark on why this card is so great. Its really been the first card ever that you can stick in just about any system and see huge gains in gaming.
 
Great video and really hits the mark on why this card is so great. Its really been the first card ever that you can stick in just about any system and see huge gains in gaming.

That's the point I've been trying to make - I haven't seen a GPU - or any one-part upgrade - with this wide-ranging an impact ever. Not at any price. (And yes, that includes ALL of 3dfx' cards.)

At even the high end (non-reference), GTX750Ti is not even $175USD. (My own leading choice among the non-reference cards - EVGA's GTX750Ti - is $150USD, either direct from EVGA or Amazon; I chose it because of the full-size HDMI-out and that it *still* doesn't require additional power, yet it can go in any motherboard that supports PCI Express x16 discrete graphics. Unless you're a real cheapskate, the question - if you are looking to upgrade from low-end discrete or onboard, is more why NOT GTX750Ti, as opposed to WHY GTX750Ti.)
 
My HTPC needed a graphics card - it's using integrated currently and stutters pretty badly in places.

Think they'll be able to make a passively cooled version of this card?
 
Didn't we talk about this a while ago, and it was deemed impossible by some idiots? Mainly due to power supply. I will repeat, you don't need a 1000w expensive power supply to run a graphics card. As long as you have a quality 350w power supply, it'll do. Even among the heavy duty video cards that have a power connector on it.
 
I love how even the garbage 250W stock units can power the 750 Ti, saw a few people claiming they'd have to go out and buy a "decent" 400W+ unit to power them.

As it stands now, anybody with any kind of PC built within the last 4 years or so can destroy next-gen consoles for an entry fee of $150. Hilarious, and sad. I'm excited to see what the future brings.
 
The only problem with this card....(and I have one on the way) is that the miners are going to be all over them like a fat kid on cake. So if you think you might want one, might be wise to jump now cause in a few weeks they might be $300 bucks and not $200.
 
Most of the miners have been complaining that the price is too high for the hash rate they will get. I don't mine, so I don't know. I just know it's not an AMD so maybe they'll keep a low profile to the mining community :D
I got mine today (EVGA 750Ti SC), and I can say it's a beastly little card for the money. Runs very cool, and with a little adjustment, I had mine running a touch over 1400mhz core clock. Stock it was boosting to 1320(ish). I slid the slider up to +135, and it passed a couple runs of valley and fire strike. I may end up switching this one for an FTW card, just to see what having the extra 75w from the 6-pin.
 
That's the point I've been trying to make - I haven't seen a GPU - or any one-part upgrade - with this wide-ranging an impact ever. Not at any price. (And yes, that includes ALL of 3dfx' cards.)

You just had to throw that 3dfx thing in there didn't you? Quadrupling the resolution, going from 8bit to 16bit color, 3d this 3d that, this was imo the biggest change that you could add into your computer.

gl-quake-640-480.jpg
 
Can't wait to see what Maxwell can really achieve once we get down to 20nm :eek:
 
You just had to throw that 3dfx thing in there didn't you? Quadrupling the resolution, going from 8bit to 16bit color, 3d this 3d that, this was imo the biggest change that you could add into your computer.

gl-quake-640-480.jpg

Didn't 3dfx cards use glide in quake? Either way I remember getting open gl working on my riva 128 for the first time in quake 2 and my jaw hit the floor. Good times back then.
 
I love how even the garbage 250W stock units can power the 750 Ti, saw a few people claiming they'd have to go out and buy a "decent" 400W+ unit to power them.

As it stands now, anybody with any kind of PC built within the last 4 years or so can destroy next-gen consoles for an entry fee of $150. Hilarious, and sad. I'm excited to see what the future brings.

There may be exceptions to everything but my experience with a 420 watt Enermax (maybe in 2005) was that when I was playing Unreal Tournament 2004, the power supply would feel really hot. In those days maybe CPUs and GPUs were less efficient. I'd like to be cheap and buy a 350 watt PSU but the last time I bought one it was a Seasonic 520 watt model. As a matter of fact, I read heat is the #1 cause of electronics failure. (The temps on my last machine are really low like under 30'C.)
 
Didn't 3dfx cards use glide in quake? Either way I remember getting open gl working on my riva 128 for the first time in quake 2 and my jaw hit the floor. Good times back then.

No it's OpenGL. I think there might have been a glide version?

But when it comes to graphic cards making huge leaps in gaming, it would have been like this.

3dFX Voodoo's
Geforce 1's due to T&L
Geforce 3's for pixel and vertex shader's
Radeon 9700 for having real DX9

Past that nothing much. Bit tje Geforce GTX 750 Ti is that good, then we could see a resurgence of graphics evolution. My only regret is that the card isn't 256-bit memory, which you'd think would standard today.
 
Past that nothing much. Bit tje Geforce GTX 750 Ti is that good, then we could see a resurgence of graphics evolution. My only regret is that the card isn't 256-bit memory, which you'd think would standard today.

Yeah, but think about that.

If we are getting THIS performance-per-dollar with a 128-bit memory bus and 28nm process...

Imagine Maxwell with a 384-bit bus on 20nm!

:eek:
 
I was thinking of buying one...but the benchmarks show it as a bit slower than a 660.
 
Yeah, but think about that.

If we are getting THIS performance-per-dollar with a 128-bit memory bus and 28nm process...

Imagine Maxwell with a 384-bit bus on 20nm!

:eek:

The big thing about the Geforce GTX 750 is it's power requirement. Performance isn't that amazing, cause a Radeon 7850 or R7 265 will consistently beat it. For people who bought crap computers who don't have a power connector for more power hungry graphic cards, the 750 makes sense.

With more powerful PC's a 7850 or 265 makes sense, if power consumption isn't an issue.
 
My HTPC needed a graphics card - it's using integrated currently and stutters pretty badly in places.

Think they'll be able to make a passively cooled version of this card?

Easily, the thing uses 60W.
 
The 750Ti looks even more impressive once you realize that it performs like a gtx 480

Man that thing sucked power
 
That's the point I've been trying to make - I haven't seen a GPU - or any one-part upgrade - with this wide-ranging an impact ever. Not at any price. (And yes, that includes ALL of 3dfx' cards.)

At even the high end (non-reference), GTX750Ti is not even $175USD. (My own leading choice among the non-reference cards - EVGA's GTX750Ti - is $150USD, either direct from EVGA or Amazon; I chose it because of the full-size HDMI-out and that it *still* doesn't require additional power, yet it can go in any motherboard that supports PCI Express x16 discrete graphics. Unless you're a real cheapskate, the question - if you are looking to upgrade from low-end discrete or onboard, is more why NOT GTX750Ti, as opposed to WHY GTX750Ti.)

I feel like this is a Voodoo3/Counter-Strike type of moment. We have our gaming PCs, now we can make one for our brother cheaply, etc.
 
If we are getting THIS performance-per-dollar with a 128-bit memory bus and 28nm process...
Imagine Maxwell with a 384-bit bus on 20nm!
:eek:
Performance/$$ is not a strong suit of GM107 right now, though I do think Nvidia priced it perfectly in the overall scheme of things.

Die size requirements and wafer costs will push your 384bit 20nm ASIC with a huge performance/$$ increase back into 2015 unless things get extremely heated up between AMD and Nvidia.
 
Actually no.

It's a MiniGL port. It's pretty much ONLY running the subset of OpenGL that Quake used. Not a full OGL.

I didn't say it was full OpenGL. Either way, it's still OpenGL and not Glide.
 
The story I find even more compelling, is just how close in performance the 750 is, to the 750ti.
http://www.anandtech.com/show/7764/the-nvidia-geforce-gtx-750-ti-and-gtx-750-review-maxwell/11

They test several games, at several different quality levels.

The first thing that became apparent to me, is that the 750ti is really held back by it's 128bit architecture (the 750 is also 128bit). When things are really cranking, the 750 and 750ti tend to be even MORE similar in performance. Suggesting that even though the 750ti has more shader cores and whatnot, the 128bit architecture is not feeding it enough to be meaningful, when stuff is really piled onto it. It also makes its extra GB of VRAM almost worthless, in conventional games.*

But even when you relax setttings a bit so that the architecture isn't saturated, the 750ti isn't that much better. There's only about 4 instances in the whole test, where it is a full 10fps better than the 750. The overall gameplay experience is basically the same. I have a feeling a [H] review would come to the same conclusion.

and then you look at the overclocking section. A very attainable overclock on the 750, makes it perform neck and neck with a 750ti, for barely any extra power usage.

and then you see that you can buy a 750, with a similar overclock as the default, for $4 more than the cheapest 750. Pretty interesting.
http://www.newegg.com/Product/Product.aspx?Item=N82E16814487026
 
I had some fun with the Kyro II, one of the most random cards I've owned.

I went with the 9700 Pro in AIW trim (in fact, it's still on my shelf, to be eventually passed down for a system with AGP) for several reasons:

1. Solid DX9c support (which was the case with all of R300).
2. No underclocking (this AIW was the first to share core clocking with the non-AIW counterparts).
3. No real compromises. (Between the DX9c support and the solid TV tuner, it was ideal for HTPC use, while doubling as a solid gamers' card - which is something that nVidia never developed.)

Like the parent 9700 Pro, it was nastily and messily disruptive - despite the seemingly-high $400+ MSRP.
 
The story I find even more compelling, is just how close in performance the 750 is, to the 750ti.
http://www.anandtech.com/show/7764/the-nvidia-geforce-gtx-750-ti-and-gtx-750-review-maxwell/11

They test several games, at several different quality levels.

The first thing that became apparent to me, is that the 750ti is really held back by it's 128bit architecture (the 750 is also 128bit). When things are really cranking, the 750 and 750ti tend to be even MORE similar in performance. Suggesting that even though the 750ti has more shader cores and whatnot, the 128bit architecture is not feeding it enough to be meaningful, when stuff is really piled onto it. It also makes its extra GB of VRAM almost worthless, in conventional games.*

But even when you relax setttings a bit so that the architecture isn't saturated, the 750ti isn't that much better. There's only about 4 instances in the whole test, where it is a full 10fps better than the 750. The overall gameplay experience is basically the same. I have a feeling a [H] review would come to the same conclusion.

and then you look at the overclocking section. A very attainable overclock on the 750, makes it perform neck and neck with a 750ti, for barely any extra power usage.

and then you see that you can buy a 750, with a similar overclock as the default, for $4 more than the cheapest 750. Pretty interesting.
http://www.newegg.com/Product/Product.aspx?Item=N82E16814487026

I get that the GTX750 series (including the Ti) is indeed held back by the memory bus (the same is true for AMD HD77xx) - the shocker is that GTX750/750Ti holds its own against the GTX650, GTX650Ti, and GTX660 - all of which have a larger memory bus AND draw extra power that reference GTX750/750Ti does not. That is why both are compelling merely as-is - for both performance per watt and price-for-performance reasons. It didn't have to beat any of them - and under normal circumstances, you wouldn't expect a card with this sort of hobblage to even match up against any of them. Yet match up it does - hence the massive disruption factor it represents.
 
You just had to throw that 3dfx thing in there didn't you? Quadrupling the resolution, going from 8bit to 16bit color, 3d this 3d that, this was imo the biggest change that you could add into your computer.

gl-quake-640-480.jpg

I threw it in for a reason - I've owned two of them (and still have the Monster 3D II in 12 MB Revision E trim). Like the first one I bought (a "brand X" basic Voodoo I) it would be paired with ATI graphics cards (AIWs) over its usage life - it spent the longest period tag-teamed with an ATI AIW 8500DV - a massive disruptor in its own right. With this Dynamic Duo, I could run D3D, GLide, miniGL, etc - letting each card play to its strength. The Voodoo2 had one weakness compared to the AIW - no 32-bit color support. However, with 16-bit color, even Voodoo2 was a solid D3D card, and I would use it as such where 32-bit either had issues or was simply missing - Forsaken was one of two games (Hellbender, the space shoot-em-up sequel to both Fury3 and Terminal Velocity, was the other) where I could show off the D3D performance Voodoo2 was capable of.
 
The first thing that became apparent to me, is that the 750ti is really held back by it's 128bit architecture (the 750 is also 128bit). When things are really cranking, the 750 and 750ti tend to be even MORE similar in performance. Suggesting that even though the 750ti has more shader cores and whatnot, the 128bit architecture is not feeding it enough to be meaningful, when stuff is really piled onto it. It also makes its extra GB of VRAM almost worthless, in conventional games.*

But even when you relax setttings a bit so that the architecture isn't saturated, the 750ti isn't that much better. There's only about 4 instances in the whole test, where it is a full 10fps better than the 750. The overall gameplay experience is basically the same. I have a feeling a [H] review would come to the same conclusion.
That's my main grip with this card, is the 128-bit memory. There shouldn't be graphic cards with 128-bit anymore. At least not $100+.
 
That's my main grip with this card, is the 128-bit memory. There shouldn't be graphic cards with 128-bit anymore. At least not $100+.

At this point, it's a quibble, as it keeps up with (and beats in quite a few cases) GPUs with wider memory buses AND higher power draws - one victim is no less than GTX550Ti, a darling among refurbished GPUs for valid reasons. (GTX550Ti has a 192-bit memory bus and requires a 6-pin power feed, while GTX750 has a 128-bit memory bus, and uses only what it gets from the motherboard. So tell me - exactly why is GTX550Ti losing?)

nVidia itself answered that question in their Maxwell presentation - efficiency, efficiency, efficiency (derived from the direct ancestor of Maxwell - Tegra). Tegra itself is far more efficient than base Kepler because it's designed for areas where available power (and available screen) are premiums - however, Tegra doesn't have the raw horsepower that base Kepler does for those same reasons. We have been rather painfully aware that base Kepler - even GTX Titan or Titan Black - is not as efficient as we would like. Maxwell is more like Kepler 1.5 at this point - while based clearly on Kepler, it is fabricated on the same 28nm nodes as Kepler, therefore, it's not second-generation. GM200 and later will likely be 20nm and the true second-generation Kepler-based parts, and WILL cause further disruption.
 
This seems like a good option for an HP Z200 workstation (i3-540 system) I have, where the power supply connector to the motherboard is proprietary, so the power supply cannot really be upgraded. (unless I went with the auxiliary supply or dual supply route). Would like to turn this into an HTPC/light gaming computer for the living room television
 
That's my main grip with this card, is the 128-bit memory. There shouldn't be graphic cards with 128-bit anymore. At least not $100+.

Wafer costs increases cost per mm2.
You need a minimum of ~120mm2 to have the perimeter needed for a 128bit bus and ~200mm2 for a 256bit, obviously other factors come into play but it is a pretty good generalization.
 
Last edited:
I love how even the garbage 250W stock units can power the 750 Ti, saw a few people claiming they'd have to go out and buy a "decent" 400W+ unit to power them.

As it stands now, anybody with any kind of PC built within the last 4 years or so can destroy next-gen consoles for an entry fee of $150. Hilarious, and sad. I'm excited to see what the future brings.

That sounds like exaggeration. If the ps4 has a 7850, which outperforms the 750 ti, the ps4 wins, albeit by a small margin. Anybody know how this compares to the xbone? I believe that runs a 7790. And of course 7850 beats 7790.
 
Whats all the hooplah here? Many games are still CPU bottlenecked. Are you trying to tell me that BF3 is going to go from 1600x900 @ 25fps to 1900x1200 @60fps on the same system with nothing more than a $150 GPU upgrade? I call shenanigans.
 
Wafer costs increases cost per mm2.
You need a minimum of ~120mm2 to have the perimeter needed for a 128bit bus and ~200mm2 for a 256bit, obviously other factors come into play but it is a pretty good generalization.

The Radeon HD 7850 is about the same price and is 256-bit. The Radeon R7 265 is the same card but I can't find any for sale anywhere. Most likely due to bitcoin. In every benchmark, the 7850 outperforms the 750. It's pretty clear that the 750 is limited by it's 128-bit memory.

Nvidia has a killer product because the 750 needs far less power, but for everyone else that has decent power supplies, the 7850 is the better choice.
 
The Radeon HD 7850 is about the same price and is 256-bit. The Radeon R7 265 is the same card but I can't find any for sale anywhere. Most likely due to bitcoin. In every benchmark, the 7850 outperforms the 750. It's pretty clear that the 750 is limited by it's 128-bit memory.

Nvidia has a killer product because the 750 needs far less power, but for everyone else that has decent power supplies, the 7850 is the better choice.

I thought the 265 cards had not been released yet. I had seen a post that rumored early March for the 265. It seems a lot of people on here are comparing it to the 750Ti cards, but no one has one yet that I've seen. :D
 
Think they'll be able to make a passively cooled version of this card?
It's not impossible. Gainward used to produce passive versions of the GTX 680, which was a 195W TDP part. Just don't expect a passively-cooled version at the $149 price point.

That's my main grip with this card, is the 128-bit memory. There shouldn't be graphic cards with 128-bit anymore. At least not $100+.
NVIDIA's plan is to beat narrow busses with fast local cache. Maxwell has 2MB shared cache, up from Kepler's 256K, and Maxwell's successor, Volta, will have stacked DRAM with about a terabyte/s of bandwidth to the GPU. Higher-end Volta parts will still have external memory, but there'll be enough stacked DRAM to make a narrow bus a complete non-issue.

High-end Maxwell parts will still have a fat bus.
 
I'm running an Acer Predator 7710.
As far as I know/can tell, the PCI bus in it is PCI-E 1.0.

Is there a point to getting this card for an attempt at a video upgrade, or would the card needing to drop down to 1.0 be too negative an impact?
 
Back
Top