Does the 750 Ti spell doom for AMD?

geok1ng

2[H]4U
Joined
Oct 28, 2007
Messages
2,130
What are your thoughts on the 750 Ti reviews so far? From what i have gathered, NVIDIA has achieved major improvements in performance/power with Maxwell. Considering that heat is teh major limiting factor in vidoe card performance these days, generating half the heat allow future NVIDIA cards to reach up to twice the performance of same size chips today.:)
 
I personally think they gonna nerf it and keep this chip for a couple of years like. Theres no way a company will release a card witch perform 100 % better one generation after. I personally think we'll see 50% better performance at the same level on 880ti and titan2 or whatever than next year they'll release the 980 with another 50% more or less..

I don't think it ever been done from generation leap, what was the biggest leap?
 
While the performance gains aren't spectacular as each GPC is weaker compared to Kepler, but have more in numbers to compensate. I think the biggest performance leap was from Radeon HD 3870 to HD 4870, both based on the same manufacturing process and the latter having 2.5 times more shaders and offering up to 3 times more performance on games while using Anti Aliasing. The HD 5870 could take the second place has offered almost twice the performance of the HD 4870, the X800XT offered 2.5 the performance of the 9800XT. I think that after 2009, the performance gains on the high end market had been more limited, as the HD 6970 wasn't much faster than the HD 5870, and when the HD 7970 launched, was around 30-70% faster than the HD 6970 and is until now in games that are more advanced along with better drivers that the HD 7970 is around two times faster than the HD 6970 which seems didn't aged well with current games.
 
What are your thoughts on the 750 Ti reviews so far? From what i have gathered, NVIDIA has achieved major improvements in performance/power with Maxwell. Considering that heat is teh major limiting factor in vidoe card performance these days, generating half the heat allow future NVIDIA cards to reach up to twice the performance of same size chips today.:)

Right now? Doom? Nah. The 750ti is amazing from a technical standpoint, but nvidia isn't even attempting to compete on a price / performance basis. But they have won the performance/watt war by a MILE. So, some SFF users care more about efficiency, while other desktop users will prefer maximum performance - so one will prefer the 750ti for performance/watt while anyone wanting max performance will opt for the 150W TDP 265. Anyway, NV doesn't really have to compete on price in NA/Canada since all of AMDs lower tier cards are price inflated. Anyway, back to the point. The answer is no. AMD generally has better price/performance or comparable price/performance in the sub 200$ market even taking their price inflation into account. Of course, AMD's comparable offerings will obviously be far worse than the 750ti in terms of power consumption, AMD isn't even close to matching Maxwell's efficiency.

The real worry for AMD is when nvidia pushes the Maxwell uarch onto 20nm and scale it up with thousands of CUDA cores. Given the efficiency of GM107, that has amazing implications for GM200 and GM204 - even on the worse 28nm node, nvidia basically *halved* the power required for a given level of performance. Put in context, this means that a scaled up Maxwell could essentially achieve 780ti performance at half the power level - 125-150W TDP - on 28nm. Now, imagine Maxwell scaled up to 3000+ CUDA cores on the better 20nm node on a 250W TDP card. I'm pretty optimistic about GM204/200 given how well GM107 is doing; NV stated in their most recent financial conference call that GM200 and GM204 are slated for Q3 this year. Should be an exciting GPU release IMO.
 
Last edited:
There's a lot more going on inside a GPU that impacts performance beyond just heat.

So, no... you're theory that half the heat = twice the performance is unfortunately, wrong.
 
It'll depend on how well the design scales on 20nm. It looks pretty sound at the moment. However, until nvidia's cards can mine as well as AMD's, they don't have much to worry about (although that day is approaching).
 
There's a lot more going on inside a GPU that impacts performance beyond just heat.

So, no... you're theory that half the heat = twice the performance is unfortunately, wrong.

It could allow for much higher overclocks which could result in a 50% increase in performance but 2 times is just silly
 
There's a lot more going on inside a GPU that impacts performance beyond just heat.

So, no... you're theory that half the heat = twice the performance is unfortunately, wrong.

Well, this.

Plus it depends on the target business market of a company. Couple generations ago nvidia was running hot, real hot, while AMD used less power. Now it's the reverse. In a few gens it might be the reverse again.

As long as there is real competition there will be innovation, else there will be even more price gouging than we see now with $150 over msrp and $999 cards.
 
It could allow for much higher overclocks which could result in a 50% increase in performance but 2 times is just silly

Is it? The GT640 is a 65W TDP and the 750ti is a 60W TDP. The 750ti is over 2 times faster, which was my original point. Or another way to look at it is that the 650ti boost is a 135W TDP while the 750ti is a 60W TDP. The 750 is nearly achieving 650ti boost level of performance - while exceeding the 650ti performance with roughly half the TDP requirement. So you can say it's silly, however, *roughly* 2 times performance in the same TDP window was basically realized with the GM107. And this is on 28nm, mind you, not even 20nm yet.

Now it's entirely true that big Maxwell may not scale linearly, however, Kepler.M and Kepler actually did scale on a fairly linear basis. Two times performance for the same TDP window was basically achieved with the 750ti, but I suppose AMD fans would argue otherwise for the GM200 or GM204 just because they don't really want to see nvidia succeed. Which is fine, I mean, those products won't be released until Q3. It's an entirely valid argument that we're discussing theoretical performance and not actual performance, so i'm not disagreeing or agreeing. But to say it's "silly"? I dunno. GM107 basically achieved it. So we'll see. In any case, I do think the efficiency of GM107 has great implications for Big Maxwell. Even if it isn't quite 2 times the performance for the same TDP window, the efficiency of GM107 means great things for the big Maxwell parts. Also has great implications for the Maxwell mobile parts which will be in a ton of ultrabook designs with mobile dGPUs this year.
 
Last edited:
No, I don't think it spells doom for AMD. They're going to do just fine. Both NVIDIA and AMD are waiting on the die shrinks to 20nm anyways so who knows what AMD has in store once better processes are available. I will say that the 750 Ti seems to be an incredible card for the money.

I think the real hilarious part here is that the $150 750 Ti runs games faster than the $500 Xbox One. Consoles are always inferior to current gen PCs typically but man, I can't recall a console being beaten by a low-end budget card 3 months after release.
 
GM107 consumes much less power than it's predecessor, but is also larger than its predecessor. It's not as though NVIDIA has completely solved every problem from every angle, but I do think they're in a fairly good position. I foresee numerous notebook wins for Maxwell (particularly on the Apple side), and, hopefully, a pretty reasonable high-end lineup, but nothing Earth-shattering. Certainly nothing that would spell doom for AMD.
 
No, I don't think it spells doom for AMD. They're going to do just fine. Both NVIDIA and AMD are waiting on the die shrinks to 20nm anyways so who knows what AMD has in store once better processes are available. I will say that the 750 Ti seems to be an incredible card for the money.

I think the real hilarious part here is that the $150 750 Ti runs games faster than the $500 Xbox One. Consoles are always inferior to current gen PCs typically but man, I can't recall a console being beaten by a low-end budget card 3 months after release.

Absolutely spot on. AMD and NV are going for different buyers with their sub 200$ offerings. Obviously for HTPC/SFF use with great efficiency, the 750ti is a better buy. However, AMD still has better price/performance, NA/Canada price hikes not withstanding. AMD will do just fine, they certainly aren't "doomed".

I'm very optimistic for Big Maxwell - like I said, GM107 spells absolutely great things for GM200/204 - and I do think GM107 is very indicative of how well they will do. GM107 is delivering twice the performance at the same wattage which is just insane from a technical standpoint. And just to be clear here: I'm also interested to see how AMD reacts; just because I prefer NV these days doesn't mean that I don't want AMD to thrive. I want AMD in a strong position. Competition drives the industry away from stagnation so I do want a strong AMD despite my preferences. It will be interesting to see if/when AMD has an answer for 20nm - now I do think Maxwell will hit well prior to what AMD has, but I'm 100% certain that AMD is quietly working on something.
 
Looking over the reviews the R9 265 is ~25% faster but 750 Ti is ~25% lower system power. I'd like to see a review down clock the 265 performance by 25% to see if the power consumption is a wash between the two. Temperature wise the 265 runs cooler at stock (see Anandtech review). Deciding factor is mining performance but being that it's a 270X/270 derivative it'll fly off the shelves at $150. The other thing is the 750 Ti cut costs by removing the 6-pin power connector and circuitry but time will tell how stable it is drawing power from just the PCI-E connector.

http://www.anandtech.com/show/7764/the-nvidia-geforce-gtx-750-ti-and-gtx-750-review-maxwell/22
 
I appreciate the power to performance ratio and the new architecture, but an AMD killer? Really?
 
As a miner I really don't see the place for this card unless they dropped the price at least another $50 or more. We're still limited to 5-6 PCI-x slots on motherboards, so for only $50 less than a 270 card your losing over 150KH/s or more per card. If these cards could push out 300-350KH/s for $100, then yes, it would be a very good card to have. The fact remains though, you can pick up 270 cards for $200 or less and they will all do 450KH/s or more.

No one uses the r9 265 cards for mining. They are pretty terrible and for just a bit more money, you can have a 270 card.
 
Looking over the reviews the R9 265 is ~25% faster but 750 Ti is ~25% lower system power. I'd like to see a review down clock the 265 performance by 25% to see if the power consumption is a wash between the two. Temperature wise the 265 runs cooler at stock (see Anandtech review). Deciding factor is mining performance but being that it's a 270X/270 derivative it'll fly off the shelves at $150. The other thing is the 750 Ti cut costs by removing the 6-pin power connector and circuitry but time will tell how stable it is drawing power from just the PCI-E connector.

http://www.anandtech.com/show/7764/the-nvidia-geforce-gtx-750-ti-and-gtx-750-review-maxwell/22
Temperature is largely dependent on the cooler that is used and the 265 from the reviews I've seen uses a pretty robust cooler, so I don't think the Maxwell design is hot by any means.

I also think the 6-pin PCI-E lacking is a non-issue. I believe PCI-E slots can deliver 75w on their own. (6-pin PCI-E is 75w, 8 pin is 150w) This board shouldn't have any problems since it's well under 75w. NVIDIA wouldn't release a card that had stability issues over something as trivial as the power connector.
 
No shit NVIDIAs next gen Arch beats AMDs current gen Arch, guess what, Maxwell also beats NVIDIAs current gen Arch.

We have yet so see AMDs next gen Arch, compare GPUs from within the same generation FFS.
 
Doom is a little extreme.

Did any generation of intel processors DOOM AMD cpus?
 
No. Because only like 20% from the budget oriented gamers segment cares about ~ 30W +- compared to equal or similar cards from AMD for ~ 50$ less. In order to make that power consumption worth you will need years to run that card...
 
I also think the 6-pin PCI-E lacking is a non-issue. I believe PCI-E slots can deliver 75w on their own. (6-pin PCI-E is 75w, 8 pin is 150w) This board shouldn't have any problems since it's well under 75w. NVIDIA wouldn't release a card that had stability issues over something as trivial as the power connector.

If I had a choice I'd rather my GPU draw power directly from the PS than through the mobo. Plus, for overclockers having more consistent power direct from PS is an advantage. My guess is a large chunk of 750 Ti power efficiency is from eliminating the 6-pin power connector and circuitry and less so from the GPU redesign since it's still limited by the same 28nm budget.
 
The X800XT offered 2.5 the performance of the 9800XT.
One of the few "wins" in my purchasing history. I usually get what I want but end up spending too much for it.

The x800XT felt like I got my money's worth and then some.
 
No. Because only like 20% from the budget oriented gamers segment cares about ~ 30W +- compared to equal or similar cards from AMD for ~ 50$ less. In order to make that power consumption worth you will need years to run that card...

It may not spell doom, but it certainly spells trouble.

Look at what AMD has in the GTX750Ti price range, and the power and performance of what's in that range. Most of the GPUs in that price range (until now) required additional power simply to yield decent (as in GTX550Ti or even HD7770-level) performance - it was also why I went with refurbished GTX550Ti instead.

Now, along comes the GTX750Ti, which requires less power than even GTX550Ti (which is no slouch, even today) while outperforming it every which way. With refurbished GTX660 dwindling in supply, and GTX650Ti flat-out going away altogether, mainstream gamers (as opposed to mining or multi-GPU usage) are getting seriously squeezed. The base GTX750 is going to put pressure on AMD's passively-cooled HD66xx simply due to the performance delta - after all, it costs either little more or no more, and can fit into the same usage scenarios, while delivering greater performance, both on outright and per-watt bases. Unless you are totally in denial, that is not ignorable.
 
If I had a choice I'd rather my GPU draw power directly from the PS than through the mobo. Plus, for overclockers having more consistent power direct from PS is an advantage. My guess is a large chunk of 750 Ti power efficiency is from eliminating the 6-pin power connector and circuitry and less so from the GPU redesign since it's still limited by the same 28nm budget.
Every GPU already draws some power from the PCI-E slot, this is nothing new. The extra power connectors are supplementary.

My guess is a large chunk of 750 Ti power efficiency is from eliminating the 6-pin power connector and circuitry and less so from the GPU redesign since it's still limited by the same 28nm budget.
That makes no sense. The amount of power the GPU draws has nothing to do with whether or not the board features an extra 6-pin power connector. If you could plug your refrigerator into multiple outlets would it suddenly use more power? No. The chip either has enough power to run or it doesn't. Video cards not needing external power isn't anything new, this is just the most powerful card with a low power draw that we've seen so far.

NVIDIA and AMD build their cards to adhere to PCI-E standards. If a particular motherboard can't run a 750 Ti through the PCI-E bus the problem is with the motherboard's implementation of the PCI-E spec not NVIDIA.
 
Thing is that most off the shelf PCs wont have these in.

Also most average consumers wont spend that much on a GPU.

Great achievement but won't really hit any benefit until the higher level cards appear. This seems more a testing/proof of concept card. Nothing exciting about the card itself, that will be reserved for the higher end.
 
As long as AMD cards are the cryto mining champs, nobody is going anywhere.
 
Hmm, for htpc use low power is good. But with a decent APU it's all rather moot as it doesn't need an additional card at all really.

So the potential market for these cards are cheap OEM desktops who use tiny psu units?
 
Thing is that most off the shelf PCs wont have these in.

Also most average consumers wont spend that much on a GPU.

Great achievement but won't really hit any benefit until the higher level cards appear. This seems more a testing/proof of concept card. Nothing exciting about the card itself, that will be reserved for the higher end.

Kind of where I sit right now. Though, in my opinion - the GTX-750 itself steals the show. It absolutely bitch-slaps the 650 it's designed to replace. Perfect ITX/LAN Box GPU.
 
Doom for AMD by a card that has no performance?

Im all for low power useage but we been down this road im the past. It used to be the other way around and it didnt hurt anyone then, It wont spell doom now. I guarantee you that most typical consumers even care less. Its to give the bench junkies and each sides fanboys something to discuss.
 
For a general htpc this card is massive overkill. Very good card, just not something that is aiming for that segment.
 
Hmm, for htpc use low power is good. But with a decent APU it's all rather moot as it doesn't need an additional card at all really.

True. 750 Ti still requires a CPU. Hard to compete with a $129 A10-7850K APU.
 
If I had a choice I'd rather my GPU draw power directly from the PS than through the mobo. Plus, for overclockers having more consistent power direct from PS is an advantage. My guess is a large chunk of 750 Ti power efficiency is from eliminating the 6-pin power connector and circuitry and less so from the GPU redesign since it's still limited by the same 28nm budget.

You have absolutely no idea what you are talking about.:rolleyes:
 
It may not spell doom, but it certainly spells trouble.

Look at what AMD has in the GTX750Ti price range, and the power and performance of what's in that range. Most of the GPUs in that price range (until now) required additional power simply to yield decent (as in GTX550Ti or even HD7770-level) performance - it was also why I went with refurbished GTX550Ti instead.

Now, along comes the GTX750Ti, which requires less power than even GTX550Ti (which is no slouch, even today) while outperforming it every which way. With refurbished GTX660 dwindling in supply, and GTX650Ti flat-out going away altogether, mainstream gamers (as opposed to mining or multi-GPU usage) are getting seriously squeezed. The base GTX750 is going to put pressure on AMD's passively-cooled HD66xx simply due to the performance delta - after all, it costs either little more or no more, and can fit into the same usage scenarios, while delivering greater performance, both on outright and per-watt bases. Unless you are totally in denial, that is not ignorable.

This card is good for HTPC - and similar small factor pc, or users with crap psу-s. Regarding gaming, is slower then R7 265, not all care for performance per watt, you know 50W + is nothing for some users...

61447.png


61444.png


FarCry3-FR.png


Also the TDP of this card is limiting factor, the oc scaling is bad, so that makes the 265 quite a bit better pick for gamers with good enough psu.
 
Last edited:
I think the 750/750Ti will be featured on many steamboxes.

BTW the card begs for a PCIe power connector but more so a higher TDP limit.

From what I've seen pretty much all the cards can do 1300+ on boost.

I'm sure someone will volt mod it and get insane speeds.
 
NV fans grasping at straws.

I understand the draw of the card and why Nvidia released it. Its the most effecient card ever made and would be an amazing addition to a stream box or mid range pc. I like that it's power requirements allow it to be used as a drop in upgrade for aging systems and that it is enough power to run modern games at 1080p at high/medium settings and have a great user experience.

With that said, I don't know why people lean so far to one side or the other. So many people think this card is just garbage because its not a 880gtx or 390x. A lot of people spew all over tech forum that if you don't have top of the line, it's not worth having and miss the point that 99% of the world doesn't need a top end video card.

I don't think it has anything to do with being a fan of either brand, I think its just misconceptions on who this card is meant for.
 
Hmm, for htpc use low power is good. But with a decent APU it's all rather moot as it doesn't need an additional card at all really.

So the potential market for these cards are cheap OEM desktops who use tiny psu units?

Or those with single 6-pin GPUs (anything from GTS450 up to GTX5xx from nV, or HD5xxx/6xxx, and even some HD7xxx from AMD) looking for improved performance.

What we have been leaving out is that GTX750Ti has better performance than refurbished Fermi at both the same price and a vastly reduced TDP; nVidia itself makes the point that GTX750Ti has greater performance than GTX550Ti.

GTX750Ti makes a valid option for straight-up performance reasons in this sector - the far-lower TDP stretches the envelope further than anything previously (not just nVidia, but even AMD) - you can upgrade graphics performance without requiring you to throw the entire system away.
 
I understand the draw of the card and why Nvidia released it. Its the most effecient card ever made and would be an amazing addition to a stream box or mid range pc. I like that it's power requirements allow it to be used as a drop in upgrade for aging systems and that it is enough power to run modern games at 1080p at high/medium settings and have a great user experience.

With that said, I don't know why people lean so far to one side or the other. So many people think this card is just garbage because its not a 880gtx or 390x. A lot of people spew all over tech forum that if you don't have top of the line, it's not worth having and miss the point that 99% of the world doesn't need a top end video card.

I don't think it has anything to do with being a fan of either brand, I think its just misconceptions on who this card is meant for.

Noticed that, have you?

I went with a refurbished GTX550Ti (after a LONG history of AMD, and prior to that ATI, GPUs) because AMD had diddly-squat in my price/performance range. Segue to today, and they still don't.

However, GTX750Ti solves several issues (for me specifically) as it is:

1. Direct support for streaming on-the-fly to TwitchTV without either punching a hole in my wallet or being a power pig. (Remember, the previous lowest-end nV GPU to support such streaming was GTX650Ti - which is priced higher than GTX750Ti, which will replace it.)
2. Full-sized HDMI support (several non-reference designs). The biggest *sin* that the reference GTX550Ti (and reference GTX750Ti) commit is the use of mini-HDMI-out - fortunately, there are several non-reference cards with full-sized HDMI-out, and none of them are any larger than my GTX550Ti.
3. Improved scalability with CPU upgrades.
 
Noticed that, have you?

I went with a refurbished GTX550Ti (after a LONG history of AMD, and prior to that ATI, GPUs) because AMD had diddly-squat in my price/performance range. Segue to today, and they still don't.

However, GTX750Ti solves several issues (for me specifically) as it is:

1. Direct support for streaming on-the-fly to TwitchTV without either punching a hole in my wallet or being a power pig. (Remember, the previous lowest-end nV GPU to support such streaming was GTX650Ti - which is priced higher than GTX750Ti, which will replace it.)
2. Full-sized HDMI support (several non-reference designs). The biggest *sin* that the reference GTX550Ti (and reference GTX750Ti) commit is the use of mini-HDMI-out - fortunately, there are several non-reference cards with full-sized HDMI-out, and none of them are any larger than my GTX550Ti.
3. Improved scalability with CPU upgrades.

You are a great example of who this card was made for and I think Nvidia hit the mark on where they put the card in their lineup. There are a HUGE amount of people like you who have the same kind of system setup and the same needs. I think that AMD did a good job with the 250x and the 265x, but for the same price, I would rather take a card like teh 750 ti with some of the software features that Nvidia offers.
 
Back
Top