GIGABYTE GTX 980 Ti G1 GAMING Video Card Review @ [H]

Who keeps GPU's 2 to even 3 years though if your a semi hardcore gamer?

He said with the generation of graphics cards 2-3 years ago, AMD was marketing their cards as having more memory than nvidia's cards, whereas in this generation AMD cards have less memory and they say it doesn't matter.

His comment has nothing to do with keeping a card for 2 or 3 years.
 
Last edited:
But when AMD cards began to appear with 6GB as standard (back in the 7970 days, IIRC), AMD fans crowed about the advantage over NV's measly 4GB, better for xfire etc. [H] failed to find much support for that line of thinking in testing the AAA titles that were current at the time.

Now the shoe is on the other foot, and 2-3 years down the road there are titles where 4GB is just not enough. [H] demonstrates the truth of this with one representative sample, and you are offended, it seems.

That's the thing about computing technology--the bits don't lie. And neither does the [H].

You are a bit off there. The big boy cards were the HD 7990 GB, which was 2x3GB or really 3 GB effective, and the GTX 690 was 2 GB. The HD 7970 and GTX 690 were 3 GB and 2 GB, respectively. All logic should have shown the GTX 690 to bog down, but there wasn't much evidence of this. (too bad Dying Light wasn't around then :p)

More recently was the HD 290x with 4GB ram and GTX 780Ti with only 3 GB of ram. I don't think that got near the attention as the Vram shortcomings of the Fury X. (especially since the 780ti was more likely to be Xfired then the awkward Fury X)

No one really cares what the fanboys are yelling about, we really only care about what the reviewers say since that has the most influence on purchases.
 
Just set the damn fan speed at 60-70%, which is more than enough to cool most aggressive overclock!
My car can reach 150 mph but for some reasons I tend to keep it below 75 mph. :rolleyes:

I was making a point about the review obviously. Where Kyle mentions that 100% fan speed isn't loud when it is quite obvious that it is, :rolleyes: -- Well according to everyone else that reviewed this card using a sound meter and those of us that have seen them first hand. The card is loud. Even at normal operating fan speeds the card is much louder than my SC+. This is important to some people.

Context is your friend!
 
Last edited:
He said with the generation of graphics cards 2-3 years ago, AMD was marketing their cards as having more memory than nvidia's cards, whereas in this generation AMD cards have less memory and they say it doesn't matter.

His comment has nothing to do with keeping a card for 2 or 3 years.

You, sir or madam, are correct! :)
 
You are a bit off there. The big boy cards were the HD 7990 GB, which was 2x3GB or really 3 GB effective, and the GTX 690 was 2 GB. The HD 7970 and GTX 690 were 3 GB and 2 GB, respectively. All logic should have shown the GTX 690 to bog down, but there wasn't much evidence of this. (too bad Dying Light wasn't around then :p)

More recently was the HD 290x with 4GB ram and GTX 780Ti with only 3 GB of ram. I don't think that got near the attention as the Vram shortcomings of the Fury X. (especially since the 780ti was more likely to be Xfired then the awkward Fury X)

No one really cares what the fanboys are yelling about, we really only care about what the reviewers say since that has the most influence on purchases.

then if that the case you are also off there, first, there was 6GB HD7970 and 4GB GTX 680, but in those times even 4GB were overkill for the games.. however AMD people "we are more future proof because we can pick 6GB cards for 2560x1440 gaming etc".. suddenly 4GB are more than enough to 4K.. oh well.. :rolleyes:

Also you are wrong, as the GTX 780 and 780TI were severely attacked in games like watch dogs and in every review [H] Pointed that if well those were stronger cards than 290 and 290X (at that time). the later offered smoother and better gameplay due to higher vRAM capacity as the 780 and 780TI were a stuttering fest in that game due vRAM.
 
then if that the case you are also off there, first, there was 6GB HD7970 and 4GB GTX 680, but in those times even 4GB were overkill for the games.. however AMD people "we are more future proof because we can pick 6GB cards for 2560x1440 gaming etc".. suddenly 4GB are more than enough to 4K.. oh well.. :rolleyes:

Also you are wrong, as the GTX 780 and 780TI were severely attacked in games like watch dogs and in every review [H] Pointed that if well those were stronger cards than 290 and 290X (at that time). the later offered smoother and better gameplay due to higher vRAM capacity as the 780 and 780TI were a stuttering fest in that game due vRAM.

Thx, Araxie, thought I was losing my marbles for a second there.
 
I am happy your boyfriend has your back, Commander, but I really don't care about what the manufactures claim. Only the reviewers as I stated earlier. As far as the 7970 and 680 go, most were 3 GB and 2 GB. So let's stick with those reviews. (I am not even sure [H] reviewed these vram bloated cards, anyyhow.

Let's go back to the 4 GB 290x and 3 GB 780ti. Those cards WERE reviewed recently - at 4k resolution, mind you. Here is a link to the conclusion and there opinion on the VRAM:

http://www.hardocp.com/article/2013...ti_vs_radeon_r9_290x_4k_gaming/7#.VcvZaCZViko

Paraphrasing, they say there is no advantage at 4k with 4 GB ram over 3 GB ram, a least with a single card. So in essence, 3 GB is enough for a card with 290x performance though 4 GB will gimp a Fury X, which really isn't much faster.

Hell, let's go even more recent. How about GTX 970s in SLI. Surely, those would suffer from 4 GB ram since the blow a single Fury X out of the water. These cards were tested at 4k as well as NV surround, but there was no mention of Vram concerns.

I just think it is strange that 3 GB was seen as enough for 4k last gen, but 4 GB is not enough for 1440p this gen.
 
Then let me Quote Straight something extremely important from he review and thanks for provide it:

Brent_Justice said:
The reason why we aren't seeing the difference between VRAM capacity right now is because these cards aren't fast enough to exploit the kind of game settings that would push the limits of VRAM. At this resolution these just aren't fast enough to exploit the highest in-game settings, and high levels of AA.

then just:

Brent_Justice said:
We think the difference in VRAM capacity isn't going to show itself until you are running SLI and CrossFire plus you need to be running the right game that loads up the VRAM. Right now, that is few and far between in the games that demand such a large VRAM capacity. BF4 is one game that may show a difference, and we have lots planned for BF4 testing in the future.

Also another Thing to take into consideration it's the game suite which it's complete and entirely different, those are actually pretty old games and newer gen of cards behave completely different with those games, In fact you just have to check the Quality Settings used for the heavier games of that review Crysis 3 "medium" when for example a overclocked 980TI can handle that game with very high settings at 4K with similar or bit higher FPS and the vRAM requirement it's certainly higher at those settings than medium, can you see it?. Run Those games with Modern cards and the vRAM issue will be become more apparent. specially in SLI/Xfire configurations where can be involved A LOT of horsepower to move easily those games at 4K but can run into vRAM limitations.

now, in that time we hadn't games like Watch Dogs, Dragon Age Inquisition, Shadow of Mordor, Dying Light, GTA V, etc etc etc.. newer games have higher vRAM requirement and that's what [H] are pointing out and that's a thing you have to understand for ACTUAL games with ACTUAL and RECENT games 4GB are starting to simply being not enough. if we delete Dying Light From the [H] Benchmark Suit but in that case we add Shadow of Mordor or Dragon Age Inquisition, Things will be just even worse even for the 980TI. and good that Brent and Kyle Always point how bad it's the AMD marketing as they are who alert the consumer to not fall into marketing, if we let AMD wash our brain with crappy marketing we could be buying R9 390X for 4K gaming, and Fury X for 5K gaming. :D.
 
I am really considering this for my new skylake build but it seems this card is hit or miss. What are other good options for a factory overclocked 980 Ti like this? I'm also looking at the ASUS Stix and Zotac Omega/Extreme cards. One of these will fit in a HAF 922, right? I won't be doing manual overclocking and would rely on anything the manufacturer would have in place for automatic overclocking.
 
Last edited:
Got my Ebay coupon in for 10% of my sales in July - wohooo! Got another Gigabyte 980Ti G1 on the way. Why the hell do I need it for 1080P? I don't, I'm just insane in the membrane. Insane in the brain!
 
"the G1 OC "destroys (omg) the Radeon FuryX, so the fury should cost 490$ to be competitive". On that logic, why the TitanX sould cost 1000$? The G1 beats the stock Titan to. And dont give me the same crap "the Titan is a workstation card and has 12 GB Ram" the extra 6GB wont cut the 400$ premium.
The fury X, without the wattercooler would cost arround 80$ less, nobody takes that into account.
 
I agree, I did not understand how they came to the 499 logic for Fury X. Fury X to me should be about 579-599 card. Fury should be 499.
 
Who keeps GPU's 2 to even 3 years though if your a semi hardcore gamer?

I can make it that far if I go xfire or SLI. Still using my two gtx 670's because its only with the gtx 980 Ti that I will find any significant performance boost. The $670-700 price tag is a enough to make me hesitate, but I will probably get one this fall and then eventually get another for SLI once that seems underpowered.
 
"the G1 OC "destroys (omg) the Radeon FuryX, so the fury should cost 490$ to be competitive". On that logic, why the TitanX sould cost 1000$? The G1 beats the stock Titan to. And dont give me the same crap "the Titan is a workstation card and has 12 GB Ram" the extra 6GB wont cut the 400$ premium.
The fury X, without the wattercooler would cost arround 80$ less, nobody takes that into account.

Actually as I recall when the 980 Ti came out they did point out that it's performance, and price point, made the Titan X much worse when comparing price to performance.

As for the water cooler on the Fury X, sorry but AMD does not get points for their card needing to be water cooled to perform and that does not justify any extra being charged for it.
 
i love talking about vram and 4k, but i havent read all the posts, so sorry if i say something thats already been said.

anyhow, who cares what people last generation or before said about vram. games change, and with that, so does the hardware they prefer.

games are starting to use quite a bit more vram (even at 1080p), people are playing at higher resolutions these days which equals more vram, and GPUs are getting beefy enough to turn on all eye candy and use stupid amounts of traditional AA (not shader based). all this requires more vram (esp the last one).

but of course, you need two things for high end settings: (1) the capacity, aka vram, and (2) the performance (GPU architecture and vram bandwidth). without both, either is fairly useless by itself.

we can all expect texture resolution and eye candy to increase in future games, and unless compression gets vastly better, we can expect more vram to be needed.

gta5, basically a last gen game, with my settings, uses just under 4GB of vram at 1440p w/ 4x msaa (god, does msaa do gta5 well). i need more vram. i need more performance. i need both.

hope pascal can do 4k + ~4x msaa on a single die.
 
I am happy your boyfriend has your back, Commander, but I really don't care about what the manufactures claim. Only the reviewers as I stated earlier.

Whoah, there, Tex. Not so fast on the trigger. My original post wasn't addressed to you, and I'm not sure what you think we disagree about so much anyway. 6 GB cards vs. 4 GB cards WERE widely available in the generation I recalled, the memory difference made for no advantage then, it does now. What's to quibble over?

As far as my boyfriends are concerned, it's like RAM--it's not the size that counts, it's how you use it...
 
Sorry, I wilt away like Roger from American Dad if I don't let out my bichiness once in a while.
 
"the G1 OC "destroys (omg) the Radeon FuryX, so the fury should cost 490$ to be competitive". On that logic, why the TitanX sould cost 1000$? The G1 beats the stock Titan to. And dont give me the same crap "the Titan is a workstation card and has 12 GB Ram" the extra 6GB wont cut the 400$ premium.
The fury X, without the wattercooler would cost arround 80$ less, nobody takes that into account.

... Are you implying that anyone around thinks the Titan X is value for money? Nobody is saying that. That thing is ludicrously over priced and always has been.

Also nobody calls the Titan X a workstation card, GM200 is a pure SP piece of silicon unlike GK110. Not that the excuse ever really flew outside of the fanboy crowd with the original Titan and Titan Black since Nvidia marketed it for gaming.

Would FuryX be cheaper without a CLC? Absolutely without question. But it would also be slower because the clock speed would need to be wound back much like the stock Fury and the FuryX is already slower than the equivalent offerings in the 980 Ti department (stock or otherwise).
 
"the G1 OC "destroys (omg) the Radeon FuryX, so the fury should cost 490$ to be competitive". On that logic, why the TitanX sould cost 1000$? The G1 beats the stock Titan to. And dont give me the same crap "the Titan is a workstation card and has 12 GB Ram" the extra 6GB wont cut the 400$ premium.
The fury X, without the wattercooler would cost arround 80$ less, nobody takes that into account.

No one takes it into account because it means fuck all to the consumer. Why it costs what it does is not important at all. The cost to produce is AMD's problem, not the end user's. What matters in the end is what it offers us. Price, performance, features, support. Those four things matter. Cost of production, profit, return on investment; those are all things for AMD and it's investors to worry about. If people feel a product is not worth the asking cost then that's all there is to it.
 
No one takes it into account because it means fuck all to the consumer. Why it costs what it does is not important at all. The cost to produce is AMD's problem, not the end user's. What matters in the end is what it offers us. Price, performance, features, support. Those four things matter. Cost of production, profit, return on investment; those are all things for AMD and it's investors to worry about. If people feel a product is not worth the asking cost then that's all there is to it.

Extremely well put, +1 for you!
 
Too bad mine and many other peoples G1's had serious coil whine issues and others have had problems with dead DVI ports outa the box. Pretty poor QA honestly. That and the fans get insanely loud too keep that card cool at higher temps with those dinky 80mm fans. I did a vid of it

https://www.youtube.com/watch?v=pAILvRdtP6s

After RMAing my first G1 because of coil whine my second one had it also. Then, I replaced my power supply due to an awesome deal on an EVGA 850 G2 and guess what happened... no more coil whine!! Card is absolutely silent and it was the PS all along. I suspect this happens more often than not.
 
After RMAing my first G1 because of coil whine my second one had it also. Then, I replaced my power supply due to an awesome deal on an EVGA 850 G2 and guess what happened... no more coil whine!! Card is absolutely silent and it was the PS all along. I suspect this happens more often than not.

WTF....well what the hell would I even replace my Corsair AX860 with? Its a Seasonic basically
 
WTF....well what the hell would I even replace my Corsair AX860 with? Its a Seasonic basically

Dunno. It was kind of a perfect storm for me. Ive had my OCZ 1000w for a few GPU setups going back to GTX 580 SLI. Always had coild whine but always figured it was the nature of the beast. Been running single cards since my GTX690,980 and now 980ti and had CW. Even swithced mobos a few months ago.. CW. Figured it was time to replace my 1k(which still worked fine) but really was overkill anyways and all but stole the 850 G2 off ebay. Lo and behold..Absolutely no coil whine. Like I said, I RMAed my first 980Ti G1 thinking it would go away with luck, it didnt. Again, I wonder how many "good" cards took the rap for the PS which was probably the cause. Of course..YMMV
 
Dunno. It was kind of a perfect storm for me. Ive had my OCZ 1000w for a few GPU setups going back to GTX 580 SLI. Always had coild whine but always figured it was the nature of the beast. Been running single cards since my GTX690,980 and now 980ti and had CW. Even swithced mobos a few months ago.. CW. Figured it was time to replace my 1k(which still worked fine) but really was overkill anyways and all but stole the 850 G2 off ebay. Lo and behold..Absolutely no coil whine. Like I said, I RMAed my first 980Ti G1 thinking it would go away with luck, it didnt. Again, I wonder how many "good" cards took the rap for the PS which was probably the cause. Of course..YMMV

So do you think something with the rail from the PSU was causing coil whine from the card, or was the noise coming from the PSU the whole time? If it was the first, I'd be very interested in a technical explanation of what causes that. (Not TOO technical, but you know....) Might be something PSU reviewers need to watch for.
 
So do you think something with the rail from the PSU was causing coil whine from the card, or was the noise coming from the PSU the whole time? If it was the first, I'd be very interested in a technical explanation of what causes that. (Not TOO technical, but you know....) Might be something PSU reviewers need to watch for.


I know for an absolute fact all the coil whine was from my GPU's but I honestly don't feel like selling a perfectly working used PSU just to buy a new one on a whim it MIGHT help on a slim chance.
 
I know for an absolute fact all the coil whine was from my GPU's but I honestly don't feel like selling a perfectly working used PSU just to buy a new one on a whim it MIGHT help on a slim chance.

I changed my PSU during the ownership of my 1st GTX 980 Ti, went from a 1000W EVGA G2 to a PS model, coil whine was identical across both PSU's. If it sounds like it's coming from the card...it's probably the card, given the amount of reviews indicating such.
 
So what has been the all around solution for this? I just got mine today and the coil whine is INSANE loud and one of the DVI ports doesn't even work...? just RMA through Amazon and hope for a good one or switch to different brand all together?
 
If one of your DVI ports doesn't work, why wouldn't you RMA it?

I mean obviously but I mean about the Whine should I RMA it for another G1 Gaming or go with another brand? I've always had Gigabyte and don't really want to RMA this thing 20 times I guess? I'm getting a new PSU next week so I might just hold onto it and see if that changes anything but ;{.
 
Bottom line: If you spend that kind of money on a GPU and it makes a ton of noise that isn't directly related to the cooling solution (and for a darn good reason), RMA once and then return for a refund.
 
Read the review and decided to purchase as I do trust your reviews. Here is feedback of my experience with the card so far.

Despite the "SOC GPU Gauntlet™ Sorting" the ASIC Quality @ 64.1% is the lowest rated card I have ever owned, gutted! However with the f4 bios, stock voltage and core set to 1225MHz it boosts to 1402MHz with the fans spinning at ~2100rpm (~60%) and temps in check ~70C. My rig is quiet and the fans at 60% are not noticeable and so far I have not noticed coil whine so all good, i just need to overcome my prejudice of such low ASIC quality.

However above ~60% the fans get progressively noisy in a bad way. There is both the air movement whoosh sound and much more undesirable a "singing/ringing" sound exactly as captured in this video clip of various cards including the G1.

http://www.computerbase.de/videos/2015-06/geforce-gtx-980-ti-partnerkarten-lautstaerke/

Based on that clip I would rate this card worst of the four examples for fan noise. It may keep the card cool but spinning above 70% is too loud for my living room as other family members watching TV have made clear to me.
 
Last edited:
So, I thought I had pretty decent overlocking on my Gigabyte G1 980ti when I first bought it. Over time I discovered that it was pretty unstable. I'm back to stock clocks AND I've increased the voltage just to get it stable there.

And my 2nd DVI port didn't work initially but then it did and now it doesn't again - piece of crap ... I was going to RMA it and probably will eventually but I don't feel like being without a card for a month right now.

Overall not really happy with the card especially considering what most of the review samples have been like.
 
I got a new PSU from RMA cause my old Antec 750 fan was making noise... they sent me a Brand New Antec High Current Gamer 750M now the card is dead silent........... 0 whine when before it was INSANE loud... so I guess the PSU does matter but why so sensitive?
 
I've just got one of these cards, and I can say only one thing:

Loud and hot.

The fan off mode is straight up useless, as when the fan is turned off at idle, the ambient temp around the card edges up over 50C. Causing my case fan to spin up which never ever did while I had my crossfire 290x configuration, not even during heavy gaming.

After I manually set the fans to don't stop this problem is solved. But as soon as I put load on the card, the fans start spinning up, and anything over 50% is louder than what I'd accept. The noise is comparable only to the reference 290x that I used to have until I replaced it's cooler with water.

I'll check if it's possible to solve the issue with a manual fan profile, as the default one is very aggressive. The fans are spinning at 60% at 50C and 80% at 70C.

What is a safe gpu temp for the 980TI? I don't care if I have to settle for factory clocks, I'd rather have that than the noise.
 
I have the Gigabyte Windforce non-G1 version, I wonder if I can flash this VBIOS on it?
 
Read the review and decided to purchase as I do trust your reviews. Here is feedback of my experience with the card so far.

Despite the "SOC GPU Gauntlet™ Sorting" the ASIC Quality @ 64.1% is the lowest rated card I have ever owned, gutted! However with the f4 bios, stock voltage and core set to 1225MHz it boosts to 1402MHz with the fans spinning at ~2100rpm (~60%) and temps in check ~70C. My rig is quiet and the fans at 60% are not noticeable and so far I have not noticed coil whine so all good, i just need to overcome my prejudice of such low ASIC quality.

However above ~60% the fans get progressively noisy in a bad way. There is both the air movement whoosh sound and much more undesirable a "singing/ringing" sound exactly as captured in this video clip of various cards including the G1.

http://www.computerbase.de/videos/2015-06/geforce-gtx-980-ti-partnerkarten-lautstaerke/

Based on that clip I would rate this card worst of the four examples for fan noise. It may keep the card cool but spinning above 70% is too loud for my living room as other family members watching TV have made clear to me.

Other issues aside, [H] GPU editors have been stating that the ASIC quality test is not a valid measurement of real-world performance potential. It seems that your experience bears this out.
 
I got a new PSU from RMA cause my old Antec 750 fan was making noise... they sent me a Brand New Antec High Current Gamer 750M now the card is dead silent........... 0 whine when before it was INSANE loud... so I guess the PSU does matter but why so sensitive?

I think that if [H] could investigate and find out the connection between PSUs and coil whine, it would be a major scoop. Something is going on here. Once they knew what caused it, they could make testing for it a part of their PSU reviews.
 
Coil whine - when I used to work at a GPU company, the engineer's kinda explained it to me, where certain components are sensitive to noise feedback, and when the components hit a specific frequency due to load/power draw these components will exhibit the "coil whine". Additionally he also told me it's usually because the circuit design is shit as the ASIC designer's they didn't account for this properly. From memory Furmark seemed to be the best tool to cause this, as it creates an unrealistic GPU load that earlier GPU's did not account for, hence the "frying" or damaging of said cards.

Disclaimer - I am para-phrasing from memory I don't remember all the technical details I am not an electrical engineer.

I do have a Gigabyte Windforce (non G1) but is does use a similar PBC design - dual 8-pin with dual DVI, and I'm using a Seasonic X-850 without any issues, I've only stressed the GPU with Unigine Valley/Heaven.
 
Back
Top