GTX 970/980 specs @ Techpowerup, plus 3dmark gpu-score leaks... Neo: "Woah."

So many of you guys that are talking about getting more than 'one' card don't even factor scaling issues, which are usually driver and game dependent.

Something known as 'Premium' is paid for getting one 'high-end' card, as compared to two slightly lower models, just for the sacrifice of scaling.

As far as this second gen Maxwell is concerned, we knew this was coming, the 750 ti made it really clear. This would be a game changer.

I'd imagine everyone here understands the concept of a premium for the flagship. However some of us run 4k monitors aka early/bleeding-edge tech and a single "premium" card cannot handle our needs ;).
 
And only about 1% of people who buy video cards would even mod the bios.

When is the last time you read a review of a video card on a major hardware website that showed they used a modded bios to overclock a card and compare to other stock video cards?

Saying the card will overclock better with a modded bios, is like saying a mustang will run faster with a turbo......

Skyn3t bios and msi afterburner voltage unlock
 
I'm not sure Maxwell is a game-changer. It seems like just a typical iteration of architectures. Another good one, but no real new tech or anything like that.

Being forced to 28nm kind of dampens nvidia's ability to complete uncork it, yes; but no Denver cores, no unified memory, etc. It'll be like a Fermi to Kepler move.
 
I'm not sure Maxwell is a game-changer. It seems like just a typical iteration of architectures. Another good one, but no real new tech or anything like that.

Being forced to 28nm kind of dampens nvidia's ability to complete uncork it, yes; but no Denver cores, no unified memory, etc. It'll be like a Fermi to Kepler move.

I think that you are right but I hope that you are wrong.
 
175W?! This chip is begging for a dual GPU card. Give me a 990 for <$1000 please!

anyone thinking this is going to be a "175W" gpu is dreaming.. 175W cards do NOT come equipped with nearly 300W delivery capabilities (75W PCIe, 75W 6 pin and 150W 8 pin) for just looks. The 750 Ti OCs draw about 60-65W through PCIe (75W) and power use does not typically scale.. yet some are somehow believing that nV is going to magically deliver 4x perf with just 3x power cost ??
 
$400 for a card slower than a 780ti? Well, that is disappointing. So roughly GTX780 speeds... which is only 15% or so faster than my GTX670, which cost $400 2+ years back.

Even if they clock it up it will still be a pretty minimal upgrade from what we could buy at the same price 3 years ago.
 
$400 for a card slower than a 780ti? Well, that is disappointing. So roughly GTX780 speeds... which is only 15% or so faster than my GTX670, which cost $400 2+ years back.

Even if they clock it up it will still be a pretty minimal upgrade from what we could buy at the same price 3 years ago.

Stock GTX 980 is faster than a 780 ti by a good margin (10%+) and will likely cost $499, which is two hundred less than the 780 ti. Additionally it will consume much less power, generate less heat and noise, and have 1gb more VRAM. How is that disappointing?

Probabaly 8GB 3-6 months down the road.

Within a month or so, around early november late october supposedly

anyone thinking this is going to be a "175W" gpu is dreaming.. 175W cards do NOT come equipped with nearly 300W delivery capabilities (75W PCIe, 75W 6 pin and 150W 8 pin) for just looks. The 750 Ti OCs draw about 60-65W through PCIe (75W) and power use does not typically scale.. yet some are somehow believing that nV is going to magically deliver 4x perf with just 3x power cost ??

Maxwell is ludicrously power efficient, and the GTX 980 has two 6-pin connectors, actually ;), for reference models. Maxwell's TDP maximum in the BIOS on the GTX 750 Ti is actually 38 watts, i.e. it throttles itself if you exceed that by default. So no, it does not consume 65w reference.
 
Stock GTX 980 is faster than a 780 ti by a good margin (10%+) and will likely cost $499, which is two hundred less than the 780 ti. Additionally it will consume much less power, generate less heat and noise, and have 1gb more VRAM. How is that disappointing?

10% performance increase. That is nothing to get excited for, especially considering the price drop. Less power is nice, but how much money does it save in the long run (2-3 years)? Less heat/power is nice, but unless your current card is killing itself it isn't worth upgrading unless you have money to burn (nothing wrong with that).

My main concern is for the 970 though. If the jump is only 15% it isn't worth forking over the extra $300 or so to upgrade to it. Hopefully it turns out to be faster though. As for heat, my current GTX670 rarely pushes over the low 70s even on the hottest days so heat is not a concern for me.
 
I have done nothing of the sort, I made a general statement, don't believe rumors.

I am correct in my statement. One should never take rumors at face value.
 
I actually was saying that tongue in cheek. I agree with you.
 
I get worked up, that some people get worked up about rumors and actually make buying decisions based on rumors.

It is not logical.

Yep.

I plan to buy 980 if it beats 780Ti since I am currently running a GTX 690. If it equals a 780Ti I'll just wait till the big die comes out.
 
I get worked up, that some people get worked up about rumors and actually make buying decisions based on rumors.

It is not logical.

Of course it is. Silver/gold are dropping. Rumors about it rising after we bomb terrorists, it wil go up again! :D
 
I get worked up, that some people get worked up about rumors and actually make buying decisions based on rumors.

It is not logical.

Heh, personally I sold my 780 a couple of months ago when it wasn't cutting it for my new 4k monitor ;). Shortly after when I was deciding whether to go 780 Ti or SLI 780 at that point or where to go from there, the rumors started circulating of a new card hitting soon... at which point it would have been silly to get an old-gen one :p. I don't spend time on the waiting game normally but if a new release is coming soon it would be a waste, or at least safe to assume it would be, to buy an old card setup that had been out for quite awhile by then.
 
I get worked up, that some people get worked up about rumors and actually make buying decisions based on rumors.

It is not logical.


This is what drives me most nuts about rumor threads. You hear some pretty far out shit buying decision wise and it starts a shitnami every time.
 
...
Maxwell is ludicrously power efficient, and the GTX 980 has two 6-pin connectors, actually ;), for reference models. Maxwell's TDP maximum in the BIOS on the GTX 750 Ti is actually 38 watts, i.e. it throttles itself if you exceed that by default. So no, it does not consume 65w reference.

Please note I did mention OC and they most certainly do draw well above the TDP (38.5) set in the gpu's BIOS

6Vx5eAF.png


and

nCDEClE.gif


Nv states

Graphics Card Power (W): 60W

And as awesome as people like to make out Maxwell effeciency it only just surpassed AMD's already high mark perf/w set 2+ years ago with the pitcairn and then bonaire.

Ad4BhUAl.jpg


Granted perf/WW is a misnomer largely dependent upon the application being used in the comparison, however when you look at AMDs lower end products and how/where they compare to the 750Ti, the perf largely falls in line with the power consumption.. AMD parts consuming 12-25% more but offering 15-20% more perf..

Hard to say how an architecture as a whole is based only ONE part.. One can't simply saw that "Maxwell is ludicrously power efficient" because a low end part does well. After all if you look at Cape Verde -> Pitcairn then Bonaire, they are incredible efficient as well however we see when the architecture scales up, power climbs and efficiency (%) drops.
 
I get worked up, that some people get worked up about rumors and actually make buying decisions based on rumors.

It is not logical.

This. Hence why one should take rumors with a grain of salt. It's only when the said rumor comes from a reliable source (rarely), that I take it seriously.
 
This is what drives me most nuts about rumor threads. You hear some pretty far out shit buying decision wise and it starts a shitnami every time.

Rumor threads are the best! You have people working themselves up for something that doesn't really matter in the grand scheme of things. I can wait and examine the benchmarks and not work myself up at night waiting for the next post.
 
People just wait for the actual cards.

If it's not your cup of tea, get over it. The people who will buy these, "WILL buy these".
 
So basically, the 900 series will have better texture compression, enabling a 50% reduction in required vram width, right?
 
And as awesome as people like to make out Maxwell effeciency it only just surpassed AMD's already high mark perf/w set 2+ years ago with the pitcairn and then bonaire.

How DARE YOU COME IN HERE WITH LOGIC BACKED UP BY FACTS?!?!?!? Are you drunk?!?! If you don't repent by chugging a keg of Green Kool-Aide I will have to take away your internetz!!!!!!:D:D:D:D:D:D:D:D:D:D:D






Bravo sir, Bravo...Notice Not one of the of usual Nvidiots even attempted to refute your claim;)...


Back on topic, I think so many of you are putting WAY to much faith in the "Huge L2 cache/new wide front end etc" that you think is going to help that 256bit bus from choking over 1440P with any sort of AA...If those leaked specs are true, there is just such little bandwidth...I want to be wrong, but I think that everything Maxwell was promised to be has been pushed to Pascal. It sure is going to be super interesting to see all the fan boys go crazy when the benchmarks drop..:p
 
Can someone explain to me why power efficiency is needed in a gaming card ?

im not looking for the same speed at the same power use i just want more performance plain and simple, AMD seams to grasp this
 
Last edited:
Can someone explain to me why power efficiency is needed in a gaming card ?

im not looking for the same speed at the same power use i just want more performance plain and simple, AMD seams to grasp this

someone want to save on the bill.
a gaming PC with SLI or CrossFire, if used for some hours a day, can increase your bill by a little.
the bigger the power consumption, the bigger the PSU must be, bigger PSUs, generally cost more than smaller.

another important factor is heat. a GPU that requires a lot of power, produces a lot of heat, this means that your pc will run hotter and it will be more noisy.

plain and simple uhu?
 
Can someone explain to me why power efficiency is needed in a gaming card ?

im not looking for the same speed at the same power use i just want more performance plain and simple, AMD seams to grasp this

It reduces the bill of parts, for one thing. More importantly though power is the ultimate limiter of performance.
 
Can someone explain to me why power efficiency is needed in a gaming card ?

im not looking for the same speed at the same power use i just want more performance plain and simple, AMD seams to grasp this

I don`t care about the electricity bill, but power efficiency is very nice from a noise/performance point of view. Not only that, but because it takes less to cool the GPU, chances are that we can use the GPU in more builds with less performance sacrifice (SFF systems, laptops etc).
 
plain and simple uhu?

if you can afford 500 bucks for a card you will menage the few dollars at the end of the year for the power bill
the whole argument is absurd, its like buying a muscle car for a fortune and then worrying about the gasoline price, muscle cars = 4k gaming

there is a reason why Intel sells its ultra low power CPUs in a separate category, the new 8 cores are energy hogs yet people are drooling over their performance

Its like you are readying a case if those rumors turn true, yet you are being duped paying 500 bucks for a mid range card that possibly wont be able to handle 4K


Noise ? use headphones

More importantly though power is the ultimate limiter of performance.
Reply With Quote

295x2 disagrees with you
 
It reduces the bill of parts, for one thing. More importantly though power is the ultimate limiter of performance.

Power Effeciency helps you build a more affordable computer as well, less wattage PSU needed, less extreme cooling needed, you can build more compact, save money. It is also a limiter of performance if power is too high, there are power limits to the PCI-E connector slot and power cables. The more you can save on power, the less heat you need to exchange, the higher performance can improve, like through clock speed, or increased shaders.
 
flagship cards aren't for affordable PCs but high end rigs
buy a 970 or 750ti if you need an affordable PC

with this Nvidia will move flagship cards to Titan range of cards, increasing the price of mid range ones

if those specs turn to be true ofc but more that i think about it, its very logical in a consumer leaching way
 
i rather pick Card C 70 fps at 500W :cool:

tell me what incentive does a 780 user have to buy 980 ?

ironic, since i even bought a over 1kw PSU for a new gpu / cpu upgrade, at least Intel didn't fail short
 
Last edited:
If you want power efficiency, buy a powerful card and then underclock and undervolt it a bit.
 
If you want power efficiency, buy a powerful card and then underclock and undervolt it a bit.

don't know if it works that way

at least for CPUs Intel implemented a 80% margin, so you can undervolt 20 % max , to protect its Q, T, S, E model sales i assume

but then again i never even considered or had any reason to do that with my gpu
winter is commin;)
 
If Flagship Card:

2012 Card A = 60 FPS at 500W

2014 Card B = 60 FPS at 300W

I'm choosin Card B

That's really what it should read. Even though you still get 60fps 2 years later, at what cost to performance is there? 0% cost? But better power efficiency? If there is no performance increase or substantial like there should be, if Card A is starting to compromise settings like turning down AA or setting textures to medium in games today, and Card B has the same performance but a 200Watts less, you still are not able to turn the settings up. So why bother??

If I want to save power, I have already done so by changing all the lights in my house and buying Energy Star rated electrical appliances. I want power in my rig, not a hipster. ;)
 
I don't know that it matters that much for a gamer (I have a mini-ITX system so it does interest me). I do understand nvidia's perspective though when they said the entire architecture was designed from the ground up with a focus on mobile, since those are markets that are very interesting to them.
 
Back
Top